Using CI to deploy database changes - entity-framework-migrations

I am using Entity Framework Code First Migrations to make changes to my local database when someone makes a change to the data model. I run "update database" when I want to incorporate database changes checked in from another developer and the model changes are made and the Seed method is called populating the database with the latest data.
I have just set up a CI environment with TeamCity to push code changes to the Latest build environment, the IAT environment and the UAT environment.
The problem is when I run the site an empty database is created but the Seed method is not run. How can I make the Seed method run as well as make it make any new database changes when I deploy?

The seed method should have run every time a developer runs Update-Database on their own DB. For a shared server, you can run migrate.exe on the command line against each server environment.
This executable is available in the Tools subfolder when you download EF via NuGet. If you want, you can automate its run by deploying it along with your application. Then, set up a command step in your CI configuration to call migrate.exe with the appropriate flags.
You can find more information on migrate.exe on MSDN:
http://msdn.microsoft.com/en-us/data/jj618307.aspx
e.g.
Migrate to the latest migration
Migrate.exe MyMvcApplication.dll /startupConfigurationFile="..\web.config"
When running migrate.exe the only mandatory parameter is the assembly,
which is the assembly that contains the migrations that you are trying
to run, but it will use all convention based settings if you do not
specify the configuration file.
Best of luck!

Related

How to UPDATE DATABASE in Code First approach automatically after check-in and Publish code in Continuous deployment

​In our Web API application, Continuous deployment need a following scenario.
User will check in code in VS, Code will get automatically build, Code will be Published, Code will be deployed.
But If we are using Entity Framework Code First approach, How can we update database without manual commands (Add-Migration/Update Database)and make database up to date with that check-in.
You can try to run Add-Migration/Update Database commands in the build/deploy process.
Assume you are using vNext build,
Add a "Nuget Installer" task in your build definition first to
restore the Entity Framework during the build. Migrate.exe will be
installed in \packages\EntityFramework.\tools folder.
Then add a "Command Line" task to run the migrate.exe. Enter
“\packages\EntityFramework.\tools\migrate.exe" in "Tool" area and
the arguments in "Arguments" field.
Reference this thread : How can I run Entity Framework's migrate.exe from Visual Studio Online?
You can also try the extension "Entity Framework Migrations" which contains a set of tasks which allow you to work with Entity Framework code first migrations:
Method 1: Generating SQL script
The first method allows you to generate a SQL script containing all
migrations. This script can be obtained by manually running
Update-Database -SourceMigration 0 -Script in the NuGet package
manager console in Visual Studio. You can then either manually run
this script after the release or automatically during the release
using a extension that allows you to run SQL scripts.
Task name: Generate migration SQL script
Other articles may helps:
Deploying an Entity Framework Database into Production
Using TFS Build to Deploy Entity Framework Database Migrations with
Migrate.exe
Continuous Delivery with TFS: Our Sample Application

Issue with Database project (DACPAC) - Continuous delivery

I have Microsoft TFS build process to deploy web project to azure web role, this occurs automated way every day. I have followed Azure article https://azure.microsoft.com/en-in/documentation/articles/cloud-services-dotnet-continuous-delivery/
I have following MSBuild arguments in my build process, as above article suggested
/t:Publish /p:PublishDir=C:\MSCD\
When i add database project to my project solution, build is keep failing. Getting error message
The "SqlPublishTask" task was not given a value for the required
parameter "SqlPublishProfilePath"
When i publish my web project, i don't want database project to be published. How to drop DACPAC file to drop folder ? so i can use powershell to update my database in azure.
I am using TFS 2012 On-Premise. Could someone give suggestion on how to solve this problem?
You need to create Master-child build definitions and in master build definition configure solution with build and deploy disabled in configuration manager .and in child build configure database project alone .
Create Master and child build definition such that they share common drop .
#Siva palla's answer solved this exact same issue for me. Here are the complete set of changes we made to get this working.
Changes in Visual Studio
Originally I was using a single configuration called 'Deployment' that was set to deploy both our WinForms project and our SQL project (VisionShellWin is the WinForms project, the two Vision.SQLMigration projects are the SSDT projects) so everything below is based on changes to Deployment. For most people you'll have Release instead of Deployment, all of the below should still work fine, just substitute in Release for Deployment:
To separate that single configuration in two I created a new configuration:
I set that new deployment to copy its settings from the old Deployment configuration and not to create new configurations for each of the projects (the existing project level Deployment ones are fine):
In that new deployment I then unticked the boxes to Build and Deploy the database projects:
I then did the exact same thing again to create a Database specific deployment:
And in that one I unticked everything except the database files:
That's everything needed in Visual Studio, so I committed all of that and synced it back to DevOps.
Changes in Azure DevOps
In DevOps I cloned my existing Visual Studio Build stage (called 'Build Winforms solution') and renamed the clone to Build Databases:
I added two new build variables in the Variables tab, named ClickOnceBuildStageConfiguration and DatabasesBuildStageConfiguration with their values set to the names of the new configurations we just created in VS:
And then I changed the WinForms build stage to use the new ClickOnceBuildStageConfiguration variable - note that we still have the /T:"VisionShellWin" /target:Publish MSBuild Arguments needed for ClickOnce set:
And the new Build Databases stage to use the databases variable - note that we don't have any MSBuild Arguments:
Finally, in addition to the copy stage I already had for copying the ClickOnce application files into the drop artifact:
I also added a Copy Files stage (called 'Copy Dacpacs') to copy the DacPac into the drop too:
Once you've done all of that, you should end up with a build that works and an artifact that contains both the ClickOnce files and the DacPacs

TFS - running MSBuild integration tests against specific SQL Servers

In our TFS 2013 project we have a set of simple MSBuild based integration tests alongside our unit tests which test stored procedures and other logic which need a database server to be present, for example
[TestMethod]
[TestCategory("Integration")]
public void SomeTest()
{
InitialiseData();
var results = RunStoredProcedure();
AssertResultIsCorrect(result);
}
As you can see we have tagged these tests as "Integration" so that we can supply a test case filter that excludes these tests when we don't want to run them. These tests are all designed to run against an installed copy of our database (we have a process which deploys an up-to-date copy of our database to a SQL Server instance).
What we would like to do is create an integration test build definition in TFS which we can schedule to deploy our database and then run all of these tests against a specific SQL Server instance.
At the moment we use the standard TFS build process with two test sources - the first runs a "unit test" which installs the database, the second contains all of the actual integration tests. The problem we have is passing the connection string & database name into the unit tests from the build definition. Currently the connection string is in app.config files for each of the test projects, however this is less than ideal as it means that we are constantly getting failing test runs, either due to developers checking in the wrong connection string, or running tests locally against the build database at the same time that a build is running. This setup also limits us to running one build at a time.
Is there a way that we can specify the connection string and database name to use as part of the build workflow template instead?
With a combination of SlowCheetah for your config transformation and VS linked files, I think you can solve this (and based on the OP you probably already have :). Make a new solution configuration in your solution for the scenario you described. This solution will not be used on dev machines, only by TFS build definition (under Process, Items to build, Configurations to build).
The Configuration Manager for the solution would then use the solution configuration only for the test proj.
Add your SlowCheetah transform for your new solution configuration and put in your db conn string you need for TFS for that new transform.
Now in the tests project, copy over all the config files as linked files. This will allow the test executions to respect the test config file that SlowCheetah will transform. You may have to adjust your configuration reading in your test proj(s) to account for this.
This isolates the solution configuration to only the TFS server since only it will be building with your new solution configuration. Now TFS will have a config file that points to your specific TFS database connection that no other machines respect.

Ideas on how to clean remote DB before running selenium tests

I have a project where we have created a set of selenium tests, using rspec and capybara, that run against a remote server. This means that these tests do not run in the same Rails instance/environment as the application and, therefore, do not have access to that applications rake tasks.
What we are trying to figure out is a good method of cleaning/restoring the database before each run. We deploy the application via a Jenkins build task and then, if successful, kick of the selenium tests. We are using Selenium2 and the tests are run via SeleniumServer (formerly Selenium Grid). We do have the capability of firing off a Cap task when we deploy the application to restore the DB.
The question is how to do the restore while minimizing the number of migrations that we need to do (preferably limiting migrations to only the most recent ones) and pre-seeding the database with the required data.
Some interesting things to note about our setup: we have a fair bit of information to seed, not Gigs of it, but more than what you would want to enter into a seeds file and we have a fully partitioned database with both public and private schemas. We have a multi-tenant application and use private schemas to isolate data access.
So, what are some of the ways that other people have used to solve this problem?
I think most people use database-cleaner for this problem, but as I said at the beginning, the selenium tests are running outside of the Rails environment so database-cleaner won't work.
If you're using Jenkins, you could build another Jenkins job that is solely responsible for resetting / refreshing your database. This could contain scripts in your flavor of choice for cleaning up the database. Then set your current Jenkins testing job as a downstream project that gets kicked off upon the successful execution of your cleanup job.
Then when you want to kickoff a full test, just run the cleanup job and go make a sandwich :)

environment configuration for tests running in NUnit

I have some integration tests that hit a webserver and verify certain functionalities. Depending on the build environment, the server will be at a different address (http://localhost:8080/, http://test-vm/, etc).
I would like to run these tests from a TFS build.
I'm wondering whats the appropriate way to configure these tests? Would I just add a setting to the config file? I'm doing that currently. Incidentally we do have a separate branch per test environment, so I could have a different config file checked in for each environment. I wonder if there is a better way though?
I'd like the build project to be able to tell the test what server to test. This seems better because then I don't have to maintain config information on a per branch basis.
I believe I'd be using NUnit for Team Build (http://nunit4teambuild.codeplex.com/) to get NUnit/TFS to play together.
I can think of a couple options:
Edit the .config file via command line before the test runs.
If the setting depends on which machine the test is run from, you could put it in machine.config

Resources