How to UPDATE DATABASE in Code First approach automatically after check-in and Publish code in Continuous deployment - tfs

​In our Web API application, Continuous deployment need a following scenario.
User will check in code in VS, Code will get automatically build, Code will be Published, Code will be deployed.
But If we are using Entity Framework Code First approach, How can we update database without manual commands (Add-Migration/Update Database)and make database up to date with that check-in.

You can try to run Add-Migration/Update Database commands in the build/deploy process.
Assume you are using vNext build,
Add a "Nuget Installer" task in your build definition first to
restore the Entity Framework during the build. Migrate.exe will be
installed in \packages\EntityFramework.\tools folder.
Then add a "Command Line" task to run the migrate.exe. Enter
“\packages\EntityFramework.\tools\migrate.exe" in "Tool" area and
the arguments in "Arguments" field.
Reference this thread : How can I run Entity Framework's migrate.exe from Visual Studio Online?
You can also try the extension "Entity Framework Migrations" which contains a set of tasks which allow you to work with Entity Framework code first migrations:
Method 1: Generating SQL script
The first method allows you to generate a SQL script containing all
migrations. This script can be obtained by manually running
Update-Database -SourceMigration 0 -Script in the NuGet package
manager console in Visual Studio. You can then either manually run
this script after the release or automatically during the release
using a extension that allows you to run SQL scripts.
Task name: Generate migration SQL script
Other articles may helps:
Deploying an Entity Framework Database into Production
Using TFS Build to Deploy Entity Framework Database Migrations with
Migrate.exe
Continuous Delivery with TFS: Our Sample Application

Related

Issue with Database project (DACPAC) - Continuous delivery

I have Microsoft TFS build process to deploy web project to azure web role, this occurs automated way every day. I have followed Azure article https://azure.microsoft.com/en-in/documentation/articles/cloud-services-dotnet-continuous-delivery/
I have following MSBuild arguments in my build process, as above article suggested
/t:Publish /p:PublishDir=C:\MSCD\
When i add database project to my project solution, build is keep failing. Getting error message
The "SqlPublishTask" task was not given a value for the required
parameter "SqlPublishProfilePath"
When i publish my web project, i don't want database project to be published. How to drop DACPAC file to drop folder ? so i can use powershell to update my database in azure.
I am using TFS 2012 On-Premise. Could someone give suggestion on how to solve this problem?
You need to create Master-child build definitions and in master build definition configure solution with build and deploy disabled in configuration manager .and in child build configure database project alone .
Create Master and child build definition such that they share common drop .
#Siva palla's answer solved this exact same issue for me. Here are the complete set of changes we made to get this working.
Changes in Visual Studio
Originally I was using a single configuration called 'Deployment' that was set to deploy both our WinForms project and our SQL project (VisionShellWin is the WinForms project, the two Vision.SQLMigration projects are the SSDT projects) so everything below is based on changes to Deployment. For most people you'll have Release instead of Deployment, all of the below should still work fine, just substitute in Release for Deployment:
To separate that single configuration in two I created a new configuration:
I set that new deployment to copy its settings from the old Deployment configuration and not to create new configurations for each of the projects (the existing project level Deployment ones are fine):
In that new deployment I then unticked the boxes to Build and Deploy the database projects:
I then did the exact same thing again to create a Database specific deployment:
And in that one I unticked everything except the database files:
That's everything needed in Visual Studio, so I committed all of that and synced it back to DevOps.
Changes in Azure DevOps
In DevOps I cloned my existing Visual Studio Build stage (called 'Build Winforms solution') and renamed the clone to Build Databases:
I added two new build variables in the Variables tab, named ClickOnceBuildStageConfiguration and DatabasesBuildStageConfiguration with their values set to the names of the new configurations we just created in VS:
And then I changed the WinForms build stage to use the new ClickOnceBuildStageConfiguration variable - note that we still have the /T:"VisionShellWin" /target:Publish MSBuild Arguments needed for ClickOnce set:
And the new Build Databases stage to use the databases variable - note that we don't have any MSBuild Arguments:
Finally, in addition to the copy stage I already had for copying the ClickOnce application files into the drop artifact:
I also added a Copy Files stage (called 'Copy Dacpacs') to copy the DacPac into the drop too:
Once you've done all of that, you should end up with a build that works and an artifact that contains both the ClickOnce files and the DacPacs

TFS - running MSBuild integration tests against specific SQL Servers

In our TFS 2013 project we have a set of simple MSBuild based integration tests alongside our unit tests which test stored procedures and other logic which need a database server to be present, for example
[TestMethod]
[TestCategory("Integration")]
public void SomeTest()
{
InitialiseData();
var results = RunStoredProcedure();
AssertResultIsCorrect(result);
}
As you can see we have tagged these tests as "Integration" so that we can supply a test case filter that excludes these tests when we don't want to run them. These tests are all designed to run against an installed copy of our database (we have a process which deploys an up-to-date copy of our database to a SQL Server instance).
What we would like to do is create an integration test build definition in TFS which we can schedule to deploy our database and then run all of these tests against a specific SQL Server instance.
At the moment we use the standard TFS build process with two test sources - the first runs a "unit test" which installs the database, the second contains all of the actual integration tests. The problem we have is passing the connection string & database name into the unit tests from the build definition. Currently the connection string is in app.config files for each of the test projects, however this is less than ideal as it means that we are constantly getting failing test runs, either due to developers checking in the wrong connection string, or running tests locally against the build database at the same time that a build is running. This setup also limits us to running one build at a time.
Is there a way that we can specify the connection string and database name to use as part of the build workflow template instead?
With a combination of SlowCheetah for your config transformation and VS linked files, I think you can solve this (and based on the OP you probably already have :). Make a new solution configuration in your solution for the scenario you described. This solution will not be used on dev machines, only by TFS build definition (under Process, Items to build, Configurations to build).
The Configuration Manager for the solution would then use the solution configuration only for the test proj.
Add your SlowCheetah transform for your new solution configuration and put in your db conn string you need for TFS for that new transform.
Now in the tests project, copy over all the config files as linked files. This will allow the test executions to respect the test config file that SlowCheetah will transform. You may have to adjust your configuration reading in your test proj(s) to account for this.
This isolates the solution configuration to only the TFS server since only it will be building with your new solution configuration. Now TFS will have a config file that points to your specific TFS database connection that no other machines respect.

TFS Build DOS command

We have a TFS (2012) project that we use as a PL/SQL source code repository. This project does not contain any .Net code or solution just PL/SQL code.
We have a command line build process & deploy process to move the PL/SQL code to the database. Currently we are running the command line daily by hand. I would really like TFS to kick off the command line daily.
I have created a custom TFS Build Process Timeplate that just contains an Invoke Process function however when I go to create a new TFS Build Definition it requires me to fill in Items to Build. I do not have any items to build I just want to kick off the Build Process template.
Is there any way to create a striped down TFS Build that will just run the Invoke Process and not worry about the Items to Build?
You have made the good choice, but your custom activity Invoke Process must be integrated to your main build, items correspond to your solution c#.
So when you create your build, you compile your items, and in the end you execute your check for pl sql.

Using CI to deploy database changes

I am using Entity Framework Code First Migrations to make changes to my local database when someone makes a change to the data model. I run "update database" when I want to incorporate database changes checked in from another developer and the model changes are made and the Seed method is called populating the database with the latest data.
I have just set up a CI environment with TeamCity to push code changes to the Latest build environment, the IAT environment and the UAT environment.
The problem is when I run the site an empty database is created but the Seed method is not run. How can I make the Seed method run as well as make it make any new database changes when I deploy?
The seed method should have run every time a developer runs Update-Database on their own DB. For a shared server, you can run migrate.exe on the command line against each server environment.
This executable is available in the Tools subfolder when you download EF via NuGet. If you want, you can automate its run by deploying it along with your application. Then, set up a command step in your CI configuration to call migrate.exe with the appropriate flags.
You can find more information on migrate.exe on MSDN:
http://msdn.microsoft.com/en-us/data/jj618307.aspx
e.g.
Migrate to the latest migration
Migrate.exe MyMvcApplication.dll /startupConfigurationFile="..\web.config"
When running migrate.exe the only mandatory parameter is the assembly,
which is the assembly that contains the migrations that you are trying
to run, but it will use all convention based settings if you do not
specify the configuration file.
Best of luck!

TFS Build - Powershell or custom activity?

I understand I can write my own custom activity (in C#) to execute custom logic during the build process. My understanding is that Powershell can also be used, but I am not sure where it fits in. I do understand Powershell is used for executing command line commands but how and where would I use it to customize the build process?
Thanks
The decision whether to use Powershell or a Custom activity is for me based on who is responsible. If you have an activity that is created by the build master (for TFS) and therefor reusable for all the teams in the organization, I create a custom activity.
If the project team is responsible (for example a deployment script), the I use the powershell. I create an argument where the team can enter the path of the powershell script that needs to execute to deploy. The project team can optionally choose to enter a value in that argument. The project team can also maintain their powershell deployment script themselves without the help of the build master.
So in short:
A reusable activity: Custom activity
Activity for the team only: Powershell
For me, powershell is the way to go. Here are my reasons why:
Script Independence:
You get script independence using this approach. Example: I have a number of scripts that run after the build (i.e. the compile) process has completed:
Instantiate Database
Deploy Database Code
Deploy Web Applications
Verify Deployment
Run Acceptance Tests
All of the above can be launched, debugged and tested independently without the need to queue a new build.
Powershell is easy to work with:
Custom assemblies tend to add a lot of complexity and flakiness to the solution. Example: Upgrading from TFS 2010 to TFS 2012 was very painful, because all of the Build Templates broke. We had to recompile all of our custom assemblies, and only one dev on the team knew how TFS Build was set up to run our custom activities. I have recently removed all custom assemblies from our build templates, and am using Powershell exclusively.
I have customised my process templates to call a user-defined powershell script after the TFS Build has completed. I do this by using a paths argument in the build definition. This argument is simply an array of strings pointing to the scripts. I agree with Ewald, above, that TFS does not pass the build arguments to scripts. To solve this, in my workflow template I parse each script in the string array, and replace well-known tokens with the build arguments - e.g. #(BuildNumber), #(SourcesDirectory) etc. I find this to be a very easy and solid solution.

Resources