Automatic Update and checkin of AssemblyInfo.cs files occasionally causes partial fail - tfs

We have TFS 2008 our build set up to checkout all AssemblyInfo.cs files in the project, update them with AssemblyInfoTask, and then either undo the checkout or checkin depending on whether the build passed or not. Unfortunately, when two builds are queued close together this results in a Partially completed build as the AssemblyInfo.cs files seem to be checked out at an earlier version to the previous checkin.
In order to get around this I thought that I could use the "Get" task to force the AssemblyInfo.cs files to the latest version before updating them, but this appears to have no effect. Any ideas?
<Target Name="AfterGet" Condition="'$(IsDesktopBuild)'!='true'">
<Message Text="SolutionRoot = $(SolutionRoot)" />
<Message Text="OutDir = $(OutDir)" />
<!-- Set the AssemblyInfoFiles items dynamically -->
<CreateItem Include="$(SolutionRoot)\Main\Source\InputApplicationSln\**\$(AssemblyInfoSpec)">
<Output ItemName="AssemblyInfoFiles" TaskParameter="Include" />
</CreateItem>
<Message Text="$(AssemblyInfoFiles)" />
<!-- When builds are queued up successively, it is possible for the next build to be set up before the AssemblyInfoSpec is checked in so we need to force
the latest these versions of these files to be got before a checkout -->
<Get Condition=" '$(SkipGet)'!='true' " TeamFoundationServerUrl="$(TeamFoundationServerUrl)" Workspace="$(WorkspaceName)" Filespec="$(AssemblyInfoSpec)" Recursive="$(RecursiveGet)" Force="$(ForceGet)" />
<Exec WorkingDirectory="$(SolutionRoot)\Main\Source\InputApplicationSln"
Command="$(TF) checkout /recursive $(AssemblyInfoSpec)"/>

Does your build re-write the AssemblyInfo files and then check them back in? Or do you just modify the AssemblyInfo files locally. Personally I prefer the latter approach - as documented over at the TFSBuild recipies site:
http://tfsbuild.com/AssemblyVersioning%20.ashx
I've never actually sat down and checked but I was wondering if you checked in the AssemblyInfo files then could the following be happening which might be causing your problems...
Request a build, current changeset = 42
Build 1 for changeset 42 starts running
Request a build, current changeset = 42 (still)
Build 2 for changeset 42 queued
Build 1 checks in new assemblyinfo files, current changeset = 43
Build 1 completes
Build 2 for changeset 42 starts, dows a get of changeset 42 meaning AssemblyInfo files are the fold ones.
As I say, not exactly sure when the changeset number is determined for the build - at the time of queuing or at the time of running. It would make more sense for it to be at the time of queueing though.

Changing:
<Get Condition=" '$(SkipGet)'!='true' " TeamFoundationServerUrl="$(TeamFoundationServerUrl)" Workspace="$(WorkspaceName)" Filespec="$(AssemblyInfoSpec)" Recursive="$(RecursiveGet)" Force="$(ForceGet)" />
To:
<Get Condition=" '$(SkipGet)'!='true' " TeamFoundationServerUrl="$(TeamFoundationServerUrl)" Workspace="$(WorkspaceName)" Filespec="$(AssemblyInfoSpec)" Recursive="True" Force="True" />
Has forced the AssemblyInfo.cs files to be overwritten with top of tree. It's been working so far, but is more of a hack than something elegant.

Related

Publish an MVC application with a web.config for each production environment

I have an MVC application with a Dev, Staging, and Production environment. Dev and Staging are essentially the same thing (same VM, IIS, DB etc.); however, Production is hosted on 4 VMs behind a load balancer. Each VM has it's own DB. For example, the instance deployed to VM1 communicates with the PROD1 DB, VM2->PROD2, etc.
For deployment to Dev and Staging, I do a simple File System deployment from VS2013 to the VM using Debug/Release web.config transforms. For Production deployments, a SysAdmin will copy the bits deployed and tested in Staging to each Production VM. This is to ensure that what was tested and verified by QA in Staging is what we promote to Production -- I don't want to do another build between Staging and Production. Because of this, our SysAdmin is responsible for (with DevOps guidance) editing each web.config between Staging and Production. This basically consists of changing connectionString values from "Data Source=STAGINGDB" to "Data Source=PROD1" (and PROD2, PROD3, PROD4).
What I ultimately want is when I publish to Staging, I want to deploy my web.config using standard Release web.config transform; however, alongside this file I want to also create and drop 4 additional files (web.config.PROD1, .PROD2, etc.). This will allow us to create scripts which ignore the existing web.config (with Staging settings) and copy/rename the appropriate .PROD config.
I am able to (sort of) achieve this with MSBuild:
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="Build">
<TransformXml Source="Web.config" Transform="Web.PROD1.config" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="Web.PROD2.config" Destination="Web.config.PROD2" />
...
</Target>
</Project>
My main issue with this approach is that I have to create 4 essentially redundant solution configurations to wire up to the Transform. Every setting is the same except the DB connectionString. Seems like there should be a more efficient way.
Can I execute individual transforms without solution configurations by simply calling the appropriate transform via MSBuild, like:
<add name="connectionString" connectionString="PROD1" xdt:Transform="SetAttributes" xdt:Locator="Match(name)" />
Should I be using another process altogether? I'd rather not use a 3rd party nuget solution if I can stay way from it. Should I be using a .wpp.targets file? XmlPoke?
My desired workflow
Right-click my MVC app and choose "Publish" (File System)
Let the Release transforms do their thing and generate the web.config. I have basic configurations. Debug = Dev, Release = Staging.
Add a custom step that generates 4 additional web.config files
Package everything up, and publish to the Staging server, so I see this on the VM:
Everything I've read leads me to believe that I should be writing custom MSBuild steps, but I don't know what I should be doing (or how). Here's some pseudo-code:
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="Build">
<TransformXml Source="Web.config" Transform="[Do-Basic-Transform-On-Conection-String]" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="[Do-Basic-Transform-On-Conection-String]" Destination="Web.config.PROD2" />
<IncludeFilesInPublish>
<FileToInclude>Web.config.PROD1</FileToInclude>
<FileToInclude>Web.config.PROD2</FileToInclude>
</IncludeFilesInPublish>
</Target>
</Project>
Can I [Do-Basic-Transform-On-Connection-String] inline here without a solution configuration? I'll only be changing 2 connectionString values. If I need to create a solution config, that's fine... I just don't think it's totally necessary especially if I can do it inline. Maybe I'm wrong?
How do I accomplish the <IncludeFilesInPublish> bit so that whatever I do get's packaged up during the Publish, so my Staging deployment has my release candidate code and web.configs ready for promotions.
I think your question is twofold: 1) how do I pass environment variables or parameters (i.e. PROD1) into my xdt transformation file so I only have to use one transformation file? and 2) how do I get MSBuild to iterate over a set of known named items to produce outputs distinguished by each item in this set?
For the first part, the only reason why I assert you might be asking this is because you said "I have to create 4 essentially redundant solution configurations to wire up to the Transform", so if your transform took "PROD1" as a parameter you could ideally just use one transform. But I'm not sure that you can do this without creating your own XmlTransform task. The xdt transformation tooling is really limited. But the nice thing about MSBuild is that it's flexible enough that you could theoretically come up with your own transformation task that extends/subclasses or behaves like the one out of the box.
using System;
using Microsoft.Build.Framework;
using Microsoft.Build.Utilities;
using Microsoft.Web.Publishing.Tasks;
namespace Thanks.IllWriteMyOwnTasks
{
public class MyCustomTransformXml : TransformXml // no idea if you can do this
{
public override bool Execute()
{
// do stuff here,
// maybe declare parameters that you can pass down to base.Execute()
return true;
}
}
}
..
<!--<UsingTask TaskName="TransformXml" AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v11.0\Web\Microsoft.Web.Publishing.Tasks.dll" />-->
<UsingTask TaskName="MyCustomTransformXml" AssemblyFile="Thanks.IllWriteMyOwnTasks.dll" />
For the second part, I think you can use ItemGroup.
<ItemGroup>
<MyEnvironments Include="PROD1" />
<MyEnvironments Include="PROD2" />
<MyEnvironments Include="PROD3" />
<MyEnvironments Include="PROD4" />
</ItemGroup>
<Target Name="BeforeBuild">
<TransformXml Source="Web.config"
Transform="Web.%(MyEnvironments.Identity).config"
Destination="Web.config.%(MyEnvironments.Identity)" />
</Target>
I haven't tested this, but I think based on what I see over here it will automatically repeat the same task as it iterates over MyEnvironments in this example.
You can add the extra transform files to your solution without adding a new solution configuration, and run the transform as in your own example (except for the target. 'Build' didn't work for me):
<Project ToolsVersion="12.0" DefaultTargets="Build">
...
<Target Name="BeforeBuild">
<TransformXml Source="Web.config" Transform="Web.PROD1.config" Destination="Web.config.PROD1" />
<TransformXml Source="Web.config" Transform="Web.PROD2.config" Destination="Web.config.PROD2" />
...
</Target>
</Project>
If you are only changing a connectionstring, your transform file will be pretty minimal:
<?xml version="1.0" encoding="utf-8"?>
<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">
<connectionStrings>
<add name="MyDB"
connectionString="Data Source=PROD1SQLServer;Initial Catalog=MyReleaseDB;Integrated Security=True"
xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/>
</connectionStrings>
</configuration>
After adding the above, and compiling, you will need to add the newly generated Web.config.PRODX files to your solution. After adding them, you simply open properties for each file and ensure that their compile action is set to 'Content'. This will mean that they are included in your deployments.
As the web.PRODX.config transform files are not part of a solution configuration, you could stick them in a folder to reduce clutter.

Retain changes to existing files when publishing

I'm stuck in a scenario where my published file is initially correct, but then later is replaced by it's original version. I suspect that the order of events is wrong, coupled with what's in the root directory.
Essentially I have successfully setup my publishing environment where it executes a custom command to create some JavaScript (which gets created outside of my project). Because a file in source control needs to reference this newly created JavaScript, I am simply copying the file (MyControl.ascx) to a temp location (thus it loses the read lock by TFS) and I am having the custom command update the references to the JS. Once this is done, I gather all the files (custom JavaScript, as well as edited MyControl.ascx in it's temp location) and publish.
It publishes everything first, so I see the new JS as well as the updated MyControl.ascx, but a few minutes later it finishes the publish, and Control looks like how it looked in the root directory.
I think what's happening is it's just pushing out what's in the root (which includes MyControl.ascx) on top of my custom Control (which is in another directory).
<PropertyGroup>
<PipelineCollectFilesPhaseDependsOn>
CustomCollectFiles;
$(PipelineCollectFilesPhaseDependsOn);
</PipelineCollectFilesPhaseDependsOn>
</PropertyGroup>
<Target Name="CustomCollectFiles">
Exec Command="MyCommand.bat"
<ItemGroup>
<_BundledJS Include="$(MSBuildThisFileDirectory)..\..\Includes\javascript\*.js" />
<FilesForPackagingFromProject Include="%(_BundledJS.Identity)">
<DestinationRelativePath>Includes\javascript\%(Filename)%(Extension)
</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<ItemGroup>
<_UpdatedControl Include="$(MSBuildThisFileDirectory)..\..\TempArea\MyControl.ascx" />
<FilesForPackagingFromProject Include="%(_UpdatedControl.Identity)">
<DestinationRelativePath>Controls\%(Filename)%(Extension)
</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
</Target>
So you can see above i'm taking something that's in ....\TempArea\MyControl.ascx and pushing it out to Controls\%(Filename)%(Extension).
Any ideas how I can tell it to essentially retain my ....\TempArea\MyControl.ascx without then overwriting it with the original MyControl.ascx within the project (....\Controls\MyControl.ascx) ?
Thanks so much!
So I actually ended up figuring it out, and wanted to share:
<ItemGroup>
<_UpdatedControl Include="$(MSBuildThisFileDirectory)..\..\TempArea\MyControl.ascx" />
<FilesForPackagingFromProject Remove="Controls\MyControl.ascx" ></FilesForPackagingFromProject>
<FilesForPackagingFromProject Include="%(_UpdatedControl.Identity)" >
<DestinationRelativePath>Controls\%(RecursiveDir)%(Filename)%(Extension) </DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
So you can see here it actually removes the original control, then adds it back.
I also changed the target as such:
<PropertyGroup>
<OnAfterPipelineCollectFilesPhase>
CustomCollectFiles;
$(OnAfterPipelineCollectFilesPhase);
</OnAfterPipelineCollectFilesPhase>
</PropertyGroup>
Let me know if any questions would be happy to help!

How do I make MvcBuildViews continue to other views on error?

I've got a handy visual studio external tool shortcut to build the current project with MvcBuildViews enabled.
Arguments: /m:2 $(ProjectFileName) /p:MvcBuildViews=true
Command Line: C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe /m:2 "ProviderPortal.csproj" /p:MvcBuildViews=true
Anytime there is an error on a view, it stops at that one and reports it.
I want to know all the views that have errors not just stop at the first one.
How would I tell the Aspnet-Compiler to continue on errors? Or is there a way to get msbuild to instead invoke the aspnet-compiler per view rather than a one-shot call?
You can try setting ContinueOnError to ErrorAndContinue:
<Target Name="MvcBuildViews" AfterTargets="AfterBuild" Condition="'$(MvcBuildViews)'=='true'">
<AspNetCompiler VirtualPath="temp" PhysicalPath="$(WebProjectOutputDir)" ContinueOnError="ErrorAndContinue" />
</Target>
http://msdn.microsoft.com/en-us/library/ms171484.aspx

Ant - conditional statement

I'm using ant to build my app, and I want to have single process for dev/qa/prod versions of the app. I want to do be able to specify the build target from command line:
ant -Dbuildtarget=dev|qa|prod
and in build.xml check for the value of buildtarget and set an application specific base URL property based on the buildtarget specified by the user. I will subsequently set the correct runtime param using
<copy file="pre.app.properties" tofile="./app.properties" overwrite="true">
<filterset>
<filter token="BASE_URL" value="${baseurl}" />
</filterset>
</copy>
What I am stuck on is how to express this in and build.xml ?
if buildtarget=='dev'
baseurl="http://my_dev_url"
else if buildtarget=='qa'
baseurl="http://my_qa_url"
else if buildtarget=='prod'
baseurl="http://my_prod_url"
I've searched around, but this seems to be difficult to do in ant. Any ideas ?
When starting your ant script with ant -Dbuildtarget=dev|qa|prod it's as simple as =
<project >
<property name="baseurl" value="http://my_${buildtarget}_url"/>
<echo>$${baseurl} => ${baseurl}</echo>
</project>
The buildtarget property can be used as dynamic part of the baseurl property.Afterwards ${buildurl} can be used for further processing..
Perhaps you should try using the condition task of ant?

What are the custom targets you all run when using ant to build project?

I am thinking of running this custom targets to find out more about my project build status
- jalopy
- jdepend
- cvs tagdiff report
- custom task for NoUnit
- generate UML diagram. ESS-Model
What are your views?
I think that it's a great idea and use it myself. That way I'll never forget to run it.
I also keep the reports for a decent amount of time and eventually create a spreadsheet of "progress".
In your main ant task - call another task to do "whatever"
and
JDepend.xml ...
<target name="statsAll">
<!-- master file that describes where everything is -->
<property file="./ant/ant-global.properties" prefix="ant-global" />
<tstamp>
<format property="gen.time" pattern="yyyyMMdd_hh"/>
</tstamp>
<echo message="LOG:./ant/logs/jdepend.${version.FILETAG}.${gen.time}.rpt"/>
<!-- generate stats to see if we're improving -->
<jdepend
outputfile="./ant/logs/jdepend.${version.FILETAG}.${gen.time}.rpt" >
<exclude name="java.*"/>
<exclude name="javax.*"/>
<classespath>
<pathelement location="./jar" />
</classespath>
<classpath location="./jar" />
</jdepend>
</target>
<target name="doJDepend" depends="getVersion,statsAll">
<echo message="FTP'ing report"/>
<ftp verbose="yes" passive="yes" depends="yes"
remotedir="/videojet/metrics" server="xxxxx"
userid="xxxx" password="xxxxx"
binary="no"
systemTypeKey="UNIX">
<fileset dir="./ant/logs/" casesensitive="no">
<include name="**/jdepend.${version.FILETAG}*.rpt"/>
<exclude name="**/*.txt"/>
</fileset>
</ftp>
</target>
Magic build machine
I second the 'good idea' part, although for a project of reasonable size you might want to make it part of an automated build, like one of the CI Servers (Bamboo, Contiuum).
You might also consider a code coverage tool to see how your test coverage is going.
This will ensure the reports get run on a regular basis, could give you somewhere to publish them and won't slow down the developer's quick turnaround development cycle.
I also think some reports about your project are a good idea. My template-project for an ant-build-script (Antiplate) has at the moment the following reports: Junitreport, emma-report, PMD, CPD and Checkstyle. I'm thinking about including a JDepend-report.
At work we use these templates and using Hudson as continuous-integration-system. Hudson creates wonderful graphs for these reports and how the measures changed with the builds.

Resources