We have an automated build and QA process for our software, using tfs/teambuild and msbuild, and we want to be able to know (for audit purposes) whether a component has gone through that process or not.
For example, if a library is installed on a user's machine, I'd like to be able to inspect it in some way to tell that it went through the build. In particular, I want to be able to distinguish it from components built directly on a developer's machine, and then manually installed.
What is the best way to do this? Code signing as part of the build process seems closest to these requirements, but presumably this would not cover any 3rd-party libraries that might be used? I also read about the ILMerge tool to merge all assemblies into one, but then I don't know enough to work out whether they can then be signed or not?
I'm sure we're not the first people to have the requirement, so casting around for any ideas or hints from others who might have done such a thing
Thanks!
Our developer builds are set to keep the versions at "0.0.0.0", but our build server marks the build based on a pre-configured version and automagically generated build string. "1.0.3.xxx". Your build server doesn't allow for this?
Your build process should be updating each of your projects assemblyinfo.cs files (or a global linked equivalent), you can do this with the TFS changeset number, so like the previous poster indicated you end up with the property on each dll of 1.0.changeset.buildno or something similar. You can do this easily in msbuild.
You could have the values of each assembly info file set in source control to be something obvious like 0 or 999.
A lot of what your asking is about process and training as well though.
If your using installers or zips to package your deliverables then you can also label them with the build number as part of your build process.
But if you have changeset you have the link from dll to code, so traceable, coupled with links to third party dll references as defined in each csproj.
Related
I've been tasked with setting up a new Team Foundation/Build server at my company, with which we'll be starting a new project. Nobody here currently has experience with TFS, so I'm learning all of this on my own. Everything is working so far; The server's been set up, the Repository and Team Project has been created, the Build Server has been created, and I've created a simple hello world application to verify the source control and Continuous Integration builds (on the build server) run properly.
However, I'm having a problem setting up the automatic versioning. I've installed the TfsVersioning project, and it's working fine; I'm able to define a format for my assembly versions. I haven't yet decided what format I'll use; probably something like Major.Minor.Changeset.Revision (I'm aware of the potential problem regarding using the changeset number in the assembly version, so I may decide to switch to Major.Minor.Julian.Revision before we begin development).
The problem:
I don't want assemblies to have new file versions if their source code has NOT changed since the last build. With a continuous Integration build this isn't a problem, as the build server will only grab the source files that have changed, causing an incremental build which produces only updated modules; the existing unchanged modules won't be built, so their version will remain unchanged.
If I set up a nightly build, I'll want to clean the workspace and perform a Build-All. However, this means that ALL assemblies will have new version (assuming the Assembly File Version includes the build number).
A solution?
This has prompted me to consider using the latest changeset number in the Assembly File Version. This way, if nothing has been committed between two successive Build-Alls, the versions won't be incremented. However, this would mean that a change and commit to a single file would force a version increment on ALL assemblies.
I'm looking for one of two things:
A way to only increment Assembly Version Numbers if their source/dependencies have changed since the last build. Successive Build-Alls should not cause changes in version numbers.
OR
A way for testers and non-developers to be able to tell version W.X.Y.Z and version W.X.Y.Z+1 of assembly 'Foo' are identical, even though they have differing file versions.
I've probably read about 20 articles on the subject, and nobody (except this guy) seem to address the issue. If what I'm asking for isn't common practice in the Team Foundation ALM, how do I address the second bullet point above?
Thanks for your time!
This is something I did in the past. The solution has two critical points:
You must use an incremental build, i.e. Clean Workspace = None
The change to AssemblyInfo.cs must be computed at each project
This latter is the most complex and I will just draft the solution here.
In the custom MSBuild properties use CustomAfterMicrosoftCommonTargets to inject an hook in normal Visual Studio compile
/property:CustomAfterMicrosoftCommonTargets=custom.proj
Also forward a value for the version
/property:BuildNumber=1.2.3.4
In custom.proj redefine the target BeforeCompile to something similar
<Target Name="BeforeCompile"
Inputs="$(MSBuildAllProjects);
#(Compile);
#(_CoreCompileResourceInputs);
$(ApplicationIcon);
$(AssemblyOriginatorKeyFile);
#(ReferencePath);
#(CompiledLicenseFile);
#(EmbeddedDocumentation);
$(Win32Resource);
$(Win32Manifest);
#(CustomAdditionalCompileInputs)"
Outputs="#(DocFileItem);
#(IntermediateAssembly);
#(_DebugSymbolsIntermediatePath);
$(NonExistentFile);
#(CustomAdditionalCompileOutputs)"
Condition="'$(BuildNumber)'!=''">
<Message Text="*TRACE* BuildNumber: $(BuildNumber)"/>
<MyTasksThatReplaceAssemblyVersion
BuildNumber="$(BuildNumber)"
File="$(MSBuildProjectDirectory)\Properties\AssemblyInfo.cs"/>
</Target>
You need to have a task for replacing the AssemblyFileVersion in the AssemblyInfo.cs source. MSBuild Extension Pack has an AssemblyInfo task for this purpose.
I posted the full details at my blog here, here and here.
Our project group stored binary files of the project that we are working on in SVN repository for over a year, in the end our repository grew out of control, taking backups of SVN repo became impossible at one point since each binary that is checked in is around 20 MB.
Now we switched to TFS,we are not responsible for backing the repository up, our IT tream takes care of it and we have more network and storage capacity for backups because of that but we want to decide what to do with the binaries. As far as I know TFS stores deltas and for binary files but deltas will be huge, but we might end up reaching our disk space quota one day, so I would like to plan things better from the start, I don't want to get caught up in a bad situation when it's too late to fix the problem.
I would prefer not keeping builds in the source control but our project group insists to keep a copy of every binary for reproducing the problems that we see in the production system, I can't get them to get the source code from TFS, build it and create the binary, because it is not straightforward according to them.
Does TFS offer a better build versioning method? If someone can share some insight I'd really be grateful.
As a general rule you should not be storing build output in TFS. Occasionally you may want to store binaries for common libraries used by many applications but tools such as nuget get around that.
Build output has a few phases of its life and each phase should be stored in a separate place. e.g.
Build output: When code is built (by TFS / Jenkins / Hudson etc.) the output is stored in a drop location. This storage should be considered volatile as you'll be producing a lot of builds, many of which will be discarded.
Builds that have been passed to testers: These are builds that have passed some very basic QA e.g. it compiles, static code analysis tools are happy, unit tests pass. Once a build has been deemed good enough to be given to test it should be moved from the drop location to another area. This could be a network share (non production as the build can be reproduced) there may be a number of builds that get promoted during the lifetime of a project and you will want to keep track of what versions the testers are using in each environment.
Builds that have passed test and are in production: Your test team deem the build to be of a high enough quality to ship. As part of your go live process, you should take the build that has been signed off by test and store it in a 3rd location. In ITIL speak this is a Definitive Media Library. This can be a simple file share, but it should be considered to be "production" and have the same backup and resilience criteria as any other production system.
The DML is the place where you store the binaries that are in production (and associated configuration items such as install instructions, symbol files etc.) The tool producing the build should also have labelled the source in TFS so that you can work out what code was used to produce the binary. Your branching strategy will also help with being able to connect the binary to the code.
It's also a good idea to have a "live like" environment, this should be separate from your regular dev and test environments. As the name suggests it contains only the code that has been released to production. This enables you to quickly reproduce bugs in production
Two methods that may help you:
Use Team Foundation Build System. One of the advantages is that you can set up retention periods for finished builds. For example, you can order TFS to store the 10 latest successful builds, and the two latest failed ones. You can also tell TFS to store certain builds (e.g. "production builds"/final releases) indefinitely. These binaries folders can of course also be backed up externally, if needed.
Use a different collection for your binaries, with another (less frequent) backup schedule. TFS needs to backup whole collections, but by separating data that doesn't change as frequently as the source you can lower the backup cost. This of course depends on the frequency you are required to have the binaries backed up.
You might want to look into creating build definitions in TFS to give your project group an easy 'one button' push to grab the source code from a particular branch and then build it and drop it to a location. That way they get to have their binaries, and you don't have to source control them.
If you are using a branching strategy where you create Release or RTM branches when you push something to production, then you can point your build definitions at those branches and they can manually trigger them from the TFS portal or from within Visual Studio.
Our team is sharing a Jenkins server with other teams, and this currently means that we are sharing the same OS-level build-user account. The different teams' OS-level build-user settings (Maven settings, bash settings, user-level Ant libraries, etc...) have collided a few times--"fixing" the settings for one team's jobs inadvertently "breaks" another team's jobs. The easiest sol'n that occurs to me is giving each team its own OS-level build-user account with which to execute its Jenkins jobs--but I cannot find a way to do this.
I have checked with Google, and also here
https://wiki.jenkins-ci.org/display/JENKINS/Use+Jenkins
and here
https://wiki.jenkins-ci.org/display/JENKINS/Plugins
to no avail.
Is there a way to do this? If not, can you recommend any best practices for segregating sets of builds from one another?
Maven Specific
You have two options that come to mind,
Add additional installations of Maven into your Jenkins global configuration, each using their own Home directory, and thus settings files. This will allow you to use totally different version of Maven, and selected based on Job requirements (You are given the option to select which "version" of maven you wish to use on the job itself.
Similar to (1), but specify specific settings configurations using Maven command line arguments. Its a little less "obvious" but may be quicker to implement
Multi-slave
You could possibly make use of multiple slaves on each machine. It increases the overheads of the builds quite significantly, and the implementation is such that you'd have multiple user accounts on a machine, each setup as needed, and then one slave instance for each user.
I'm not sure these solutions will totally answer your problem, I'll have a think and see if anything else pops into mind, but it might give some starting points
Key builds to a specific team directory that contains that team's settings. For example, provide a parameter 'TEAM' to every build, set its default value to the appropriate team name, and use that parameter as a key to a directory that contains the team's settings (so instead of using ${HOME} as in what you want to do, you'll use something like ${TEAM_SETTINGS}/${TEAM}).
You can set per-job users (who has access to/can build a particular job).
Under "Manage Jenkins" > "Configure System" >
Click on Enable Security
Check Project-based Matrix Authorization Strategy
However, I do not think there is a "per-build" option for a single job.
If you have the same project that you are sharing between teams, you could (and probably should) create two jobs for this project, and have different libraries/scripts be used in each.
You could also parametrize the build (On the Job Page, "Configure" > This build is parametrized) and supply the library versions, etc via string parameters.
You could also use a parameter to be the team's name, and in your build script change libraries based on the parameter:
For example, have a parameter called "TEAM", with choices: TEAM_A and TEAM_B, and in your script, have
if [ $TEAM == "TEAM_A" ]
then
ANT_HOME=/opt/ant/libA
else
ANT_HOME=/opt/ant/libB
fi
======================================================================
Have you considered sourcing your settings? In Linux, you could do this by saving your OS settings in a script file (for example paths, etc), and using source /path/to/settings/file, in Windows it would be call /path/to/settings/batch/file.
Can you give examples of OS level settings that you would require and per-build user for?
You problem is a common one.
Whenever something nonstandard is installed on a build server, something will break for someone.
The only solutions I know are
Set up a separate build slave for each team or product. Then they can install whatever they want on the build slave and any mess they create is all their own fault.
Any dependencies required by a job need to come with the job. This is my preferred way of working. For example: If a job needs a library or a tool, the library or tool is not installed on the build server but in the source tree and the build uses it from the source tree.
Sometimes the latter way is more work. You need to set up the tools or library so it works when it is installed in the source tree. Some tools have hard-coded paths and they do not work. In that case you can install the source of the tool and compile the tool during the build.
An even better solution is to set up separate Jenkins jobs for all the tools and libraries and the jobs that need a library or tool will download them from the Jenkins jobs.
This way you can control all your dependencies and different jobs do not conflict when e.g. one needs an older version of a library and one a newer version. And if someone upgrades the library, it is immediately visible in the version control who did what.
I'm just getting started with the team build functionality and I'm finding the sheer amount of things required to do something pretty simple a bit overwhelming. My setup at the moment is a solution with a web app, an assembly app and a test app. The web app has a PublishProfile set up which publishes via the filesystem.
I have a TFS build definition set up which currently builds the entire solution nightly and drops it onto a network share as a backup of old builds. All I want to do now is have the PublishProfile I've already setup publish the web app for me. I'm sure this is really simple but I've been playing with MSBuild commands for a full day now with no luck. Help!
Unfortunately sharing of the Publish Profile is not supported or implemented in MSBuild. The logic to publish from the profile is contained in VS itself. Fortunately the profile doesn't contain much information so there are ways to achieve what you are looking for. Our targets do not specifically support the exact same steps as followed by the publish dialog, but to achieve the same result from team build you have two choices, I will outline both here.
When you setup your Team Build definition in order to deploy you need to pass in some values for the MSBuild Arguments for the build process. See image below where I have highlighted this.
Option 1:
Pass in the following arguments:
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
Let me explain these parameters a bit, show you the result then explain the next option.
DeployOnBuild=true:This tells the project to execute the target(s) defined in the DeployTarget property.
DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder: This specifies the DeployTarget target.
PackageTempRootDir="\\sayedha-w500\BuildDrops\Publish": This specifies the location where the package files will be written. This is the location where the files are written before they are packaged.
AutoParameterizationWebConfigConnectionStrings=false: This tells the Web Publishing Pipeline (WPP) to not parameterize the connection strings in the web.config file. If you do not specify this then your connection string values will be replaced with placeholders like $(ReplacableToken_dummyConStr-Web.config Connection String_0)
After you do this you can kick off a build then inside of the PackageTempRootDir location you will find a PackageTmp folder and this contains the content that you are looking for.
Option 2:
So for the previous option you probably noticed that it creates a folder named PackageTmp and if you do not want that then you can use the following options instead.
/p:DeployOnBuild=true;DeployTarget=PipelinePreDeployCopyAllFilesToOneFolder;_PackageTempDir="\\sayedha-w500\BuildDrops\Publish";AutoParameterizationWebConfigConnectionStrings=false
The difference here is that instead of PackageTempRootDir you would pass in _PackageTempDir. The reason why I don't suggest that to begin with is because MSBuild properties that start with _ signify that the property in essentially "internal" in the sense that in a future version it may mean something else or not exist at all. So use at your own risk.
Option 3
With all that said, you could just use the build to package your web. If you want to do this then use the following arguments.
/p:DeployOnBuild=true;DeployTarget=Package
When you do this in the drop folder for your build you will find the _PublishedWebsites folder as you normally would, then inside of that there will be a folder {ProjectName}_Package where {ProjectName} is the name of the project. This folder will contain the package, the .cmd file, the parameters file and a couple others. You can use these files to deploy your web.
I hope that wasn't information over load.
The ability to publish web sites, configure IIS and push schema changes for the DEV->QA->RELEASE cycle has required either custom configuration to imitate publish or custom code where IIS settings are involved.
As of Visual Studio 2013.2 Microsoft has added a third party product that manages deployment of web sites, configuration changes and database deployment with windows workflow and would be the recommended solution for automating deployment from TFS build.
More information can be found here:
http://www.visualstudio.com/en-us/explore/release-management-vs.aspx
You can use the Publish/Deploy in Visual Studio 2010.
See http://www.ewaldhofman.nl/post/2010/04/12/Auto-deployment-of-my-web-application-with-Team-Build-2010-to-add-Interactive-Testing.aspx for more information
I'm trying to work with MSBuild and TFS.
I've managed to create my own MSBuild script, that works great from the command-line. The script works with csproj files, and compiles, obfuscate, sign and copies everything that's needed.
However, looking at the documentation of TFS & Team Build, it appears that it expect solutions as the "input" for the script.
Also, I haven't found an easy/intuitive way of performing a "Get Latest Version" from the TFS as part of the script. I'm assuming that the Team Build automatically do a "Get Latest" on the solutions it's suppose to compile, but again - I don't (want to) work with solutions...
Any insights? any pointers? any links?
Team Build defines about 25 targets of its own. When you queue a Team Build, they are automatically run for you in the predefined order listed # MSDN. Don't modify this process. Instead, simply set a couple of these properties that determine how the tasks behave. For example, set <IncrementalGet> to "true" if you want ordinary Get behavior, or "false" if you want something closer to tf get /force.
As far as running your own MSBuild script, again this shouldn't be necessary. Start with the TFSBuild.proj file that's provided for you. It should only require minimal modifications to do everything you describe. Call your obfuscation & signing code by overriding a task like AfterCompile or AfterTest. Put your auto-deploy code in AfterDropBuild. Etc.
Even really complex scenarios are possible if you refactor appropriately. See past answers #1 #2.
As far as the actual compile, you're right that Team Build operates on solutions. I recommend giving it what it wants. I'll be the first to admit that *.sln files are ugly and largely undocumented, but at least you're offloading the work to a well tested & supported product.
If you really wanted to, you could give it a blank/dummy solution and override the CoreCompile task with your custom compiler logic. But this is really asking for trouble. At bare minimum, you lose all of Team Build's flexibility WRT building multiple platforms and flavors. More practically, you're bound to spend a lot of time debugging something that's designed to "just work" -- and there are no good MSBuild debuggers yet (that I know of). Not worth it, IMO.
BTW, the solution files do not affect the Get process. As you can see in the 1st link, the Get is done very early on, long before Team Build even reads the solution file(s). Apart from a few options like <IncrementalGet>, this is not controlled from MSBuild at all -- in particular, the paths to be downloaded are determined by the workspace mappings associated with the build definition. I.e., they are stored in the Team Build SQL database, not the filesystem, and managed with tools (like Team Explorer) that call the TFS webservice API.