Paket + FAKE + swapping dependencies in CI tool - f#

I'm messing about with some FAKE and Paket (on F#) and Jenkins, not really sure I know what I'm doing but I know what I WANT to do.
The short description is I want the build server to build a whole family of related services against a referenced package, but the package comes in different flavours (but share the same basic namespace/module names).
The long description;
I have a family of family of services that sit on top of an external
API.
i.e.
they all are reference some external package and access it through modules etc.
e.g.
ServiceA.fsprj
...
let f (x : ExternalApi.Foo) = ....
---------------
ServiceB.fsprj
...
let g (x : ExternalApi.Foo) = ....
The developer will probably develop against the most common flavour, lets say ExternalApiVanilla.
The developer will be using Paket, and Fake for build tools, and Jenkins.
When the code is checked in though I want the build service to attempt to build it against the vanilla flavour...but also against chocolate, strawberry and banana.
The flavours are not "versions" in the sense of a version number, they are distinct products, with their own nuget packages. So I think (somehow) I want to parametise a jenkins folder with all the jobs in with the name of the api package, pass that into the build script and then get the build script to swap out whatever the engineer has referenced and reference the parameter.
Of course some compilations will fail, we have to develop different variants of services to handle some of the variants of API, but 90% of our stuff works on all versions, we just need an automated way to check the build and then create new variants of services and jobs, to handle them.
as an aside, we are doing some things with C# and cake/nuget, but controlling the versioning by passing the nuget folder in and forcing the build to find specific versions of 1 flavour...I understadn this, though I wouldnt be able to write it, but I want to go 1 step further and replace the reference itself with a different one.
——————-
i’ll try looking at the paket.dependencies/paket references files in the build script, remove the existing reference, and add the jenkins defined ones from a shell and paket and aee what happens, dont especially like it, im dependent on the format of these files and i was hoping this would be mainstream

I have solved this, at least in the context of cake + nuget (and the same solution will apply), by simply search replacing the package reference (using XDocument) in the cake script with a reference parameter set up in the job parameters.
I'll now implement it in the fake version of this build, though I may simply drop paket all
together.

Related

Call Build vNext task directly

Build vNext tasks are an awesome improvement over the previous build process. One downside though is that I can't make some tasks conditional. I can create an additional build for every combination, but this clearly scales badly and causes lots of additional work if we have to change some other part of the build.
Instead I'd prefer being able to write my own PowerShell tasks that can call existing build tasks. There is at least one downside to this (if no build asks specifically for the vso-task the build agent won't download it), but considering we are using on-premise TFS and build agents I can live with this.
I tried to do something like the following:
$path = get-item "$env:AGENT_HOMEDIRECTORY\Tasks\NuGetPackager\0.1.56\NuGetPackager.ps1"
& "$path" -searchPattern $searchPattern -outputDir "$packageFolder" -configurationToPackage $configurationToPackage -nugetAdditionalArgs "$nugetAdditionalArgs -version $nugetVersion"
Sadly this causes the following error:
2016-04-12T09:50:22.3652811Z ##[error]import-module : Could not load file or assembly 'Microsoft.TeamFoundation.DistributedTask.Agent.Interfaces,
2016-04-12T09:50:22.3652811Z ##[error]Version=14.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find
2016-04-12T09:50:22.3652811Z ##[error]the file specified.
2016-04-12T09:50:22.3652811Z ##[error]At C:\Agent1\Tasks\NuGetPackager\0.1.56\NuGetPackager.ps1:19 char:1
Now one solution I found on the web indicates that I could add the looked for dlls to the GAC, but I really, really don't want to. Also clearly the tasks work just fine when called from TFS directly, so what configuration am I missing?
I tried adding the folder containing the dlls to the path and even call SetDllDirectory explicitly in the PowerShell, but neither of those help.
Environment: Windows Server 2012 R2 on both build agent and TFS server. TFS 2015 Update 1.
The Powershell task Host that's used by the build agent for 2015 RTM up to Update 2 is a custom host which does creative things to resolve assemblies and handle input/output. These tasks can't be called from outside the agent.
Plus, quite a few build tasks are implemented using Node, so you'll have to detect which one is which and invoke them accordingly.
The build tasks are being migrated to a new vsts-task-lib, which will support out-of-agent invocation. These would allow exactly what you want.
In the mean time you could take the existing tasks (they're a simple manifest plus script in most cases) and add one string parameter to the task in which you stick a variable which you can then treat as the condition. You'd need to replace all the standard tasks. Then push them again. if you keep the ExtensionID and the Task GUID the same, they'll act as in-place replacements. This is probably the easiest way to do what you want without having to perform all kinds of hacks that take away the Task's UI. Just set the version number to something ridiculously higher, like 100.0.1.83. that way you'll always end up using your version.
Note: the new builds are meant to be repeatable, in that calling the same build multiple times they always yield the same results. conditional actions can be captured in custom powershell scripts that are stored in source control. These can be executed as part of the workflow.

TFS Build Copy to Versioned Folder

I'm currently looking at a TFS build server setup & I was trying to set up a process whereby I can set up a build template to build to a folder based on the version number of a .NET assembly that's part of the build (As per the assemblyinfo.cs file). I've got it building to the standard looking folder ("Release_20130502.1"), but that's not exactly useful in 4 months time when we want to find the build for the v1.1.0 release.
Basically I want to make a special build template which will create a major release, and I'd like the folders it makes to be more noticable as versions rather than timestamps. I'm also hoping to automatically label the release as that version too. I know how to copy the files & make labels, but I'm not sure how to get at the version numbers.
I'd also love if I could get this into the Build Name recorded in TFS somehow but I suspect that might be a bit optimistic.
Does anyone have any idea how to do this (Or alternatively any other technique that'll get me easily recognizable release version builds)?
Take a look at build number format property for build definitions this is used to generate the build number and in turn the build folder during build and label in sourcecontrol if its enabled, you can modify this manualy to pass desired build number. The mentioned drop folder can be manually given as well. And you can always copy the folder after build by hand.
You can use revision variable to create something that works similiar to what you want, Revision number gets incremented if there is build with same name in system
Format: Release v1.$(Rev).0 - this would give you Release v1.1.0, Release v1.2.0, ... on each build
You will have to customize build definition with custom activity for your defined goal to work without manual interaction. However with details you have provided this approach has issues - each project has its own assembly info - which one do you use, what if they are different? This should get you started.
http://www.ewaldhofman.nl/post/2010/04/20/Customize-Team-Build-2010-e28093-Part-1-Introduction.aspx
You can take a look at these may find something useful
http://tfsbuildextensions.codeplex.com/
It all comes down how often do these build take place, if its week or more then doing it by hand is perfecly valid aproach in my book.

Can I specify the OS-level build user on a per-job basis?

Our team is sharing a Jenkins server with other teams, and this currently means that we are sharing the same OS-level build-user account. The different teams' OS-level build-user settings (Maven settings, bash settings, user-level Ant libraries, etc...) have collided a few times--"fixing" the settings for one team's jobs inadvertently "breaks" another team's jobs. The easiest sol'n that occurs to me is giving each team its own OS-level build-user account with which to execute its Jenkins jobs--but I cannot find a way to do this.
I have checked with Google, and also here
https://wiki.jenkins-ci.org/display/JENKINS/Use+Jenkins
and here
https://wiki.jenkins-ci.org/display/JENKINS/Plugins
to no avail.
Is there a way to do this? If not, can you recommend any best practices for segregating sets of builds from one another?
Maven Specific
You have two options that come to mind,
Add additional installations of Maven into your Jenkins global configuration, each using their own Home directory, and thus settings files. This will allow you to use totally different version of Maven, and selected based on Job requirements (You are given the option to select which "version" of maven you wish to use on the job itself.
Similar to (1), but specify specific settings configurations using Maven command line arguments. Its a little less "obvious" but may be quicker to implement
Multi-slave
You could possibly make use of multiple slaves on each machine. It increases the overheads of the builds quite significantly, and the implementation is such that you'd have multiple user accounts on a machine, each setup as needed, and then one slave instance for each user.
I'm not sure these solutions will totally answer your problem, I'll have a think and see if anything else pops into mind, but it might give some starting points
Key builds to a specific team directory that contains that team's settings. For example, provide a parameter 'TEAM' to every build, set its default value to the appropriate team name, and use that parameter as a key to a directory that contains the team's settings (so instead of using ${HOME} as in what you want to do, you'll use something like ${TEAM_SETTINGS}/${TEAM}).
You can set per-job users (who has access to/can build a particular job).
Under "Manage Jenkins" > "Configure System" >
Click on Enable Security
Check Project-based Matrix Authorization Strategy
However, I do not think there is a "per-build" option for a single job.
If you have the same project that you are sharing between teams, you could (and probably should) create two jobs for this project, and have different libraries/scripts be used in each.
You could also parametrize the build (On the Job Page, "Configure" > This build is parametrized) and supply the library versions, etc via string parameters.
You could also use a parameter to be the team's name, and in your build script change libraries based on the parameter:
For example, have a parameter called "TEAM", with choices: TEAM_A and TEAM_B, and in your script, have
if [ $TEAM == "TEAM_A" ]
then
ANT_HOME=/opt/ant/libA
else
ANT_HOME=/opt/ant/libB
fi
======================================================================
Have you considered sourcing your settings? In Linux, you could do this by saving your OS settings in a script file (for example paths, etc), and using source /path/to/settings/file, in Windows it would be call /path/to/settings/batch/file.
Can you give examples of OS level settings that you would require and per-build user for?
You problem is a common one.
Whenever something nonstandard is installed on a build server, something will break for someone.
The only solutions I know are
Set up a separate build slave for each team or product. Then they can install whatever they want on the build slave and any mess they create is all their own fault.
Any dependencies required by a job need to come with the job. This is my preferred way of working. For example: If a job needs a library or a tool, the library or tool is not installed on the build server but in the source tree and the build uses it from the source tree.
Sometimes the latter way is more work. You need to set up the tools or library so it works when it is installed in the source tree. Some tools have hard-coded paths and they do not work. In that case you can install the source of the tool and compile the tool during the build.
An even better solution is to set up separate Jenkins jobs for all the tools and libraries and the jobs that need a library or tool will download them from the Jenkins jobs.
This way you can control all your dependencies and different jobs do not conflict when e.g. one needs an older version of a library and one a newer version. And if someone upgrades the library, it is immediately visible in the version control who did what.

TFS 2010 mapping dependent files for builds

I am pretty new to TFS and Build configuration tasks so forgive me if this problem has a simple answer.
I have a team project that is sort of a common library(CL) that contains dlls and apis that I commonly use throughout my projects. All my other projects reference files directly from the mapped folder for the CL on my dev machine.
I am trying to set up a build definition for Project A(Build server is on a different machine). I want always ensure that the CL is the latest before each build so is it possible to have the build definition pull the latest files first? The only other alternative is to start including the CL in of every project directly.
I tried adding a working folder for the CL, but it does not seem to get the files before it attempts to build project A. And then after when I try to rebuild after the failure, I receive a error saying that the CL working folder "is already mapped in workspace".
Instead of mapping in the sources, why not build the common library, deploy it to a common location, and have all the projects that use it reference it at the common location?
In addition to simply making more sense (it should be common binary, not common source), this greatly improves Continuous Integration builds. If several builds map the same source into their workspace, then when the common source is changed, all of those CI builds will be kicked off.

Teambuild / MSBuild and stamping QA-approved builds

We have an automated build and QA process for our software, using tfs/teambuild and msbuild, and we want to be able to know (for audit purposes) whether a component has gone through that process or not.
For example, if a library is installed on a user's machine, I'd like to be able to inspect it in some way to tell that it went through the build. In particular, I want to be able to distinguish it from components built directly on a developer's machine, and then manually installed.
What is the best way to do this? Code signing as part of the build process seems closest to these requirements, but presumably this would not cover any 3rd-party libraries that might be used? I also read about the ILMerge tool to merge all assemblies into one, but then I don't know enough to work out whether they can then be signed or not?
I'm sure we're not the first people to have the requirement, so casting around for any ideas or hints from others who might have done such a thing
Thanks!
Our developer builds are set to keep the versions at "0.0.0.0", but our build server marks the build based on a pre-configured version and automagically generated build string. "1.0.3.xxx". Your build server doesn't allow for this?
Your build process should be updating each of your projects assemblyinfo.cs files (or a global linked equivalent), you can do this with the TFS changeset number, so like the previous poster indicated you end up with the property on each dll of 1.0.changeset.buildno or something similar. You can do this easily in msbuild.
You could have the values of each assembly info file set in source control to be something obvious like 0 or 999.
A lot of what your asking is about process and training as well though.
If your using installers or zips to package your deliverables then you can also label them with the build number as part of your build process.
But if you have changeset you have the link from dll to code, so traceable, coupled with links to third party dll references as defined in each csproj.

Resources