How can a Jenkins Ivy Project be converted to a Freestyle Project? - jenkins

After a recent update (both Jenkins and Plug-ins) my Ivy Project settings can no longer be changed due to incompatible layouts (table to div change in a minor version update, from Jenkins 2.263 to 2.264). This broke every plugin that was involved in configuring projects, but went unnoticed for two months because our project settings haven't needed to change in quite a while, and the builds were still working fine in the meantime.
For reference, my build process is based on:
Ant for the build
Ivy for dependency resolution
Artifactory as a dependency repository
Subversion as a code repository (with Jenkins commit triggers)
Junit with Cobertura, Jmeter
FindBugs, CheckStyle, CLOC
Projects are based on Java and JavaDoc
I tried reverting to the earlier version of Jenkins, but this affected nearly every plugin, and I wasn't able to successfully revert to the plugin version combination from prior to the breaking update. After failing to revert the updates, I decided instead to plow forward in updating all of our 68 projects to accommodate the new plugin versions.
Unfortunately, I can't save any configuration changes to Ivy Projects. After trial and error, I've found that I can re-produce my builds using Freestyle Projects. However, Jenkins doesn't seem to offer any way to convert projects from one type to another. If I were to create new projects from scratch to replace my existing projects (all 68, including their dependencies and specific plugin settings), I would lose all of my previous build histories, including the build numbers (which carry over to our deployments) and our project metrics (which we use for performance evaluation). So, I don't want to lose all of that information.
How can I manually change an Ivy Project to a Freestyle Project?

I found a partial solution, but it doesn't seem to work for all projects.
Stop the Jenkins webapp (important).
For each Ivy Project that you want to convert to a Freestyle Project, rename the root element of jobs/[project]/config.xml from <hudson.ivy.IvyModuleSet plugin="ivy#2.1"> to <project> (don't forget to also change the closing tag at the end of the document from </hudson.ivy.IvyModuleSet> to </project>.
Restart Jenkins.
For most projects, I am then able to change the project configuration and save (importantly, Ant/Ivy-Artifactory Integration in a Freestyle Project is a feature-matched substitute for an Ivy Project).
However, three other projects still show up as Ivy Projects after changing the root element tag. What these projects had in common was that they all use the Performance Plugin. In order to finish converting these to Freestyle, I needed to additionally:
Disable the Performance Plugin
Restart Jenkins
Edit/Save the configuration for those projects as above.
Side effects and special considerations:
All of my build timestamps (prior to the change) are now listed as Dec 31, 1969 7:00 PM EDT, with a most recent build time as 50 yr. New build timestamps are correct. This likely was the result of no longer depending on the CloudBees plugin for build pipelines, which mapped build timestamps to build versions to avoid an old regression bug.
Every project immediately changed to red (Failed) on the dashboard, even though no builds had been attempted after the update, and the previous status was blue (Success) or yellow (Unstable). I suspect this is related to the above issue. After the next attempted build, whether successful or not, the status accurately reflects the build status.
No ability to use the Performance Plugin.
Several projects now show up as both an Upstream and Downstream Project, causing endless build cycles. There were three cases of this involving different combinations of projects, and in those cases, one or both projects needed to be removed from the build triggers. I suspect it had been this way for a while but for some reason the endless cycles only happen after the update.
I suddenly have a lot of "Unreadable Data" across all of my Jenkins projects. Unfortunately, discarding it is an all-or-nothing process (can't pick a single project to test). I backed up my jobs directory and clicked Discard, and to my surprise everything still works.
It looks like I'm back in business. My build numbers have been preserved, and the only noticeable side effect is the 50 year old builds. If I encounter any other issues resulting from these changes, I will update this answer.

Related

In Jenkins, can we delete build artifacts for older builds, but keep build details/logs?

As is good practise, I've got Jenkins set up at work to automatically build everything for continuous integration, pulling files from our Git repositories. On our development branches, builds get kicked off automatically whenever anyone commits a change. When we want to do formal testing, we pull the build from Jenkins and use that; and when we want to sign off a change request, we quote the Jenkins build number where the change went in. So far, so good.
The problem we have is that builds are a significant size. For our SDK, we have to build across multiple platforms so that we can check it works on all of them. At maybe 50MB per build, this starts to mount up! Short term I can keep asking IT to give me more storage space, but longer term I'd like a more strategic solution
The obvious answer in Jenkins is to set up deletion rules, whether deleting after some time or after some number of builds. The problem then though is that if we delete that older development build, we lose the traceability of what we tested. I'm sure most engineers at one time or another have had to do a binary chop through older builds to find an obscure bug/regression which was only spotted some time later. For me, it is unacceptable to lose that history.
The important feature of build history though is not the binary build artifacts, but the build log recording what Git commits (or anything else; toolchain versions for example) went into each build. That's what lets us go back to investigate older builds and recreate them if required. The build log is relatively small (and highly compressible, being a text file). We do still need to keep build artifacts for recent builds though, so that testers can use them. So I'm thinking a better alternative would be to preserve the build log in Jenkins for all builds, but to have Jenkins automatically delete build artifacts after some time.
Does anyone know of a way in Jenkins (perhaps a plugin?) which would let us automatically delete/archive build artifacts from older builds, but still keep the build details and log for those builds? I'm happy to do a Jenkins upgrade if necessary to get this feature. And of course this needs to be only for selected development build jobs - all release build jobs need their build artifacts to be preserved forever, as do any builds which have the "keep forever" button ticked.
If it's absolutely necessary, I could set up a separate cron job to do this on the Jenkins file area. That's a nasty hack though, and I suspect it's likely to cause some issues with Jenkins, so I'd rather not do something that brute-force if there's a better alternative.
I think you need this option in your jenkinsfile
buildDiscarder(logRotator(artifactNumToKeepStr: '10'))
artifactNumToKeepStr: This number of builds have their artifacts kept.

Any quick way to convert VS .net manual build into Jenkins?

We are migrating 50+ .net project from TFS to GitHub, at the same, we want to use Jenkins to automate the build. Currently all the builds are done inside the Visual Studio manually. I know how to automate this build using MSBuild and we already have a lot of these projects building inside Jenkins.
My question: is there a way to set up these 50+ project quickly w/o creating them one by one manually? Anyway to script them? e.g. a Jenkins project has everything inside a folder, I can copy a sample project/folder to create a new one and modify something. Or create a Jenkins project using a script reading a config file? Any idea can save some time is appreciated.
Not a direct answer but too long for a comment so here it goes anyway. Following the Joel test (which in no way is dogmatic for me but does make a lot of good points), and in my experience, you should already have an msbuild file now to build all those projects 'in one click'. Then, setting up a build server, in fact any build server, is just a matter of making it build that single parent project. This might not work for everyone, but for several projects I've worked on this had the following advantages:
the entire build process gets defined by developpers, working locally on their machine, using 'standard' tools
as such they don't need to spend hours in a web interface figuring out the appropriate build steps, dependencies and whatnot (also those hours would have been worthless in the end if switching to a different build server)
since a complete build is now just a matter of msbuild master.proj, possibly along with some options to define configuration/platform/output directories getting this running on any build server should be painless and quick
in the same manner this makes it easy to test different build servers with a minimum of time and migrate between them (also no need to ask SO questions on how to set everything up :)
this also makes it easy for other developpers to get complete builds as well without having to go round via a build server
Anecdote: we once had Jenkins running on multiple different projects as well. It took us days to get everything running, with the templates etc, and we found the web intercae slow and cumbersome (and getting to know the API would have taken even more days). Then one day I got sick of this and made a bunch of msbuild scripts which could build everything from one msbuild command. That took much less time than setting up Jenkins, a couple of hours or so. Then I took a TeamCity installation we already had and made it build the new master project. Took like an hour and everything worked. Just recently I took the same project and got it working on Visual Studio Online, again in no time.
If those projects are more or less similar to build, you will probably be interested in using the template plug-in for jenkins. There you configure a dummy project such that it does what is common to (most of) the 50+ projects.
Afterwards you create a separate project for each: Create the first project and make it use the template project for each of the steps which can be shared with the template project (use build step from other project). All subsequent projects can be created as slightly adopted copy of this first 'real' project.
I use it such that the variable $JOB_NAME (the actual project name in jenkins that is) is part of the repository path and I can thus clone from http://example.org/$JOB_NAME/
Configured that way, I can include the source code management step in the templating job and use it unmodified. Similar with the build step and post-build step: they are run by a script which is somewhat universal accross all my projects (mostly calling make and guessing deployment / publication paths upon $JOB_NAME again).

How can I control the order of builds in TFS 2010 when common library is checked in?

I have a TFS 2010 with some projects and a common library used in 5 of them. We use VS 2013 and we have Rolling Builds enabled in most if not all build definitions. When the common library is checked in, all of the projects referencing it are recompiled - but the order is poor, the most commonly used project is compiled as the last one. Is there a way to change that so it gets compiled first?
This question hints at a lot of problems and possible solutions. The simplest answer is probably to just add more build servers to run all the builds in parallel.
Otherwise you need to consider turning off rolling builds and writing your own build scheduler. That or other strategies such as building the DLL once and checking it in as source to the other builds or running the build on a branch that is outside of the other 5 builds and merging the source into those builds when they want to pick up the change.
If you are otherwise happy with how things are now and don't want to do a lot of work to solve the problem.... then just scale out your build farm with additional agents.
There's a "Priority in queue" on the build definition, but it sounds like you might want to change your solution's > Project Dependencies' > Build Order?

Automatic Versioning with Team Foundation Server 2012; Increment Only on Changed Assembly

I've been tasked with setting up a new Team Foundation/Build server at my company, with which we'll be starting a new project. Nobody here currently has experience with TFS, so I'm learning all of this on my own. Everything is working so far; The server's been set up, the Repository and Team Project has been created, the Build Server has been created, and I've created a simple hello world application to verify the source control and Continuous Integration builds (on the build server) run properly.
However, I'm having a problem setting up the automatic versioning. I've installed the TfsVersioning project, and it's working fine; I'm able to define a format for my assembly versions. I haven't yet decided what format I'll use; probably something like Major.Minor.Changeset.Revision (I'm aware of the potential problem regarding using the changeset number in the assembly version, so I may decide to switch to Major.Minor.Julian.Revision before we begin development).
The problem:
I don't want assemblies to have new file versions if their source code has NOT changed since the last build. With a continuous Integration build this isn't a problem, as the build server will only grab the source files that have changed, causing an incremental build which produces only updated modules; the existing unchanged modules won't be built, so their version will remain unchanged.
If I set up a nightly build, I'll want to clean the workspace and perform a Build-All. However, this means that ALL assemblies will have new version (assuming the Assembly File Version includes the build number).
A solution?
This has prompted me to consider using the latest changeset number in the Assembly File Version. This way, if nothing has been committed between two successive Build-Alls, the versions won't be incremented. However, this would mean that a change and commit to a single file would force a version increment on ALL assemblies.
I'm looking for one of two things:
A way to only increment Assembly Version Numbers if their source/dependencies have changed since the last build. Successive Build-Alls should not cause changes in version numbers.
OR
A way for testers and non-developers to be able to tell version W.X.Y.Z and version W.X.Y.Z+1 of assembly 'Foo' are identical, even though they have differing file versions.
I've probably read about 20 articles on the subject, and nobody (except this guy) seem to address the issue. If what I'm asking for isn't common practice in the Team Foundation ALM, how do I address the second bullet point above?
Thanks for your time!
This is something I did in the past. The solution has two critical points:
You must use an incremental build, i.e. Clean Workspace = None
The change to AssemblyInfo.cs must be computed at each project
This latter is the most complex and I will just draft the solution here.
In the custom MSBuild properties use CustomAfterMicrosoftCommonTargets to inject an hook in normal Visual Studio compile
/property:CustomAfterMicrosoftCommonTargets=custom.proj
Also forward a value for the version
/property:BuildNumber=1.2.3.4
In custom.proj redefine the target BeforeCompile to something similar
<Target Name="BeforeCompile"
Inputs="$(MSBuildAllProjects);
#(Compile);
#(_CoreCompileResourceInputs);
$(ApplicationIcon);
$(AssemblyOriginatorKeyFile);
#(ReferencePath);
#(CompiledLicenseFile);
#(EmbeddedDocumentation);
$(Win32Resource);
$(Win32Manifest);
#(CustomAdditionalCompileInputs)"
Outputs="#(DocFileItem);
#(IntermediateAssembly);
#(_DebugSymbolsIntermediatePath);
$(NonExistentFile);
#(CustomAdditionalCompileOutputs)"
Condition="'$(BuildNumber)'!=''">
<Message Text="*TRACE* BuildNumber: $(BuildNumber)"/>
<MyTasksThatReplaceAssemblyVersion
BuildNumber="$(BuildNumber)"
File="$(MSBuildProjectDirectory)\Properties\AssemblyInfo.cs"/>
</Target>
You need to have a task for replacing the AssemblyFileVersion in the AssemblyInfo.cs source. MSBuild Extension Pack has an AssemblyInfo task for this purpose.
I posted the full details at my blog here, here and here.

TFS 2012: Correllating binaries to builds and source code

I'm starting to dive into TFS 2012 and I have a basic understanding of the tiers and how build servers, controllers and agents work and how different build scripts can have different configurations and projects.
However, one of the things I'm struggling with is a requirement for our source control solution that says that I need to be able to prove a particular changeset or shelfset produced a particular build. That is, given a particular binary, I can point to a release changeset that generated that binary. I should also be able to point to the test changeset that was merged into the release branch. The idea here is not just a separation of duty, but validating that because the release and test changesets are identical, no code was injected into a project by a code reviewer.
I've read one blog post that talks about "Binary promotions" -- would that concept be useful in my situation? I'm having a hard time finding how this binary promotion is set up in TFS.
Deployment
Out of the box TFS doesn't really support deployments, it can deploy to 1 location on build which often is a test server (think lab management). TFS 2012 has built in support for Azure deployments, but they still happen at the end of a build and the build artifacts cannot be automatically deployed to a new location.
You could modify the build template to allow to release to different locations, but that would still be a fresh build for every environment and not true binary promotions.
TFS does, however, have a concept of build quality and actually fires off events when this quality is changed. TFS Deployer is a 3rd party tool that hooks into the quality change event and can execute powershell scripts. This means with a simple change of a dropdown value you can automatically kickoff a script that releases to any environment you want. You can customize the build quality list (per team collection) to be a list of environments (dev, uat, staging, production etc) which the script then figures out where to release the specific build to.
VS2012 also has some nice improvements to web deploy which means deployment configurations are stored in source control with the project, which in theory means they'll be available in the drop folder for TFS Deployer to make use of.
I don't believe TFS keeps a history of build qualities, which means you can't really use the build quality history to maintain a list of what is deployed to which environment. You could fairly easily record this information as part of the deployment script though. Or at the very least add a custom summary node to the build with information about the release.
TFS2012 does have the ability to mark a build as deployed as part of the Azure deployment functionality, you mark tfs deployer builds as deployed using a script but it doesn't feel very useful.
Octopus Deploy is another project that's worth checking out, and could be used instead of TFS Deployer if your build template creates NuGet packages. It requires a bit more control over the production hardware as you need to install agents on each environment to handle releases, but it solves a lot of other issues with deployment.
Versioning
Once you have a nice consistent way of automatically releasing that people don't bypass, you can look at enhancing the build template to inject the build version, or changeset number as the assembly version for anything built as part of that automated build. There's a number of different ways to do it and plenty of blog posts and tools to help you achieve that.
Alternatively you could just use automatic assembly versioning ([assembly: AssemblyVersion("1.0.*")]) to give you the date/time the build occurred, which ends up like 1.0.1234.123 where 1234 is something like the days since jan 1st 2000, and 123 is the minutes since midnight (my specifics may be wrong here).
If you're deploying websites, then I highly recommend injecting the current build version into the html somewhere. This way you can check what version a website is running without needing access to the bin directory. It can also be appended as a querystring to css/js file imports to ensure no browser caching occurs between versions.
Thoughts
Personally I'm hoping Microsoft realise that the xaml build workflows are trying to do too much and that they split the different concerns (build, test, deployment...) into different scriptable parts. Of course that would not be until the next major release of TFS which is years away. Although with Team Foundation Service they are trying to iterate a lot quicker, so they may actually extend the Azure deployment stuff into something more useful in the nearer future.

Resources