I have a jenkins job to create multiple debian packages. Each created package file is archived as artifact of the build. This works well so far.
Currently I am trying to to trigger multiple builds of another job, one for each created package file. This job should install each package in an isolated vagrant box and do some tests on it.
The question is how to trigger the builds. As it would be nice to parallelize the builds it is not easy as doing one build for all packages. The number of packages is not always the same, so it is very uncomfortable to duplicate the job for each package.
Thanks,
krissi
To act on every build of a project, you probably want "Promotions". Read about it here:
How to promote a specific build number from another job in Jenkins?
Related
I have been searching far and wide to see if I can find information on Jenkins incremental pipeline builds that does not involve Maven.
The general idea is that I want to build a generic project and run specific steps of the pipeline if the underlying code has changed. If the code did not change, I want to re-use the results from a previous build.
The reason why I want to do this, is to drastically reduce build times for huge projects.
Imagine that you only need to fix 1 line in a SCSS file, but the whole project needs to be rebuild, repackaged, etc because of this. In the meantime, the site is live and broken and waiting 15 mins to be fixed.
Can someone give a basic example of how such a build can be created or where I can find more information on incremental building?
The only thing I have been able to find is incremental building for Maven projects, but this is not applicable for me.
The standard solution is to create modules that depends on each others.
Publish the built artifact of your modules to a binary repository like Sonatype Nexus (you can easily create private npm repo as well as proxy npm repo).
During the build download the dependencies, instead of building them.
If this solution is not the one you want to take, you will have a hard time hacking a solution. To persist the state of your steps, an easy solution is to create files in the job workspace and read them at next build
This is reproducible 100%.
We are working on different branches of the release, but each branch should run the same jobs, with some minor change. So ideally I want to copy all the jobs from one working branch to a new branch.
I select a New item -> folder and select copy from another folder.
The new folder contains the all the jobs from source folder, but all the job configurations are missing. In another word, I have jobs created just with job names, I need to refill everything else. This is essentially useless.
I googled and did not see any related errors. Anyone have any good advice on copy jenkins folders ? I am jenkins 1.651.3, ubuntu 14.04
I tried the same on jenkins 2.19.1 and worked with out the issue your are seeing.
The best way to create similar array of jobs for new branches is via groovy & using https://jenkinsci.github.io/job-dsl-plugin/
create a job where you execute a groovy script to iterate over a list of branches and creates jobs .
DSL plugin is available for jenkins 1.642 and above
Note that manipulating content in JENKINS_HOME is not advised and is typically restricted
I should also mentioned it turns out just our computer issue. Lack of ram. After we added more ram, it' all working perfectly!
I'm using TFS 2013 on premises. I have four build agents configured on a Build machine. Several build definitions compile ASP .NET websites. I configured the msbuild parameters to deploy the IIS application to the integration server, which sits out there in Rackspace.
By default webdeploy does differential deployments by comparing file dates. In my case that's a big plus because copying files from our network to Rackspace takes quite some time. Now, in order to preserve file dates the build agent has to compile the same base set of source code. On every build only the differential source code yields a new DLL, minimizing the number of files deployed.
All of that works fine, with a caveat: a given build definition has to be assigned to a build agent (by agent name or tag). The problem is I create a lot of contingency when all builds assigned to the same agent are queued up. They wait in line until the previous build is done.
In an ideal world any agent should be able to take care of any build, but the source code being compiled has to be the same, regardless of the agent.
I tried changing the working folder of all agents to point to the same location but I get an error because two agents can't be mapped to the same folder. I guess there is one workspace per agent.
Any ideas?
Finally I found a way to do this. Here are all the changes you need to do:
By default the working folder of each agent is $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath). That means there's one working folder per BuildAgentId. I changed it so that all Agents share the same folder: $(SystemDrive)\Builds\WorkingFolder\$(BuildDefinitionPath)
By default at runtime the workflow creates a workspace that looks like "[BuildDefinitionId][AgentId][MachineName]". Because all agents share the same working folder there's an error trying to create each separate workspace. The solution to this is in the build definition: Edit the xaml and look for an activity called "Get sources from Team Foundation Version Control". There's a property called WrokspaceName. Since I want to have one workspace per build definition I set that property to the BuildDetail.BuildDefinition.Name.
Save your customized build template and create a build that uses it.
Make sure the option "1. TF VersionControl/1. Clean workspace" is set to False. Otherwise the build will wipe out all the source code on every build.
Make sure the option "2. Build/3. Clean build" is set to false. Otherwise the build will wipeout the output binaries on every build.
With this setup you can queue up the same build on any agent, and all of them will point to the same source code and bin output. When the source code changes only the affected binaries are recompiled. I have a custom step in the template that deploys the output files to IIS, to all the servers in our webfarm, using msdeploy.exe. Now my builds+deployments take one or two minutes, because only the dlls or content that changed during the build are synchronized to the servers.
You can't run two build agents in the same folder. The point of build agents is to run multiple builds in parallel, usually on separate PCs. If you try to run them on the same source code, then (a) it's pointless as two build of exactly the same source should produce identical results, and (b) they are almost certainly going to trip over each other and cause the builds to fail or produce unexpected results.
If you want to be able to build and then deploy a series of versions of your codebase, then there are two options:
if you queue up multiple builds, then the last one will "win", so the intermediate builds are of no real value. So if you check in New code before your first build completes, you may as well stop the active build and start a new one. you should be asking yourself why the build is so slow, or why you are checking in changes so often that this is necessary.
if each build produces an incremental update to the deployed result, then you need to pass the output of your builds to some deployment agent that is able to diff it against the deployed version and send only the changes to be deployed. This could be set up to gather results from multiple build agents if that would be beneficial.
but I wonder if perhaps your build Is slow because you are doing a complete build each time (which cleans the build folder, gets all the sources, and does a full rebuild), when what you want is an incremental build (which gets the latest changes, compiles only what is affected, and complete quickly). perhaps you should investigate making your build incremental.
My current setup consists of:
Project job - it's the one, that fetches the sources, deploys to the test environment and runs tests across the test env
Building job - it's a job that runs on a special machine which builds the sources into deb packages.
The issue: it's fairly easy to retrieve the deb packages from building job back (as a job artifact), but how would I pass the sources from project job to a building one?
They run on a different jenkins slaves.
What are the possible options?
Note: the building job isn't a specific job for this particular project. Several projects use it as a helper to build deb from the sources, so I cannot hardcode anything project-specific there.
You want to look into the CloneWorkspace SCM Plugin
If you are using SVN, you might want to look at the Tracking SVN Plugin. This allows you to pull the same version out of SVN as another job. We use this so that we can create both a "debug" and "release" build from the same SVN revision. The debug version builds first. If it succeeds, the release version is built using the same revision that was built for the debug.
So i have 4 project depend on each other with 4 different configuration in team city. when i run 1 they all run. but , each one of them is doing check out when starting his run , so it is possible that some files were committed during build and than it is not the same revision.
i want to be able to checkout them all at the beginning so the build will be always the same revision.
does anyone has an idea?
If they are all pulling from the same repository you can create a snapshot dependancy from the dependent build to the build they are dependent on. What this means is that they will use the same sources as the build that they are depending on.
Snapshot Dependency
You could group the projects logically in version control and add a parent script to build them in order. You then have one TC checkout/build that does all 4.
We use maven multi-module builds in Hudson for the same purpose.