Is there a way to save/archive multiple artifacts from the same build?
Jenkins only allows a single 'Archive the Artifacts' post build step, and grey's out the option after it has been used once.
Maybe the ArtifactsArchiver's allows multiple patterns?
You can use Ant-style pattern, e.g. target/*.jar to archive multiple artifacts.
And it is possible to use a comma separated list of patterns if your files can't be matched with one pattern, e.g. target/*.jar, target/*.war.
The ? button next to the input field reveals this info.
You can comma separate the paths, like this:
XXX.UnitTests\bin\Release\**.* , XXX.WriteAPI.Service/bin/Release/**.*
Then you get two separate artifacts.
See http://ant.apache.org/manual/Types/fileset.html for details of the Ant Pattern syntax.
If you want to save two different types of files like zip files and html files, then you can use
*.html,*.zip
It will help you to archive all zip files and html files in that directory.
No, jenkins does not provide only one artifact to save. You can use wild card pattern to same any number of artifacts, For example
All Jars - **/*.jar
All War - **/*.war
and so on.
**/ means Any directory.
Copying #Steven the Easily Amused's comment on one of the bottom-ranked answers for visibility. You can just run it twice:
Notice that the step names are plural. One can run archiveArtifacts -
the pipeline step - as often as desired though it's more efficient to
run it once with Ant style patterns. When invoked archiveArtifacts
transports the selected file(s) back to the master where they are
stored. Similarly one can run copyArtifacts multiple times to select
all of or a portion of the archived Artifacts.
All the answers on here show how to combine multiple file patterns into 1 artifact which is not what the OP asked for.
An example of what was asked for is to have something like a Single Page Web app build that has environment specific settings compiled into the JavaScript for QA, Staging and Production.
As you would want to deploy the same build to multiple environments, you would need 3 builds, each with it's own environment settings in it. When you deploy the archive, you would not want to deploy the contents of all 3 to each environment and extract just the content for that one environment, because it is expensive to copy 66% more than is needed each time and could be error prone.
So, it is reasonable to generate 2 or more builds into their own artifacts and deploy 1 of those artifacts, depending on the target environment.
Jenkins should support multiple artifacts, not just making 1 artifact bigger.
Related
So I want to have a number of different websites running identical copies of binaries, but with differently transformed config files. These are different regional 'copies' of basically the same website (but connected to different backend DBs etc.)
I have a jenkins job which builds my asp.net site, e.g.;
MSBUILD
C:\Code\ProjectX\src\Website\adminsite.projectx\adminsite.projectx.csproj
/m /p:Configuration=Debug /p:OutputPath=C:\Code\ProjectX\build\Website\adminsite.projectx /t:Rebuild
When that job completes I want it to trigger a transform of the .configs, and a deployment of the binaries. Is there any recommended means of achieving this?
Right now there are only 2 different regional versions of the site deployed, each with their own web.config transformation file
I know that I could have each region BUILD its own copy of binaries, and do a straightforward deployment. But both regions will have identical binaries, so it seems like a waste of time for them to both kick off a build...
If both jobs try to build from the same source location msbuild seems to be producing artefacts in sub-folders of that location - so when both are kicked off at the same time they're tripping over eachother...
Any suggestions? :)
For what it's worth msbuild seems to ignore OutputPath when I provide that
That would have been ideal because I could just use something like;
/p:OutputPath=c:\Code\ProjectX\Build\$(Configuration)\.... etc.
I found that least wasteful way is to build (or "prepackage") once and include the trasforms into the artefact for environment-specific transformations and deployment later. Basically you'll have a custom MSBuild project, on build it'll call PipelinePreDeployCopyAllFilesToOneFolder target (less wasteful than Package since we don't need the final .zip) and redirect it with _PackageTempDir property and include all Web.*.config items, then on deploy you'll call the appropriate transform task and deploy via msdeploy sync.
I have a couple jenkins builds that run every second hour or so, since jenkins stores the data and metadata for the builds this takes up a lot of space but most of that space goes to the jars that are stored.
Jenkins keeps every jar for every build and most of them don't really change from one build to another so I was wondering if there's a way to
a) store only the jars that changed, which would be the best case scenario, something using symbolic links or something;
b) don't store the jars at all, we don't really check the builds by using the jars as a debug tool so we don't really need them. Of course I could put a cron to erase them, but I'd prefer do that from inside jenkins if possible.
Jenkins only stores jars and such if you have an "Archive the artifacts" post-build action in your job. If you don't have this, it doesn't archive anything except for logs and results.
If you're wanting to store SOME stuff but just not the jars, you can change the Excludes line in the advanced setting of the "Archive the artifacts" post-build action.
I have a single code based being used to build an application for multiple platforms.
Locally I have setup a main build-env.properties file, and a series of additional *.properties files that I use to switch settings for the different platforms I am publishing to.
Doing my build on the command line I simply use the command:
ant build -propertyfile dev-build.properties
How can I do this in Jenkins?
I currently use the "Invoke Ant" Build Step with the target set to build, but am at a loss for how to specify the secondary propertyfile?
Although not exactly the same, you can take the contents of those properties and put them into the Jenkins Invoke Ant build step, utilizing the properties advanced field.
The most basic way:
You will need to create a new task for each different set of sub properties you wish to utilize.
In your "Invoke Ant" build step, if you press Advanced..., this reveals a "Properties" field, you can copy the properties from one of your *.properties files into that field.
Repeat for each different properties file you wish to utilize.
Parametrized build plugin might help you. This is assuming the number of properties you are changing is one or two. So when you run a job, you get a drop-down to select you OS and go.
Though, as I have mentioned here , what goes against this plugin is that it makes the process manual
On this thread Hudson / Jenkins: share parameters between several jobs you can read the 2nd option in Anders's answer as an alternate approach.
A better approach for this is using a parameterised job with file parameter(refer to doc for creating the builds). Mentioning the file location as "propertyfile" would help. This would be better than reconfiguring the job again and again to run a build (To copy the properties file to the input location).
I have a fairly complicated Jenkins job that builds, unit tests and packages a web application. Depending on the situation, I would like to do different things once this job completes. I have not found a re-usable/maintainable way to do this. Is that really the case or am I missing something?
The options I would like to have once my complicated job completes:
Do nothing
Start my low-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to production
Start my high-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to test
run acceptance tests
deploy to production
I have not found an easy way to do this. The simplest, but not very maintainable approach would be to make three separate jobs, each of which kicks off a downstream build. This approach scares me for a few reasons including the fact that changes would have to be made in three places instead of one. In addition, many of the downstream jobs are also nearly identical. The only difference is which downstream jobs they call. The proliferation of jobs seems like it would lead to an un-maintainable mess.
I have looked at using several approaches to keep this as one job, but none have worked so far:
Make the job a multi-configuration project (https://wiki.jenkins-ci.org/display/JENKINS/Building+a+matrix+project). This provides a way to inject the job with a parameter. I have not found a way to make the "build other projects" step respond to a parameter.
Use the Parameterized-Trigger plugin (https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin). This plugin lets you trigger downstream-jobs based on certain triggers. The triggers appear to be too restrictive though. They're all based on the state of the build, not arbitrary variables. I don't see any option provided here that would work for my use case.
Use the Flexible Publish plugin (https://wiki.jenkins-ci.org/display/JENKINS/Flexible+Publish+Plugin). This plugin has the opposite problem as the parameterized-trigger plugin. It has many useful conditions it can check, but it doesn't look like it can start building another project. Its actions are limited to publishing type activities.
Use Flexible Publish + Any Build Step plugin (https://wiki.jenkins-ci.org/display/JENKINS/Any+Build+Step+Plugin). The Any Build Step plugin allows making any build action available to the Flexible Publish plugin. While more actions were made available once this plugin was activated, those actions didn't include "build other projects."
Is there really not an easy way to do this? I'm surprised that I haven't found it and even more surprised that I haven't really seen any one else trying to do this? Am I doing something unusual? Is there something obvious that I am missing?
If I understood it correct you should be able to do this by following these Steps:
First Build Step:
Does the regular work. In your case: building, unit testing and packaging of the web application
Depending on the result let it create a file with a specific name.
This means if you want the low-risk-change to run afterwards create a file low-risk.prop
Second Build Step:
Create a Trigger/call builds on other projects Step from the Parameterized-Trigger
plugin.
Entery the name of your low-risk job into the Projects to build field
Click on: Add Parameter
Choose: Parameters from properties File
Enter low-risk.prop into the Use properties from file Field
Enable Don't trigger if any files are missing
Third Build Step:
Check if a low-risk.prop file exists
Delete the File
Do the same for the high-risk job
Now you should have the following Setup:
if a file called low-risk.prop occurs during the first Build Step the low-risk job will be started
if a file called high-risk.prop occurs during the first Build Step the high-risk job will be started
if there's no .prop File nothing happens
And that's what you wanted to achieve. Isn't it?
Have you looked at the Conditional Build Plugin? (https://wiki.jenkins.io/display/JENKINS/Conditional+BuildStep+Plugin)
I think it can do what you're looking for.
If you want a conditional post-build step, there is a plugin for that:
https://wiki.jenkins-ci.org/display/JENKINS/Post+build+task
It will search the console log for a RegEx you specify, and if found, will execute a custom script. You can configure fairly complex criteria, and you can configure multiple sets of criteria each executing different post build tasks.
It doesn't provide you with the usual "build step" actions, so you've got to write your own script there. You can trigger execution of the same job with different parameters, or another job with some parameters, in standard ways that jenkins supports (for example using curl)
Yet another alternative is Jenkins text finder plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Text-finder+Plugin
This is a post-build step that allows to forcefully mark a build as "unstable" if a RegEx is found in console text (or even some file in workspace). So, in your build steps, depending on your conditions, echo a unique line into console log, and then do a RegEx for that line. You can then use "Trigger parameterized buids" and set the condition as "unstable". This has an added benefit of visually marking the build different (with a yellow ball), however you only have 1 conditional option with this method, and from your OP, looks like you need 2.
Try a combination of these 2 methods:
Do you use Ant for your builds?
If so, it's possible to do conditional building in ant by having a set of environment variables your build scripts can use to conditionally build. In Jenkins, your build will then be building all of the projects, but your actual build will decide whether it builds or just short-circuits.
I think the way to do it is to add an intermediate job that you put in the post-build step and pass to it all the parameters your downstream jobs could possibly need, and then within that job place conditional builds for the real downstream jobs.
The simplest approach I found is to trigger other jobs remotely, so that you can use Conditional Build Plugin or any other plugins to build other jobs conditionally.
Having real problems creating artifacts in teamcity 6.5 (using TFS & MSBuild as the buildrunner if it makes any odds, which it probably does as any examples I find seem to use SVN...).
The Build works, so long as I enter no checkout rules.
If I understand it, I'll need to set up some artifacts, that themselves rely on checkout rules(?).
I have two builds that are identical other than the way they are kicked off.
One is initiated on check-in
One is initiated manually from within TC. This build is the Test Build
Assembly version numbers come from a single versioninfo.cs file that is a linked file in all projects in the solution. This method is detailed here : http://www.codeproject.com/Articles/328977/The-Right-Way-to-Version-Your-Assemblies and holds the version number thus:
[assembly: AssemblyFileVersion("9.1.0.0")]
Ultimately, I'm unable to copy the output of the test build to another location.
As it stands, the only output of a build is in the teamcity data directory, for example :
C:\TeamCity\buildAgent\work\ceaaf65dc87ff856\Project1\bin\Debug
C:\TeamCity\buildAgent\work\ceaaf65dc87ff856\Project2\bin\Debug
etc
I'd like to copy the output files (exes and DLLs) to an output folder which has the build number of the build on it
For arguments sake, lets say for the version number above, this would be to
c:\BuildServer_Output\SolutionName\9.1.0.0
Currently I have not been able to create artifact paths that actually do anything - i.e. to copy anything anywhere.
For instance I have acoupe of artifact paths, but nothing ever gets put into C:\BuildServer_TestBuilds -
+:Accounts\bin\debug* => C:\BuildServer_TestBuilds
+:BackOffice\bin\debug* => C:\BuildServer_TestBuilds
Am I getting no artifacts (and my artifact paths therefore ignored) because I have no checkout rules?
Any help would be appreciated.
I am pretty sure artifacts and checkout rules are completely independent. Artifacts just deal with what has been built. Checkout rules tell teamcity how to react to and checkout changes in the VCS.
It looks like your artifact paths are beginning with absolute paths. I have always found it easier to use relative paths with wildcards. That way I don't need to worry about where teamcity put the build. We use the following to get all dlls and exes to one folder
**\bin\Debug\*.*=>deploymentdir
Our build configuration page has an artifacts link and when we open it it will have things like
deploymentdir\common\bin\debug\common.dll
deploymentdir\common\bin\debug\common.pdb
deploymentdir\runner\bin\debug\runner.exe
In one of our other builds we use an msbuild script to flatten our output before putting it through the artifact process.
We do use checkout rules but we have not had to change our artifact paths to accommodate them.