I have a Jenkins server that I'd like to download build artifacts from. The problem is that the way the job is set up, the build artifact includes the job number e.g. NightlyBuild-346.tar.bz2. We like the job numbers, because they make it easy to know how old a specific build is.
This becomes problematic because I don't know the precise name of the file I'm downloading--I just know I want the last successful build. I could do something like this:
- name: download build from CI
get_url:
url: "https://ci.contoso.com/job/NightlyBuild/lastSuccessfulBuild/artifact/NightlyBuild-345.tar.bz2"
dest: /tmp/NightlyBuild-345.tar.bz2
...but this will break after Jenkins finishes the next nightly build, because the artifact will become NightlyBuild-346.tar.bz2. I think I have a few options here:
Try to use wildcards in the get_url module (not so sure about that)
Download ALL artifacts from the job (there are several) as a single archive.zip and use command-line and regex magic to find the actual build artifact I care about. (potential for a hot unmaintanable mess)
Use the REST API to obtain the job number for the last successful job and form the full URL. (not sure that Ansible allows me to set variables on-the-fly like that).
Are these my options? Is there a better way to go about this? I want to eventually publish to an Artifactory repository from Jenkins, and if that's the right thing to do here, I'd appreciate some pointers in that direction too.
You can query Jenkins about build number with uri module:
- uri:
url: http://ci/job/NightlyBuild/lastSuccessfulBuild/buildNumber
return_content: yes
register: build_number_resp
- debug: msg="URL with build number http://ci/job/NightlyBuild/lastSuccessfulBuild/artifact/NightlyBuild-{{ build_number_resp.content }}.tar.bz2"
Since Ansible 2.0 maven_artifact module is available. The module supports maven version coordinates via version parameter. Use it like this:
- maven_artifact:
group_id: junit
artifact_id: junit
dest: /tmp/junit-latest.jar
version: latest
repository_url: htttp://your-artifactory
Related
I have installed the copyArtifacts plugin and created two freestyle jobs: experiment-main and experiment-1
experiment-1 just creates a file called artifact.txt with the build # in it, and archives it.
experiment-main triggers experiment-1 and then tries to copy the artifact like this:
but this is the result:
Running as SYSTEM
Building on master in workspace /var/lib/jenkins/workspace/experiment-main
Waiting for the completion of experiment-1
experiment-1 #4 started.
experiment-1 #4 completed. Result was SUCCESS
Build step 'Trigger/call builds on other projects' changed build result to SUCCESS
ERROR: Unable to find a build for artifact copy from: experiment-1
Finished: FAILURE
which isn't what I expected (or at least what I was hoping for)
I hoped it would find the experiment-1 build that was downstream from the current build.
Any ideas?
I figured out that there are variables with the numbers of triggered builds that I can use. To figure out the variable, I just printed all the environment variables with env and then found the right variable in the list.
Then I configured the copy artifacts plugin to use that build number.
I couldn't do it how #alex-o suggested, just getting the last build of the subjob, because I might have more than one job using the subjob at once, but if you don't have that problem, that might work for you.
Yes, this is unexpected behavior indeed.
The reason why this won't work is hidden in the help text of the "Upstream Project Name" input field:
Downstream builds are found using fingerprints of files. That is, a build that is triggered from a build isn't always considered downstream, but you need to fingerprint files used in builds to let Jenkins track them.
So, the Copy-Artifact plugin relies on fingerprint data to determine job ancestry. For that reason, you can not use the "Downstream build of..." feature using the current job as a parent: fingerprints are recorded in a post-build step, so an ongoing build of example-master does not have any fingerprints associated to it by the time it is looking for a matching build of experiment-1.
It is possible to modify fingerprint information at build run-time (e.g., via Groovy), but then, it's probably best to avoid the Copy-Artifact plugin entirely and to implement the whole procedure in Groovy right away.
Your best bet is probably to refer to example-1 via "Last successful build" and to ensure that this is the build that you triggered before (usually this will be correct, but depending on your setup there can be race conditions).
I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks
We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.
I'm using the Scriptler plugin for Jenkins, and am having a hard time finding any information on how to share the scriptler scripts I'm writing between scripts. I've tried using the ScriptHelper from the Scriptler API, but have run into issues when passing in arguments to the script.
Anyone else come across this and solve it? Is there a standard way to do this (without calling the Jenkins REST API) to execute a script?
More Details
We have a full build MultiJob that contains many phase jobs, each with their own artifacts, with a 3 day time to live on them. When a this full build job is promoted, a scriptler runs against it, pulling each of the phase jobs artifacts into the full build job. By doing so, we can keep the full build alive forever, without changing the lifetime on the artifacts for each phase job (essentially 'keep this build forever' on the full build, ignoring the lifetimes set in the phase jobs.
We also want to pull these artifacts into a deploy job. The idea is that we can point a deploy job to a full build, and it will pull out the artifacts we specify. If the full build is promoted, this script will pull the artifacts directly from the full build job, otherwise, it will pull them from the internal phase jobs. Since we have 2 scripts that work with MultiJobs, I would like to be able to share this code between them.
The script would take a MultiJob name and build number, and return the individual phase job's build numbers, build statuses, and artifact information.
This is possible using Groovy capabilities, though I don't know if Scripler supports it directly. If you are running on the master node, you can use Groovy evaluate. Scriptler scripts are stored as Groovy files on the file system of the master node in the $JENKINS_HOME/scriptler/scripts directory. The Scripter ID is the function name within that directory.
Here is a very simple example. It uses two files. The first is the parameterized function, findByScm.groovy, which finds jobs using a give source control type. The second script, findByGitScm.groovy will evaluate the first function for Git SCMs and print the results.
findByScm.groovy
import jenkins.model.*
jenkins = Jenkins.instance;
// Notice that myScmType is not defined in this function
scmJobs = jenkins.items
.findAll { job -> job.scm != null && job.scm.type == myScmType }
findByGitScm.groovy
// This is supplying the argument to findByScm.groovy
myScmType = 'hudson.plugins.git.GitSCM'
// Now we are evaluating the script
evaluate(new File("${System.getProperty('JENKINS_HOME')}/scriptler/scripts/findByScm.groovy"))
// scmJobs is a variable which was introduced in findByScm.groovy
scmJobs.each { println it }
Is there a way to change the build number that is sent via email after a job completes? The problem is that are product builds are NOT being done by Jenkins, so we want to be able to get the build number(ie. from a text file) and update the build number in Jenkins to match it. I have tried to set the build number:
set BUILD_NUMBER=45
But the email is still showing the build number that Jenkins originally set.
If you have access to the script console (Manage Jenkins -> Script Console), then you can do this following:
Jenkins.instance.getItemByFullName("YourJobName").updateNextBuildNumber(45)
can be done with the plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Next+Build+Number+Plugin
more info:
http://www.alexlea.me/2010/10/howto-set-hudson-next-build-number.html
if you don't like the plugin:
If you want to change build number via nextBuildNumber file you should
"Reload Configuration from Disk" from "Manage Jenkins" page.
Under the job workspace folder, like:
C:\Program Files (x86)\Jenkins\jobs\job_name
there is a file named nextBuildNumber.
Setting the build number in the file and reloading the configuration from disk (Manage Jenkins menu) will force the next build you start to have the value from the file as BUILD_NUMBER.
If you have branch name including Forward Slash (using git flow for example), you will need to replace the Forward Slash with its Unicode character %2F within the branch name.
Here is an example for the pipeline My-Pipeline-Name and the branch release/my-release-branch-name
Jenkins.instance.getItemByFullName("My-Pipeline-Name/release%2Fmy-release-branch-name").updateNextBuildNumber(BUILD_NUMBER)
I was able to find out about this by running the following command which will list the different jobs (branches) for your pipeline
Jenkins.instance.getItem("My-Pipeline-Name").getAllJobs()
Hope it helps.
Perhaps a combination of these plugins may come in handy:
Parametrized build plugin - define some variable which holds your build number
Version number plugin - use the variable to change the build number
Build name setter plugin - use the variable to change the build number
You can change build number by updating file ${JENKINS_HOME}/jobs/job_name/nextBuildNumber on Jenkins server.
You can also install plugin Next Build Number plugin to change build number using CLI or UI
For multibranch pipeline projects, do this in the script console:
def project = Jenkins.instance.getItemByFullName("YourMultibranchPipelineProjectName")
project.getAllJobs().each{ item ->
if(item.name == 'jobName'){ // master, develop, feature/......
item.updateNextBuildNumber(#Number);
item.saveNextBuildNumber();
println('new build: ' + item.getNextBuildNumber())
}
}
Follow the steps: Jenkins Console > Manage Jenkins > Script Console, then write the script as:
Jenkins.instance.getItemByFullName("Your_job_name").updateNextBuildNumber(45)
By using environmental variables:
$BUILD_NUMBER =4