Copy artifacts from an upstream multi-branch pipeline - jenkins

I have the following Jenkins setup:
A multi-branch pipeline which sometimes (on certain tag builds) triggers
a pipeline that builds an installer from the upstream artifacts.
In the upstream MB-pipeline, I have the following fragments:
options {
copyArtifactPermission('my-downstream-project');
}
post {
success {
script {
if (isRelease()) {
build job: 'my-downstream-project'
}
}
}
}
The downstream pipeline, I then try to grab the artifacts:
copyArtifacts projectName: 'my-upstream-project',
selector: upstream(),
filter: '*.jar',
fingerprintArtifacts: true
While the downstream build is started, it fails with:
ERROR: Unable to find project for artifact copy: hds-access-code-cache
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
My understanding so far:
While I can't configure the Copy Artifact permission via the configuration UI for the MB-pipeline, the option is accepted and should work.
The examples I can find would use projectName: 'my-upstream-project/tag-name' as that's the actual job. I don't have a fixed branch or tag, though.
How can I correctly access the upstream artifact?

It is possible to pass down the job name as parameter.
Change the upstream pipeline to:
build job: 'my-downstream-project',
parameters: [string(name: 'upstreamJobName', value: env.BRANCH_NAME)]
Add the parameter to the downstream pipeline:
parameters {
string(name: 'upstreamJobName',
defaultValue: '',
description: 'The name of the job the triggering upstream build'
)
}
And change the copy directive to:
copyArtifacts projectName: "my-upstream-project/${params.upstreamJobName}",
selector: upstream(),
filter: '*.jar',
fingerprintArtifacts: true
Et voila:
Copied 1 artifact from "My Upstream Project » my-tag" build number 1

Related

Use workspace location in the post build script in jenkins

I am trying to use the artifacts created in the workspace post jenkins build in a postbuild shell script.
I am not able to use them as workspace artifacts are automatically getting deleted before it comes postbuild script.
Could anyone help me to address this?
When the post-build stage is running, your workspace is already removed. When you think of it, your regular stage and post-build stage may even be running on different nodes, so there can't be any expectation that the files are in your workspace.
To access your artifacts in the post-build stage, you need to fetch them manually, e.g. by using Copy Artifact plugin:
post {
always {
// fetch artifacts of this job and this number to $WORKSPACE
step([
$class: 'CopyArtifact',
filter: '*',
fingerprintArtifacts: true,
optional: true,
projectName: "${JOB_NAME}",
selector: [$class: 'SpecificBuildSelector',
buildNumber: "${BUILD_NUMBER}"]
])
script {
try {
for(file in findFiles(glob: "*")) {
println "Found file ${file}"
}
} catch(error) {
println "Failed to find files"
}
}
}
}

Jenkins groovy script parameter get from another job

I have one pipeline and one other job. i want to pass parameter.
This is my groovy script which is inside pipeline job.
pipeline {
agent any
stages {
stage('release') {
steps {
echo 'This is release!'
echo branch
build job: projectname , parameters: [[$class: 'StringParameterValue', name: 'branch', value: branch]]
}
}
So this branch i want to pass into build job. echo branch also printing perfectly.
And this is how i tried to get my branch name from release job
This trigger an error.
org.tmatesoft.svn.core.SVNException: svn: E160005: Target path '/${branch}' does not exist
It does not resolve to the branch name which i want
This should work:
build job: projectname , parameters: [string(name: 'branch', value: "${branch}")]

How to Copy Artifacts from other Jenkins Job from a Pipeline?

I want to copy a build artifact from another Jenkins Job using the CopyArtifact plugin.
The artifact is created using the following command:
archiveArtifacts artifacts: '_Builds/BuildRelease/**', fingerprint: true
build 'Release Installer'
Within the 'Release Installer' job, I try the obtain the archived artifacts
using the following command within the Pipeline:
stages {
stage('Get Artifacts') {
steps {
step([ $class: 'CopyArtifact',
projectName: "MyBuildJob",
filter: "_Builds/BuildRelease/archive.zip"
])
}
}
When the "Release Installer" Job is executed, the artifact is not found.
Both jobs are executed on the same Build node.
I think my filter rule is missing something. Unfortunately the available Jenkins documentation is a little thin on details and examples.
I believe that the default is to copy from the last successful job. However, it looks like you currently want the upstream job. Here is a snippet:
copyArtifacts fingerprintArtifacts: true, projectName: 'MyBuildJob', selector: upstream()
I generated this code using the snippet generator. It should exist on the left panel of the classic view of a job. The button text reads "Pipeline Syntax" and the url is "my.jenkins.instance.com/pipeline-syntax/"
Specifying an artifact filter is not required, it will copy all of them. However if you want to keep the filter:
copyArtifacts filter: '_Builds/BuildRelease/archive.zip', fingerprintArtifacts: true, projectName: 'MyBuildJob', selector: upstream()

Copy artifact within a jenkins pipeline

I have a Jenkins pipeline job that archives an Artifact in its first phase, I then need to copy that Artifact in another stage of the pipeline build
node {
stage 'Stage 1 of build'
// Run tests, if successful archive the artifact
archiveArtifacts artifacts: 'build/test.js', excludes: null
stage 'Stage 2 of build'
// want to copy artifact from stage 1 of the build
step([$class: 'CopyArtifact', filter: 'build/test.js', fingerprintArtifacts: true, flatten: true, projectName: 'echo-develop-js-pipeline', selector: [$class: 'WorkspaceSelector'], target: './client/public/vendor/echo/'])
}
With this I get a unable to find a build for artifact copy
When the artifact is created it is saved here:
http://localhost:8181/view/Echo JS Develop/job/echo-develop-js-pipeline/233/artifact/build/test.js
How do I access the created artifact from within a pipeline job?
I needed this recently, and none of the other solutions here did exactly what I wanted, because I need to use multiple parameter filters for my selection. Here's what I did using the "Run Selector Plugin" in addition to the direct calling of the "Copy Artifact Plugin":
Step One: Select the build number you need.
prereq_build = selectRun filter: parameters("TARGET_OS=${TARGET_OS},GIT_BRANCH_NAME=${GIT_BRANCH_NAME}"), job: 'prereq_rpms', selector: status('STABLE'), verbose: true
Step Two: Copy (updated 2017-11: Native pipeline support now!).
copyArtifacts(
projectName: 'prereq_rpms',
filter: '**/*.rpm',
fingerprintArtifacts: true,
target: 'prereq',
flatten: true,
selector: specific(prereq_build.getId())
)
Figured this one out, so using the var ${BUILD_NUMBER} you can access artifacts un the current pipeline
step([$class: 'CopyArtifact', filter: 'build/test.js', fingerprintArtifacts: true, flatten: true, projectName: 'echo-develop-js-pipeline', selector: [$class: 'SpecificBuildSelector', buildNumber: '${BUILD_NUMBER}'], target: './client/public/vendor/echo/'])
In pipeline plugin, there is a new feature called 'stash', 'unstash' instead of artifacts.
Artifact: Archives are designed for longer term file storage (e.g., intermediate binaries from your builds). Artifact requires more storage space and resource management.
Stash: Saves a set of files and use later in the same build, generally on another node/workspace. stash and unstash steps are designed for use with small files. Stash/unstash can be used inside a pipeline with just assigning a name to the stash & works only locally.
Here is a good example for stash/unstash: Tutorial

How can I trigger another job from a jenkins pipeline (jenkinsfile) with GitHub Org Plugin?

How can I trigger build of another job from inside the Jenkinsfile?
I assume that this job is another repository under the same github organization, one that already has its own Jenkins file.
I also want to do this only if the branch name is master, as it doesn't make sense to trigger downstream builds of any local branches.
Update:
stage 'test-downstream'
node {
def job = build job: 'some-downtream-job-name'
}
Still, when executed I get an error
No parameterized job named some-downtream-job-name found
I am sure that this job exists in jenkins and is under the same organization folder as the current one. It is another job that has its own Jenkinsfile.
Please note that this question is specific to the GitHub Organization Plugin which auto-creates and maintains jobs for each repository and branch from your GitHub Organization.
In addition to the above mentioned answers: I wanted to start a job with a simple parameter passed to a second pipeline and found the answer on http://web.archive.org/web/20160209062101/https://dzone.com/refcardz/continuous-delivery-with-jenkins-workflow
So i used:
stage ('Starting ART job') {
build job: 'RunArtInTest', parameters: [[$class: 'StringParameterValue', name: 'systemname', value: systemname]]
}
First of all, it is a waste of an executor slot to wrap the build step in node. Your upstream executor will just be sitting idle for no reason.
Second, from a multibranch project, you can use the environment variable BRANCH_NAME to make logic conditional on the current branch.
Third, the job parameter takes an absolute or relative job name. If you give a name without any path qualification, that would refer to another job in the same folder, which in the case of a multibranch project would mean another branch of the same repository.
Thus what you meant to write is probably
if (env.BRANCH_NAME == 'master') {
build '../other-repo/master'
}
You can use the build job step from Jenkins Pipeline (Minimum Jenkins requirement: 2.130).
Here's the full API for the build step: https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
How to use build:
job: Name of a downstream job to build. May be another Pipeline job, but more commonly a freestyle or other project.
Use a simple name if the job is in the same folder as this upstream Pipeline job;
You can instead use relative paths like ../sister-folder/downstream
Or you can use absolute paths like /top-level-folder/nested-folder/downstream
Trigger another job using a branch as a param
At my company many of our branches include "/". You must replace any instances of "/" with "%2F" (as it appears in the URL of the job).
In this example we're using relative paths
stage('Trigger Branch Build') {
steps {
script {
echo "Triggering job for branch ${env.BRANCH_NAME}"
BRANCH_TO_TAG=env.BRANCH_NAME.replace("/","%2F")
build job: "../my-relative-job/${BRANCH_TO_TAG}", wait: false
}
}
}
Trigger another job using build number as a param
build job: 'your-job-name',
parameters: [
string(name: 'passed_build_number_param', value: String.valueOf(BUILD_NUMBER)),
string(name: 'complex_param', value: 'prefix-' + String.valueOf(BUILD_NUMBER))
]
Trigger many jobs in parallel
Source: https://jenkins.io/blog/2017/01/19/converting-conditional-to-pipeline/
More info on Parallel here: https://jenkins.io/doc/book/pipeline/syntax/#parallel
stage ('Trigger Builds In Parallel') {
steps {
// Freestyle build trigger calls a list of jobs
// Pipeline build() step only calls one job
// To run all three jobs in parallel, we use "parallel" step
// https://jenkins.io/doc/pipeline/examples/#jobs-in-parallel
parallel (
linux: {
build job: 'full-build-linux', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
mac: {
build job: 'full-build-mac', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
windows: {
build job: 'full-build-windows', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
failFast: false)
}
}
Or alternatively:
stage('Build A and B') {
failFast true
parallel {
stage('Build A') {
steps {
build job: "/project/A/${env.BRANCH}", wait: true
}
}
stage('Build B') {
steps {
build job: "/project/B/${env.BRANCH}", wait: true
}
}
}
}
The command build in pipeline is there to trigger other jobs in jenkins.
Example on github
The job must exist in Jenkins and can be parametrized.
As for the branch, I guess you can read it from git
Use build job plugin for that task in order to trigger other jobs from jenkins file.
You can add variety of logic to your execution such as parallel ,node and agents options and steps for triggering external jobs. I gave some easy-to-read cookbook example for that.
1.example for triggering external job from jenkins file with conditional example:
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueOfParam2')
]
}
2.example triggering multiple jobs from jenkins file with conditionals example:
def jobs =[
'job1Title'{
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueNameOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueNameOfParam2')
]
}
},
'job2Title'{
if (env.GIT_COMMIT == 'someCommitHashToPerformAdditionalTest') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam3',value: 'valueOfParam3')
booleanParam(name: 'keyNameOfParam4',value:'valueNameOfParam4')
booleanParam(name: 'keyNameOfParam5',value:'valueNameOfParam5')
]
}
}

Resources