Use workspace location in the post build script in jenkins - jenkins

I am trying to use the artifacts created in the workspace post jenkins build in a postbuild shell script.
I am not able to use them as workspace artifacts are automatically getting deleted before it comes postbuild script.
Could anyone help me to address this?

When the post-build stage is running, your workspace is already removed. When you think of it, your regular stage and post-build stage may even be running on different nodes, so there can't be any expectation that the files are in your workspace.
To access your artifacts in the post-build stage, you need to fetch them manually, e.g. by using Copy Artifact plugin:
post {
always {
// fetch artifacts of this job and this number to $WORKSPACE
step([
$class: 'CopyArtifact',
filter: '*',
fingerprintArtifacts: true,
optional: true,
projectName: "${JOB_NAME}",
selector: [$class: 'SpecificBuildSelector',
buildNumber: "${BUILD_NUMBER}"]
])
script {
try {
for(file in findFiles(glob: "*")) {
println "Found file ${file}"
}
} catch(error) {
println "Failed to find files"
}
}
}
}

Related

Jenkins: unable to access the artifacts on the initial run

My setup: main node runs on Linux and an agent on Windows. I want to compile a library on an agent, archive those artifacts and copy them on the main node to create a release togather with the Linux compiled binaries.
This is my Jenkinsfile:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
archiveArtifacts artifacts: 'build_32/bin/mylib.dll'
}
}
}
post {
success {
node('linux') {
copyArtifacts filter: 'build_32/bin/mylib.dll', flatten: true, projectName: '${JOB_NAME}', target: 'Win32'
}
}
}
}
My problem is, when I run this project for the first time, I get the following error
Unable to find project for artifact copy: mylib
But when I comment the copyArtifacts block and rerun the project, it is successful and I have artifacts vivible in the project overview. After this I can reenable the copyArtifacts and then the artifacts will be copied as expected.
How to configure the pipeline so it can access the artifacts on the initial run?
The copyArtifacts capability is usually used to copy artifacts between different builds and not between agents on the same build. Instead, to achieve what you want you can use the stash and unstash keywords which are designed exactly for passing artifacts from different agents in the same pipeline execution:
stash: Stash some files to be used later in the build.
Saves a set of files for later use on any node/workspace in the same Pipeline run. By default, stashed files are discarded at the end of a pipeline run
unstash: Restore files previously stashed.
Restores a set of files previously stashed into the current workspace.
In your case it can look like:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
// dir is used to control the path structure of the stashed artifact
dir('build_32/bin'){
stash name: "build_artifact" ,includes: 'mylib.dll'
}
}
}
}
post {
success {
node('linux') {
// dir is used to control the output location of the unstash keyword
dir('Win32'){
unstash "build_artifact"
}
}
}
}

Sharing files between Jenkins pipelines

A lot of the examples I see like How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)? share a file within the SAME pipeline. I want to share a file between two different pipelines.
I tried to use the Copy Artifacts plugin like so
Pipeline1:
node {'linux-0') {
stage("Create file") {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifact: 'hello.txt', fingerprint: true
}
}
Pipeline2:
node('linux-1') {
stage("copy") {
copyArtifacts projectName: 'Pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
and I get the following error for Pipeline2
ERROR: Unable to find project for artifact copy: Pipeline1
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Finished: FAILURE
What am I missing?
NOTE: My real scripted, i.e., not declarative, pipelines are more complicated than these so I can't readily convert them to declarative pipelines.
I just tested this and it worked fine for me. Here is my code, pipeline1:
pipeline {
agent any
stages {
stage('Hello') {
steps {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifacts: 'hello.txt', fingerprint: true
}
}
}
}
pipeline2
pipeline {
agent any
stages {
stage('Hello') {
steps {
copyArtifacts projectName: 'pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
}
}
copyArtifacts projectName: 'pipeline1'
Ensure that the project name is exactly same as the first pipleline's name (
and there are no conflicts on that name). If you have conflicts or use folder plugin ensure to look at this link to refer the project accordingly:
https://wiki.jenkins.io/display/JENKINS/How+to+reference+another+project+by+name

Jenkins pipeline script to copy artifacts of current build to server location

I want to create a Jenkins job which does following:
Git>Mvn build> copy jar to some location of server.
So this can be done using a single job or 2 jobs?
Or which is preferred way of doing this , is pipeline preferred over creating a maven job?
I have created this pipeline script, but this does not copy the current build jar to the server location, it copies the previous build artifact jar.
node {
def mvnHome
stage('Preparation') { // for display purposes
// Get some code from a GitHub repository
git 'git#github.pie.ABC.com:abcdef/BoltRepo.git'
mvnHome = tool 'M2'
}
stage('Build') {
// Run the maven build
if (isUnix()) {
sh "'${mvnHome}/bin/mvn' -Dmaven.test.failure.ignore clean package"
} else {
bat(/"${mvnHome}binmvn" -Dmaven.test.failure.ignore clean package/)
}
}
stage('Results') {
archiveArtifacts 'target/*/BoltRepo*.jar'
}
stage('Deploy Artifact') {
copyArtifacts(
projectName: currentBuild.projectName,
filter: 'target/*/BoltRepo*.jar',
fingerprintArtifacts: true,
target: '/ngs/app/boltd/bolt/bolt_components/bolt_provision/test',
flatten: true )
}
}
What is the best way of achieving this.
I haven't used the pipeline before, but I have done what you want using "ArtifactDeployer" from the "Post-build Actions" in the job's configurations
Note: you will need to install "Artifact Deployer Plug-in"

Copy artifacts from an upstream multi-branch pipeline

I have the following Jenkins setup:
A multi-branch pipeline which sometimes (on certain tag builds) triggers
a pipeline that builds an installer from the upstream artifacts.
In the upstream MB-pipeline, I have the following fragments:
options {
copyArtifactPermission('my-downstream-project');
}
post {
success {
script {
if (isRelease()) {
build job: 'my-downstream-project'
}
}
}
}
The downstream pipeline, I then try to grab the artifacts:
copyArtifacts projectName: 'my-upstream-project',
selector: upstream(),
filter: '*.jar',
fingerprintArtifacts: true
While the downstream build is started, it fails with:
ERROR: Unable to find project for artifact copy: hds-access-code-cache
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
My understanding so far:
While I can't configure the Copy Artifact permission via the configuration UI for the MB-pipeline, the option is accepted and should work.
The examples I can find would use projectName: 'my-upstream-project/tag-name' as that's the actual job. I don't have a fixed branch or tag, though.
How can I correctly access the upstream artifact?
It is possible to pass down the job name as parameter.
Change the upstream pipeline to:
build job: 'my-downstream-project',
parameters: [string(name: 'upstreamJobName', value: env.BRANCH_NAME)]
Add the parameter to the downstream pipeline:
parameters {
string(name: 'upstreamJobName',
defaultValue: '',
description: 'The name of the job the triggering upstream build'
)
}
And change the copy directive to:
copyArtifacts projectName: "my-upstream-project/${params.upstreamJobName}",
selector: upstream(),
filter: '*.jar',
fingerprintArtifacts: true
Et voila:
Copied 1 artifact from "My Upstream Project ยป my-tag" build number 1

How to Copy Artifacts from other Jenkins Job from a Pipeline?

I want to copy a build artifact from another Jenkins Job using the CopyArtifact plugin.
The artifact is created using the following command:
archiveArtifacts artifacts: '_Builds/BuildRelease/**', fingerprint: true
build 'Release Installer'
Within the 'Release Installer' job, I try the obtain the archived artifacts
using the following command within the Pipeline:
stages {
stage('Get Artifacts') {
steps {
step([ $class: 'CopyArtifact',
projectName: "MyBuildJob",
filter: "_Builds/BuildRelease/archive.zip"
])
}
}
When the "Release Installer" Job is executed, the artifact is not found.
Both jobs are executed on the same Build node.
I think my filter rule is missing something. Unfortunately the available Jenkins documentation is a little thin on details and examples.
I believe that the default is to copy from the last successful job. However, it looks like you currently want the upstream job. Here is a snippet:
copyArtifacts fingerprintArtifacts: true, projectName: 'MyBuildJob', selector: upstream()
I generated this code using the snippet generator. It should exist on the left panel of the classic view of a job. The button text reads "Pipeline Syntax" and the url is "my.jenkins.instance.com/pipeline-syntax/"
Specifying an artifact filter is not required, it will copy all of them. However if you want to keep the filter:
copyArtifacts filter: '_Builds/BuildRelease/archive.zip', fingerprintArtifacts: true, projectName: 'MyBuildJob', selector: upstream()

Resources