Sharing files between Jenkins pipelines - jenkins

A lot of the examples I see like How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)? share a file within the SAME pipeline. I want to share a file between two different pipelines.
I tried to use the Copy Artifacts plugin like so
Pipeline1:
node {'linux-0') {
stage("Create file") {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifact: 'hello.txt', fingerprint: true
}
}
Pipeline2:
node('linux-1') {
stage("copy") {
copyArtifacts projectName: 'Pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
and I get the following error for Pipeline2
ERROR: Unable to find project for artifact copy: Pipeline1
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Finished: FAILURE
What am I missing?
NOTE: My real scripted, i.e., not declarative, pipelines are more complicated than these so I can't readily convert them to declarative pipelines.

I just tested this and it worked fine for me. Here is my code, pipeline1:
pipeline {
agent any
stages {
stage('Hello') {
steps {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifacts: 'hello.txt', fingerprint: true
}
}
}
}
pipeline2
pipeline {
agent any
stages {
stage('Hello') {
steps {
copyArtifacts projectName: 'pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
}
}
copyArtifacts projectName: 'pipeline1'
Ensure that the project name is exactly same as the first pipleline's name (
and there are no conflicts on that name). If you have conflicts or use folder plugin ensure to look at this link to refer the project accordingly:
https://wiki.jenkins.io/display/JENKINS/How+to+reference+another+project+by+name

Related

Jenkins: unable to access the artifacts on the initial run

My setup: main node runs on Linux and an agent on Windows. I want to compile a library on an agent, archive those artifacts and copy them on the main node to create a release togather with the Linux compiled binaries.
This is my Jenkinsfile:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
archiveArtifacts artifacts: 'build_32/bin/mylib.dll'
}
}
}
post {
success {
node('linux') {
copyArtifacts filter: 'build_32/bin/mylib.dll', flatten: true, projectName: '${JOB_NAME}', target: 'Win32'
}
}
}
}
My problem is, when I run this project for the first time, I get the following error
Unable to find project for artifact copy: mylib
But when I comment the copyArtifacts block and rerun the project, it is successful and I have artifacts vivible in the project overview. After this I can reenable the copyArtifacts and then the artifacts will be copied as expected.
How to configure the pipeline so it can access the artifacts on the initial run?
The copyArtifacts capability is usually used to copy artifacts between different builds and not between agents on the same build. Instead, to achieve what you want you can use the stash and unstash keywords which are designed exactly for passing artifacts from different agents in the same pipeline execution:
stash: Stash some files to be used later in the build.
Saves a set of files for later use on any node/workspace in the same Pipeline run. By default, stashed files are discarded at the end of a pipeline run
unstash: Restore files previously stashed.
Restores a set of files previously stashed into the current workspace.
In your case it can look like:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
// dir is used to control the path structure of the stashed artifact
dir('build_32/bin'){
stash name: "build_artifact" ,includes: 'mylib.dll'
}
}
}
}
post {
success {
node('linux') {
// dir is used to control the output location of the unstash keyword
dir('Win32'){
unstash "build_artifact"
}
}
}
}

Jenkinsfile copyArtifacts plugin: Use environment variable in filter

I have the Jenkins copyArtifact plugin installed and am using declarative pipeline syntax.
I've set up my pipeline declaring environment variables as such:
pipeline {
environment {
ENVIRONMENT="prod"
}
}
I want to be able to filter copying artifacts based on a filter determined by the environment variable specified above. However, when specifying the following in my declarative pipeline:
steps {
copyArtifacts(
filter: "build_${env.ENVIRONMENT}_*.exe"
)
// do stuff
}
I get the following error:
Failed to copy artifacts from feature/project with filter: build_${env.ENVIRONMENT}_*.exe
The documentation indicates that "filter" is meant to be a string of syntax
ant-expression to filter artifacts to copy
However I've been unable to integrate reading of pipeline environment variables in this filter parameter. Does anyone know if including environment variables in the filter string is possible? If so, what is the syntax?
Yes it is allowed, but you would need to first archive before doing copy or should be available in the artifacts.
Example:
#!groovy
import hudson.model.Result
import groovy.json.*
pipeline
{
agent any
environment
{
ENVIRONMENT= "prod"
}
stages
{
stage ('stage 1')
{
steps
{
script
{
// Example to show archiving of data. I Copy folder "MyArchive" to jenkins artifacts as an example.
dir("C:\\Data")
{
// Archive all files in folder "My_Archive"
archiveArtifacts allowEmptyArchive: true, artifacts: "My_Archive/"
copyArtifacts filter: "My_Archive/build_${env.ENVIRONMENT}_*.exe", fingerprintArtifacts: true, projectName: 'Archivetest_Pipeline', selector: lastWithArtifacts(), target: 'C:\\Data\\Test'
}
}
}
}
}
}
You can also use the pipeline syntax generator to configure inputs as per your need.
https://www.jenkins.io/doc/book/pipeline/getting-started/#snippet-generator

Jenkins pipeline script to copy artifacts of current build to server location

I want to create a Jenkins job which does following:
Git>Mvn build> copy jar to some location of server.
So this can be done using a single job or 2 jobs?
Or which is preferred way of doing this , is pipeline preferred over creating a maven job?
I have created this pipeline script, but this does not copy the current build jar to the server location, it copies the previous build artifact jar.
node {
def mvnHome
stage('Preparation') { // for display purposes
// Get some code from a GitHub repository
git 'git#github.pie.ABC.com:abcdef/BoltRepo.git'
mvnHome = tool 'M2'
}
stage('Build') {
// Run the maven build
if (isUnix()) {
sh "'${mvnHome}/bin/mvn' -Dmaven.test.failure.ignore clean package"
} else {
bat(/"${mvnHome}binmvn" -Dmaven.test.failure.ignore clean package/)
}
}
stage('Results') {
archiveArtifacts 'target/*/BoltRepo*.jar'
}
stage('Deploy Artifact') {
copyArtifacts(
projectName: currentBuild.projectName,
filter: 'target/*/BoltRepo*.jar',
fingerprintArtifacts: true,
target: '/ngs/app/boltd/bolt/bolt_components/bolt_provision/test',
flatten: true )
}
}
What is the best way of achieving this.
I haven't used the pipeline before, but I have done what you want using "ArtifactDeployer" from the "Post-build Actions" in the job's configurations
Note: you will need to install "Artifact Deployer Plug-in"

jenkinsfile copyArtifacts fails

I have Copy Artifact Plugin installed & trying to build and deploy through jenkins pipeline with following Jenkinsfile
Parameter DEPLOY_BUILD_NUMBER default to current build number. I want to make it such a way pipeline should build and deploy if DEPLOY_BUILD_NUMBER is current build number OR just deploy whatever build number specified for DEPLOY_BUILD_NUMBER
pipeline {
agent { label 'windows' }
parameters {
string(
name: 'DEPLOY_BUILD_NUMBER',
defaultValue: '${BUILD_NUMBER}',
description: 'Fresh Build and Deploy OR Deploy Previous Build Number'
)
}
stages {
stage ('Build') {
steps {
echo "Building"
}
post {
success {
archiveArtifacts artifacts: 'build.tar.gz', fingerprint: true
}
}
}
stage ('Deploy') {
steps {
echo "Deploying...."
script {
step ([$class: 'CopyArtifact',
projectName: '${JOB_NAME}',
filter: "*.tar.gz"]);
}
}
}
}
post {
always {
cleanWs()
}
}
}
When I run this pipeline I get following error
java.lang.UnsupportedOperationException: no known implementation of interface jenkins.tasks.SimpleBuildStep is named CopyArtifact
Also tried
stage ('Deploy') {
steps {
echo "Deploying...."
copyArtifacts filter: '*.tar.gz', fingerprintArtifacts: true, projectName: '${JOB_NAME}'
}
}
which failed with following error
java.lang.NoSuchMethodError: No such DSL method 'copyArtifacts' found among steps
and
stage ('Deploy') {
steps {
echo "Deploying...."
script {
copyArtifacts filter: '*.tar.gz', fingerprintArtifacts: true, projectName: '${JOB_NAME}'
}
}
}
which gave me
java.lang.NoSuchMethodError: No such DSL method 'copyArtifacts' found among steps
What is the correct syntax for copyArtifacts ? what I am missing here ?
I would check the version of the Copy Artifacts plugin you have installed (you can see that in /pluginManager/installed), the minimum version that supports pipeline is 1.39
CopyArtifact defines a step, copyArtifacts, that you can use directly.
Check the step reference here

Reuse artifacts at a later stage in the same Jenkins project

I have a Jenkins pipeline whose Build step has an archiveArtifacts command.
After the Build step there is Unit test, Integration test and Deploy.
In Deploy step, I want to use one of the artifacts. I thought I could find it in the same place the Build step generated it, but apparently the archiveArtifacts has deleted them.
As a workaround I can copy the artifact before it is archived, but it doesn't look elegant to me. Is there any better way?
As I understand it, archiveArtifacts is more for saving artifacts for use by something (or someone) after the build has finished. I would recommend looking at using "stash" and "unstash" for transferring files between stages or nodes.
You just go...
stash include: 'globdescribingfiles', name: 'stashnameusedlatertounstash'
and when you want to later retrieve that artifact...
unstash 'stashnameusedlatertounstash'
and the stashed files will be put into the current working directory.
Here's the example of that given in the Jenkinsfile docs (https://jenkins.io/doc/book/pipeline/jenkinsfile/#using-multiple-agents):
pipeline {
agent none
stages {
stage('Build') {
agent any
steps {
checkout scm
sh 'make'
stash includes: '**/target/*.jar', name: 'app'
}
}
stage('Test on Linux') {
agent {
label 'linux'
}
steps {
unstash 'app'
sh 'make check'
}
post {
always {
junit '**/target/*.xml'
}
}
}
stage('Test on Windows') {
agent {
label 'windows'
}
steps {
unstash 'app'
bat 'make check'
}
post {
always {
junit '**/target/*.xml'
}
}
}
}
}

Resources