Jenkins artifacts corrupted when copied - jenkins

I'm setting up a Jenkins declarative pipeline, where I need to copy an artifact from a different job. The artifact is of substantial size, 10.8 M, and seems to get corrupted when copied. I save the copied artifact again as an artifact in the second job and see the size as 10.78 M. Is there any reason for this behaviour or ways to avoid it?
The resulting code from the pipeline seems corrupted and a byte-by-byte comparison reveals differences between the artifact in the first and second jobs.
I use the Copy Artifact Plugin for Jenkins like so:
step ([$class: 'CopyArtifact',
projectName: 'First_Job',
filter: '**/*.rbf',
fingerprintArtifacts: true,
target: '.',
])
And I save the artifact for the second time like this:
archiveArtifacts artifacts: 'My_Artifact.rbf', fingerprint: true
The artifact is copied and renamed on the system using a bat script between copying to the second job and archiving again.

After digging around on the second build machine, I've found that the problem was a 'bug' in the Copy Artifact plugin. The copied artifact wasn't being cleaned up correctly after each build and the plugin doesn't overwrite the previous artifact, nor does it give a message saying it can't overwrite a file.
This gave the appearance of a successful copy while the pipeline used the old artifact.

Related

Jenkins Copy Artifact - does not copy any files

I am using copy artifact 1.46 in Jenkins 2.263.4 and want to copy a file from one job to another. However, it fails to do so. The error is always:
ERROR: Failed to copy artifacts from TestPack with filter: **
Have tried this with both a scripted pipeline job and a freestyle one, on both Windows and Centos, but same result. I know it has found the job, because I get an error if the job name if wrong. The job I want to copy from only has a single text file in its root directory.
My pipeline script is:
node ("${env.Node}") {
stage('dodeploy') {
copyArtifacts(projectName: 'TestPack');
}
}
I have tried copyArtifacts with and without a filter and with and without a target. In the freestyle project I tried similar settings settings, but get exactly the same error.
Feel I must be missing something obvious, but cannot see what.
Turns out that I was not interpreting the 'Artifacts' part of 'copyArtifacts' literally enough. It looks as though copyArtifacts can only copy files that have previously been archived in a post build step (or pipeline stage).

Jenkins scripted pipeline - multiple execution with passing artifacts between stages

I was searching and could not find proper information on how to resolve the issue I have with copying the artifacts to jobs that are being executed multiple times in parallel.
I have defined scripted pipeline which executes predefined jobs in stages some which are run in parallel as follows:
the main pipeline job is located in this structure: /jenkins/workspace/<main_job>
these jobs are also preparing artifacts and I later copy them to different stage/jobs in same pipeline with the build id of the executed job.
node() {
stage("Creating Build") {
def stages = [:]
failFast: true
stages["Core"] = {
copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + coreBuildJob)
buildCore = build job: coreBuildJob
}
}
stages["Content"] = {
copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + contentBuildJob)
buildContent = build job: contentBuildJob
}
parallel(stages)
}
I am using the CopyArtifact Plugin to copy the artifacts that were created but it appears that:
it copies the file to main job folder in the instance.
because of the different workspace/project location I was needed to define the 'target' location to properly copy the artifacts to required job that I execute in the script prior the jobs execution.
e.g for coreBuildJob in the Core stage:
`copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + <job_for_execution>)`
This does helps me to resolve the issue with copying the required artifacts by these jobs but I end up with another issue in this case:
When I want this scripted pipeline job to be executed multiple times with different parameters.
The issue is that when the pipeline is executed for the 2nd time and the job that is run in one of the stages runs the 2nd time it creates the following path on the local machine:
`/jenkins/workspace/test_jobs/<job_for_execution>#2`
That means that what I have in my script is not correct, because it copies the files to:
`/jenkins/workspace/test_jobs/<job_for_execution>`
it does not copy the artifacts to proper location and they are not accessible from the executed job.
I thought of having the copyArtifacts part to be executed during the 'build job' command(as you can define in Jenkins UI with passing BUILD_ID as variable to copy artifacts like that) but I cant find any details regarding this to achieve the same behavior with the script.
How can this issue be resolved?
You can use stash/unstash.
After running your build, you can stash:
stash name:'data', includes: './*'
where data is the name (an identifier) and includes can be a directory, subdirectory or single file.
Then, in the stages you want to have the output of your build, use unstash:
unstash 'data'
After doing unstash, the files will be also in respective folder and you can run your other steps.
Refer to https://www.jenkins.io/doc/pipeline/examples/ for more information.

jenkins trying to copyArtifacts from a build that I trigger

I have installed the copyArtifacts plugin and created two freestyle jobs: experiment-main and experiment-1
experiment-1 just creates a file called artifact.txt with the build # in it, and archives it.
experiment-main triggers experiment-1 and then tries to copy the artifact like this:
but this is the result:
Running as SYSTEM
Building on master in workspace /var/lib/jenkins/workspace/experiment-main
Waiting for the completion of experiment-1
experiment-1 #4 started.
experiment-1 #4 completed. Result was SUCCESS
Build step 'Trigger/call builds on other projects' changed build result to SUCCESS
ERROR: Unable to find a build for artifact copy from: experiment-1
Finished: FAILURE
which isn't what I expected (or at least what I was hoping for)
I hoped it would find the experiment-1 build that was downstream from the current build.
Any ideas?
I figured out that there are variables with the numbers of triggered builds that I can use. To figure out the variable, I just printed all the environment variables with env and then found the right variable in the list.
Then I configured the copy artifacts plugin to use that build number.
I couldn't do it how #alex-o suggested, just getting the last build of the subjob, because I might have more than one job using the subjob at once, but if you don't have that problem, that might work for you.
Yes, this is unexpected behavior indeed.
The reason why this won't work is hidden in the help text of the "Upstream Project Name" input field:
Downstream builds are found using fingerprints of files. That is, a build that is triggered from a build isn't always considered downstream, but you need to fingerprint files used in builds to let Jenkins track them.
So, the Copy-Artifact plugin relies on fingerprint data to determine job ancestry. For that reason, you can not use the "Downstream build of..." feature using the current job as a parent: fingerprints are recorded in a post-build step, so an ongoing build of example-master does not have any fingerprints associated to it by the time it is looking for a matching build of experiment-1.
It is possible to modify fingerprint information at build run-time (e.g., via Groovy), but then, it's probably best to avoid the Copy-Artifact plugin entirely and to implement the whole procedure in Groovy right away.
Your best bet is probably to refer to example-1 via "Last successful build" and to ensure that this is the build that you triggered before (usually this will be correct, but depending on your setup there can be race conditions).

Jenkins Copy Artifact unable to find folder/multiProjectPipeline/branchWithSlash

I have Jenkins LTS 2.60.2 on Windows Server 2016 and using these plugins:
Folders plugin (6.1.0)
Copy Artifact plugin (1.38.1)
Pipeline plugin (2.5) + all dependent pipeline sub-plugins
Various other dependent plugins...
See Pipeline to use artifacts from 2 projects associated by the same git branch name for more details about my setup, but to sum it up I have these items:
playground (a folder created with the Folders plugin to group all these following items)
frontend (multibranch pipeline)
backend (multibranch pipeline)
configure (pipeline with a parameter called BRANCH_NAME)
The frontend and backend git repos, both have a branch called master and one called release/2017.2.
The idea is to call the configure pipeline automatically after each successful build, passing the git branch name. Automatically triggering the configure pipeline works.
What doesn't work and I need your help to fix, is the step inside the configure pipeline to copy the artifacts from a multibranchPipeline/specificBranch.
If for the BRANCH_NAME parameter (or the upstream pipeline) is master it works. If BRANCH_NAME is: release/2017.2 I get this error:
ERROR: Unable to find project for artifact copy:
playground/frontend/release%2f2017.2 This may be due to incorrect project
name or permission settings; see help for project name in job
configuration. Finished: FAILURE
The configure pipeline looks like this:
node {
stage('Prepare') {
def projectname = "playground/frontend/" + "${BRANCH_NAME}".replace("/", "%2f")
step([$class: 'CopyArtifact', projectName: "${projectname}", selector: [$class: 'StatusBuildSelector', stable: false]])
}
stage('Archive') {
archiveArtifacts '**'
}
}
As you can see I already replace / with %2f (it's needed).
If I don't use the "playground" folder (all my pipelines as is, not inside a folder item), it works. If I use the folder and use the master branch, it works. It doesn't work if I use the folder and a branch name like 2017.2. What am I doing wrong? Can you help making it work? Of well if it's a bug (I searched in https://issues.jenkins-ci.org and found some bugs where a similar setup with folder doesn't work, but they have been fixed... so I really wonder...) in the copy artifact plugin, please file the bug and share the link here, so we can all monitor its progress...
Thank you.
I finally found the issue. The configure pipeline was failing to find a branch with a slash because the encoding was incorrect.
So, in my question, in the configure pipeline:
this (replace / with %2f) is wrong and generates the error:
def projectname = "playground/frontend/" + "${BRANCH_NAME}".replace("/", "%2f")
this is the proper way to encode the slash, and it works:
def projectname = "playground/frontend/" + URLEncoder.encode("${BRANCH_NAME}", "UTF-8").replace("+", "%20")
Credits to: http://www.pipegrep.se/copy-artifacts-from-jenkins-pipeline-jobs.html
UPDATE: actually, I investigated a bit further and added echo "${projectname}" just before step, with the previous and fixed projectname, and I noticed that the difference was %2f lowercase.
Uppercase, like this: %2F works:
def projectname = "playground/frontend/" + "${BRANCH_NAME}".replace("/", "%2F")
So, the fixed configure pipeline looks like this (I kept my replace function, which was enough for my case):
node {
stage('Prepare') {
def projectname = "playground/frontend/" + "${BRANCH_NAME}".replace("/", "%2F")
step([$class: 'CopyArtifact', projectName: "${projectname}", selector: [$class: 'StatusBuildSelector', stable: false]])
}
stage('Archive') {
archiveArtifacts '**'
}
}
I created a sample project to try and recreate what you were seeing, and I was able to do so, after a fashion, except that the build that I was having trouble on was master instead of release/2017.2. Eventually, I realized that I was doing the build job incorrectly from the frontend project, and it was giving me the same error as you because I hadn't ever completed a successful build of the frontend/master branch (I had completed a successful build of the release/2017.2 branch because I didn't have it triggering the configure build initially, so it didn't give me the same error once I did configure it to trigger the configure build).
What worked was changing the build job in the frontend Jenkinsfile to this:
build job: 'playground/configure', parameters: [[$class: 'StringParameterValue', name: 'BRANCH_NAME', value: env.BRANCH_NAME]], quietPeriod: 2, wait: false
Adding in the quietPeriod gives a couple seconds of quiet time between completing the previous job (I'm not certain that this is critical, but it seems like it might be a nice fail-safe, to try and make sure there's enough time for the build to complete), but the important part is the wait: false, which instructs Jenkins that this build shouldn't wait for the triggered build to complete. Once I changed that, the frontend/master branch completed successfully, and the configure build that it triggered also completed successfully.
Hopefully this helps. I was able to get both my master and release/2017.2 branches to build properly, so I don't believe there's any intrinsic problem with the / in the project name. You can see my simple Jenkinsfiles in the referenced repo, and I used the same pipeline script as you posted in your question.

Jenkins pipeline share information between jobs

We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.

Resources