I'm using a Jenkins job to trigger a few downstream jobs. I pass parameters through properties file. But there's a file that was uploaded when the upstream job was submitted that I want to pass to the downstream jobs. There is an option under Copy Artifact Plugin that allows copying from the workspace of the latest completed upstream job.
The problem is that my upstream job is blocked on the downstream jobs and cannot complete before them. This is the same reason that I cannot copy the file as an artifact, as archiving artifacts is only possible as a post-build action (AFAIK).
Is there any way around this problem?
You could you stick the uploaded artifact in the upstream job into an online file repository like Artifactory or an external network/file share and access it in the downstream job?
That way, you just need to pass in the path of the file rather than the entire file and can download it in the child.
You could even use the build number of the upstream job as the unique identifier of the artifact, so you just need to pass the build number down to download it.
http://myonlinerepository/{build number}/upload.zip
Related
I am trying to make our Jenkins setup more secure, but I am unable to protect artifacts with the Permission to Copy Artifact option.
I have two jobs, where the downstream job copies artifacts from the upstream job by using the Copy Artifact Plugin. This works as expected. Now I want to ensure that only this specific downstream job can copy the artifacts of the upstream job, but that doesn't seem to work.
As far as I understand, you should specify the name of the downstream job in the Permission to Copy Artifact option of the upstream job. I tried this, but no matter what I configure in the Permission to Copy Artifact option of the upstream job, the downstream job is always allowed to copy the artifacts.
So I would like to know: are there any global options that must be enabled or disabled for this to work? Is there something else that I must configure before the permission to Copy Artifact option actually limits permissions?
Edit: I'm using version Copy Artifact Plugin 1.38.1.
Since Copyartifact 1.30, there is a limitation "Permission to Copy Artifacts" accepts only relative project names.
If you use CloudBees Folders Plugin and the projects are located in different folders, you need to specify project names like "../folder/project".
This is fixed in Copyartifact 1.31.
detail link : https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin
I'm a little confused on what exactly jenkins (2.74) is doing when archiving artifacts from a maven project. It appears from the logs that jenkins is automatically archiving artifacts in the project without having to specify a post-build action. The artifact files are indeed in the jobs/<project>/modules subdirectory and are available for download from the jenkins webui.
When triggering a follow-on job, I've added a build step to copy artifacts from the upstream (which build: upstream build that triggered this job) without specifying artifacts. The log files for the downstream state that is has copied artifacts from the upstream job. But I'm unable to see them in the workspace.
Are the artifacts available to the downstream job? If so, where are they?
Or do I need to explicitly archive the artifacts (with filenames) in the upstream job and explicitly copy the artifacts by filename in the downstream job?
In your downstream job, you should set this kind of configuration:
To specify the artifacts you want to retrieve from your upstream job and where you can to copy them.
In your case, we copy the openidm-zip file into a local "archives" folder.
I've got job that builds jar artifact
In other jobs I want to use this artifact.
As I understand there is a plugin called "Copy artifact plugin" but it copies file.
I don't want to have copy of this artifact in every job I created, I want to pass reference to this artifact.
Is it possible?
Thanks!
This is not technically feasible in situation with multiple slaves. So I believe there is no such functionality in any Jenkins plugin.
You have a choice to even:
Force second job to run on master and calculate artifact file path
from root path configured in Jenkins settings, $JOB_NAME,
$BUILD_NUMBER, or
Upload artifact to an external artifacts repository and later
reference it, or
Save artifacts in a common shared folder accessible from every node.
I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.
Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):
if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron
I have a job that uses the Copy Artifacts Plugin to upload a .ipa file to TestFlight. I would like to only run this job by hand, not trigger it automatically. The job is configured with a build-selector parameter so that I can start the upload from one of a handful of similar jobs.
Is there an easy way (possibly with a plugin or script) to get the URL to the specific job that provided the artifact being uploaded?
Essentially I want to take the $BUILD_URL value from the upstream job so that I can include it in the TestFlight build notes.
(I am not sure if it directly pertains to what I want, but Get Jenkins upstream jobs seems to suggest that Groovy scripting might be the way to go. I also found a post on the Jenkins forums, http://jenkins-ci.361315.n4.nabble.com/Getting-upstream-job-s-build-number-td3167291.html that looked promising, but does not seem to apply to my scenario of manually triggered builds.)
Unless you are "Triggering Parameterized build..." a downstream job, in which case you could pass along Predefined Parameters "UPSTREAM_BUILD_URL=$BUILD_URL", I think you would have to store the BUILD_URL with the artifacts.