Jenkins Permission to Copy Artifact is always granted - jenkins

I am trying to make our Jenkins setup more secure, but I am unable to protect artifacts with the Permission to Copy Artifact option.
I have two jobs, where the downstream job copies artifacts from the upstream job by using the Copy Artifact Plugin. This works as expected. Now I want to ensure that only this specific downstream job can copy the artifacts of the upstream job, but that doesn't seem to work.
As far as I understand, you should specify the name of the downstream job in the Permission to Copy Artifact option of the upstream job. I tried this, but no matter what I configure in the Permission to Copy Artifact option of the upstream job, the downstream job is always allowed to copy the artifacts.
So I would like to know: are there any global options that must be enabled or disabled for this to work? Is there something else that I must configure before the permission to Copy Artifact option actually limits permissions?
Edit: I'm using version Copy Artifact Plugin 1.38.1.

Since Copyartifact 1.30, there is a limitation "Permission to Copy Artifacts" accepts only relative project names.
If you use CloudBees Folders Plugin and the projects are located in different folders, you need to specify project names like "../folder/project".
This is fixed in Copyartifact 1.31.
detail link : https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin

Related

Jenkins - access archived artifacts from downstream job

I'm a little confused on what exactly jenkins (2.74) is doing when archiving artifacts from a maven project. It appears from the logs that jenkins is automatically archiving artifacts in the project without having to specify a post-build action. The artifact files are indeed in the jobs/<project>/modules subdirectory and are available for download from the jenkins webui.
When triggering a follow-on job, I've added a build step to copy artifacts from the upstream (which build: upstream build that triggered this job) without specifying artifacts. The log files for the downstream state that is has copied artifacts from the upstream job. But I'm unable to see them in the workspace.
Are the artifacts available to the downstream job? If so, where are they?
Or do I need to explicitly archive the artifacts (with filenames) in the upstream job and explicitly copy the artifacts by filename in the downstream job?
In your downstream job, you should set this kind of configuration:
To specify the artifacts you want to retrieve from your upstream job and where you can to copy them.
In your case, we copy the openidm-zip file into a local "archives" folder.

Jenkins config file provider plugin in a slave

I want to use a config file provided by config file provider plugin in a pipeline project.
However when I run a build step inside a slave. I get a "PermissionDenied" exception, The same runs in master however.
So question is thats the best possible way to share files between master and slaves. I may not be able to Copy to slave plugin as there doesn't seem to be pipeline support.
If you want to share files between stages or nodes you can use the stash - unstash methods. see the example here
If you want to share files between builds you can use the archive method and the Copy Artifact Plugin

Using reference to artifact without multiplying the artifact

I've got job that builds jar artifact
In other jobs I want to use this artifact.
As I understand there is a plugin called "Copy artifact plugin" but it copies file.
I don't want to have copy of this artifact in every job I created, I want to pass reference to this artifact.
Is it possible?
Thanks!
This is not technically feasible in situation with multiple slaves. So I believe there is no such functionality in any Jenkins plugin.
You have a choice to even:
Force second job to run on master and calculate artifact file path
from root path configured in Jenkins settings, $JOB_NAME,
$BUILD_NUMBER, or
Upload artifact to an external artifacts repository and later
reference it, or
Save artifacts in a common shared folder accessible from every node.

copy file from workspace of the upstream running build

I'm using a Jenkins job to trigger a few downstream jobs. I pass parameters through properties file. But there's a file that was uploaded when the upstream job was submitted that I want to pass to the downstream jobs. There is an option under Copy Artifact Plugin that allows copying from the workspace of the latest completed upstream job.
The problem is that my upstream job is blocked on the downstream jobs and cannot complete before them. This is the same reason that I cannot copy the file as an artifact, as archiving artifacts is only possible as a post-build action (AFAIK).
Is there any way around this problem?
You could you stick the uploaded artifact in the upstream job into an online file repository like Artifactory or an external network/file share and access it in the downstream job?
That way, you just need to pass in the path of the file rather than the entire file and can download it in the child.
You could even use the build number of the upstream job as the unique identifier of the artifact, so you just need to pass the build number down to download it.
http://myonlinerepository/{build number}/upload.zip

Access Jenkins host drive, beside the job workspace

I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.

Resources