jenkins matrix artifacts with PostBuildScript - jenkins

I'd like to use the PostBuildScript plugin to deploy the artifacts from a Matrix job that runs on several slaves.
The slaves are archiving the artifacts-- but its unclear how to access them from the PostBuildScript. How can I get the matrix node artifacts into the master workspace where the PostBuildScript job is running?

There is a plugin called Copy To Slave Plugin.
https://wiki.jenkins-ci.org/display/JENKINS/Copy+To+Slave+Plugin
This can copy artifacts from master to slave or vice versa. Inorder to get your work done you can use this plugin. It has a feature called "Copy files back to master node". This will copy the files back to the masters workspace. So you don't need post build script plugin to copy artifacts. This way will be more simpler.

Related

How to read yaml file from master using Jenkinsfile pipeline script when jenkins job is running on slave machine?

I am building a Shared Library(I'm not a CI engineer) for Jenkins pipeline. As a part of it, we are thinking of giving configurations in yaml file and have Jenkinsfile pipeline script read from the yaml file.
So, we are planning to commit Jenkinsfile and yaml file in one git repository (let's say repo A) and the job is going to run on a slave machine utilising another git repository (let's say repo B). The Jenkinsfile will be executed from master after it clones repo A. The yaml file is also in master workspace. In the slave, repo B will be cloned and a build will take place as defined in the Jenkinsfile. But the question I have is, how do I read yaml file from Jenkinsfile script without having to clone repo A in slave i.e., how I reference a file present in master and not in slave from Jenkinsfile? This question arises because, whatever file I am trying to open is being opened from slave and not from master.
Thanks in advance.
EDIT:
Soemthing I forgot to mention. We actually thought of using stash from master and unstash in slave. But the problem is that we do only the job configurations and the environment is provided to us by some other team which also provides to many other teams. Because of that, master rarely has executors free. So, even when we run the job, it hangs waiting for the master to be free. Is there any other way to load the yaml file when the pipeline script is being loaded in master's memory?
I think the most straightforward way is to use the stash. Stashes are temporary packages that can be created on one node and copied to any other node.
There is some overhead on master for the package creation so stashes are not recommended for really big files, but they are ideally suited for your usecase of transfering a small configuration file.
node('master') {
// Create temporary stash package on master
stash name: 'MyConfiguration', includes: 'SomeFile.yaml'
}
node('MySlave') {
// Copy to and extract the stash package on slave
unstash 'MyConfiguration'
}
This expects 'SomeFile.yaml' to be in the workspace on master and will also extract it to workspace on slave. In case you want a sub directory of WORKSPACE, simply wrap the stash and/or unstash steps in a dir step.

Jenkins - Copy Artifacts from upstream job built in different node

There is a job controlled by Development team which built in a different node. I am on Testing team who want to take the artifacts and deploy on test device.
I can see those Artifacts from dev are stored in some path in dev's node. Does it means it must first archived in Jenkins master before I can copy it to my job?
I am using Copy Artifact plugin and constantly getting the error
Failed to copy artifacts from <dev-job> with filter: <path-in-dev-node>
*Some newbie question since i just moved from TeamCity
You probably want to use: Copy Artifact plugin.
Adds a build step to copy artifacts from another project.
Consider also, the Jenkins post-buid step "Archive the artifacts".
If you copy from the other job's workspace, what happens if another job is in progress or the workspace is wiped? That step copies them from the node to the master and stores a copy along with the build logs, etc. That makes them available via the UI as long as the build logs remain. It can take up space tho.
If you do use archive artifacts, consider using the system property jenkins.model.Jenkins.buildsDir to store all the build logs (and artifacts) outside of the jobs config directory. Some downtime and work required to separate the two (config / logs) .
You may also want to consider using a proper repository manager (Nexus / artifactory)
Finally, you may want to learn about using a Jenkins pipeline rather the relying on chained jobs, triggers or users and so forth. Why? 'cos it's much more controlled and easier to maintain.
ps: I'm not a huge fan of artifactDeployer, but it may work for you.
pps: you may want to review this in depth answer: Jenkis downstream job fails to find upstream artifacts

View jenkins workspace on slave

We're running tests and producing build files on a jenkins master and a jenkins slave for extra parallellisation, our RCPTT tests takes ages.
Our problem is that Jenkins -> -> Show workspace only shows the workspace on the master, so we have no way to get the builds except copying files manually over ssh.
We don't want duplication since different patches run either on master or slave, and we want to be able to get the files from both master and slave nodes.
You can use "Copy To Slave Plugin" to copy any files from master to slave.
If you use the Jenkins pipeline plugin https://jenkins.io/doc/book/pipeline/
then you can use stash / unstash:
https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#code-stash-code-stash-some-files-to-be-used-later-in-the-build
Her an example: https://www.cloudbees.com/blog/parallelism-and-distributed-builds-jenkins

How to copy the artifacts from slave to Jenkins workspace?

I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.
Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):
if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron

How can we iterate on slave list in a jenkins job?

Is it possible to iterate on slaves with a particular label in a jenkins job?
For example, lets say I have few slave with label "redhat". I have a job in which I want to logically do something like:
for slave in slave_list_with_label_redhat do
ssh someuser#${slave.hostname}
done
Thanks in advance!!
Edit use case in detail:
So this is to workaround a bug in jenkins where archiving artifacts fails from AIX slave. https://issues.jenkins-ci.org/browse/JENKINS-13614
So what we really want to do is once the build is complete on build slave, we would like to scp "build files" to available jenkins aix slaves and install and run few tests on test slaves.
You might find https://wiki.jenkins-ci.org/display/JENKINS/Elastic+Axis to fit your needs
I think this is used as part of the multi-configuration (Axis) create job and you can then select this as one of the axes
I usually let my jobs download all necessary artifacts. So your option would be to add another build step to your build job and archive the artifacts manually outside of Jenkins. You can use SCM tools (not perfect for binary artifacts but offer great auditing features) or binary artifact repositories (e.g. nexus - best, but I have no Idea about the auditing features) or just a remote filesystem (you can mount it locally on your slave machines). The build job just needs to pass some information to the test jobs so that these can get the right artifact. This way, Jenkins still decides where the test job(s) will run but the slaves have always access to the artifacts.

Resources