How to copy the artifacts from slave to Jenkins workspace? - jenkins

I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.

Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):

if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron

Related

Change working directory during Jenkins build (not gradle, not maven)

Can I change the working directory during a Jenkins job for all successive steps?
My job's first step checks out a git project. Unfortunately this project has a mix of technologies; it's not a java/maven project (so the trick of 'mvn -f subdir/pom.xml' doesn't apply) nor is it a gradle project. So I'd like to change to a subdirectory of the checked-out project and start running Jenkins plugins, like invoking shell scripts, like running tox to test python code, like running docker to build images, etc.
Maybe Jenkins wants every step to begin in $WORKSPACE, and allowing a directory change during the job would break some vital assumptions?
I know this has been asked before. Similar questions but answers specific to maven:
Jenkins Maven Build -> Change Directory and
Jenkins: How To Build multiple top-level projects from a git repository?
Similar question but answer specific to gradle:
Change directory during a build job on jenkins
You can separate out job based on sub folders and use filter to checkout in you SCM configuration so only sub folder that you want for that job will get cloned in your workspace. As your first step of your build use batch/shell command to move all file from sub folder to workspace. and then run all the steps that you want.

Jenkins - Copy build log from master to a shared drive

Can someone direct me here? I have a simple job configured in Jenkins on a WINDOWS environment (master and all slaves running on windows) and the job is supposed to run on a particular slave. When you build the job, the build log ( log.log) gets stored in ” %JENKINS_HOME%\jobs\\builds\%BUILD_NUMBER%\” on the master.
I do have a Jenkins workspace (which is required when you add a slave node) set on the slave for this job–where nothing gets stored when the job runs.
With this scenario, I would like to copy the build log (log.log file that’s available on the master) to a share drive. Please advise me the way to get this done. I have tried few plugins “Copy to slave”, “Copy Artifact Plugin” and ArtifactDeployer Plugin…I could not get them working to meet what I need.
Use a second build action with the execute batch option. Put the copy command there to copy the log to another location.
The following command kind-of works:
curl ${BUILD_URL}consoleFull -o ${TargetDir}/Log.txt
where
TargetDir="${WORKSPACE}/Directory/target"
BUILD_URL and WORKSPACE are set by Jenkins. Unfortunately Jenkins doesn't copy the whole log. I've tried consoleText and gotten the same result: partial logs files. :-(

Jenkins downstream job fails to find upstream artifacts

The setup is used to build and deploy to Adobe AEM.
Master Build job pulls from git repository, builds and packages, run the tests and then fires downstream jobs that should use the built packages from upstream job.
The issue is that downstream job fail with the message:
Unable to access upstream artifacts area /var/lib/jenkins/jobs/PROJECTNAME-Master-Branch/builds/2014-10-22_11-33-46/archive. Does source project archive artifacts?
It seems to me that somehow CopyArtifacts plugin, triggered by the downstream job, is looking for the artifacts in wrong location. The correct location would be
/var/lib/jenkins/jobs/PROJECTNAME-Master-Branch/workspace/PROJECTNAME-*/**/*.jar,/var/lib/jenkins/jobs/PROJECTNAME-Master-Branch/workspace/PROJECTNAME-*/**/*.zip
But then, it complains about
java.io.IOException: Expecting Ant GLOB pattern, but saw '/var/lib/jenkins/jobs/PROJECTNAME-Master-Branch/workspace/PROJECTNAME-*/**/*.jar,/var/lib/jenkins/jobs/PROJECTNAME-Master-Branch/workspace/PROJECTNAME-*/**/*.zip'. See http://ant.apache.org/manual/Types/fileset.html for syntax
The downstream job copies artifacts from another project, and then the build was either "Upstream build that triggered this job" or "Copy from workspace of latest completed build". And none works.
Any ideas?
TL;DR
You are trying to use artifacts without archiving them first.
You are trying to use absolute paths, but they should be relative to $WORKSPACE and/or "archive location".
Full Answer
You are misunderstanding the concept of "Artifacts" as it relates to Jenkins.
What are Jenkins Artifacts
Artifacts are files that are specifically preserved after the build with the help of Archive the Artifacts post-build action.
When the build runs, it runs within:
$WORKSPACE, which on filesystem usually resides within
$JENKINS_HOME/jobs/$JOB_NAME/workspace
Inside there, you can have your SCM checkout folders, temporary build files, final built files, binaries, etc.
The contents of $WORKSPACE is volatile, you should never rely on it, outside of the build timeframe (and downstream jobs are outside of the build timeframe). The contents of $WORKSPACE could be different between different master/slave nodes, it could be deleted at any time by admin, or by SCM update/cleanup/checkout.
It's also important to understand that there is only one $WORKSPACE for the whole Job.
But now pay attention to your Build History, there are several entries in that list, referenced by build number (#) and date timestamp.
These are stored under:
$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID
with $BUILD_ID being the date-timestamp of the build, like 2014-10-22_11-33-46
The $WORKSPACE contains the information relevant to current or last (and the problem is: you can never be sure if it's "current" or "last") build;
The builds folder contains a record of all past (retained) build executions (this is what makes up the Build History list on your left), per build.
By default, it contains only what Jenkins itself needs: build.xml copy, changelog information, console log. When you go to URL http://$JENKINS_URL/job/$JOB_NAME/[nn]/ where [nn] is a numeric job build/run number (#), it's reading this information from the builds folder on the filesystem.
To preserve artifacts of a build (to avoid them being overwritten by the next build, wiped out worskpace, or just to access older builds), you need to Archive the Artifacts (with same post-build action with the same title). When you archive the artifacts, you indicate which files within $WORKSPACE you want to preserve. When Jenkins does the archiving, it will place those files (keeping paths [relative to $WORKSPACE] preserved) into:
$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive/.
This way, you can have multiple sets of artifacts preserved for previous builds, not just "latest/last" from $WORKSPACE.
For the sake of completeness, I will mention that Jenkins's "permalinks", such as http://$JENKINS_URL/job/$JOB_NAME/lastSuccessfulBuild and /lastFailedBuild, etc are in fact symlinks on the filesystem to one of the preserved builds/$BUILD_ID folders.
Lastly, you control how many build runs and how many artifacts are retained (can be configured separately) through "Discard old builds" checkmark on job configuration. By default, all are retained, but if you start retaining artifacts, you need to think of hard-disk space capacity.
Solutions to your problem
So with the information above, and looking at your error messages, you should now see that the Copy Artifacts plugin is correctly looking for artifacts under the /archive/ section of a build.
You should also notice that Copy Artifacts plugin does not let you pick "current build" when selecting which build to copy from. It has permalinks (like "last successful" or "last build"), and specific build numbers, all of which translate to preserved builds under $JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive/
Even "Upstream Build that triggered this job" will link to a specific $BUILD_ID.
In either of below options
Configuration for Archiving Artifacts is relative to $WORKSPACE.
Configuration for Copy Artifacts is relative to "archive location", that is $JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive/.
Since "Copy Artifacts" is relative to "archive location", and "archive location" is relative to $WORKSPACE, then for all intensive purposes, the relative paths of both configurations can be same and relative to $WORKSPACE
Option 1
First Archive the Artifacts with the post-build action, otherwise you have nothing to copy from.
If you have your files in the root of $WORKSPACE, it should be:
PROJECTNAME-*/**/*.jar,PROJECTNAME-*/**/*.zip
(Note, not full paths in here)
Then use Upstream Build that triggered this job for Copy Artifacts selection.
For Artifacts to copy field use either:
** or blank to copy all archived artifacts, or
PROJECTNAME-*/**/*.jar,PROJECTNAME-*/**/*.zip (same as the archiving section)
Option 2
If you don't want to archive, you can use $WORKSPACE directly, with Copy from workspace of latest completed build, however you must ensure that no second upstream build can run while downstream build is executing, else you risk getting a partial file from a partial build, because as previously explained, $WORKSPACE is volatile.
Again, for the Copy Artifacts step, under Artifacts to copy field, use path relative to $WORKSPACE, that is:
PROJECTNAME-*/**/*.jar,PROJECTNAME-*/**/*.zip
Option 3
If you really want to copy the whole WORKSPACE between different jobs, use either
Clone Workspace SCM plugin or
Shared Workspace plugin
The fix may be this simple: disable or remove Compress Artifacts plugin and reboot Jenkins.
This workaround was deduced from a long-standing bug report: "Copy Artifacts Plugin" should support ArtifactManager.
The solution is about the configuration of the builder.
The root cause sits on the configuration of the downstream job. Once "Copy from workspace of latest completed build" is chosen for the build to be copied, and the path of artifacts to copy is set to relative path, such as projectname-//.jar,projectname-//.zip then the build succeeds.
Furthemore, in the parent job configuration, downstream job needs to be allowed to CopyArtifact and Projects to allow copy artifacts field should specify the downstream job.
Edit: Now I see that you responded in the meantime. Great answer and basically clears up some of the questions I had.
The one unclear thing about option 1 is that archiving of the files happens after the parent job completes.
Waiting for the completion of projectname-Deploy
projectname-Deploy #19 completed. Result was SUCCESS
Waiting for the completion of projectname-Deploy
projectname-Deploy #20 completed. Result was SUCCESS
Build step 'Trigger/call builds on other projects' changed build result to SUCCESS
Strings match run condition: string 1=[lab2b], string 2=[both]
Run condition [Strings match] preventing perform for step [BuilderChain]
Archiving artifacts
Once I changed the approach to option two it worked for me, but I would like to understand first option as well.

jenkins matrix artifacts with PostBuildScript

I'd like to use the PostBuildScript plugin to deploy the artifacts from a Matrix job that runs on several slaves.
The slaves are archiving the artifacts-- but its unclear how to access them from the PostBuildScript. How can I get the matrix node artifacts into the master workspace where the PostBuildScript job is running?
There is a plugin called Copy To Slave Plugin.
https://wiki.jenkins-ci.org/display/JENKINS/Copy+To+Slave+Plugin
This can copy artifacts from master to slave or vice versa. Inorder to get your work done you can use this plugin. It has a feature called "Copy files back to master node". This will copy the files back to the masters workspace. So you don't need post build script plugin to copy artifacts. This way will be more simpler.

How to copy build XMLs from Master to the Slave node in Jenkins?

I have created a Jenkins job which used to run in the master and I have the build.xml file in the master.
Now I have added a slave node and added the setting Restrict where this project can be run so that my job always runs on a particular slave.
Now my build jobs are failing and I can see:
[EnvInject] - Loading node environment variables.
Building remotely on demo_slave_inst2 (slave1) in workspace /root/slave/workspace/demo_job
FATAL: Unable to find build script at /root/slave/workspace/demo_job/autobvt.xml
Build step 'Invoke Ant' marked build as failure
Recording test results
Finished: FAILURE
This autobvt.xml file already exists in the master. So looks like I need to copy this file over to the slave node manually which does not looks like quite handy.
How I can instruct jenkins to copy this as part of the build?
Use "Copy data to Workspace" http://wiki.jenkins-ci.org/display/JENKINS/Copy+Data+To+Workspace+Plugin using which you can copy the files from master to slave and run them as a part of build process (No manual effort needed!)
I sorted the issue using the Copy to Slave plugin.

Resources