Obtaining test results from another job in jenkins - jenkins

I have a Jenkins pipeline A that looks something like this
Run prebuild tests
Build project
Run post-build tests
Run another pipeline B with parameters extracted from current build
I was wondering if there was a way to get test results from pipeline B and aggregate them with the tests results of pipeline A.
Currently, I have to open the console output and open the Url to the external build.
If the above is not possible, is it possible to display this Url somewhere else than the console (e.g. as an artifact ).

I believe what you are looking for is "stash". Below is copied directly from https://jenkins.io/doc/pipeline/examples/
Synopsis
This is a simple demonstration of how to unstash to a different directory than the root directory, so that you can make sure not to overwrite directories or files, etc.
// First we'll generate a text file in a subdirectory on one node and stash it.
stage "first step on first node"
// Run on a node with the "first-node" label.
node('first-node') {
// Make the output directory.
sh "mkdir -p output"
// Write a text file there.
writeFile file: "output/somefile", text: "Hey look, some text."
// Stash that directory and file.
// Note that the includes could be "output/", "output/*" as below, or even
// "output/**/*" - it all works out basically the same.
stash name: "first-stash", includes: "output/*"
}
// Next, we'll make a new directory on a second node, and unstash the original
// into that new directory, rather than into the root of the build.
stage "second step on second node"
// Run on a node with the "second-node" label.
node('second-node') {
// Run the unstash from within that directory!
dir("first-stash") {
unstash "first-stash"
}
// Look, no output directory under the root!
// pwd() outputs the current directory Pipeline is running in.
sh "ls -la ${pwd()}"
// And look, output directory is there under first-stash!
sh "ls -la ${pwd()}/first-stash"
}
Basically you can copy your artifacts, say .xml files that result from running unit tests, from the first job to the node running the second job. Then have the Unit test processor run on both the results from the first and the second job.

Related

how to change the # and #tmp directory creation in jenkins workspace

Every time my build is running, i got 2 or 3 folders with # and #tmp
Example: If my build name is test, I run the build it fetches some code from git and store it in the jenkins workspaces with names test, test#2 test#2tmp test#tmp. But original folder is test. I only want the test folder and i need to remove the next 2 folders. How can i do this.
My present working directory is automatically choosing as test#2
I want the default pwd to be /var/lib/jenkins/workspace/
I want to delete the #2 and #tmp files and change my working directory to after the build runs
Sample output is:
pwd
/var/lib/jenkins/workspace/test#2
You can use customWorkspace in your jenkins pipelines:
Example:
agent {
node {
label 'my-label'
customWorkspace '/my/path/to/workspace'
}
}
Note that Jenkins use different directories to support concurrent builds:
If concurrent builds ask for the same workspace, a directory with a suffix such as #2 may be locked instead.
Since you don't want this behaviour I advise you to use disableConcurrentBuilds option:
Disallow concurrent executions of the Pipeline. Can be useful for preventing simultaneous accesses to shared resources, etc. For example: options { disableConcurrentBuilds() }
References on customWorkspace and disableConcurrentBuilds: https://jenkins.io/doc/book/pipeline/syntax/
look for agent{ } block in your pipeline groovy script. it should be only at the pipeline block level and not into any stage block.

Why does Jenkins shell file copy not work as expected (does not overwrite existing files)

I have a step in a Jenkins pipeline to copy some source files to the workspace.
stage('Copy Files') {
script {
echo 'Staging files'
sh "cp -ar /home/dev/src/ ${env.WORKSPACE}"
}
}
Yet, when I rerun the build it uses the old code. The only solution is to delete the workspace prior to the copy. In a normal Linux file system a copy will overwrite the destination. Why does Jenkins behave differently--i.e., old files are not overwritten? From the syntax it seems like it is just running a shell command, so why does this not have the expected behavior?
It is because, Jenkins run on master node and workspace will be on the worker node.
when checkout scm and sh "" code blocks are in different stages, files will not be save from first stage to other. You should use stash & unstash. when you stash a directory path, files in that dir will be available to the unstashed step in later stages.
Jenkins doc - here

How to make subsequent checkout scm stages use local repo in a Jenkins pipeline?

We use Jenkins ECS plugin to spawn Docker containers for "each" job we build. So our pipelines look like
node ('linux') {
stage('comp-0') {
checkout scm
}
parallel(
"comp-1": {
node('linux') {
checkout scm
...
}
}
"comp-2": {
node('linux') {
checkout scm
...
}
}
)
}
The above pipeline will spawn 3 containers, one for each node('linux') call.
We set up a 'linux' node in our Jenkins configuration page to tell Jenkins the Docker repo/image we want to spawn. Its setup has a notion of 'Container mount points' which I assume are mounts on the host that the container can access.
So in above pipeline, I want the "first" checkout scm to clone the our repo onto a host path mounted by our containers, say /tmp/git. I then want the succeeding 'checkout scm' lines to clone the repo in my host's /tmp/git path.
I'm looking at How to mount Jenkins workspace in docker container using Jenkins pipeline to see how to mount a local path onto my docker
Is this possible?
You can stash the code from your checkout scm step and then unstash it in subsequent steps. Here's an example from the Jenkins pipeline documentation.
// First we'll generate a text file in a subdirectory on one node and stash it.
stage "first step on first node"
// Run on a node with the "first-node" label.
node('first-node') {
// Make the output directory.
sh "mkdir -p output"
// Write a text file there.
writeFile file: "output/somefile", text: "Hey look, some text."
// Stash that directory and file.
// Note that the includes could be "output/", "output/*" as below, or even
// "output/**/*" - it all works out basically the same.
stash name: "first-stash", includes: "output/*"
}
// Next, we'll make a new directory on a second node, and unstash the original
// into that new directory, rather than into the root of the build.
stage "second step on second node"
// Run on a node with the "second-node" label.
node('second-node') {
// Run the unstash from within that directory!
dir("first-stash") {
unstash "first-stash"
}
// Look, no output directory under the root!
// pwd() outputs the current directory Pipeline is running in.
sh "ls -la ${pwd()}"
// And look, output directory is there under first-stash!
sh "ls -la ${pwd()}/first-stash"
}
Jenkins Documentation on Stash/Unstash

Parallel execution of jenkins builds in different directories

I use Jenkinse(file) Piplines and want to run multiple build steps in parallel (different constants for example - these can't be passed to the compiler, source code has to be changed by a script).
this could look something like that:
stage('Build') {
steps {
parallel(
build_default: {
echo "WORKSPACE: ${WORKSPACE}"
bat 'build.bat'
},
build_remove: {
echo "WORKSPACE 2: ${WORKSPACE}"
// EXAMPLE: only to test interference
deleteDir() // <- this would be code changes
}
)
}
}
This is not working since all the code is deleted before compilation is done. I want to run both steps in parallel like jenkins does it (creating multiple temp directories #2 and so on) when 2 builds run in parallel (triggered by button presses for example).
The only thing I found out so far is to create temp dirs myself in the working dir and copy the sourcecode to them and work there. But I'm looking for a nicer/more automatic solution. (when using the node command I have the same problems since I only have one node)

Correct usage of stash\unstash into a different directory

In one of my stages I need to copy the contents of two folders after a build is completed and copy to a different directory.
I am actually converting a freestyle job to pipeline, and have been using the artifact deployer plugin. Reading around, it looks like stash and unstash commands should help with what I want to achieve.
Can someone verify if this is the correct approach below please?
stage('Build') {
steps {
sh '''
gulp set-staging-node-env
gulp prepare-staging-files
gulp webpack
'''
stash includes: '/dist/**/*', name: 'builtSources'
stash includes: '/config/**/*', name: 'appConfig'
dir('/some-dir') {
unstash 'builtSources'
unstash 'appConfig'
}
}
}
If I change dir in one stage, does that mean all other stages thereafter will try to execute commands from that directory, or do they do back to using the workspace default location?
Thanks
EDIT
I have realised what I actually want to do is to copy built sources to a different node (running a different OS). So in my snippet I have shared, where I am switching directories, that directory is actually to be on a different machine (node) that I have setup.
Would I need to wrap the dir() block with a node('my-node-name') block? Im struggling to find examples.
Thanks
I hope it is meant to be this:
stash includes: 'dist/**/*', name: 'builtSources'
stash includes: 'config/**/*', name: 'appConfig'
where dist and config are the directories in the workspace path, so it should be a relative path like above.
Rest seems alright, only to mention that path "/some-dir" should be writable by jenkins user (user used to run jenkins daemon).
And yes it falls back to its then enclosing workspace path (in this case default) when it exits dir block.
EDIT
So when you stash a path, it is available to be unstashed at any step later in the pipeline. So yes, you could put dir block under a node('<nodename>') block.
You could add something like this :
stage('Move the Build'){
node('datahouse'){
dir('/opt/jenkins_artifacts'){
unstash 'builtSources'
unstash 'appConfig'
}
}
}

Resources