Jenkins scripted pipeline - multiple execution with passing artifacts between stages - jenkins

I was searching and could not find proper information on how to resolve the issue I have with copying the artifacts to jobs that are being executed multiple times in parallel.
I have defined scripted pipeline which executes predefined jobs in stages some which are run in parallel as follows:
the main pipeline job is located in this structure: /jenkins/workspace/<main_job>
these jobs are also preparing artifacts and I later copy them to different stage/jobs in same pipeline with the build id of the executed job.
node() {
stage("Creating Build") {
def stages = [:]
failFast: true
stages["Core"] = {
copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + coreBuildJob)
buildCore = build job: coreBuildJob
}
}
stages["Content"] = {
copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + contentBuildJob)
buildContent = build job: contentBuildJob
}
parallel(stages)
}
I am using the CopyArtifact Plugin to copy the artifacts that were created but it appears that:
it copies the file to main job folder in the instance.
because of the different workspace/project location I was needed to define the 'target' location to properly copy the artifacts to required job that I execute in the script prior the jobs execution.
e.g for coreBuildJob in the Core stage:
`copyArtifacts(projectName: <job to copy from>, flatten: true, target: '../' + <job_for_execution>)`
This does helps me to resolve the issue with copying the required artifacts by these jobs but I end up with another issue in this case:
When I want this scripted pipeline job to be executed multiple times with different parameters.
The issue is that when the pipeline is executed for the 2nd time and the job that is run in one of the stages runs the 2nd time it creates the following path on the local machine:
`/jenkins/workspace/test_jobs/<job_for_execution>#2`
That means that what I have in my script is not correct, because it copies the files to:
`/jenkins/workspace/test_jobs/<job_for_execution>`
it does not copy the artifacts to proper location and they are not accessible from the executed job.
I thought of having the copyArtifacts part to be executed during the 'build job' command(as you can define in Jenkins UI with passing BUILD_ID as variable to copy artifacts like that) but I cant find any details regarding this to achieve the same behavior with the script.
How can this issue be resolved?

You can use stash/unstash.
After running your build, you can stash:
stash name:'data', includes: './*'
where data is the name (an identifier) and includes can be a directory, subdirectory or single file.
Then, in the stages you want to have the output of your build, use unstash:
unstash 'data'
After doing unstash, the files will be also in respective folder and you can run your other steps.
Refer to https://www.jenkins.io/doc/pipeline/examples/ for more information.

Related

Can I set up dependency builds in Jenkins similar to TeamCity?

I have not found information about that, I want to trigger build but before it executes, I want it to trigger some other pipelines and wait for a file created by these other pipelines to pass to a main Pipeline, can I do this in Jenkins?
You can, yes, in multiple ways depending on your actual use case.
The simplest way would be to create the job you want to call and then add a step for calling that job, copy artifacts from the job and then continue with the pipeline.
Using Jenkins Build step:
stage ('Child job') {
steps {
build(job: 'foo', wait: true, propagate: true, parameters: [parameters_list])
}
}
The parameter wait makes it so your pipeline waits for the child to complete run before continuing and propagate means that the result of the job is shown in the parent pipeline as well. The parameters section is needed only if your child job requires parameters. See the parameter types in the Build step documentation.
You can also use the Jenkins pipeline snippet generator to create the call properly in your own Jenkins instance.
To get any build artifacts from the child job, the easiest way to go is to use the Copy Artifact plugin. You'll need to archiveArtifacts in the first job and then
copyArtifacts fingerprintArtifacts: true, projectName: 'foo', selector: lastCompleted()
When the artifacts are needed.

How to run a job from a Jenkins Pipeline on the same executor (declarative syntax)

I want to use the Jenkins "PRQA" plugin, which seems not to have the option to use it from a pipeline. The plugin would run static code analysis and publish the results.
In my case, it requires some preparations that are already done in a pipelinejob. Because of that, I want to include the job into that pipeline, but on the same executor with the data prepared by the pipeline as some kind of inlined job-step.
I have tried to create a job for the PRQA-Plugin-Step and execute this with the build step from the pipeline. But this tries to start the job on a new executor (and stalls because I have only one executor).
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Prepare'
}
}
stage('SCA') {
steps {
//Run this without using a new executor with the Environment that exists now
build 'PRQA_Job'
}
}
}
}
What is the correct way to run the job on the same executor with the current working directory.
With specified build 'PRQA_Job' it's not possible to run second job on the same executor (1 job = 1 executor), since main job just waiting for a triggered job to be finished. But you can run another job on the same agent with more than 1 executor to reach workspace from main job.
For a test porpose specify agent name in both jobs: agent 'agent_name_here'
If you want to use plugin functionality for a plugin, which has no native pipeline support, you could try using "step: General Build step" feature for Jenkins Pipelines. You can use the Pipeline Syntax wizzard linked in the Job configuration windows to generate the needed Pipeline description.
If the plugin does not show up in the "step: General Build step" part of Jenkins you can use a separate Job. To copy all the needed files/Data into this second Job you will require to use Archive Artifact/Copy Artifact functionality of Jenkins to save files from your Pipeline build.
For more information on how to sue Archive Artifact/Copy Artifact see https://plugins.jenkins.io/copyartifact/ and
https://www.jenkins.io/doc/pipeline/tour/tests-and-artifacts/

Passing workspace url of Job A to Job B in Jenkins

I have two pipeline jobs Job A and Job B. I need to pass the workspace url of Job A (say /var/lib/jenkins/workspace/JobA) to be used by Job B. The main idea is I am trying to copy the contents of target folder which is generated due to maven build but I don't want to use Copy Artifacts Plugin or Archive Artifacts Plugin to achieve the same.
I have tried using the option "This job is parameterized" where Job A is the upstream of Job B but i am unable to so using that option.
Can anyone help to achieve the same ?
The WORKSPACE variable is an env variable from Jenkins and is pointing like / .
For eg.
If the job name is Job_A --> the workspace value will be <jenkins_path>/Job_A
For eg.
If the job name is Job_B --> the workspace value will be <jenkins_path>/Job_B
So you can't use the WORKSPACE var and expects the Job_B to point to Job_A workspace value.
The below can be used to get certain properties from the upstream job.
Jenkins - How to get and use upstream info in downstream
Even if you want to hard code it in the Job_B it will be fine(not recommended)
Also for this to work your node should be same for both the jobs
I have found a way to do the same and it is working fine.
I have made the Job B a parameterized job using "This project is parameterized" and used string parameter.
Then, in the pipeline script of Job A, i invoked the Job B by passing WORKSPACE env variable. Here is the declarative pipeline script for Job A:
pipeline {
agent any
stages
{
stage ('Build JobB')
{
steps {
build job: 'jobB', parameters: [string(name: 'UPSTREAM_WORKSPACE', value: "${env.WORKSPACE}")]
}
}
} }
Now, in Job B pipeline you can try to echo the variable UPSTREAM_WORKSPACE. This is how we can pass the workspace url and use it to copy the artifacts.

Jenkinsfile how to mimic 2 separate test and build jobs

In the old configuration we had 2 jobs, test and build.
The build ran after test had run successfully, but we could manually trigger build if we want to skip the tests.
After we switched to multiple pipeline using Jenkinsfile, we had to put those 2 build jobs in to the same file:
stage('Running tests'){
...
}
stage('Build'){
...
}
So now the build step is only triggered after running tests successfully, and we cannot manually trigger build, without commenting out the test steps and commit to the repository.
I am wondering if there is a better approach/practise to utilise the Jenkinsfile to overcome this limitation?
Using pipeline and Jenkinsfile is becoming the standard and preferred way of running jobs on Jenkins now a days. So using a Jenkinsfile is certainly the way to go.
One way to solve the problem is to make the job parameterized:
// Set the parameter properties, this will be done at the first run so that we can trigger with parameters manually
properties([parameters([booleanParam(defaultValue: true, description: 'Testing will be done if this is checked', name: 'DO_TEST')])])
stage('Running tests'){
// Putting the check inside of the stage step so that we don't confuse the stage view
if (params['DO_TEST']) {
...
}
}
stage('Build'){
...
}
The first time the job runs, it will add a parameter to the job. After that we can trigger manually and select whether tests should run. The default value will be used when it's triggered by SCM.

How to pass output from one pipeline to another in jenkins

I'm new to Jenkins and I've been tasked with a simple task of passing the output from one pipeline to the other.
Lets say that the first pipeline has a script that says echo HelloWorld, how would i pass this output to another pipeline so it displays the same thing.
I've looked at parameterized triggers and couple of other answers but I was hoping if someone could layout the step by step procedure to me.
If you want to implement it purely with Jenkins pipeline code - what I do is have an orchestrator pipeline job that builds all the pipeline jobs in my process, waits for them to complete then gets the build number:
Orchestrator job
def result = build job: 'jobA'
def buildNumber = result.getNumber()
echo "jobA build number : ${buildNumber}"
In each job like say 'jobA' I arrange to write the output to a known file (a properties file for example) which is then archived:
jobA
writeFile encoding: 'utf-8', file: 'results.properties', text: 'a=123\r\nb=foo'
archiveArtifacts 'results.properties'
Then after the build of each job like jobA, use the build number and use the Copy Artifacts plugin to get the file back into your orchestrator job and process it however you want:
Orchestrator job
step([$class : 'CopyArtifact',
filter : 'results.properties',
flatten : true,
projectName: 'jobA',
selector : [$class : 'SpecificBuildSelector',
buildNumber: buildNumber.toString()]])
You will find these plugins useful to look at:
Copy Artifact Plugin
Pipeline Utility Steps Plugin
If you are chaining jobs instead of using an orchestrator - say jobA builds jobB builds jobC etc - then you can use a similar method. CopyArtifacts can copy from the upstream job or you can pass parameters with the build number and name of the upstream job. I chose to use an orchestrator job after changing from chained jobs because I need some jobs to be built in parallel.

Resources