Access builds from Parallel execution in Jenkins 2.0 Pipeline - jenkins

I'm currently using the Build Flow plugin, which seems to have been abandoned in favor of Pipelines in Jenkins 2.0.
Running into some problems re-building our existing jobs using the new pipelines.
Currently, I have code similar to this:
ignore(FAILURE) {
join = parallel([
job1: {build('job1')},
job2: {build('job2')},
job3: {build('job3')}
])
}
results = [join.job1.result.toString(), join.job2.result.toString(), join.job2.result.toString()]
if(join.job1.result.toString() == 'SUCCESS') {
buildList << join.job1.lastBuild.getDisplayName()
}
The goal here is to run multiple existing jobs in parallel, and then access information about the builds that completed. This has been working without issue in the Build Flow plugin.
I have been unable find a way to access this data using the new Pipelines.
echo 'Checking streams for latest builds'
join = [:]
join['Job1'] = { build job: 'Job1', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job2'] = { build job: 'Job2', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job3'] = { build job: 'Job3', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
parallel join
A dump of join['Job1'] doesn't give access to an AbstractBuild or similar, the way the Build Flow plugin does. Instead, it shows:
<org.jenkinsci.plugins.workflow.cps.CpsClosure2#2eac6ed9
def=com.cloudbees.groovy.cps.impl.CpsClosureDef#59647704
delegate=WorkflowScript#3aa1807f
owner=WorkflowScript#3aa1807f
thisObject=WorkflowScript#3aa1807f
resolveStrategy=0
directive=0
parameterTypes=null
maximumNumberOfParameters=0
bcw=null>
Using the new Pipelines, is there a way to access data like job1.result, job1.lastBuild, job1.lastBuild.getDisplayName()?

a little late but you can also define the runWrapper object returned by build command in your closure and place it in a map defined outside of the parallel command.
Here's an example. Note: I am using propagate: false so that exceptions (JUnit test failures, etc) are not thrown. You would have to decide how you want to handle exceptions, try/catch/finally, etc.
Example Pipeline Job to execute (needs to be parameterized with a string param commandStr):
env.PASSED_CMD="${params.commandStr}"
stage('command-exec') {
node {
sh "${commandStr}"
}
}
Executing job (config):
buildRuns = [:]
buildResults = [:]
def buildClosure(String jobKey, String paramAValue) {
return {
def runWrapper = build(
job: 'command-test-job',
propagate: false,
parameters: [[$class: 'StringParameterValue', name: 'commandStr', value: paramAValue]]
)
buildResults."$jobKey" = runWrapper
}
}
buildRuns."job1" = buildClosure("job1", "echo 'HI' && exit 0")
buildRuns."job2" = buildClosure("job2", "echo 'HO' && exit 0")
parallel buildRuns
for(k in buildRuns.keySet()) {
def runResult = buildResults."$k"
echo "$k -> ${runResult.result}"
echo "$k -> ${runResult.buildVariables.PASSED_CMD}"
}
The build log shows:
[Pipeline] parallel
[Pipeline] [job1] { (Branch: job1)
[Pipeline] [job2] { (Branch: job2)
[Pipeline] [job1] build (Building command-test-job)
[job1] Scheduling project: command-test-job
[Pipeline] [job2] build (Building command-test-job)
[job2] Scheduling project: command-test-job
[job1] Starting building: command-test-job #7
[job2] Starting building: command-test-job #8
[Pipeline] [job2] }
[Pipeline] [job1] }
[Pipeline] // parallel
[Pipeline] echo
job1 -> SUCCESS
[Pipeline] echo
job1 -> echo 'HI' && exit 0
[Pipeline] echo
job2 -> SUCCESS
[Pipeline] echo
job2 -> echo 'HO' && exit 0
[Pipeline] End of Pipeline
Finished: SUCCESS

This is very similar to Steve-B's Answer, but you don't actually need to define the runwrapper explicitly or place it in the additional map before hand.
tl;dr You can just store the parallel build to a hashMap and access that map by directly looping over it's keySet instead
Take this answer with a grain of salt, I am using an older version of pipeline (Jenkins 2.7.2 and Pipeline 2.2).
You can store the parallel build results to a hashMap and loop over the map's keySet to get some information about the build.
def create_build_job(job_name, pool_label="master", propagate=false) {
build job: job_name, parameters: [[$class: 'LabelParameterValue', name: "node_label", label: "${pool_label}"]], propagate: propagate, wait:true
}
def buildmap = [:]
def build_results
stage 'Perform Build'
//test1 is set to fail, test2 is set to succeed
buildmap['test1'] = {create_build_job('test1', "your_node_label")}
buildmap['test2'] = {create_build_job('test2', "your_node_label")}
build_results = parallel buildmap
for(k in build_results.keySet()){
println build_results["${k}"].getProperties()
}
For this pipeline I'm just dumping all of the properties of the RunWrapper stored in item in the map, however you can access each property directly, so if you want the result of the build you can just do:
build_results["${k}"].result
The console output produced by this pipeline (with any potentially identifying information redacted is:
Started by user <user>
[Pipeline] stage (Perform Build)
Entering stage Perform Build
Proceeding
[Pipeline] parallel
[Pipeline] [test1] { (Branch: test1)
[Pipeline] [test2] { (Branch: test2)
[Pipeline] [test1] build (Building test1)
[test1] Scheduling project: test1
[test1] Starting building: test1 #11
[Pipeline] [test2] build (Building test2)
[test2] Scheduling project: test2
[test2] Starting building: test2 #11
[Pipeline] }
[Pipeline] }
[Pipeline] // parallel
[Pipeline] echo
{rawBuild=test1 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test1/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#1480013a, id=11, nextBuild=null, changeSets=[], result=FAILURE, description=null, startTimeInMillis=1509667550519, timeInMillis=1509667550510, duration=956, number=11, displayName=#11}
[Pipeline] echo
{rawBuild=test2 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test2/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#2d9c7128, id=11, nextBuild=null, changeSets=[], result=SUCCESS, description=null, startTimeInMillis=1509667550546, timeInMillis=1509667550539, duration=992, number=11, displayName=#11}
[Pipeline] End of Pipeline
Finished: SUCCESS

You can access that data by using the Jenkins API after the parallel step:
Jenkins.instance.getItemByFullName('Job1').lastBuild

Related

How to use Jenkinsfile to pass BRANCH_NAME to a jenkins job

So I have a jenkins job that checkout a svn repo like this remote: "svn://xyz-repo/svn/xyzclientjs/$BRANCH_NAME"]],
I pass this $BRANCH_NAME through a Jenkinsfile that is present in this svn repo.
Now Inside Jenkinsfile I am doing this -
node 'xyz-169' {
checkout scm
def BRANCH_NAME = sh "svn info | grep -Po 'Relative URL: \\^/\\K.*'"
def BRANCH_REV = sh "svn info --show-item revision"
stage('Build A') {
build job: 'xyzclientjs-webui-test', propagate: true, parameters:
[
[$class: 'StringParameterValue', name: 'BRANCH_NAME', value: $env.BRANCH_NAME],
[$class: 'StringParameterValue', name: 'BRANCH_REV', value: $env.BRANCH_REV],
]
But when I run the job on jenkins I get following error
Error while checking out xyzclientjs branch from SVN
10:26:05 [Pipeline] error
10:26:05 [Pipeline] }
10:26:05 [Pipeline] // stage
10:26:05 [Pipeline] }
10:26:05 [Pipeline] // node
10:26:05 [Pipeline] End of Pipeline
10:26:05 java.lang.ClassCastException: org.jenkinsci.plugins.workflow.steps.ErrorStep.message expects class java.lang.String but received class groovy.lang.MissingPropertyException
Is there any way to do this. Please help any suggestions would be highly appriciated.
You should have a look here for an example to correctly fetch the output of a sh step:
How do I get the output of a shell command executed using into a variable from Jenkinsfile (groovy)?
For example (not tested due to lack of SVN):
def BRANCH_NAME = sh (
script: "svn info | grep -Po 'Relative URL: \\^/\\K.*'",
returnStdout: true
).trim()

No item named 'freestyle' found

I am working on basic Jenkins pipeline and I took this example from their documentation, but an error is propping 'No item named Pipeline found'
// in this array we'll place the jobs that we wish to run
def branches = [:]
//running the job 4 times concurrently
//the dummy parameter is for preventing mutation of the parameter before the execution of the closure.
//we have to assign it outside the closure or it will run the job multiple times with the same parameter "4"
//and jenkins will unite them into a single run of the job
for (int i = 0; i < 4; i++) {
def index = i //if we tried to use i below, it would equal 4 in each job execution.
branches["branch${i}"] = {
//Parameters:
//param1 : an example string parameter for the triggered job.
//dummy: a parameter used to prevent triggering the job with the same parameters value.
// this parameter has to accept a different value each time the job is triggered.
build job: 'freestyle', parameters: [
string(name: 'param1', value:'test_param'),
string(name:'dummy', value: "${index}")]
}
}
parallel branches
Started by user unknown or anonymous
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] parallel
[Pipeline] { (Branch: branch0)
[Pipeline] { (Branch: branch1)
[Pipeline] { (Branch: branch2)
[Pipeline] { (Branch: branch3)
[Pipeline] build
[Pipeline] }
Failed in branch branch0
[Pipeline] build
[Pipeline] }
Failed in branch branch1
[Pipeline] build
[Pipeline] }
Failed in branch branch2
[Pipeline] build
[Pipeline] }
Failed in branch branch3
[Pipeline] // parallel
[Pipeline] End of Pipeline
ERROR: No item named Pipeline found
Finished: FAILURE
you should create a freestyle job and put that job name in your pipeline job - that's how you pipeline job will call the new job that you created.
as you are giving the parameters as well, you freestyle job should have those parameters

Failure to trigger freestyle job from declarative pipeline

I have a Jenkins declarative pipeline job that is required to trigger a downstream freestyle job. I do that using this snippet:
build job: 'DL_TVG_Backward_Compatibility_Verification',
parameters: [booleanParam(name: 'CHECK_CAM2', value: true),
[$class: 'ListSubversionTagsParameterValue',
name: 'CAM2_GOLDEN_TAG', tag: '',
tagsDir: '<snip>/tags'],
string(name: 'CAM2_SCENARIOS', value: ''),
booleanParam(name: 'CHECK_CAM3', value: false),
<snip>
[$class: 'NodeParameterValue',
name: 'UPSTREAM_NODE',
labels: ['jenkinswin10'],
nodeEligibility: [$class: 'AllNodeEligibility']],
string(name: 'EMAIL_RECIPIENTS',
value: '<snip>')
]
The downstream job fails:
[Pipeline] build (Building DL_TVG_Backward_Compatibility_Verification)
Scheduling project: DL_TVG_Backward_Compatibility_Verification
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] End of Pipeline
ERROR: Failed to trigger build of DL_TVG_Backward_Compatibility_Verification
Finished: FAILURE
Unfortunately no reason for the failure is given. Is there a way to get more information about the reason?
I am unsure about the line:
[$class: 'NodeParameterValue',
name: 'UPSTREAM_NODE',
labels: ['jenkinswin10'],
nodeEligibility: [$class: 'AllNodeEligibility']],
Maybe that is wrong.
Any idea why this snippet fails?
Did you check the jenkins log for errors? Looks like the Backward-Job was found by Jenkins (Consolen-Log: "Scheduling...").
Do you have a Node-Parameter UPSTREAM_NODE in your Downstream-Job deklared? Check: How to trigger a jenkins build on specific node using pipeline plugin?
Syntax of your NodeParameterValue in the build-Step looks fine. Check: How to use a parameter from NodeLabelParameter Plugin with the "build" step of Jenkins Workflow
Did you checked the configuration of the label 'jenkinswin10'? Did you already executed builds successfully on one agent of the label?

How to run two Jenkins multi phase Jobs at the same time?

I have two groups of multi-phase jobs, parallel test 1 and parallel test 2; where I need to execute both the groups together at the same time.
Does multi job jenkins plugin has a hack for it? or any alternatives...
Note: I don't want all the 3 jobs in the same MultiJob Phase
Since you can't run those jobs in one multijob phase, as an alternative You could use Jenkins pipeline job (Pipeline docs). Parallel stages execution can be achieved by using declarative pipeline parallel block. A dummy example of how your MultiJob could be achieved with pipeline:
pipeline {
agent any
stages {
stage('MultiJob like stage') {
parallel {
stage('Parallel Test') {
steps {
echo "Here trigger job: allure_behave. Triggered at time:"
sh(script: "date -u")
// build(job: "allure_behave")
}
}
stage('Parallel Test 2') {
steps {
echo "Here trigger job: allure_behave_new. Triggered at time:"
sh(script: "date -u")
// build(job: "allure_behave_new")
echo "Here trigger job: allure_behave_old. Triggered at time:"
sh(script: """date -u""")
// build(job: "allure_behave_old")
}
}
}
}
}
}
In this case, You have a Stage called MultiJob like stage which has substages Parallel Test and Parallel Test 2 just like in your MultiJob. The difference is that both of those sub stages are being executed in parallel.
To trigger other jobs from inside the pipeline job use build step:
build(job: "job-name")
Or if you need to run it with parameters then just add parameters build() option:
build(job: "${JOB_NAME}", parameters: [string(name: 'ENVNAME', value: 'EXAMPLE_STR_PARAM')])
Blue Ocean View:
Output:
Running on Jenkins in /var/jenkins_home/workspace/Dummy_pipeline
[Pipeline] {
[Pipeline] stage
[Pipeline] { (MultiJob like stage)
[Pipeline] parallel
[Pipeline] { (Branch: Parallel Test)
[Pipeline] { (Branch: Parallel Test 2)
[Pipeline] stage
[Pipeline] { (Parallel Test)
[Pipeline] stage
[Pipeline] { (Parallel Test 2)
[Pipeline] echo
Here trigger job: allure_behave. Triggered at time:
[Pipeline] sh
[Pipeline] echo
Here trigger job: allure_behave_new. Triggered at time:
[Pipeline] sh
+ date -u
Thu Nov 22 18:48:56 UTC 2018
+ date -u
Thu Nov 22 18:48:56 UTC 2018
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] echo
Here trigger job: allure_behave_old. Triggered at time:
[Pipeline] sh
+ date -u
Thu Nov 22 18:48:56 UTC 2018
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Is this alternative valid for your use case?
Regards

trigger parameterized build doesnt find buildpath in jenkins pipeline job

My code dynamically creates a .groovy file which triggers parameterized build inside a parallel step:
def executeParallelBuilds(){
try {
parallel(
build1BUILD: {
def build1BUILD = build job: 'TA/test1', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
build2BUILD: {
def build2BUILD = build job: 'TA/test2', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
failFast: false
)
} catch (e) {
echo "An error ocurred while building"
currentBuild.result="UNSTABLE"
}
}
return this;
Now, I load and execute the groovy file with:
node('master'){
def executeGroovyFile = load buildFilePath
executeGroovyFile.executeParallelBuilds()
}
But it seems that my pipeline cant find the Buildjobs by path.
[Pipeline] }
[Pipeline] // node
[Pipeline] node
Running on master in C:\DevApps\Jenkins\workspace\TA\pipeline_1.0_TEMPLATE
[Pipeline] {
[Pipeline] load
[Pipeline] { (D:\BuildResults_tmp\TA\MBE3\\buildString.groovy)
[Pipeline] }
[Pipeline] // load
[Pipeline] parallel
[Pipeline] [build1BUILD] { (Branch: build1BUILD)
[Pipeline] [build2BUILD] { (Branch: build2BUILD)
[Pipeline] [build1BUILD] build
[Pipeline] [build1BUILD] }
[build1BUILD] Failed in branch build1BUILD
[Pipeline] [build2BUILD] build
[Pipeline] [build2BUILD] }
[build2BUILD] Failed in branch build2BUILD
[Pipeline] // parallel
[Pipeline] echo
An error ocurred while building
[Pipeline] }
What am I doing wrong? I load and execute the .groovy file on my master so that the pipeline should be able to find the other jobs. (Without node declaration I am not able to load and execute)
EDIT: What confuses me is, that I don't get the following error:
No parameterized job named some-downtream-job-name found
There was a problem with the build call.
I saved the whole code as String to a .groovy file. This gave me some struggling with the right notation. (quotes and double qoutes)
After calling my script as:
def build1BUILD = build job: BuildJobNameList[i], parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
everything works fine

Resources