Failure to trigger freestyle job from declarative pipeline - jenkins

I have a Jenkins declarative pipeline job that is required to trigger a downstream freestyle job. I do that using this snippet:
build job: 'DL_TVG_Backward_Compatibility_Verification',
parameters: [booleanParam(name: 'CHECK_CAM2', value: true),
[$class: 'ListSubversionTagsParameterValue',
name: 'CAM2_GOLDEN_TAG', tag: '',
tagsDir: '<snip>/tags'],
string(name: 'CAM2_SCENARIOS', value: ''),
booleanParam(name: 'CHECK_CAM3', value: false),
<snip>
[$class: 'NodeParameterValue',
name: 'UPSTREAM_NODE',
labels: ['jenkinswin10'],
nodeEligibility: [$class: 'AllNodeEligibility']],
string(name: 'EMAIL_RECIPIENTS',
value: '<snip>')
]
The downstream job fails:
[Pipeline] build (Building DL_TVG_Backward_Compatibility_Verification)
Scheduling project: DL_TVG_Backward_Compatibility_Verification
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] End of Pipeline
ERROR: Failed to trigger build of DL_TVG_Backward_Compatibility_Verification
Finished: FAILURE
Unfortunately no reason for the failure is given. Is there a way to get more information about the reason?
I am unsure about the line:
[$class: 'NodeParameterValue',
name: 'UPSTREAM_NODE',
labels: ['jenkinswin10'],
nodeEligibility: [$class: 'AllNodeEligibility']],
Maybe that is wrong.
Any idea why this snippet fails?

Did you check the jenkins log for errors? Looks like the Backward-Job was found by Jenkins (Consolen-Log: "Scheduling...").
Do you have a Node-Parameter UPSTREAM_NODE in your Downstream-Job deklared? Check: How to trigger a jenkins build on specific node using pipeline plugin?
Syntax of your NodeParameterValue in the build-Step looks fine. Check: How to use a parameter from NodeLabelParameter Plugin with the "build" step of Jenkins Workflow
Did you checked the configuration of the label 'jenkinswin10'? Did you already executed builds successfully on one agent of the label?

Related

How to use Jenkinsfile to pass BRANCH_NAME to a jenkins job

So I have a jenkins job that checkout a svn repo like this remote: "svn://xyz-repo/svn/xyzclientjs/$BRANCH_NAME"]],
I pass this $BRANCH_NAME through a Jenkinsfile that is present in this svn repo.
Now Inside Jenkinsfile I am doing this -
node 'xyz-169' {
checkout scm
def BRANCH_NAME = sh "svn info | grep -Po 'Relative URL: \\^/\\K.*'"
def BRANCH_REV = sh "svn info --show-item revision"
stage('Build A') {
build job: 'xyzclientjs-webui-test', propagate: true, parameters:
[
[$class: 'StringParameterValue', name: 'BRANCH_NAME', value: $env.BRANCH_NAME],
[$class: 'StringParameterValue', name: 'BRANCH_REV', value: $env.BRANCH_REV],
]
But when I run the job on jenkins I get following error
Error while checking out xyzclientjs branch from SVN
10:26:05 [Pipeline] error
10:26:05 [Pipeline] }
10:26:05 [Pipeline] // stage
10:26:05 [Pipeline] }
10:26:05 [Pipeline] // node
10:26:05 [Pipeline] End of Pipeline
10:26:05 java.lang.ClassCastException: org.jenkinsci.plugins.workflow.steps.ErrorStep.message expects class java.lang.String but received class groovy.lang.MissingPropertyException
Is there any way to do this. Please help any suggestions would be highly appriciated.
You should have a look here for an example to correctly fetch the output of a sh step:
How do I get the output of a shell command executed using into a variable from Jenkinsfile (groovy)?
For example (not tested due to lack of SVN):
def BRANCH_NAME = sh (
script: "svn info | grep -Po 'Relative URL: \\^/\\K.*'",
returnStdout: true
).trim()

multibranch pipeline job do not start the build step even downstream finished building

I run a multibranch pipeline with jenkinsfile and I got 'Scheduling project: caiwu » sis-server-test',but not 'Starting building: ' appears until the job sis-test funish building,who can give me a hand?
pipeline{
agent {label 'test-slave'}
stages{
stage('deploy'){
steps{
echo 'deploy...'
build job:'/folder/sis-test', propagate: true, wait: true
}
}
}
}
here are the build log
[Pipeline] script
[Pipeline] {
[Pipeline] build (Building caiwu » sis-server-test)
Scheduling project: caiwu » sis-server-test
Aborted by ***
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
Your question is quite difficult to understand. If the problem is that you do NOT want it to wait for the job, your current line:
build job:'/folder/sis-test', propagate: true, wait: true
Will need to be changed to
build job:'/folder/sis-test', propagate: true, wait: false

How to pass environmental variables to downstream from jenkinsfile

I am running Jenkins on a windows machine.
How can I pass environment variables like %BUILD_NUMBER% to a downstream job?
I am using below piece of code but that is not working as expected, it is printing the same thing again in output
build job: 'DeployBuild', parameters: [string(name: 'BuildId', value: $env:BUILD_NUMBER)], wait: false
The output I am getting:
C:\JenkinsBuilds\jobs\DeployBuild\workspace>echo $env:BRANCH_NAME-$env:BUILD_NUMBER
$env:BRANCH_NAME-$env:BUILD_NUMBER
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Expected Output:
C:\JenkinsBuilds\jobs\DeployBuild\workspace>echo TESTJob-12
TESTJob-12
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Tried multiple patterns like ${BUILD_NUMBER}, $BUILD_NUMBER %BUILD_NUMBER but none is giving the expected output.
You can try sending env.BUILD_NUMBER or "${BUILD_NUMBER}" or "env.BUILD_NUMBER" and notice the double quotes string interpolation happens only for double quotes .
Put the value: $env:BUILD_NUMBER) as value: ${env.BUILD_NUMBER}. This will do it. So total code will be like:
build job: 'DeployBuild', parameters: [string(name: 'BuildId', value: ${env.BUILD_NUMBER})], wait: false

trigger parameterized build doesnt find buildpath in jenkins pipeline job

My code dynamically creates a .groovy file which triggers parameterized build inside a parallel step:
def executeParallelBuilds(){
try {
parallel(
build1BUILD: {
def build1BUILD = build job: 'TA/test1', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
build2BUILD: {
def build2BUILD = build job: 'TA/test2', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
failFast: false
)
} catch (e) {
echo "An error ocurred while building"
currentBuild.result="UNSTABLE"
}
}
return this;
Now, I load and execute the groovy file with:
node('master'){
def executeGroovyFile = load buildFilePath
executeGroovyFile.executeParallelBuilds()
}
But it seems that my pipeline cant find the Buildjobs by path.
[Pipeline] }
[Pipeline] // node
[Pipeline] node
Running on master in C:\DevApps\Jenkins\workspace\TA\pipeline_1.0_TEMPLATE
[Pipeline] {
[Pipeline] load
[Pipeline] { (D:\BuildResults_tmp\TA\MBE3\\buildString.groovy)
[Pipeline] }
[Pipeline] // load
[Pipeline] parallel
[Pipeline] [build1BUILD] { (Branch: build1BUILD)
[Pipeline] [build2BUILD] { (Branch: build2BUILD)
[Pipeline] [build1BUILD] build
[Pipeline] [build1BUILD] }
[build1BUILD] Failed in branch build1BUILD
[Pipeline] [build2BUILD] build
[Pipeline] [build2BUILD] }
[build2BUILD] Failed in branch build2BUILD
[Pipeline] // parallel
[Pipeline] echo
An error ocurred while building
[Pipeline] }
What am I doing wrong? I load and execute the .groovy file on my master so that the pipeline should be able to find the other jobs. (Without node declaration I am not able to load and execute)
EDIT: What confuses me is, that I don't get the following error:
No parameterized job named some-downtream-job-name found
There was a problem with the build call.
I saved the whole code as String to a .groovy file. This gave me some struggling with the right notation. (quotes and double qoutes)
After calling my script as:
def build1BUILD = build job: BuildJobNameList[i], parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
everything works fine

Access builds from Parallel execution in Jenkins 2.0 Pipeline

I'm currently using the Build Flow plugin, which seems to have been abandoned in favor of Pipelines in Jenkins 2.0.
Running into some problems re-building our existing jobs using the new pipelines.
Currently, I have code similar to this:
ignore(FAILURE) {
join = parallel([
job1: {build('job1')},
job2: {build('job2')},
job3: {build('job3')}
])
}
results = [join.job1.result.toString(), join.job2.result.toString(), join.job2.result.toString()]
if(join.job1.result.toString() == 'SUCCESS') {
buildList << join.job1.lastBuild.getDisplayName()
}
The goal here is to run multiple existing jobs in parallel, and then access information about the builds that completed. This has been working without issue in the Build Flow plugin.
I have been unable find a way to access this data using the new Pipelines.
echo 'Checking streams for latest builds'
join = [:]
join['Job1'] = { build job: 'Job1', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job2'] = { build job: 'Job2', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job3'] = { build job: 'Job3', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
parallel join
A dump of join['Job1'] doesn't give access to an AbstractBuild or similar, the way the Build Flow plugin does. Instead, it shows:
<org.jenkinsci.plugins.workflow.cps.CpsClosure2#2eac6ed9
def=com.cloudbees.groovy.cps.impl.CpsClosureDef#59647704
delegate=WorkflowScript#3aa1807f
owner=WorkflowScript#3aa1807f
thisObject=WorkflowScript#3aa1807f
resolveStrategy=0
directive=0
parameterTypes=null
maximumNumberOfParameters=0
bcw=null>
Using the new Pipelines, is there a way to access data like job1.result, job1.lastBuild, job1.lastBuild.getDisplayName()?
a little late but you can also define the runWrapper object returned by build command in your closure and place it in a map defined outside of the parallel command.
Here's an example. Note: I am using propagate: false so that exceptions (JUnit test failures, etc) are not thrown. You would have to decide how you want to handle exceptions, try/catch/finally, etc.
Example Pipeline Job to execute (needs to be parameterized with a string param commandStr):
env.PASSED_CMD="${params.commandStr}"
stage('command-exec') {
node {
sh "${commandStr}"
}
}
Executing job (config):
buildRuns = [:]
buildResults = [:]
def buildClosure(String jobKey, String paramAValue) {
return {
def runWrapper = build(
job: 'command-test-job',
propagate: false,
parameters: [[$class: 'StringParameterValue', name: 'commandStr', value: paramAValue]]
)
buildResults."$jobKey" = runWrapper
}
}
buildRuns."job1" = buildClosure("job1", "echo 'HI' && exit 0")
buildRuns."job2" = buildClosure("job2", "echo 'HO' && exit 0")
parallel buildRuns
for(k in buildRuns.keySet()) {
def runResult = buildResults."$k"
echo "$k -> ${runResult.result}"
echo "$k -> ${runResult.buildVariables.PASSED_CMD}"
}
The build log shows:
[Pipeline] parallel
[Pipeline] [job1] { (Branch: job1)
[Pipeline] [job2] { (Branch: job2)
[Pipeline] [job1] build (Building command-test-job)
[job1] Scheduling project: command-test-job
[Pipeline] [job2] build (Building command-test-job)
[job2] Scheduling project: command-test-job
[job1] Starting building: command-test-job #7
[job2] Starting building: command-test-job #8
[Pipeline] [job2] }
[Pipeline] [job1] }
[Pipeline] // parallel
[Pipeline] echo
job1 -> SUCCESS
[Pipeline] echo
job1 -> echo 'HI' && exit 0
[Pipeline] echo
job2 -> SUCCESS
[Pipeline] echo
job2 -> echo 'HO' && exit 0
[Pipeline] End of Pipeline
Finished: SUCCESS
This is very similar to Steve-B's Answer, but you don't actually need to define the runwrapper explicitly or place it in the additional map before hand.
tl;dr You can just store the parallel build to a hashMap and access that map by directly looping over it's keySet instead
Take this answer with a grain of salt, I am using an older version of pipeline (Jenkins 2.7.2 and Pipeline 2.2).
You can store the parallel build results to a hashMap and loop over the map's keySet to get some information about the build.
def create_build_job(job_name, pool_label="master", propagate=false) {
build job: job_name, parameters: [[$class: 'LabelParameterValue', name: "node_label", label: "${pool_label}"]], propagate: propagate, wait:true
}
def buildmap = [:]
def build_results
stage 'Perform Build'
//test1 is set to fail, test2 is set to succeed
buildmap['test1'] = {create_build_job('test1', "your_node_label")}
buildmap['test2'] = {create_build_job('test2', "your_node_label")}
build_results = parallel buildmap
for(k in build_results.keySet()){
println build_results["${k}"].getProperties()
}
For this pipeline I'm just dumping all of the properties of the RunWrapper stored in item in the map, however you can access each property directly, so if you want the result of the build you can just do:
build_results["${k}"].result
The console output produced by this pipeline (with any potentially identifying information redacted is:
Started by user <user>
[Pipeline] stage (Perform Build)
Entering stage Perform Build
Proceeding
[Pipeline] parallel
[Pipeline] [test1] { (Branch: test1)
[Pipeline] [test2] { (Branch: test2)
[Pipeline] [test1] build (Building test1)
[test1] Scheduling project: test1
[test1] Starting building: test1 #11
[Pipeline] [test2] build (Building test2)
[test2] Scheduling project: test2
[test2] Starting building: test2 #11
[Pipeline] }
[Pipeline] }
[Pipeline] // parallel
[Pipeline] echo
{rawBuild=test1 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test1/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#1480013a, id=11, nextBuild=null, changeSets=[], result=FAILURE, description=null, startTimeInMillis=1509667550519, timeInMillis=1509667550510, duration=956, number=11, displayName=#11}
[Pipeline] echo
{rawBuild=test2 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test2/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#2d9c7128, id=11, nextBuild=null, changeSets=[], result=SUCCESS, description=null, startTimeInMillis=1509667550546, timeInMillis=1509667550539, duration=992, number=11, displayName=#11}
[Pipeline] End of Pipeline
Finished: SUCCESS
You can access that data by using the Jenkins API after the parallel step:
Jenkins.instance.getItemByFullName('Job1').lastBuild

Resources