trigger parameterized build doesnt find buildpath in jenkins pipeline job - jenkins

My code dynamically creates a .groovy file which triggers parameterized build inside a parallel step:
def executeParallelBuilds(){
try {
parallel(
build1BUILD: {
def build1BUILD = build job: 'TA/test1', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
build2BUILD: {
def build2BUILD = build job: 'TA/test2', parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
},
failFast: false
)
} catch (e) {
echo "An error ocurred while building"
currentBuild.result="UNSTABLE"
}
}
return this;
Now, I load and execute the groovy file with:
node('master'){
def executeGroovyFile = load buildFilePath
executeGroovyFile.executeParallelBuilds()
}
But it seems that my pipeline cant find the Buildjobs by path.
[Pipeline] }
[Pipeline] // node
[Pipeline] node
Running on master in C:\DevApps\Jenkins\workspace\TA\pipeline_1.0_TEMPLATE
[Pipeline] {
[Pipeline] load
[Pipeline] { (D:\BuildResults_tmp\TA\MBE3\\buildString.groovy)
[Pipeline] }
[Pipeline] // load
[Pipeline] parallel
[Pipeline] [build1BUILD] { (Branch: build1BUILD)
[Pipeline] [build2BUILD] { (Branch: build2BUILD)
[Pipeline] [build1BUILD] build
[Pipeline] [build1BUILD] }
[build1BUILD] Failed in branch build1BUILD
[Pipeline] [build2BUILD] build
[Pipeline] [build2BUILD] }
[build2BUILD] Failed in branch build2BUILD
[Pipeline] // parallel
[Pipeline] echo
An error ocurred while building
[Pipeline] }
What am I doing wrong? I load and execute the .groovy file on my master so that the pipeline should be able to find the other jobs. (Without node declaration I am not able to load and execute)
EDIT: What confuses me is, that I don't get the following error:
No parameterized job named some-downtream-job-name found

There was a problem with the build call.
I saved the whole code as String to a .groovy file. This gave me some struggling with the right notation. (quotes and double qoutes)
After calling my script as:
def build1BUILD = build job: BuildJobNameList[i], parameters: [string(name: "CPNUM_PARAM", value: 1.141)]
everything works fine

Related

How to use jenkins use environment variables in parameters in a Pipeline

I'm trying to use "JOB_BASE_NAME" jenkins environmental variable in a parameter's path in a pipeline script that gets set will building the project.
example: string(defaultValue: "/abc/test/workspace/test_${JOB_BASE_NAME}/sample", description: 'test', name: 'HOME')
but while executing the ${JOB_BASE_NAME} is not getting replaced by the value(jenkins job name). I'm unsure if I'm setting the jenkins environmental variable in the path of the parameter correctly.
thank you!
I have replicated your use case and it works for me. This is the section of code
node {
stage ('test') {
sh "echo ${HOME}"
}
}
and this is the output - (my Job name was stackoverflow)
[Pipeline] { (hide)
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] sh
+ echo /abc/test/workspace/test_stackoverflow/sample
/abc/test/workspace/test_stackoverflow/sample
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Check the picture of how I set the String parameter.

jenkins pipeline readYaml how to use variable to specify key

Suppose the yaml file is like this:
#test.yaml
0.6.5.1.0:
module:
- mysql
- zk
0.7.1.0.0:
module:
- java
Now I want to get the module list of specified version, and the version is a variable, I try to write the jenkins pipeline like this:
yamlFile = readYaml file: test.yaml
version = '0.7.1.0.0'
moduleList = yamlFile.get("${version}").get(module)
but this can't work, yamlFile.get("${version}") is a null object, how can I achieve this?
This works for me:
pipeline {
agent any
stages {
stage ('read') {
steps {
script {
def data = readYaml text: """
0.6.5.1.0:
module:
- mysql
- zk
0.7.1.0.0:
module:
- java
"""
version = '0.7.1.0.0'
println data.get(version).get('module')
}
}
}
}
}
The output:
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on server in /home/user/workspace/task
[Pipeline] {
[Pipeline] stage
[Pipeline] { (read)
[Pipeline] script
[Pipeline] {
[Pipeline] readYaml
[Pipeline] echo
[java]
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

No item named 'freestyle' found

I am working on basic Jenkins pipeline and I took this example from their documentation, but an error is propping 'No item named Pipeline found'
// in this array we'll place the jobs that we wish to run
def branches = [:]
//running the job 4 times concurrently
//the dummy parameter is for preventing mutation of the parameter before the execution of the closure.
//we have to assign it outside the closure or it will run the job multiple times with the same parameter "4"
//and jenkins will unite them into a single run of the job
for (int i = 0; i < 4; i++) {
def index = i //if we tried to use i below, it would equal 4 in each job execution.
branches["branch${i}"] = {
//Parameters:
//param1 : an example string parameter for the triggered job.
//dummy: a parameter used to prevent triggering the job with the same parameters value.
// this parameter has to accept a different value each time the job is triggered.
build job: 'freestyle', parameters: [
string(name: 'param1', value:'test_param'),
string(name:'dummy', value: "${index}")]
}
}
parallel branches
Started by user unknown or anonymous
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] parallel
[Pipeline] { (Branch: branch0)
[Pipeline] { (Branch: branch1)
[Pipeline] { (Branch: branch2)
[Pipeline] { (Branch: branch3)
[Pipeline] build
[Pipeline] }
Failed in branch branch0
[Pipeline] build
[Pipeline] }
Failed in branch branch1
[Pipeline] build
[Pipeline] }
Failed in branch branch2
[Pipeline] build
[Pipeline] }
Failed in branch branch3
[Pipeline] // parallel
[Pipeline] End of Pipeline
ERROR: No item named Pipeline found
Finished: FAILURE
you should create a freestyle job and put that job name in your pipeline job - that's how you pipeline job will call the new job that you created.
as you are giving the parameters as well, you freestyle job should have those parameters

Getting Jenkinsfile error - command not found

What is wrong with this Jenkins file? I am new to it but I don't get what am I doing wrong
pipeline {
agent any
stages {
stage('Test') {
steps {
dir ('/var/lib/jenkins/workspace/pipleline_2') {
}
}
}
}
post {
always {
sh 'hello2.sh'
}
failure {
mail(from: "heenashree2010#gmail.com",
to: "qshoretechnologies#gmail.com",
subject: "That build passed.",
body: "Nothing to see here")
}
}
}
I am getting below error. hello2.sh exists in the directory which I have specified but I am not able to execute it. I also tried sh('hello2.sh') but it didn't work for me. What am I doing wrong?
Started by user qshore
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in /var/lib/jenkins/workspace/pipleline_2
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] dir
Running in /var/lib/jenkins/workspace/pipleline_2
[Pipeline] {
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] sh
[pipleline_2] Running shell script
+ hello2.sh
/var/lib/jenkins/workspace/pipleline_2#tmp/durable-dbcba8b2/script.sh: line 2: hello2.sh: command not found
[Pipeline] mail
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
The script called hello2.sh is not found. Make sure that it is included in the repo that you're checking out.

Access builds from Parallel execution in Jenkins 2.0 Pipeline

I'm currently using the Build Flow plugin, which seems to have been abandoned in favor of Pipelines in Jenkins 2.0.
Running into some problems re-building our existing jobs using the new pipelines.
Currently, I have code similar to this:
ignore(FAILURE) {
join = parallel([
job1: {build('job1')},
job2: {build('job2')},
job3: {build('job3')}
])
}
results = [join.job1.result.toString(), join.job2.result.toString(), join.job2.result.toString()]
if(join.job1.result.toString() == 'SUCCESS') {
buildList << join.job1.lastBuild.getDisplayName()
}
The goal here is to run multiple existing jobs in parallel, and then access information about the builds that completed. This has been working without issue in the Build Flow plugin.
I have been unable find a way to access this data using the new Pipelines.
echo 'Checking streams for latest builds'
join = [:]
join['Job1'] = { build job: 'Job1', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job2'] = { build job: 'Job2', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
join['Job3'] = { build job: 'Job3', parameters: [[$class: 'StringParameterValue', name: 'TimeWindow', value: '1200']], propagate: false}
parallel join
A dump of join['Job1'] doesn't give access to an AbstractBuild or similar, the way the Build Flow plugin does. Instead, it shows:
<org.jenkinsci.plugins.workflow.cps.CpsClosure2#2eac6ed9
def=com.cloudbees.groovy.cps.impl.CpsClosureDef#59647704
delegate=WorkflowScript#3aa1807f
owner=WorkflowScript#3aa1807f
thisObject=WorkflowScript#3aa1807f
resolveStrategy=0
directive=0
parameterTypes=null
maximumNumberOfParameters=0
bcw=null>
Using the new Pipelines, is there a way to access data like job1.result, job1.lastBuild, job1.lastBuild.getDisplayName()?
a little late but you can also define the runWrapper object returned by build command in your closure and place it in a map defined outside of the parallel command.
Here's an example. Note: I am using propagate: false so that exceptions (JUnit test failures, etc) are not thrown. You would have to decide how you want to handle exceptions, try/catch/finally, etc.
Example Pipeline Job to execute (needs to be parameterized with a string param commandStr):
env.PASSED_CMD="${params.commandStr}"
stage('command-exec') {
node {
sh "${commandStr}"
}
}
Executing job (config):
buildRuns = [:]
buildResults = [:]
def buildClosure(String jobKey, String paramAValue) {
return {
def runWrapper = build(
job: 'command-test-job',
propagate: false,
parameters: [[$class: 'StringParameterValue', name: 'commandStr', value: paramAValue]]
)
buildResults."$jobKey" = runWrapper
}
}
buildRuns."job1" = buildClosure("job1", "echo 'HI' && exit 0")
buildRuns."job2" = buildClosure("job2", "echo 'HO' && exit 0")
parallel buildRuns
for(k in buildRuns.keySet()) {
def runResult = buildResults."$k"
echo "$k -> ${runResult.result}"
echo "$k -> ${runResult.buildVariables.PASSED_CMD}"
}
The build log shows:
[Pipeline] parallel
[Pipeline] [job1] { (Branch: job1)
[Pipeline] [job2] { (Branch: job2)
[Pipeline] [job1] build (Building command-test-job)
[job1] Scheduling project: command-test-job
[Pipeline] [job2] build (Building command-test-job)
[job2] Scheduling project: command-test-job
[job1] Starting building: command-test-job #7
[job2] Starting building: command-test-job #8
[Pipeline] [job2] }
[Pipeline] [job1] }
[Pipeline] // parallel
[Pipeline] echo
job1 -> SUCCESS
[Pipeline] echo
job1 -> echo 'HI' && exit 0
[Pipeline] echo
job2 -> SUCCESS
[Pipeline] echo
job2 -> echo 'HO' && exit 0
[Pipeline] End of Pipeline
Finished: SUCCESS
This is very similar to Steve-B's Answer, but you don't actually need to define the runwrapper explicitly or place it in the additional map before hand.
tl;dr You can just store the parallel build to a hashMap and access that map by directly looping over it's keySet instead
Take this answer with a grain of salt, I am using an older version of pipeline (Jenkins 2.7.2 and Pipeline 2.2).
You can store the parallel build results to a hashMap and loop over the map's keySet to get some information about the build.
def create_build_job(job_name, pool_label="master", propagate=false) {
build job: job_name, parameters: [[$class: 'LabelParameterValue', name: "node_label", label: "${pool_label}"]], propagate: propagate, wait:true
}
def buildmap = [:]
def build_results
stage 'Perform Build'
//test1 is set to fail, test2 is set to succeed
buildmap['test1'] = {create_build_job('test1', "your_node_label")}
buildmap['test2'] = {create_build_job('test2', "your_node_label")}
build_results = parallel buildmap
for(k in build_results.keySet()){
println build_results["${k}"].getProperties()
}
For this pipeline I'm just dumping all of the properties of the RunWrapper stored in item in the map, however you can access each property directly, so if you want the result of the build you can just do:
build_results["${k}"].result
The console output produced by this pipeline (with any potentially identifying information redacted is:
Started by user <user>
[Pipeline] stage (Perform Build)
Entering stage Perform Build
Proceeding
[Pipeline] parallel
[Pipeline] [test1] { (Branch: test1)
[Pipeline] [test2] { (Branch: test2)
[Pipeline] [test1] build (Building test1)
[test1] Scheduling project: test1
[test1] Starting building: test1 #11
[Pipeline] [test2] build (Building test2)
[test2] Scheduling project: test2
[test2] Starting building: test2 #11
[Pipeline] }
[Pipeline] }
[Pipeline] // parallel
[Pipeline] echo
{rawBuild=test1 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test1/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#1480013a, id=11, nextBuild=null, changeSets=[], result=FAILURE, description=null, startTimeInMillis=1509667550519, timeInMillis=1509667550510, duration=956, number=11, displayName=#11}
[Pipeline] echo
{rawBuild=test2 #11, class=class org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper, absoluteUrl=<jenkins_url>/job/test2/11/, buildVariables={}, previousBuild=org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper#2d9c7128, id=11, nextBuild=null, changeSets=[], result=SUCCESS, description=null, startTimeInMillis=1509667550546, timeInMillis=1509667550539, duration=992, number=11, displayName=#11}
[Pipeline] End of Pipeline
Finished: SUCCESS
You can access that data by using the Jenkins API after the parallel step:
Jenkins.instance.getItemByFullName('Job1').lastBuild

Resources