Start multiple Jenkins jobs by loop - jenkins

I want to start another Jenkins job from a job called nightly. I've created a for loop but my problem is that the loop runs only one time and then finished.
This is my code:
stage('Build_Nightly')
{
devices = [
'device1',
'device2',
'device3'
]
for (int i = 0; i < devices.size(); i++) {
build job: 'Build_Daily',
parameters:
[
[$class: 'StringParameterValue', name: 'Device', value: devices[i]]]
],
wait: true
}
}
The first run ist successfull and there is no second run with device2

check out the propagate attribute in the build step documentation.
propagate (optional).
If enabled (default state), then the result of this step is that of the downstream build (e.g., success, unstable, failure, not built, or aborted). If disabled, then this step succeeds even if the downstream build is unstable, failed, etc.; use the result property of the return value as needed.
By default the propagate is set to true, which means that if the downstream build failed then the step will throw an exception thus failing the caller job. If you want to run all jobs regardless of the result you can pass the propagate attribute as false.
if you do need to save the result of the downstream build, then the retuned value from the build command contains a result property with the downstream job result which you can then use for any logic you want.
Example:
stage('Build_Nightly') {
devices = ['device1', 'device2', 'device3']
devices.each { device ->
def buildResults = build job: 'Build_Daily', wait: true, propagate: false,
parameters:[string(name: 'Device', defaultValue: device, description: '', trim: true)]
println "The result of the downstream job is: ${buildResults.result}"
}
}

Related

How to enforce jenkins job to wait until all jobs executed in loop

In this i have requirement to trigger my job with 3 iterations (below example 3) without waiting, but after all 3 jobs were triggered this has to wait until all 3 jobs have successfully finished irrespective of fail or pass.
I am using wait:true but that will wait for each iteration, thats not i want.
If I use wait:false , it's not waiting once all iterations in loop has completed, its not waiting for the downstream jobs to finish. I want the current job to wait until i got result of jobs (3 pipelines).
//job1 is a pipeline job which i am triggering multiple times with different params
stage {
for(int cntr=0;i<3;i++) {
build job : "job1",
parameters: [string(name: 'param1', value:val[cntr] )],
wait: false
}
}
I think what you actually want is to run them all in parallel and then wait until they all finish.
To do so you can use the parallel keyword:
parallel: Execute in parallel.
Takes a map from branch names to closures and an optional argument failFast >which will terminate all branches upon a failure in any other branch:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false```
In your case it can look something like:
stage('Build Jobs') {
def values = ['value1', 'value2', 'value2']
parallel values.collectEntries {value ->
["Building With ${value}": {
build job : "job1",
parameters: [string(name: 'param1', value: value)],
wait: true
}]
}
}
Or if you want to use indexes instead of a constant list:
stage('Build Jobs') {
def range = 0..2 // or range = [0, 1, 2]
parallel range.collectEntries { num ->
["Iteration ${num}": {
build job : "job1",
parameters: [string(name: 'param1', value: somefunc(num)],
wait: true
}]
}
}
This will execute all the jobs in parallel and then wait until they are all finished before progressing with the pipeline (don't forget to set the wait parameter of the build step to true).
You can find more examples for things like this here.

Does jenkins can run the same job multiple times in parallel?

I'm running version 2.249.3 of jenkins and try to create a pipeline that remove all old instances.
for (String Name : ClustersToRemove) {
buildRemoveJob (Name, removeClusterBuilds, removeClusterBuildsResults)
parallel removeClusterBuilds
}
and what the method does is :
def buildRemoveJob (Name, removeClusterBuilds, removeClusterBuildsResults) {
removeClusterBuilds[clusterName] = {
//Random rnd = new Random()
//sleep (Math.abs(rnd.nextInt(Integer.valueOf(rangeToRandom)) + Integer.valueOf(minimumRunInterval)))
removeClusterBuildsResults[clusterName] = build job: 'Delete_Instance', wait: true, propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'Cluster_Name', value: clusterName],
]
}
But... I get only one downstream job that is being launched.
I found this bug https://issues.jenkins.io/browse/JENKINS-55748 but it looks that someone must have solved this issue since it's a very common scenario.
Also here - How to run the same job multiple times in parallel with Jenkins? - I found a documentaion but looks that it does not apply from same jobs
The version of build pipeline plugin is 1.5.8
From the parallel command documentation:
Takes a map from branch names to closures and an optional argument failFast which will terminate all branches upon a failure in any other branch:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
So you should first create a map of all executions and then run them all in parallel.
In your example, you should first iterate over the strings and create the executions map, then pass it to the parallel command. Something like this:
def executions = ClustersToRemove.collectEntries {
["building ${it}": {
stage("Build") {
removeClusterBuildsResults[it] = build job: 'Delete_Instance', wait: true, propagate: false,
parameters: [[$class: 'StringParameterValue', name: it, value: clusterName]]
}
}]
}
parallel executions
or without the variable:
parallel ClustersToRemove.collectEntries {
...
Yes to be straight..
Depends on number of agents you have.
if you have single agent then other triggers go to queue state.
Hope that answers your question.

How to get child job logs in parent job on child job failure?

I want to get child job logs in parent job irrespective of if child job passes or fails. Following code returns child jobs logs in parent job only when child job passes -
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo 'Hello World'
echo 'In parent job'
script {
def result = build job: 'ChildJob', parameters: []
println result.getRawBuild().getLog()
}
}
}
}
}
Is there any way where on failure I can get child job logs in parent job?
from https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
propagate (optional)
If set, then if the downstream build is anything but successful (blue ball), this step fails. If disabled, then this step succeeds even if the downstream build is unstable, failed, etc.; use the result property of the return value as needed.
This should exactly answer your question
A bit late to the game but I managed this with propagate: false like this:
script {
def run = build(
job: "job name here",
propagate: false
)
println(run.rawBuild.getLog())
currentBuild.result = run.getResult()
}

How can I run parallel jobs efficiently with a pipeline without having queueing of builds

I have a pipeline something like below.
stage('Build, run, report') {
for (int i = 0; i < components.size(); ++i){
builds[i] = {
stage('Build') {
build job: 'Build', parameters: [string(name: 'Component', value: component)]
}
stage('Run') {
build job: 'Run', parameters: [string(name: 'Component', value: component)]
}
stage('Reporting') {
'Reporting', parameters: [string(name: 'Component', value: component)]
}
}
}
parallel builds
Here "components" is a list coming from parameter of pipeline. I want to run the same flow according the number of component.
I have only one slave node with 4 executers. If I have 10 components 4 will start running immediately and the other 6 will be queued and will be waiting for the executer to be free.
I can get even more than 50 comonents in the list and having somany in the queue is not looking good and I don't feel this will be the right approach then.(I suspect there would be a limit of builds can be in queue also.)
Do we have a way to pause the parallel triggering till executers/slaves are available and resume one by one when the executer/slave is getting free ?
Or do we have a better way to handle it than parallel run in pipeline ?
I haven't tried it myself, but maybe you can consider using quiet period for the build job after every 4 count of components.

Jenkins 2.8 pipeline loop to trigger same job multiple time with different parameters

I am building pipeline workflow in Jenkins v.2.8. What I would like to achieve is to build one step which would trigger same job multiple times as same time with different parameters.
Example: I have a worfklow called "Master" which has one step, this step is reading my parameter "Number" which is a check box with multiple selection option. So user can trigger workflow and select option for Number like "1, 2, 3". Now what I would like to achieve when this step is executed that it calls my job "Master_Child" and triggers "Master_Child" with 3 different parameters at the same time.
I tried to do it in this way:
stage('MyStep') {
steps {
echo 'Deploying MyStep'
script {
env.NUMBER.split(',').each {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")]
}
}
}
}
But with this it reads first parameter triggers the Mast_Child with parametre 1 and it waits until the jobs is finished, when job is finished then it triggers the same the job with parameter 2.
If I use wait: false on job call, then pipeline workflow just calls this jobs with different parameters but it is not depended if sub job fails.
Any ideas how to implement that ?
Thank you in advance.
I resolved my problem in this way.
stage('MyStage') {
steps {
echo 'Deploying MyStep'
script {
def numbers = [:]
env.NUMBER.split(',').each {
numbers["numbers${it}"] = {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")]
}
}
parallel numbers
}
}
}
Set the wait in the build job syntax to false wait: false
stage('MyStep') {
steps {
echo 'Deploying MyStep'
script {
env.NUMBER.split(',').each {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")], wait: false
}
}
}
}

Resources