Dynamic parallel pipeline - jenkins

I have a Jenkins DSL pipeline where I create dynamical some steps, and I was wondering how can I run all the steps in parallel.
Since I use script to split the string into array an iterate, parallel seems to complain, since it has to be between stage tags
Here a code example
stages {
stage('BuildAll') {
script {
parallel {
"${names}".split(",").each { name->
stage(name) {
sh "env | grep -i NODE_NAME"
}
}
}
}
}
}

Because you are running the parallel function inside the script directive you must use the scripted syntax for the parallel function:
Takes a map from branch names to closures and an optional argument failFast which > will terminate all branches upon a failure in any other branch:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
So you can use the collectEntries method to iterate over your list and generate the Map that will be passed to the parallel function. Something like:
stages {
stage('BuildAll') {
steps {
script {
parallel names.split(',').collectEntries { name ->
["Execution ${name}": { // Map key is the branch name
// Following code will be executed in parallel for each branch
stage(name) {
sh "env | grep -i NODE_NAME"
}
}]
}
}
}
}
}
Another option is to define the map and then call parallel:
stages {
stage('BuildAll') {
steps {
script {
def executions = names.split(',').collectEntries { name ->
["Execution ${name}": {
stage(name) {
sh "env | grep -i NODE_NAME"
}
}]
}
parallel executions
}
}
}
}

Related

Jenkins pipeline: detect if a stage is started with the "Restart from stage" icon

Let's say I have a declarative pipeline. I want to run a stage only when 'Restart from stage' icon is used ?
Is there a way to do this (a method, a variable...)? I want to run the stage only if "Restart from stage" is used
stage('Test') {
when {
expression {
// An expression to detect if Restart from this stage is used
}
}
steps {
sh 'echo 1'
}
}
You can define a global variable that will hold a Boolean value representing if the pipeline was executed from the beginning or from a specific stage, update it in your first stage and use it later on in the when condition to determine if a restart from stage has occurred.
Something like:
RESTART = true
pipeline {
agent any
stages {
stage('Setup') {
steps {
script{
// signaling pipeline was executed from the beginning (first stage)
RESTART = false
}
// other setup steps
}
}
stage('Test') {
when {
expression { return RESTART }
}
steps {
sh 'echo 1'
}
}
}
}
Another nice option based on #Pamela's answer for using a cause condition, is to use the built in triggeredBy option in the when directive, thus avoiding the need to use getBuildCauses() and the need to filter all causes, and instaed get the condition out of the box.
Something like:
stage('Test') {
when { triggeredBy 'RestartDeclarativePipelineCause' }
steps {
sh 'echo 1'
}
}
You can use currentBuild.getBuildCauses(): https://www.jenkins.io/doc/pipeline/examples/#get-build-cause
Then, in your Test stage add when expression checking the cause of the build matches the one you need.
stage('Test') {
when {
expression {
return currentBuild.getBuildCauses().any { cause ->
cause._class == 'org.jenkinsci.plugins.pipeline.modeldefinition.causes.RestartDeclarativePipelineCause'
}
}
}
steps {
sh 'echo 1'
}
}

How to run all stages in parallel in jenkinsfile

I want to execute all the stages in parallel with the loop based on user input.
This gives error because script is not allowed under stages.
How should I achieve the same?
pipeline {
agent {
node {
label 'ec2'
}
}
stages{
script{
int[] array = params.elements;
for(int i in array) {
parallel{
stage('Preparation') {
echo 'Preparation'
println(i);
}
stage('Build') {
echo 'Build'
println(i);
}
}
}
}
}
}
If you are using declarative pipelines you have two options, first is to use static parallel stages which is an integral part of the declarative syntax but does not allow dynamic or runtime modifications.
The second option (which is probably what you attempted) is to use the scripted parallel function:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false```
When using it inside a declarative pipeline it should be used inside a script block like you did but the declarative basic directive must still be kept: pipeline -> stages -> stage -> steps -> script. In addition the scripted parallel function receives a specifically formatted map alike the example above.
In your case it can look somethong like:
pipeline {
agent {
node {
label 'ec2'
}
}
stages {
stage('Parallel Execution') {
steps {
script {
parallel params.elements.collectEntries {
// the key of each entry is the parallel execution branch name
// and the value of each entry is the code to execute
["Iteration for ${it}" : {
stage('Preparation') {
echo 'Preparation'
println(it);
}
stage('Build') {
echo 'Build'
println(it);
}
}]
}
}
}
}
}
}
Or if you want to use the for loop:
pipeline {
agent {
node {
label 'ec2'
}
}
stages {
stage('Parallel Execution') {
steps {
script {
map executions = [:]
for(int i in params.elements) {
executions["Iteration for ${it}" ] = {
stage('Preparation') {
echo 'Preparation'
println(i);
}
stage('Build') {
echo 'Build'
println(i);
}
}]
}
parallel executions
}
}
}
}
}
Other useful examples for the parallel function can be found here

How to configure Jenkinsfile to only run a specific stage when running a daily cron job?

I've set a cron job to run every night, however I only want it to run stage B within the Jenkinsfile not all of them.
pipeline {
agent any
triggers {
cron('#midnight')
}
}
stages {
stage('A') {
...
}
stage('B'){
when {
allOf {
expression { env.CHANGE_TARGET == "master" }
branch "PR-*"
}
}
steps {
sh """
echo 'running script'
make run-script
"""
}
}
stage('C') {
...
}
Without removing the conditionals in Stage B, I can't seem to figure out how to specify the cron to explicitly only run Stage B of the Jenkinsfile - I need to run that makefile script only when those conditionals are met OR during the daily midnight cron job
You can achieve what you want with the Parameterized Scheduler Plugin which enables you to define cron triggers that trigger the job with a specific environment variable, you can then use this variable as a condition to determine which step to execute.
in your case you can use that environment variable in the when directive of each stage to determine if it should run or not according to the variable.
Something like:
pipeline {
agent any
parameters {
booleanParam(name: 'MIDNIGHT_BUILD', defaultValue: 'true', description: 'Midnight build')
}
triggers {
parameterizedCron('''
0 * * * * %MIDNIGHT_BUILD=true
''')
}
stages {
stage('A') {
when {
expression { !env.MIDNIGHT_BUILD }
}
steps {
...
}
}
stage('B') {
when {
expression { env.MIDNIGHT_BUILD || env.CHANGE_TARGET == "master" }
}
steps {
sh """
echo 'running script'
make run-script
"""
}
}
stage('C') {
when {
expression { !env.MIDNIGHT_BUILD }
}
steps {
...
}
}
}
}

How to run parallel jobs from map inside groovy function?

I have a jenkinsfile that calls a function from groovy:
jenkinsfile:
pipeline {
agent none
environment {
HOME = '.'
}
stages {
stage("initiating"){
agent {
docker {
image 'docker-image'
}
}
stages {
stage('scanning') {
steps {
script {
workloadPipeline = load("Pipeline.groovy")
workloadPipeline.loopImages1(Images)
}
}
}
}
}
}
}
groovy functions:
def loopImages1(Images){
Images.each { entry ->
parallel {
stage('test-' + entry.key) {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
script {
sh """
docker run -d $entry.value
"""
}
}
}
}
}
}
Images returns a map, something like this:
image-1 : 123.dkr.ecr.eu-west-1.amazonaws.com....
image-2 : 123.dkr.ecr.eu-west-1.amazonaws.com....
image-3 : 123.dkr.ecr.eu-west-1.amazonaws.com....
And I was trying to run it with parallel, which in this case should run 3 jobs in parallel, but it gives me the following error message:
java.lang.IllegalArgumentException: Expected named arguments but got
org.jenkinsci.plugins.workflow.cps.CpsClosure2#19027e83
What do I need to change in order to get this to work? From what I read it needs a map as input, which I'm already giving.
In case anyone has a similar question, here is the answer that solved my problem:
groovy function:
def loopImages1(Images){
**def parallelStage = [:]**
Images.each { entry ->
**parallelStage[entry] = {**
stage('test-' + entry.key) {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
script {
sh """
docker run -d $entry.value
"""
}
}
}
}
}
**parallel parallelStage**
}

Dynamically defining parallel steps in declarative jenkins pipeline

I try to parallelize dynamically defined set of functions as follows:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
{ somefunc() },
{ somefunc2() }
]
pipeline {
agent none
stages{
stage('Run') {
steps {
parallel(running_set)
}
}
}
}
And what I end up with is:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 17: No "steps" or "parallel" to execute within stage "Run" # line 17, column 9.
stage('Run') {
Although steps are defined within stage 'Run'. Anyway what I would like to achieve running is a dynamically defined set of functions to execute in parallel.
If you want to use dynamic parallel block with declarative pipeline script, you have to apply two changes to your Jenkinsfile:
You have to define running_set as a Map like ["task 1": { somefunc()}, "task 2": { somefunc2() }] - keys from this map are used as parallel stages names
You have to pass running_set to parallel method inside script {} block
Here is what updated Jenkinsfile could look like:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
"task1": {
somefunc()
},
"task2": {
somefunc2()
}
]
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel(running_set)
}
}
}
}
}
And here is what it looks like in Blue Ocean UI:
It is not obvious. But Szymon's way can be very straightforward.
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel([
'parallelTask1_Name': {
any code you like
},
'parallelTask2_Name': {
any other code you like
},
... etc
])
}
}
}
}
}

Resources