How to run parallel jobs from map inside groovy function? - jenkins

I have a jenkinsfile that calls a function from groovy:
jenkinsfile:
pipeline {
agent none
environment {
HOME = '.'
}
stages {
stage("initiating"){
agent {
docker {
image 'docker-image'
}
}
stages {
stage('scanning') {
steps {
script {
workloadPipeline = load("Pipeline.groovy")
workloadPipeline.loopImages1(Images)
}
}
}
}
}
}
}
groovy functions:
def loopImages1(Images){
Images.each { entry ->
parallel {
stage('test-' + entry.key) {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
script {
sh """
docker run -d $entry.value
"""
}
}
}
}
}
}
Images returns a map, something like this:
image-1 : 123.dkr.ecr.eu-west-1.amazonaws.com....
image-2 : 123.dkr.ecr.eu-west-1.amazonaws.com....
image-3 : 123.dkr.ecr.eu-west-1.amazonaws.com....
And I was trying to run it with parallel, which in this case should run 3 jobs in parallel, but it gives me the following error message:
java.lang.IllegalArgumentException: Expected named arguments but got
org.jenkinsci.plugins.workflow.cps.CpsClosure2#19027e83
What do I need to change in order to get this to work? From what I read it needs a map as input, which I'm already giving.

In case anyone has a similar question, here is the answer that solved my problem:
groovy function:
def loopImages1(Images){
**def parallelStage = [:]**
Images.each { entry ->
**parallelStage[entry] = {**
stage('test-' + entry.key) {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
script {
sh """
docker run -d $entry.value
"""
}
}
}
}
}
**parallel parallelStage**
}

Related

Jenkins Declarative Pipeline - Running multiple things paralellel but skip "branch" if earlier failure occured

I wanna build one Jenkins pipeline that builds and runs tests on multiple versions of a program (e.g. Different databases)
But when any step fails, I want to skip the following steps only for that "branch" so to speak..
This is my example Code where Stage 1 is run first, with possible parallel steps (1.a, 1.b). The code does not work and is only some sort of example of how I would like it to work:
pipeline {
agent any
environment {
stageOneFailed = "false"
stageTwoFailed = "false"
}
stages {
stage ("Stage 1") {
parallel {
stage("Stage 1.a") {
// Something like this maybe?
steps {
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
// Do stuff here..
}
}
post {
unsuccessful {
// When stage did not succeed..
// Set stageOneFailed = "true"
}
}
}
stage("Stage 1.b") {
// Do Stuff..
// If Stage 1.b fails, set stageTwoFailed="true"
}
}
}
stage("Stage 2") {
parallel {
// Only run stages if earlier steps didn't fail
stage("Stage 2.a") {
when {
environment(name: "stageOneFailed", value: "false")
}
steps {
// Do stuff..
// If Stage 2.a fails, set stageOneFailed="true"
}
}
stage("Stage 2.b") {
when {
environment(name: "stageTwoFailed", value: "false")
}
steps {
// Do stuff..
// If Stage 2.b fails, set stageTwoFailed="true"
}
}
}
}
// stage()
}
}
Can anyone give any advice on how to do this the proper way?
Thanks in advance
EDIT: Changed code example. The example runs now!
pipeline {
agent any
environment {
stageOneFailed = "false"
stageTwoFailed = "false"
}
stages {
stage ("Stage 1") {
parallel {
stage("Stage 1.a") {
steps {
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
bat "ech Stage 1.a" // Should fail because ech is no valid command
}
}
post {
failure {
script {
env.stageOneFailed = "true"
}
}
}
}
stage("Stage 1.b") {
steps {
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
bat "echo Stage 1.b" // Should not fail
}
}
post {
failure {
script {
env.stageTwoFailed = "true"
}
}
}
}
}
}
stage("Stage 2") {
parallel {
// Only run stages if earlier steps didn't fail
stage("Stage 2.a") {
when {
environment(name: "stageOneFailed", value: "false")
}
steps {
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
bat "echo Stage 2.a"
}
}
post {
failure {
script {
env.stageOneFailed = "true"
}
}
}
}
stage("Stage 2.b") {
when {
environment(name: "stageTwoFailed", value: "false")
}
steps {
catchError(buildResult: 'UNSTABLE', stageResult: 'FAILURE') {
bat "echo Stage 2.b"
}
}
post {
failure {
script {
env.stageTwoFailed = "true"
}
}
}
}
}
}
}
}
But when running the example, the Stage 1.a fails but Stage 2.a is still run, maybe anyone could help out here..
EDIT: I added output to see, what value stageNFailed is set to. Even after calling env.stageOneFailed, when going into next stage, it takes the old value false..
My assumption is, that when calling script env.stageNFailed = "true", the value is only set temporarily for that stage..
The example you have used is a perfectly acceptable way to do it. You have introduced 2 env variables that get used to determine if the previous step failed. You have used catchError to ensure that the pipeline doesn't fail when the stage fails. You have to use catchError in every stage to prevent the pipeline from failing (but I guess you already know that). In the post part of the stage, you have set the appropriate env variable to true, which is also correct.
post {
failure {
script {
env.stageOneFailed = true
}
}
}
Then when the next relevant stage starts, you have used the when condition to check if the stage should be run (you could also do something like this):
when {
expression { stageOneFailed == false }
}
So basically you have done everything right.

Dynamic parallel pipeline

I have a Jenkins DSL pipeline where I create dynamical some steps, and I was wondering how can I run all the steps in parallel.
Since I use script to split the string into array an iterate, parallel seems to complain, since it has to be between stage tags
Here a code example
stages {
stage('BuildAll') {
script {
parallel {
"${names}".split(",").each { name->
stage(name) {
sh "env | grep -i NODE_NAME"
}
}
}
}
}
}
Because you are running the parallel function inside the script directive you must use the scripted syntax for the parallel function:
Takes a map from branch names to closures and an optional argument failFast which > will terminate all branches upon a failure in any other branch:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
So you can use the collectEntries method to iterate over your list and generate the Map that will be passed to the parallel function. Something like:
stages {
stage('BuildAll') {
steps {
script {
parallel names.split(',').collectEntries { name ->
["Execution ${name}": { // Map key is the branch name
// Following code will be executed in parallel for each branch
stage(name) {
sh "env | grep -i NODE_NAME"
}
}]
}
}
}
}
}
Another option is to define the map and then call parallel:
stages {
stage('BuildAll') {
steps {
script {
def executions = names.split(',').collectEntries { name ->
["Execution ${name}": {
stage(name) {
sh "env | grep -i NODE_NAME"
}
}]
}
parallel executions
}
}
}
}

Show a Jenkins pipeline build job stage as failed without failing the whole job

I have a Jenkins pipeline with some parallel stages that should not fail the job if they fail.
Those stages start a build job.
I started from https://stackoverflow.com/a/56975220/1817610.
The original sample works, but not if my stage builds another pipeline.
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
parallel {
stage('2.1') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'Failing pipeline'
}
}
}
stage('2.2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'Succesful pipeline'
}
}
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
See build 7 in screenshot
If I changed the stage to
stage('2.1') {
steps {
build job: 'Failing pipeline', propagate: false
}
}
The job does not fail, but also the stage does not fail, see build 8.
I'd like to have the global state as successful but still showing that one of the builds failed.
you could make use of pure groovy try..catch block and control your SUCCESS and FAILURE with some condition.
Below is an example:
pipeline {
agent any;
stages {
stage('01') {
steps {
sh "echo Hello"
}
}
stage('02') {
parallel {
stage('02.1') {
steps {
script {
try {
def v = 10/0
println v
}catch(Exception e) {
println e
}
}
}
}
stage('02.2') {
steps {
script {
try {
def v = 10 % 2
println v
}catch(Exception e) {
println e
}
}
}
}
}
}
}
}
In my example, my parallel stage 02.1 will fail with java.lang.ArithmeticException: Division by zero but catch block will handle it by catching the exception.
if you want to fail the build with some condition, you can put if..else condition inside catch {} and fail by throwing back the exception to Jenkins like
...
stage('02.1') {
steps {
script {
try {
def v = 10/0
println v
}catch(Exception e) {
if(someCondition) {
println e
} else {
throw e;
}
}
}
}

Jenkins Pipeline execute same step on multiple nodes

I have below code which is executing fine on 1 node - Server1:
pipeline {
agent {
node {
label 'Server1'
}
}
stages {
stage('Stage1') {
steps {
callStage1()
}
}
stage('Stage2') {
steps {
callStage2()
}
}
}
}
def callStage1()
{
sh ''' #shell script
'''
}
def callStage2()
{
sh ''' #shell script
'''
}
I want to execute Stage 1 on Server1 only and Stage2 on 3 nodes serially - Server1,Server2,Server3.
How can this be achieved?

Dynamically defining parallel steps in declarative jenkins pipeline

I try to parallelize dynamically defined set of functions as follows:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
{ somefunc() },
{ somefunc2() }
]
pipeline {
agent none
stages{
stage('Run') {
steps {
parallel(running_set)
}
}
}
}
And what I end up with is:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 17: No "steps" or "parallel" to execute within stage "Run" # line 17, column 9.
stage('Run') {
Although steps are defined within stage 'Run'. Anyway what I would like to achieve running is a dynamically defined set of functions to execute in parallel.
If you want to use dynamic parallel block with declarative pipeline script, you have to apply two changes to your Jenkinsfile:
You have to define running_set as a Map like ["task 1": { somefunc()}, "task 2": { somefunc2() }] - keys from this map are used as parallel stages names
You have to pass running_set to parallel method inside script {} block
Here is what updated Jenkinsfile could look like:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
"task1": {
somefunc()
},
"task2": {
somefunc2()
}
]
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel(running_set)
}
}
}
}
}
And here is what it looks like in Blue Ocean UI:
It is not obvious. But Szymon's way can be very straightforward.
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel([
'parallelTask1_Name': {
any code you like
},
'parallelTask2_Name': {
any other code you like
},
... etc
])
}
}
}
}
}

Resources