Jenkins declarative pipeline parallel steps executors - jenkins

I am migrating a job from multijob to a Jenkins Declarative pipeline job. I am unable to run the parallel steps on multiple executors.
For example in the pipeline below, I see only one executor being used when I run the pipeline.
I was wondering why only a single executor is used. The idea is that each parallel step would be inoking a make target that would build a docker image.
pipeline {
agent none
stages {
stage('build libraries') {
agent { label 'master' }
steps {
parallel(
"nodejs_lib": {
dir(path: 'nodejs_lib') {
sh 'sleep 110'
}
},
"python_lib": {
dir(path: 'python_lib') {
sh 'sleep 100'
}
}
)
}
}
}
options {
ansiColor('gnome-terminal')
buildDiscarder(logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '30'))
timestamps()
}
}

You can try the following way to perform parallel tasks execution for your pipeline job:
def tasks = [:]
tasks["TasK No.1"] = {
stage ("TASK1"){
node('master') {
sh '<docker_build_command_here>'
}
}
}
tasks["task No.2"] = {
stage ("TASK2"){
node('master') {
sh '<docker_build_command_here>'
}
}
}
tasks["task No.3"] = {
stage ("TASK3"){
node('remote_node') {
sh '<docker_build_command_here>'
}
}
}
parallel tasks
If you want to execute parallel tasks on a single node and also want to have the same workspace for both the tasks then you can go with the following approach:
node('master') {
def tasks = [:]
tasks["TasK No.1"] = {
stage ("TASK1"){
sh '<docker_build_command_here>'
}
}
tasks["task No.2"] = {
stage ("TASK2"){
sh '<docker_build_command_here>'
}
}
parallel tasks
}

Related

Trigger pipeline job from other and wait for it on next stage

I need to execute pipeline B from pipeline A , do some work and than come back and wait until B is finished.
something like this:
pipeline
{
stages {
stage ('Init') {
steps {
job_b = build (job:"my_name", wait: false)
}
}
stage ('step 2')
{
steps
{
....do some work
}
stage ('step3'){
steps
{
job_b.waitUtilFinish()
}
}
}
I familiar with parallel but I don't want to use it because blue Ocean view in parallel and I have the entire pipeline to do until checking the other job
You can merge 'Init' and 'Step 3' stages into a unique a stage and use parallel directive:
pipeline {
stages {
parallel {
stage ('step 1') {
steps {
job_b = build (job:"my_name", wait: true)
}
}
stage ('step 2') {
steps {
....do some work
}
} ​
​}
​ }
}

Run multiple Jobs in parallel via Jenkins Declarative pipeline syntax

I want to execute multiple jobs from a single pipeline using declarative syntax in parallel. Can this be possible!! I know we can make a declarative parallel pipeline using "parallel" parameter.
pipeline {
agent any
parallel{
stages {
stage('Test1') {
steps {
sh 'pip install -r requirements.txt'
}
}
stage('Test2') {
steps {
echo 'Stage 2'
sh 'behave -f allure_behave.formatter:AllureFormatter -o allure-results features/scenarios/**/*.feature'
}
}
stage('Test3') {
steps {
script {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'allure-results']]
])
}
}
}
}
}
}
Below image will show you the proper flow that I want. Any approach how to do it?
// Pipeline project: SO-69680107-1-parallel-downstream-jobs-matrix
pipeline {
agent any
stages {
stage('Clean Workspace') {
steps {
cleanWs()
}
}
stage('Job matrix') {
matrix {
axes {
axis {
name 'job'
values 'SO-69680107-2', 'SO-69680107-3', 'SO-69680107-k' // , ...
}
}
stages {
stage('Run job') {
steps {
build "$job"
copyFiles( "$WORKSPACE\\..\\$job", "$WORKSPACE")
}
} // stage 'Run job'
}
} // matrix
} // stage 'Job matrix'
stage('List upstream workspace') {
steps {
bat "#dir /b \"$WORKSPACE\""
}
}
} // stages
}
def copyFiles( downstreamWorkspace, upstreamWorkspace ) {
dir("$downstreamWorkspace") {
bat """
#set prompt=\$g\$s
#echo Begin: %time%
dir /b
xcopy /f *.* \"$upstreamWorkspace\\\"
#echo End: %time%
"""
}
}
Template for downstream projects SO-69680107-2, SO-69680107-3, SO-69680107-k:
// Pipeline project: SO-69680107-X
pipeline {
agent any
stages {
stage('Stage X') {
steps {
sh 'set +x; echo "Step X" | tee SO-69680107-X.log; date; sleep 3; date'
}
}
}
}

Quicker syntax for Jenkins identical parallel stages

I have some parallel stages in my Jenkins pipeline. They are all identical, except that they run on different agents:
stage {
parallel {
stage {
agent {
label 'agent-1'
}
steps {
sh 'do task number 468'
}
}
stage {
agent {
label 'agent-2'
}
steps {
sh 'do task number 468'
}
}
stage {
agent {
label 'agent-3'
}
steps {
sh 'do task number 468'
}
}
}
}
I want to add more parallel stages on more nodes, but the script is long and repetetive. What's the best way to rewrite this to tell jenkins to parallelize the same steps across agents 1, 2, 3, 4...etc?
Please see below code which will create and run the stage on multiple agents:
// Define your agents
def agents = ['agent-1','agent-2','agent-3']
def createStage(label) {
return {
stage("Runs on ${label}") {
node(label) {
// build steps that should happen on all nodes go here
echo "Running on ${label}"
sh 'do task number 468'
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : createStage(it)]
}
pipeline {
agent none
stages {
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
More information is available at : Jenkins examples

Jenkins Scripted Pipeline use global timestamps options

In my scripted pipeline I would like to set global timestamps and ansicolor option.
Below scripted pipeline not working. How can we add these two options in scripted pipeline?
Declarative Pipeline
pipeline {
agent none
options {
timestamps()
ansiColor('xterm')
}
stages {
stage('Checkout') {
agent { label 'linux' }
steps{
echo "test"
}
}
}
}
Scripted Pipeline
node('linux') {
options {
timestamps()
ansiColor('xterm')
}
stage('Pre Build Setup') {
task('Display env') {
echo "test"
}
}
}
In case of a scripted pipeline, all you have to do is to wrap your script with timestamps and ansiColor('xterm') steps as shown in the example down below:
node {
timestamps {
ansiColor("xterm") {
stage("A") {
echo 'This is stage A'
sh 'printf "\\e[31mHello World\\e[0m\\n"'
sh "sleep 3s"
}
stage("B") {
echo "This is stage B"
}
}
}
}

How to use failFast in dynamic pipeline in Jenkins

I have pipeline which has dynamic parallel stages and I want my pipeline to fail fast, if any of the stage fail. I tried to add failFast: true but my pipeline is stuck at "Failed at Stage ABC".
stage("Deploy") {
steps {
script {
def stages = createStages("Name", "Project")
fastFail: true
for (stage in stages) {
parallel stage
}
}
}
}
Solution: Use failFast flag on Jenkins pipeline.
From Documentation: You can force your parallel stages to all be aborted when one of them fails, by adding failFast true to the stage containing the parallel.
Pay attention that all jobs would be triggered and quit (if one fails) if the agent node was started in each one of them (if job 'a' in pipeline fails but job 'b' is still looking for node and not started yet, it will continue - [this is an edge case]).
Examples - The options are :
1.Use parallelsAlwaysFailFast method in your options pipeline:
pipeline {
agent any
options {
parallelsAlwaysFailFast()
}
stages {
stage('Non-Parallel Stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('Parallel Stage') {
when {
branch 'master'
}
parallel {
stage('Branch A') {
agent {
label "for-branch-a"
}
steps {
echo "On Branch A"
}
}
stage('Branch B') {
agent {
label "for-branch-b"
}
steps {
echo "On Branch B"
}
}
stage('Branch C') {
agent {
label "for-branch-c"
}
stages {
stage('Nested 1') {
steps {
echo "In stage Nested 1 within Branch C"
}
}
stage('Nested 2') {
steps {
echo "In stage Nested 2 within Branch C"
}
}
}
}
}
}
}
2.Use before parallel using failFast true
stage('Parallel Stage') {
when {
branch 'master'
}
failFast true
parallel {
3.Configure jobs in map and execute with failFast attribute on.
jobsList = [
{job: 'jobA', parameters: [booleanParam(name: 'flag', value: true)]},
{job: 'jobB', parameters: [booleanParam(name: 'flag', value: true)]}
]
jobsList.failFast = true
parallel(jobsList)
I couldn't reply to the answer provided by #avivamg but I wasn't able to use his/her solution directly.
This worked for me:
stages.failFast = true
parallel stages
Or in your case:
stage("Deploy") {
steps {
script {
def stages = createStages("Name", "Project")
stages.fastFail = true
// I'm not sure if the for loop will work as failFast is on the map
// so if that doesn't work then you could use this instead:
// parallel stages
for (stage in stages) {
parallel stage
}
}
}
}
If you're using a scripted pipeline then you need to add the failFast into the parallel step like so -
stage('SomeStage') {
parallel (
"Process1" : { //do something },
"Process2" : { //do something else },
failFast: true
)
}

Resources