Jenkins parallel script in loop using wrong variables - jenkins

I'm trying to build a dynamic group of steps to run in parallel. The following example is what I came up with (and found examples of at https://devops.stackexchange.com/questions/3073/how-to-properly-achieve-dynamic-parallel-action-with-a-declarative-pipeline). But I'm having trouble getting it to use the expected variables. The result always seems to be the variables from the last iteration of the loop.
In the following example the echo output is always bdir2 for both tests:
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
tests[roleName] = {
echo roleName
}
}
parallel tests
}
}
}
}
}
I'm expecting one of the tests to output adir2 and another to be bdir2. What am I missing here?

Just try to move the test section a little higher, and it will be work
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
tests[f] = {
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
echo roleName
}
}
parallel tests
}
}
}
}
}

Related

Declarative dynamic parallel stages

I figure I’m doing something unorthodox here, but I’d like to stick to declarative for convenience while dynamically generating parallel steps.
I found a way to do something like that, but mixing both paradigms, which doesn’t seem to work well with the BlueOcean UI (multiple stages inside each parallel branch do not show up properly).
The closest I got was with something like this:
def accounts() {
return ["dynamic", "list"]
}
def parallelJobs() {
jobs = []
for (account in accounts()) {
jobs[] = stage(account) {
steps {
echo "Step for $account"
}
}
}
return jobs
}
# this is inside a shared library, called by my Jenkinsfile, like what is described
# under "Defining Declarative Pipelines in Shared Libraries" in
# https://www.jenkins.io/blog/2017/09/25/declarative-1/
def call() {
pipeline {
stages {
stage('Build all variations') {
parallel parallelJobs()
}
}
}
}
The problem is Jenkins errors like this:
Expected a block for parallel # line X, column Y.
parallel parallelJobs()
^
So, I was wondering if there is a way I could transform that list of stages, returned by parallelJobs(), into the block expected by Jenkins...
Yes, you can. You need to return a map of stages. Following is a working pipeline example.
pipeline {
agent any
stages {
stage('Parallel') {
steps {
script {
parallel parallelJobs()
}
}
}
}
}
def accounts() {
return ["dynamic", "list"]
}
def parallelJobs() {
jobs = [:]
for (account in accounts()) {
jobs[account] = { stage(account) {
echo "Step for $account"
}
}
}
return jobs
}

Getting the same output from parallel stages in jenkins scripted pipelines

I'm trying to create parallel stages in jenkins pipeline for say with this example
node {
stage('CI') {
script {
doDynamicParallelSteps()
}
}
}
def doDynamicParallelSteps(){
tests = [:]
for (f in ["Branch_1", "Branch_2", "Branch_3"]) {
tests["${f}"] = {
node {
stage("${f}") {
echo "${f}"
}
}
}
}
parallel tests
}
I'm expecting to see "Branch_1", "Branch_2", "Branch_3" and instead I'm getting "Branch_3", "Branch_3", "Branch_3"
I don't understand why. Can you please help ?
Short answer: On the classic view, the stage names are displaying the last value of the variable ${f}. Also, all the echo are echoing the same value. You need to change the loop.
Long Answer: Jenkins does not allow to have multiple stages with the same name so this could never happen successfully :)
On your example, you can see it fine on Blue Ocean:
Also, on console output, the names are right too.
On Jenkins classic view, the stage names have the last value of the variable ${f}. The last value is being printed on the classic view for the stage name, and all the echo are the same.
Solution: Change your loop. This worked fine for me.
node {
stage('CI') {
script {
doDynamicParallelSteps()
}
}
}
def void doDynamicParallelSteps(){
def branches = [:]
for (int i = 0; i < 3 ; i++) {
int index=i, branch = i+1
branches["branch_${branch}"] = {
stage ("Branch_${branch}"){
node {
sh "echo branch_${branch}"
}
}
}
}
parallel branches
}
This has to do with closures and iteration, but in the end this might fix it:
for (f in ["Branch_1", "Branch_2", "Branch_3"]) {
def definitive_name = f
tests[definitive_name] = {

how to access folder variables across pipeline stages?

I am trying to create multiple pipeline jobs under a folder. Under this folder I have created some folder properties. I am having a hard time to use this folder properties across multiple stages in a job.
plugin used : https://wiki.jenkins.io/display/JENKINS/Folder+Properties+Plugin
def region
pipeline {
agent any
stages {
stage('Assign values to global properties') {
steps {
withFolderProperties{
region = "${env.appRegion}"
}
}
}
stage('Print') {
steps {
print(region)
}
}
}
}
Error:
Expected a step # line 8, column 21.
region = "${env.appRegion}"
Thanks in Advance
region = "${env.appRegion}" is not pipeline reserved name of step or directive. It's groovy statement. You should put them inside script step. If you used Scripted Pipeline you can put any kinds of groovy statement in anywhere. But for Declarative Pipeline any groovy statement should wrapped in script step.
steps {
script {
withFolderProperties{
region = "${env.appRegion}"
}
}
}
steps {
withFolderProperties{
script {
region = "${env.appRegion}"
}
}
}
I'm not sure which one code block above is work, but you can give a try.
#!groovy
def CI_NAMESPACE = ""
withFolderProperties{
CI_NAMESPACE = "${env.CI_NAMESPACE}"
}
println "CI_NAMESPACE = ${CI_NAMESPACE}"
if (CI_NAMESPACE == '' || CI_NAMESPACE == null || CI_NAMESPACE == 'null') {
currentBuild.result = 'ABORTED'
error('Not defined CI_NAMESPACE in Folder properies plugin!')
}
pipeline {
environment {
CI_NAMESPACE = "${CI_NAMESPACE}"
}
stages {
stage('Test') {
steps {
echo "CI_NAMESPACE: ${env.CI_NAMESPACE}"
}
}
}
}

Dynamic number of parallel steps in declarative pipeline

I'm trying to create a declarative pipeline which does a number (configurable via parameter) jobs in parallel, but I'm having trouble with the parallel part.
Basically, for some reason the below pipeline generates the error
Nothing to execute within stage "Testing" # line .., column ..
and I cannot figure out why, or how to solve it.
import groovy.transform.Field
#Field def mayFinish = false
def getJob() {
return {
lock("finiteResource") {
waitUntil {
script {
mayFinish
}
}
}
}
}
def getFinalJob() {
return {
waitUntil {
script {
try {
echo "Start Job"
sleep 3 // Replace with something that might fail.
echo "Finished running"
mayFinish = true
true
} catch (Exception e) {
echo e.toString()
echo "Failed :("
}
}
}
}
}
def getJobs(def NUM_JOBS) {
def jobs = [:]
for (int i = 0; i < (NUM_JOBS as Integer); i++) {
jobs["job{i}"] = getJob()
}
jobs["finalJob"] = getFinalJob()
return jobs
}
pipeline {
agent any
options {
buildDiscarder(logRotator(numToKeepStr:'5'))
}
parameters {
string(
name: "NUM_JOBS",
description: "Set how many jobs to run in parallel"
)
}
stages {
stage('Setup') {
steps {
echo "Setting it up..."
}
}
stage('Testing') {
steps {
parallel getJobs(params.NUM_JOBS)
}
}
}
}
I've seen plenty of examples doing this in the old pipeline, but not declarative.
Anyone know what I'm doing wrong?
At the moment, it doesn't seem possible to dynamically provide the parallel branches when using a Declarative Pipeline.
Even if you have a stage prior where, in a script block, you call getJobs() and add it to the binding, the same error message is thrown.
In this case you'd have to fall back to using a Scripted Pipeline.

Can I create dynamically stages in a Jenkins pipeline?

I need to launch a dynamic set of tests in a declarative pipeline.
For better visualization purposes, I'd like to create a stage for each test.
Is there a way to do so?
The only way to create a stage I know is:
stage('foo') {
...
}
I've seen this example, but I it does not use declarative syntax.
Use the scripted syntax that allows more flexibility than the declarative syntax, even though the declarative is more documented and recommended.
For example stages can be created in a loop:
def tests = params.Tests.split(',')
for (int i = 0; i < tests.length; i++) {
stage("Test ${tests[i]}") {
sh '....'
}
}
As JamesD suggested, you may create stages dynamically (but they will be sequential) like that:
def list
pipeline {
agent none
options {buildDiscarder(logRotator(daysToKeepStr: '7', numToKeepStr: '1'))}
stages {
stage('Create List') {
agent {node 'nodename'}
steps {
script {
// you may create your list here, lets say reading from a file after checkout
list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
}
}
post {
cleanup {
cleanWs()
}
}
}
stage('Dynamic Stages') {
agent {node 'nodename'}
steps {
script {
for(int i=0; i < list.size(); i++) {
stage(list[i]){
echo "Element: $i"
}
}
}
}
post {
cleanup {
cleanWs()
}
}
}
}
}
That will result in:
dynamic-sequential-stages
If you don't want to use for loop, and generated pipeline to be executed in parallel then, here is an answer.
def jobs = ["JobA", "JobB", "JobC"]
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
Note that all generated stages will be executed into 1 node.
If you are willing to executed the generated stages to be executed into different nodes.
def agents = ['master', 'agent1', 'agent2']
// enter valid agent name in array.
def generateStage(nodeLabel) {
return {
stage("Runs on ${nodeLabel}") {
node(nodeLabel) {
echo "Running on ${nodeLabel}"
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : generateStage(it)]
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
You can of course add more than 1 parameters and can use collectEntries for 2 parameters.
Please remember return in function generateStage is must.
#Jorge Machado: Because I cannot comment I had to post it as an answer. I've solved it recently. I hope it'll help you.
Declarative pipeline:
A simple static example:
stage('Dynamic') {
steps {
script {
stage('NewOne') {
echo('new one echo')
}
}
}
}
Dynamic real-life example:
// in a declarative pipeline
stage('Trigger Building') {
when {
environment(name: 'DO_BUILD_PACKAGES', value: 'true')
}
steps {
executeModuleScripts('build') // local method, see at the end of this script
}
}
// at the end of the file or in a shared library
void executeModuleScripts(String operation) {
def allModules = ['module1', 'module2', 'module3', 'module4', 'module11']
allModules.each { module ->
String action = "${operation}:${module}"
echo("---- ${action.toUpperCase()} ----")
String command = "npm run ${action} -ddd"
// here is the trick
script {
stage(module) {
bat(command)
}
}
}
}
You might want to take a look at this example - you can have a function return a closure which should be able to have a stage in it.
This code shows the concept, but doesn't have a stage in it.
def transformDeployBuildStep(OS) {
return {
node ('master') {
wrap([$class: 'TimestamperBuildWrapper']) {
...
} } // ts / node
} // closure
} // transformDeployBuildStep
stage("Yum Deploy") {
stepsForParallel = [:]
for (int i = 0; i < TargetOSs.size(); i++) {
def s = TargetOSs.get(i)
def stepName = "CentOS ${s} Deployment"
stepsForParallel[stepName] = transformDeployBuildStep(s)
}
stepsForParallel['failFast'] = false
parallel stepsForParallel
} // stage
Just an addition to what #np2807 and #Anton Yurchenko have already presented: you can create stages dynamically and run the in parallel by simply delaying list of stages creation (but keeping its declaration), e.g. like that:
def parallelStagesMap
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent { label 'master' }
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
// you may create your list here, lets say reading from a file after checkout
// personally, I like to use scriptler scripts and load the as simple as:
// list = load '/var/lib/jenkins/scriptler/scripts/load-list-script.groovy'
parallelStagesMap = list.collectEntries {
["${it}" : generateStage(it)]
}
}
}
}
stage('Run Stages in Parallel') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
That will result in Dynamic Parallel Stages:
I use this to generate my stages which contain a Jenkins job in them.
build_list is a list of Jenkins jobs that i want to trigger from my main Jenkins job, but have a stage for each job that is trigger.
build_list = ['job1', 'job2', 'job3']
for(int i=0; i < build_list.size(); i++) {
stage(build_list[i]){
build job: build_list[i], propagate: false
}
}
if you are using Jenkinsfile then, I achieved it via dynamically creating the stages, running them in parallel and also getting Jenkinsfile UI to show separate columns. This assumes parallel steps are independent of each other (otherwise don't use parallel) and you can nest them as deep as you want (depending upon the # of for loops you'll nest for creating stages).
Jenkinsfile Pipeline DSL: How to Show Multi-Columns in Jobs dashboard GUI - For all Dynamically created stages - When within PIPELINE section see here for more.

Resources