Jenkins multithreading functions - jenkins

I'm in need to run two defined functions parallel in Jenkins pipeline.
As defined in jenkins, the keyword parallel used with jobs, seems don't work with function calling.
What I've tried is -
def first_func(){
echo "first function"
}
def second_func(){
echo "second function"
}
node {
task = [:]
function_lists = ['first_func()', 'second_func()']
stage ('build') {
for (job in function_lists) {
task[job] = { '${job}' }
}
parallel task
}
}
don't actually call the functions. Is there any way to do so in jenkins?

Yes this can be achieved in below way:
def first_func(){
echo "first function"
}
def second_func(){
echo "second function"
}
node {
def task = [:]
stage ('build') {
// Loop through list
['first_func', 'second_func'].each {
def a = it;
task[a] = { "${a}"()}
}
parallel task
}
}
Output :

Related

How to create stages dynamically / concatenate closures?

What i want to achieve is building a list of stages with avoiding using when{}. Im trying to run parallel pipelines
Here is example code
def stage_pull = {
stage('pulling') {
echo 'pulling'
}
}
def stage_build = {
stage(pulling) {
echo 'building'
}
}
def stage_deb = {
stage(pulling) {
echo 'deb file'
}
}
def transformIntoStages(stage1,stage2) {
//return stage1 + stage2
//return {stage1;stage2}
return stage1 << stage2
}
def agent_list = ["agent1", "agent2"]
stepsForParallel = [:]
stepsForParallel['agent1'] = transformIntoStages(stage_pull,stage_build)
stepsForParallel['agent2'] = transformIntoStages(stage_pull,stage_deb)
pipeline{
agent any
options {
timestamps()
}
stages{
stage('BUILD'){
steps{
script{
parallel stepsForParallel
}
}
}
}
}
This is simplified version. In real project, the number of used stages will be different for each agent.
I also have version with closures inside methods ...
https://pastebin.com/gPJjPx59
But none of this work.
PS. I know matrix{}, i use it often but I dont want to use it in this particular case.
I think i managed to achieve the goal by using strings and evaluate() function.
def stage_pull() {
return """
stage('pulling') {
echo 'pulling'
}
"""
}
def stage_build() {
return """
stage('building') {
echo 'building'
}
"""
}
def stage_deb() {
return """
stage('deb') {
echo 'deb file'
}
"""
}
def transformIntoStages(stage1,stage2) {
echo "{" + stage1 + stage2 + "}"
return { evaluate(stage1 + stage2) }
}
stepsForParallel = [:]
stepsForParallel['agent1'] = transformIntoStages(stage_pull(),stage_build())
stepsForParallel['agent2'] = transformIntoStages(stage_pull(),stage_deb())
stepsForParallel['agent3'] = transformIntoStages(stage_pull(),'')
pipeline{
agent any
options {
timestamps()
}
stages{
stage('BUILD'){
steps{
script{
parallel stepsForParallel
}
}
}
}
}
However im afraid that in case of more complicated stages/functions/structures with different kinds of parenthesis it will start to be a mess. And Blue Ocean cant show this properly. But in logs with timestamps and most of all in "Pipeline Steps" section i can see that it works as it should.
So im still open to some suggestions

How to build a combination of parallel and sequential stages in Jenkins pipeline with dynamic data

I am trying to build a Jenkins pipeline which has a combination of parallel and sequential stages. I am able to accomplish the same with static data but failing to get it working when using dynamic data, i.e. when using a parameterized build and reading data from the build parameters.
Below snippet works fine
pipeline {
agent any
stages {
stage('Parallel Tests') {
parallel {
stage('Ordered Tests Set') {
stages {
stage('Building seq test 1') {
steps {
echo "build seq test 1"
}
}
stage('Building seq test 2') {
steps {
echo "build seq test 2"
}
}
}
}
stage('Building Parallel test 1') {
steps {
echo "Building Parallel test 1"
}
}
stage('Building Parallel test 2') {
steps {
echo "Building Parallel test 2"
}
}
}
}
}
}
Gives me the following execution result
Now i want to read the values from my build parameters and just loop the stages . This is what i have tried but could not get it to work. This bit of snippet is taken from another answer i found few months back in SO but unable to trace now, else would have added the link -
def parallelStagesMap = params['Parallel Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedStagesMap = params['Ordered Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedMap (){
def orderedStagesMapList= [:]
orderedStagesMapList['Ordered Tests Set']= {
stage('Ordered Tests Set') {
stages{
orderedStagesMap
}
}
}
return orderedStagesMapList;
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage ("Parallel Stage to trigger Tests"){
steps {
script {
parallel orderedMap()+parallelStagesMap
}
}
}
}
}
Declarative and Scripted Pipeline syntax do not mix in Pipeline, see Pipeline Syntax. Since you are dynamically creating a Pipeline definition based on the parameters, you should most likely go completely to Scripted Syntax, unless your use-case matches matrix.
Removing the Declarative syntax from your Pipeline Definition would give something like below. Note that I did not test it on the live Jenkins instance.
def parallelStagesMap = params['Parallel Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedStagesMap = params['Ordered Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedMap (){
def orderedStagesMapList= [:]
orderedStagesMapList['Ordered Tests Set']= {
stage('Ordered Tests Set') {
orderedStagesMap.each { key, value ->
value.call()
}
}
}
return orderedStagesMapList;
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
stage("Parallel Stage to trigger Tests") {
parallel orderedMap()+parallelStagesMap
}

Jenkins parallel script in loop using wrong variables

I'm trying to build a dynamic group of steps to run in parallel. The following example is what I came up with (and found examples of at https://devops.stackexchange.com/questions/3073/how-to-properly-achieve-dynamic-parallel-action-with-a-declarative-pipeline). But I'm having trouble getting it to use the expected variables. The result always seems to be the variables from the last iteration of the loop.
In the following example the echo output is always bdir2 for both tests:
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
tests[roleName] = {
echo roleName
}
}
parallel tests
}
}
}
}
}
I'm expecting one of the tests to output adir2 and another to be bdir2. What am I missing here?
Just try to move the test section a little higher, and it will be work
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
tests[f] = {
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
echo roleName
}
}
parallel tests
}
}
}
}
}

Dynamic number of parallel steps in declarative pipeline

I'm trying to create a declarative pipeline which does a number (configurable via parameter) jobs in parallel, but I'm having trouble with the parallel part.
Basically, for some reason the below pipeline generates the error
Nothing to execute within stage "Testing" # line .., column ..
and I cannot figure out why, or how to solve it.
import groovy.transform.Field
#Field def mayFinish = false
def getJob() {
return {
lock("finiteResource") {
waitUntil {
script {
mayFinish
}
}
}
}
}
def getFinalJob() {
return {
waitUntil {
script {
try {
echo "Start Job"
sleep 3 // Replace with something that might fail.
echo "Finished running"
mayFinish = true
true
} catch (Exception e) {
echo e.toString()
echo "Failed :("
}
}
}
}
}
def getJobs(def NUM_JOBS) {
def jobs = [:]
for (int i = 0; i < (NUM_JOBS as Integer); i++) {
jobs["job{i}"] = getJob()
}
jobs["finalJob"] = getFinalJob()
return jobs
}
pipeline {
agent any
options {
buildDiscarder(logRotator(numToKeepStr:'5'))
}
parameters {
string(
name: "NUM_JOBS",
description: "Set how many jobs to run in parallel"
)
}
stages {
stage('Setup') {
steps {
echo "Setting it up..."
}
}
stage('Testing') {
steps {
parallel getJobs(params.NUM_JOBS)
}
}
}
}
I've seen plenty of examples doing this in the old pipeline, but not declarative.
Anyone know what I'm doing wrong?
At the moment, it doesn't seem possible to dynamically provide the parallel branches when using a Declarative Pipeline.
Even if you have a stage prior where, in a script block, you call getJobs() and add it to the binding, the same error message is thrown.
In this case you'd have to fall back to using a Scripted Pipeline.

Can I create dynamically stages in a Jenkins pipeline?

I need to launch a dynamic set of tests in a declarative pipeline.
For better visualization purposes, I'd like to create a stage for each test.
Is there a way to do so?
The only way to create a stage I know is:
stage('foo') {
...
}
I've seen this example, but I it does not use declarative syntax.
Use the scripted syntax that allows more flexibility than the declarative syntax, even though the declarative is more documented and recommended.
For example stages can be created in a loop:
def tests = params.Tests.split(',')
for (int i = 0; i < tests.length; i++) {
stage("Test ${tests[i]}") {
sh '....'
}
}
As JamesD suggested, you may create stages dynamically (but they will be sequential) like that:
def list
pipeline {
agent none
options {buildDiscarder(logRotator(daysToKeepStr: '7', numToKeepStr: '1'))}
stages {
stage('Create List') {
agent {node 'nodename'}
steps {
script {
// you may create your list here, lets say reading from a file after checkout
list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
}
}
post {
cleanup {
cleanWs()
}
}
}
stage('Dynamic Stages') {
agent {node 'nodename'}
steps {
script {
for(int i=0; i < list.size(); i++) {
stage(list[i]){
echo "Element: $i"
}
}
}
}
post {
cleanup {
cleanWs()
}
}
}
}
}
That will result in:
dynamic-sequential-stages
If you don't want to use for loop, and generated pipeline to be executed in parallel then, here is an answer.
def jobs = ["JobA", "JobB", "JobC"]
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
Note that all generated stages will be executed into 1 node.
If you are willing to executed the generated stages to be executed into different nodes.
def agents = ['master', 'agent1', 'agent2']
// enter valid agent name in array.
def generateStage(nodeLabel) {
return {
stage("Runs on ${nodeLabel}") {
node(nodeLabel) {
echo "Running on ${nodeLabel}"
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : generateStage(it)]
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
You can of course add more than 1 parameters and can use collectEntries for 2 parameters.
Please remember return in function generateStage is must.
#Jorge Machado: Because I cannot comment I had to post it as an answer. I've solved it recently. I hope it'll help you.
Declarative pipeline:
A simple static example:
stage('Dynamic') {
steps {
script {
stage('NewOne') {
echo('new one echo')
}
}
}
}
Dynamic real-life example:
// in a declarative pipeline
stage('Trigger Building') {
when {
environment(name: 'DO_BUILD_PACKAGES', value: 'true')
}
steps {
executeModuleScripts('build') // local method, see at the end of this script
}
}
// at the end of the file or in a shared library
void executeModuleScripts(String operation) {
def allModules = ['module1', 'module2', 'module3', 'module4', 'module11']
allModules.each { module ->
String action = "${operation}:${module}"
echo("---- ${action.toUpperCase()} ----")
String command = "npm run ${action} -ddd"
// here is the trick
script {
stage(module) {
bat(command)
}
}
}
}
You might want to take a look at this example - you can have a function return a closure which should be able to have a stage in it.
This code shows the concept, but doesn't have a stage in it.
def transformDeployBuildStep(OS) {
return {
node ('master') {
wrap([$class: 'TimestamperBuildWrapper']) {
...
} } // ts / node
} // closure
} // transformDeployBuildStep
stage("Yum Deploy") {
stepsForParallel = [:]
for (int i = 0; i < TargetOSs.size(); i++) {
def s = TargetOSs.get(i)
def stepName = "CentOS ${s} Deployment"
stepsForParallel[stepName] = transformDeployBuildStep(s)
}
stepsForParallel['failFast'] = false
parallel stepsForParallel
} // stage
Just an addition to what #np2807 and #Anton Yurchenko have already presented: you can create stages dynamically and run the in parallel by simply delaying list of stages creation (but keeping its declaration), e.g. like that:
def parallelStagesMap
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent { label 'master' }
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
// you may create your list here, lets say reading from a file after checkout
// personally, I like to use scriptler scripts and load the as simple as:
// list = load '/var/lib/jenkins/scriptler/scripts/load-list-script.groovy'
parallelStagesMap = list.collectEntries {
["${it}" : generateStage(it)]
}
}
}
}
stage('Run Stages in Parallel') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
That will result in Dynamic Parallel Stages:
I use this to generate my stages which contain a Jenkins job in them.
build_list is a list of Jenkins jobs that i want to trigger from my main Jenkins job, but have a stage for each job that is trigger.
build_list = ['job1', 'job2', 'job3']
for(int i=0; i < build_list.size(); i++) {
stage(build_list[i]){
build job: build_list[i], propagate: false
}
}
if you are using Jenkinsfile then, I achieved it via dynamically creating the stages, running them in parallel and also getting Jenkinsfile UI to show separate columns. This assumes parallel steps are independent of each other (otherwise don't use parallel) and you can nest them as deep as you want (depending upon the # of for loops you'll nest for creating stages).
Jenkinsfile Pipeline DSL: How to Show Multi-Columns in Jobs dashboard GUI - For all Dynamically created stages - When within PIPELINE section see here for more.

Resources