Is it possible to defined entire stages in a custom step? - jenkins

Currently I defined shared pipelines in my library and pass them params from the Jenkinsfiles. I also modularize a lot of code into custom steps.
This works great and does everything I want:
// Jenkinsfile in repo just uses the shared pipeline
#Library('shared-lib') _
mySharedPipeline ([
myParam = "sdlkfjlskdjfsd"
])
// The shared pipeline vars/mySharedPipeline.groovy
def call(Map config) {
someVar = config.myParam
someOtherVar = param.SomeUIParam
pipeline {
agent none
stages {
stage ('one') {
steps {
script {
myCustomStep derp: someVar
}
}
}
stage ('two') {
steps {
script {
myCustomStep derp: someVar
}
}
}
stage ('three') {
steps {
script {
myCustomStep derp: someVar
}
}
}
}
}
}
// Custom steps are used by the shared pipeline vars/myCustomStep.groovy
def call(Map config) {
derp = config.derp
// do stuff
}
But want I'd like to do now is put entire stages into custom steps too. Is this possible? I can't figure out how to pass a stage into stages{} and also define stages in the usual way.
For example I want this (does not work):
// Custom steps vars/myCustomStep.groovy
def call(Map config) {
derp = config.derp
// do stuff
}
// Define a stage vars/mySharedStage.groovy
stage ('one') {
steps {
script {
// the shared stage will also use custom steps
myCustomStep derp: someVar
}
}
}
// The shared pipeline vars/mySharedPipeline.groovy
def call(Map config) {
someVar = config.myParam
someOtherVar = param.SomeUIParam
pipeline {
agent none
stages {
// I want to be able to just plug in a stage where ever I want like this
mySharedStage
stage ('one') {
steps {
script {
myCustomStep derp: someVar
}
}
}
stage ('two') {
steps {
script {
myCustomStep derp: someVar
}
}
}
stage ('three') {
steps {
script {
myCustomStep derp: someVar
}
}
}
}
}
}
I get the error: Expected a stage
I even tried mySharedStage.call() and I still get this error
I added the groovy label, but fyi Jenkins runtime is very restrictive so many groovy-isms will not work here.
Edit
Another attempt to nest a stage in a custom step and reference inside the pipeline block, this also throws Expected a stage
// vars/mySharedStage.grooy
def call(Map config) {
stage ('My Shared Stage') {
steps {
echo "derps"
}
}
}
// inside vars/mySharedPipeline.groovy
stages {
stage ('one') {
steps {
script {
myCustomStep derp: someVar
}
}
}
mySharedStage
stage ('two') {
steps {
script {
myCustomStep derp: someVar
}
}
}
Edit
Just as an experiment I instantiated a Stage object in raw groovy code. I could create a Stage object without issue, but adding it in between other stage blocks raised the same error. I want to do both interchangeably- use the stage closures AND plugin a shared stage wherever I want.
def call(Map config) {
String stageName = "my stage"
StepsBlock stepsBlock = new StepsBlock()
Agent myAgent = new Agent({})
PostStage myPost = new PostStage(["myPostStage": new StepsBlock()])
StageConditionals myWhen = new StageConditionals({})
Tools tools = new Tools(["my tools": {}])
Environment myEnvironment = new Environment(new EnvironmentResolver(), new EnvironmentResolver())
Boolean failFast = true
StageOptions myOptions = new StageOptions(["xxx":{}], ["yyy":{}])
StageInput input = new StageInput("one", "two", "three", "four", "five", [])
Stages myStages = new Stages([])
Parallel myParallel = new Parallel([])
Environment anotherEnvironment = new Environment(new EnvironmentResolver(), new EnvironmentResolver())
// https://javadoc.jenkins.io/plugin/pipeline-model-definition/org/jenkinsci/plugins/pipeline/modeldefinition/model/Stage.html
// I can at least instantiate an object
Stage myStage = new Stage (
stageName,
stepsBlock,
myAgent,
myPost,
myWhen,
tools,
myEnvironment,
failFast,
myOptions,
input,
myStages,
myParallel,
anotherEnvironment,
)
pipeline {
agent none
stages {
stage ('one') {
steps {
script {
myCustomStep derp: someVar
}
}
}
// throws the same "expected stage" error
myStage
stage ('two') {
steps {
script {
myCustomStep derp: someVar
}
}
}
}
}

Related

How to run all stages in parallel in jenkinsfile

I want to execute all the stages in parallel with the loop based on user input.
This gives error because script is not allowed under stages.
How should I achieve the same?
pipeline {
agent {
node {
label 'ec2'
}
}
stages{
script{
int[] array = params.elements;
for(int i in array) {
parallel{
stage('Preparation') {
echo 'Preparation'
println(i);
}
stage('Build') {
echo 'Build'
println(i);
}
}
}
}
}
}
If you are using declarative pipelines you have two options, first is to use static parallel stages which is an integral part of the declarative syntax but does not allow dynamic or runtime modifications.
The second option (which is probably what you attempted) is to use the scripted parallel function:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false```
When using it inside a declarative pipeline it should be used inside a script block like you did but the declarative basic directive must still be kept: pipeline -> stages -> stage -> steps -> script. In addition the scripted parallel function receives a specifically formatted map alike the example above.
In your case it can look somethong like:
pipeline {
agent {
node {
label 'ec2'
}
}
stages {
stage('Parallel Execution') {
steps {
script {
parallel params.elements.collectEntries {
// the key of each entry is the parallel execution branch name
// and the value of each entry is the code to execute
["Iteration for ${it}" : {
stage('Preparation') {
echo 'Preparation'
println(it);
}
stage('Build') {
echo 'Build'
println(it);
}
}]
}
}
}
}
}
}
Or if you want to use the for loop:
pipeline {
agent {
node {
label 'ec2'
}
}
stages {
stage('Parallel Execution') {
steps {
script {
map executions = [:]
for(int i in params.elements) {
executions["Iteration for ${it}" ] = {
stage('Preparation') {
echo 'Preparation'
println(i);
}
stage('Build') {
echo 'Build'
println(i);
}
}]
}
parallel executions
}
}
}
}
}
Other useful examples for the parallel function can be found here

Jenkins Parallel Build reads in an empty map, but it held data in a previous stage

Total noobie trying to make a parallel build more dynamic.
Using this declarative script https://stackoverflow.com/a/48421660/14335065
instead of reading in a prepopulated map def jobs = ["JobA", "JobB", "JobC"], which works perfectly.
I am trying to read in from a global map variable JOBS = [] which I populate in a stage using JOBS.add("JobAAA") syntax.
Printing out JOBS in a pipeline stage shows there are contents within,
JOBS map is [JobAAA, JobBBB, JobCCC]
but when I use it to generate a parallel build it seems to become empty and I am getting error message
No branches to run
I know I must be mixing my understands up somewhere, but can anyone please point me in the right direction.
Here is the code I am fighting with
def jobs = ["JobA", "JobB", "JobC"]
JOBS_MAP = []
def parallelStagesMap = jobs.collectEntries() {
["${it}" : generateStage(it)]
}
def parallelStagesMapJOBS = JOBS_MAP.collectEntries(){
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("Build: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent any
stages {
stage('populate JOBS map') {
steps {
script {
JOBS_MAP.add("JobAAA")
JOBS_MAP.add("JobBBB")
JOBS_MAP.add("JobCCC")
}
}
}
stage('print out JOBS map'){
steps {
echo "JOBS_MAP map is ${JOBS_MAP}"
}
}
stage('parallel job stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
stage('parallel JOBS stage') {
steps {
script {
parallel parallelStagesMapJOBS
}
}
}
}
}
Try this:
def jobs = ["JobA", "JobB", "JobC"]
JOBS_MAP = []
def generateStage(job) {
return {
stage("Build: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent any
stages {
stage('populate JOBS map') {
steps {
script {
JOBS_MAP.add("JobAAA")
JOBS_MAP.add("JobBBB")
JOBS_MAP.add("JobCCC")
}
}
}
stage('print out JOBS map'){
steps {
echo "JOBS_MAP map is ${JOBS_MAP}"
}
}
stage('parallel job stage') {
steps {
script {
def parallelStagesMap = jobs.collectEntries() {
["${it}" : generateStage(it)]
}
parallel parallelStagesMap
}
}
}
stage('parallel JOBS stage') {
steps {
script {
def parallelStagesMapJOBS = JOBS_MAP.collectEntries(){
["${it}" : generateStage(it)]
}
parallel parallelStagesMapJOBS
}
}
}
}
}

How to build a combination of parallel and sequential stages in Jenkins pipeline with dynamic data

I am trying to build a Jenkins pipeline which has a combination of parallel and sequential stages. I am able to accomplish the same with static data but failing to get it working when using dynamic data, i.e. when using a parameterized build and reading data from the build parameters.
Below snippet works fine
pipeline {
agent any
stages {
stage('Parallel Tests') {
parallel {
stage('Ordered Tests Set') {
stages {
stage('Building seq test 1') {
steps {
echo "build seq test 1"
}
}
stage('Building seq test 2') {
steps {
echo "build seq test 2"
}
}
}
}
stage('Building Parallel test 1') {
steps {
echo "Building Parallel test 1"
}
}
stage('Building Parallel test 2') {
steps {
echo "Building Parallel test 2"
}
}
}
}
}
}
Gives me the following execution result
Now i want to read the values from my build parameters and just loop the stages . This is what i have tried but could not get it to work. This bit of snippet is taken from another answer i found few months back in SO but unable to trace now, else would have added the link -
def parallelStagesMap = params['Parallel Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedStagesMap = params['Ordered Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedMap (){
def orderedStagesMapList= [:]
orderedStagesMapList['Ordered Tests Set']= {
stage('Ordered Tests Set') {
stages{
orderedStagesMap
}
}
}
return orderedStagesMapList;
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage ("Parallel Stage to trigger Tests"){
steps {
script {
parallel orderedMap()+parallelStagesMap
}
}
}
}
}
Declarative and Scripted Pipeline syntax do not mix in Pipeline, see Pipeline Syntax. Since you are dynamically creating a Pipeline definition based on the parameters, you should most likely go completely to Scripted Syntax, unless your use-case matches matrix.
Removing the Declarative syntax from your Pipeline Definition would give something like below. Note that I did not test it on the live Jenkins instance.
def parallelStagesMap = params['Parallel Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedStagesMap = params['Ordered Job Set'].split(',').collectEntries {
["${it}" : generateStage(it)]
}
def orderedMap (){
def orderedStagesMapList= [:]
orderedStagesMapList['Ordered Tests Set']= {
stage('Ordered Tests Set') {
orderedStagesMap.each { key, value ->
value.call()
}
}
}
return orderedStagesMapList;
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
stage("Parallel Stage to trigger Tests") {
parallel orderedMap()+parallelStagesMap
}

Issue porting Jenkinsfile scripted to declarative withEnv{} => environment{}

I have issue porting scripted to declarative pipeline. I used to have in scripted:
//Scripted
def myEnv = [:]
stage ('Prepare my env') { [...] myEnv = ... }
stage ('Fancy stuff') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
}
}
stage ('Fancy stuff2') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
} }
}
and now in declarative I would like to have
//Declarative
def myEnv = [:]
pipeline {
agent none
stage('Prepare my env') {
steps {
script {
[...]
myEnv = ...
}
}
}
stages {
environment { myEnv }
stage('Fancy stuff') {
[...]
}
stage('Fancy stuff2') {
[...]
}
} }
when I try to run this, it fails withEnv
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: WorkflowScript: xx: "myEnv" is not a valid environment
expression. Use "key = value" pairs with valid Java/shell keys.
Fair enough.
What should I do to be able to use declarative environment { } to avoid using withEnv(myEnv) one in every further steps?
it seems that the part you are missing is the usage of environment clause.
Instead of
environment { myEnv }
It should be
environment { myEnvVal = myEnv }
Just as the error method mentions this should be key = value pair.
Your issue comes from the type of your variable myEnv. You define it as a map when you do def myEnv = [:].
So it works with withEnv that takes a map as parameter but it does not work with environment {...} that takes only "key = value" statements.
The solution depends on how you add environment variables contained in myEnv.
The simplest way is using environment directive by listing all the key/values contained in your former variable myEnv:
pipeline{
agent none
environment {
test1 = 'test-1'
test2 = 'test-2'
}
stages{
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}
But you also do it the scripted way :
pipeline{
agent none
stages{
stage('Prepare my env') {
steps {
script {
def test = []
for (int i = 1; i < 3; ++i) {
test[i] = 'test-' + i.toString()
}
test1 = test[1]
test2 = test[2]
}
}
}
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}

Can I create dynamically stages in a Jenkins pipeline?

I need to launch a dynamic set of tests in a declarative pipeline.
For better visualization purposes, I'd like to create a stage for each test.
Is there a way to do so?
The only way to create a stage I know is:
stage('foo') {
...
}
I've seen this example, but I it does not use declarative syntax.
Use the scripted syntax that allows more flexibility than the declarative syntax, even though the declarative is more documented and recommended.
For example stages can be created in a loop:
def tests = params.Tests.split(',')
for (int i = 0; i < tests.length; i++) {
stage("Test ${tests[i]}") {
sh '....'
}
}
As JamesD suggested, you may create stages dynamically (but they will be sequential) like that:
def list
pipeline {
agent none
options {buildDiscarder(logRotator(daysToKeepStr: '7', numToKeepStr: '1'))}
stages {
stage('Create List') {
agent {node 'nodename'}
steps {
script {
// you may create your list here, lets say reading from a file after checkout
list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
}
}
post {
cleanup {
cleanWs()
}
}
}
stage('Dynamic Stages') {
agent {node 'nodename'}
steps {
script {
for(int i=0; i < list.size(); i++) {
stage(list[i]){
echo "Element: $i"
}
}
}
}
post {
cleanup {
cleanWs()
}
}
}
}
}
That will result in:
dynamic-sequential-stages
If you don't want to use for loop, and generated pipeline to be executed in parallel then, here is an answer.
def jobs = ["JobA", "JobB", "JobC"]
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
Note that all generated stages will be executed into 1 node.
If you are willing to executed the generated stages to be executed into different nodes.
def agents = ['master', 'agent1', 'agent2']
// enter valid agent name in array.
def generateStage(nodeLabel) {
return {
stage("Runs on ${nodeLabel}") {
node(nodeLabel) {
echo "Running on ${nodeLabel}"
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : generateStage(it)]
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
You can of course add more than 1 parameters and can use collectEntries for 2 parameters.
Please remember return in function generateStage is must.
#Jorge Machado: Because I cannot comment I had to post it as an answer. I've solved it recently. I hope it'll help you.
Declarative pipeline:
A simple static example:
stage('Dynamic') {
steps {
script {
stage('NewOne') {
echo('new one echo')
}
}
}
}
Dynamic real-life example:
// in a declarative pipeline
stage('Trigger Building') {
when {
environment(name: 'DO_BUILD_PACKAGES', value: 'true')
}
steps {
executeModuleScripts('build') // local method, see at the end of this script
}
}
// at the end of the file or in a shared library
void executeModuleScripts(String operation) {
def allModules = ['module1', 'module2', 'module3', 'module4', 'module11']
allModules.each { module ->
String action = "${operation}:${module}"
echo("---- ${action.toUpperCase()} ----")
String command = "npm run ${action} -ddd"
// here is the trick
script {
stage(module) {
bat(command)
}
}
}
}
You might want to take a look at this example - you can have a function return a closure which should be able to have a stage in it.
This code shows the concept, but doesn't have a stage in it.
def transformDeployBuildStep(OS) {
return {
node ('master') {
wrap([$class: 'TimestamperBuildWrapper']) {
...
} } // ts / node
} // closure
} // transformDeployBuildStep
stage("Yum Deploy") {
stepsForParallel = [:]
for (int i = 0; i < TargetOSs.size(); i++) {
def s = TargetOSs.get(i)
def stepName = "CentOS ${s} Deployment"
stepsForParallel[stepName] = transformDeployBuildStep(s)
}
stepsForParallel['failFast'] = false
parallel stepsForParallel
} // stage
Just an addition to what #np2807 and #Anton Yurchenko have already presented: you can create stages dynamically and run the in parallel by simply delaying list of stages creation (but keeping its declaration), e.g. like that:
def parallelStagesMap
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent { label 'master' }
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
// you may create your list here, lets say reading from a file after checkout
// personally, I like to use scriptler scripts and load the as simple as:
// list = load '/var/lib/jenkins/scriptler/scripts/load-list-script.groovy'
parallelStagesMap = list.collectEntries {
["${it}" : generateStage(it)]
}
}
}
}
stage('Run Stages in Parallel') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
That will result in Dynamic Parallel Stages:
I use this to generate my stages which contain a Jenkins job in them.
build_list is a list of Jenkins jobs that i want to trigger from my main Jenkins job, but have a stage for each job that is trigger.
build_list = ['job1', 'job2', 'job3']
for(int i=0; i < build_list.size(); i++) {
stage(build_list[i]){
build job: build_list[i], propagate: false
}
}
if you are using Jenkinsfile then, I achieved it via dynamically creating the stages, running them in parallel and also getting Jenkinsfile UI to show separate columns. This assumes parallel steps are independent of each other (otherwise don't use parallel) and you can nest them as deep as you want (depending upon the # of for loops you'll nest for creating stages).
Jenkinsfile Pipeline DSL: How to Show Multi-Columns in Jobs dashboard GUI - For all Dynamically created stages - When within PIPELINE section see here for more.

Resources