Can Jenkins pipelines have variable stages? - jenkins

From my experience with Jenkins declarative-syntax pipelines, I'm aware that you can conditionally skip a stage with a when clause. E.g.:
run_one = true
run_two = false
run_three = true
pipeline {
agent any
stages {
stage('one') {
when {
expression { run_one }
}
steps {
echo 'one'
}
}
stage('two') {
when {
expression { run_two }
}
steps {
echo 'two'
}
}
stage('three') {
when {
expression { run_three }
}
steps {
echo 'three'
}
}
}
}
...in the above code block, there are three stages, one, two, and three, each of whose execution is conditional on a boolean variable.
I.e. the paradigm is that there is a fixed superset of known stages, of which individual stages may be conditionally skipped.
Does Jenkins pipeline script support a model where there is no fixed superset of known stages, and stages can be "looked up" for conditional execution?
To phrase it as pseudocode, is something along the lines of the following possible:
my_list = list populated _somehow_, maybe reading a file, maybe Jenkins build params, etc.
pipeline {
agent any
stages {
if (stage(my_list[0]) exists) {
run(stage(my_list[0]))
}
if (stage(my_list[1]) exists) {
run(stage(my_list[1]))
}
if (stage(my_list[2]) exists) {
run(stage(my_list[2]))
}
}
}
?
I think another way to think about what I'm asking is: is there a way to dynamically build a pipeline from some dynamic assembly of stages?

For dynamic stages you could write either a fully scripted pipeline or use a declarative pipeline with a scripted section (e. g. by using the script {…} step or calling your own function). For an overview see Declarative versus Scripted Pipeline syntax and Pipeline syntax overview.
Declarative pipeline is better supported by Blue Ocean so I personally would use that as a starting point. Disadvantage might be that you need to have a fixed root stage, but I usually name that "start" or "init" so it doesn't look too awkward.
In scripted sections you can call stage as a function, so it can be used completely dynamic.
pipeline {
agent any
stages {
stage('start') {
steps {
createDynamicStages()
}
}
}
}
void createDynamicStages() {
// Stage list could be read from a file or whatever
def stageList = ['foo', 'bar']
for( stageName in stageList ) {
stage( stageName ) {
echo "Hello from stage $stageName"
}
}
}
This shows in Blue Ocean like this:

Related

Declarative dynamic parallel stages

I figure I’m doing something unorthodox here, but I’d like to stick to declarative for convenience while dynamically generating parallel steps.
I found a way to do something like that, but mixing both paradigms, which doesn’t seem to work well with the BlueOcean UI (multiple stages inside each parallel branch do not show up properly).
The closest I got was with something like this:
def accounts() {
return ["dynamic", "list"]
}
def parallelJobs() {
jobs = []
for (account in accounts()) {
jobs[] = stage(account) {
steps {
echo "Step for $account"
}
}
}
return jobs
}
# this is inside a shared library, called by my Jenkinsfile, like what is described
# under "Defining Declarative Pipelines in Shared Libraries" in
# https://www.jenkins.io/blog/2017/09/25/declarative-1/
def call() {
pipeline {
stages {
stage('Build all variations') {
parallel parallelJobs()
}
}
}
}
The problem is Jenkins errors like this:
Expected a block for parallel # line X, column Y.
parallel parallelJobs()
^
So, I was wondering if there is a way I could transform that list of stages, returned by parallelJobs(), into the block expected by Jenkins...
Yes, you can. You need to return a map of stages. Following is a working pipeline example.
pipeline {
agent any
stages {
stage('Parallel') {
steps {
script {
parallel parallelJobs()
}
}
}
}
}
def accounts() {
return ["dynamic", "list"]
}
def parallelJobs() {
jobs = [:]
for (account in accounts()) {
jobs[account] = { stage(account) {
echo "Step for $account"
}
}
}
return jobs
}

How do I get access to Jenkins when condition functions outside of when condition directives?

For example these:
when { changeRequest() }
when { buildingTag() }
when { changeset "**/*.js" }
I don't want this information just in when conditions I want to access it in code in script{} directives and even before the pipeline directive potentially.
For example I want to do things like this:
IS_A_PR = changeRequest()
IS_A_TAG = buildingTag()
pipeline {
stages {
stage('Checkout') {
script{
sh "someCommand ${IS_A_TAG}"
I don't want to only capture this in when conditions that turn stages on or off.

Jenkins 'agent: none' lightweight executor equivalent with scripted pipeline

With Jenkins declarative syntax, it's possible to run parallel stages with no top-level agent. This ends up consuming two executors, since the top level agent is marked 'none':
pipeline {
agent none
stages {
stage('Run on parallel nodes') {
parallel {
stage('Do one thing') {
agent any
steps {
...
}
stage('Do another thing') {
agent any
steps {
...
}
}
}
}
}
}
With scripted pipelines, which requires a top-level 'node' element, this is seemingly not possible. This ends up consuming three executors, even though only two are doing real work:
node {
stage('Run on parallel nodes') {
parallel ([
'Do one thing': {
node() {
...
}
},
'Do another thing': {
node() {
...
}
}
])
}
}
Is a 'lightweight' top level executor possible with scripted pipelines?
Scripted pipelines don't require a top-level node allocation. This is just wrong and can be left out.

Jenkins Declarative Pipeline detect first run and fail when choice parameters present

I often write Declarative Pipeline jobs where I setup parameters such as "choice". The first time I run the job, it blindly executes using the first value in the list. I don't want that to happen. I want to detect this case and not continue the job if a user didn't select a real value.
I thought about using "SELECT_VALUE" as the first item in the list and then fail the job if that is the value. I know I can use a 'when' condition on each stage, but I'd rather not have to copy that expression to each stage in the pipeline. I'd like to fail the whole job with one check up front.
I don't like the UI for 'input' tasks because the controls are hidden until you hover over a running stage.
What is the best way to validate arguments with a Declarative Pipeline? Is there a better way to detect when the job is run for the first time and stop?
I've been trying to figure this out myself and it looks like the pipeline runs with a fully populated parameters list.
So, the answer to your choice option is to make the first item a value like "please select option" and have your code use when to check that
For example
def paramset = true
pipeline {
parameters {
choice(choices: ['select','test','proof', 'prod'], name: 'ENVI')
}
stages {
stage ('check') {
when { expression { return params.choice.ENVI == 'select' }
steps {
script {
echo "Missing parameters"
paramset = false
}
}
}
stage ('step 1') {
when { expression { return paramset }
steps {
script {
echo "Doing step 1"
}
}
}
stage ('step 2') {
when { expression { return paramset }
steps {
script {
echo "Doing step 2"
}
}
}
stage ('step 3') {
when { expression { return paramset }
steps {
script {
echo "Doing step 3"
}
}
}
}
}

Can I build stages with a function in a Jenkinsfile?

I'd like to use a function to build some of the stages of my Jenkinsfile. This is going to be a build with a number of repetitive stages/steps - I'd not like to generate everything manually.
I was wondering if it's possible to do something like this:
_make_stage() {
stage("xx") {
step("A") {
echo "A"
}
step("B") {
echo "B"
}
}
}
_make_stages() {
stages {
_make_stage()
}
}
// pipeline starts here!
pipeline {
agent any
_make_stages()
}
Unfortunately Jenkins doesn't like this - when I run I get the error:
WorkflowScript: 24: Undefined section "_make_stages" # line 24, column 5.
_make_stages()
^
WorkflowScript: 22: Missing required section "stages" # line 22, column 1.
pipeline {
^
So what's going wrong here? The function _make_stages() really looks like it returns whatever the stages object returns. Why does it matter whether I put that in a function call or just inline it into the pipeline definition?
As explained here, Pipeline "scripts" are not simple Groovy scripts, they are heavily transformed before running, some parts on master, some parts on slaves, with their state (variable values) serialized and passed to the next step. As such, every Groovy feature is not supported, and what you see as simple functions really is not.
It does not mean what you want to achieve is impossible. You can create stages programmatically, but apparently not with the declarative syntax. See also this question for good suggestions.
You can define a declarative pipeline in a shared library, for example:
// vars/evenOrOdd.groovy
def call(int buildNumber) {
if (buildNumber % 2 == 0) {
pipeline {
agent any
stages {
stage('Even Stage') {
steps {
echo "The build number is even"
}
}
}
}
} else {
pipeline {
agent any
stages {
stage('Odd Stage') {
steps {
echo "The build number is odd"
}
}
}
}
}
}
// Jenkinsfile
#Library('my-shared-library') _
evenOrOdd(currentBuild.getNumber())
See Defining Declarative Pipelines

Resources