Jenkins pipeline - building parallel with map received from function - jenkins

Recently I have tried to build a jenkins pipeline with a large number of 'Tests' in one of its stages.
The thing is, at some point I got an error regarding my stages phase was too large, so I tried to solve it with function that will build all of my stages and I can run this function output(map of stages) in parallel.
Some of the stages have to run on agent(node) taken from a label, and others have some unique steps in them.
I am trying to understand in general, how can I write a function that will build a map to run in parallel - but was not successfull nor did I found any good example of it online.
I know the question is general, but if anyone can point me to some examples, or just write one, it will be great.
This is the snippet I am working on(not full JenkinsFile):
def getParallelBuilders(list_arr) {
def builders = [:]
builders['Test-1'] =
stage ('Test-1')
{
node('ci-nodes')
{
when {
environment name: 'TEST_NAME', value: 'true'
beforeAgent true
}
timeout(time: 1, unit: 'HOURS')
script { runtests() }
post {
success { onTestSuccess title: 'Temp', pytest: 'results.xml' }
cleanup { afterTestCleanup2("clean") }
}
}
}
return builders
}
The call to this function happens from my 'pipeline' block, after stages of build, configure etc:
stage('Testing') {
steps {
script { parallel getParallelBuilders(list_arr) }
}
}
Not sure if my approach to this problem is right at all,
hopefully someone can point me in the right direction.

After a while, here is the solution I got for my problem:
builders = [
'Test1':
{
stage ('Test1')
{
if (RUN_TESTS == 'true')
{
timeout(time: 30, unit: 'MINUTES')
{
node('ci-nodes')
{
try
{
runtests()
onTestSuccess title: 'Temp', pytest: 'results.xml'
}
catch (err)
{
onTestFailure testName: "Test1"
}
finally
{
afterTestCleanup()
}
}
}
}
}
}
The main issue was understanding the scripting pipeline syntax, which is really diffrent fron the declarative way.

Related

Can Jenkins pipelines have variable stages?

From my experience with Jenkins declarative-syntax pipelines, I'm aware that you can conditionally skip a stage with a when clause. E.g.:
run_one = true
run_two = false
run_three = true
pipeline {
agent any
stages {
stage('one') {
when {
expression { run_one }
}
steps {
echo 'one'
}
}
stage('two') {
when {
expression { run_two }
}
steps {
echo 'two'
}
}
stage('three') {
when {
expression { run_three }
}
steps {
echo 'three'
}
}
}
}
...in the above code block, there are three stages, one, two, and three, each of whose execution is conditional on a boolean variable.
I.e. the paradigm is that there is a fixed superset of known stages, of which individual stages may be conditionally skipped.
Does Jenkins pipeline script support a model where there is no fixed superset of known stages, and stages can be "looked up" for conditional execution?
To phrase it as pseudocode, is something along the lines of the following possible:
my_list = list populated _somehow_, maybe reading a file, maybe Jenkins build params, etc.
pipeline {
agent any
stages {
if (stage(my_list[0]) exists) {
run(stage(my_list[0]))
}
if (stage(my_list[1]) exists) {
run(stage(my_list[1]))
}
if (stage(my_list[2]) exists) {
run(stage(my_list[2]))
}
}
}
?
I think another way to think about what I'm asking is: is there a way to dynamically build a pipeline from some dynamic assembly of stages?
For dynamic stages you could write either a fully scripted pipeline or use a declarative pipeline with a scripted section (e. g. by using the script {…} step or calling your own function). For an overview see Declarative versus Scripted Pipeline syntax and Pipeline syntax overview.
Declarative pipeline is better supported by Blue Ocean so I personally would use that as a starting point. Disadvantage might be that you need to have a fixed root stage, but I usually name that "start" or "init" so it doesn't look too awkward.
In scripted sections you can call stage as a function, so it can be used completely dynamic.
pipeline {
agent any
stages {
stage('start') {
steps {
createDynamicStages()
}
}
}
}
void createDynamicStages() {
// Stage list could be read from a file or whatever
def stageList = ['foo', 'bar']
for( stageName in stageList ) {
stage( stageName ) {
echo "Hello from stage $stageName"
}
}
}
This shows in Blue Ocean like this:

Jenkins execute all sub jobs before marking a top job fail or pass?

def jobs = [
'subjob1': true,
'subjob2': false,
'subjob3': true
]
pipeline
{
agent { label "ag1" }
stages
{
stage('stage1')
{
steps
{
script
{
jobs.each
{
if ("$it.value".toBoolean())
{
stage("Stage $it.key")
{
build([job:"$it.key", wait:true, propagate:true])
}
}
}
}
}
}
}
}
This Jenkins job triggers other sub-jobs (via pipeline build step): subjob1, subjob2, subjob3. If any of the sub-jobs fail, this job immediately fails (propagate:true).
However, what I'd like to do is continue executing all jobs. And mark this one as failed if one or more sub-jobs fail. How would I do that?
Here is how you can do it. You can simply use a catchError block for this.
def jobs = [
'subjob1': true,
'subjob2': false,
'subjob3': true
]
pipeline
{
agent any
stages
{
stage('stage1')
{
steps
{
script
{
jobs.each
{
if ("$it.value".toBoolean())
{
stage("Stage $it.key")
{
catchError(buildResult: 'FAILURE', stageResult: 'FAILURE')
{
echo "Building"
build([job:"$it.key", wait:true, propagate:true])
}
}
}
}
}
}
}
}
}
Instead of executing all the jobs one by one, you can execute them in parallel. This way, all the jobs will be executed independently of each other and stage1 will fail only if one or more jobs fails.
According to the documentation
The parallel directive takes a map from branch names to closures and an optional argument failFast which will terminate all branches upon a failure in any other branch.
So, we have to transform the jobs to a Map of stage names to Closures that will execute in parallel. We will use jobs.collectEntries() to build the mapping and pass it as the argument to the parallel directive:
stage('Parallel') {
steps {
script {
parallel(jobs.collectEntries {
[(it.key): {
if (it.value) {
build(job: it.key)
} else {
echo "Skipping job execution: ${it.key}"
// This is required to mark the parallel stage as skipped - it is not required for the solution to work
org.jenkinsci.plugins.pipeline.modeldefinition.Utils.markStageSkippedForConditional(it.key)
}
}]
})
}
}
}
We can omit the wait and propagate flags in the build step because they are set by default.
In the provided solution the Parallel stage (and the resulting build) will fail only if one or more jobs fails. Additionally, if you have Blue Ocean plugin installed you will see a nice view graph of Parallel stage along with all the parallel children:

Jenkins declarative pipeline conditional post action depending on stage (not pipeline) status

I have a Jenkins declarative pipeline in which some stages have a post action:
stage('1 unit tests') { ..... }
stage('2 other tests') {
steps { ..... }
post {
success { ...... }
}
}
It seems that if a unit test fails (build becomes unstable in stage 1), then the conditional post action of stage 2 is not performed.
How can I make sure the post action is only skipped if the build status changes
during the current stage?
There are some "options", I don't know what you might like or find acceptable. If a stage is skipped; it also skips all of its internals.
1: It's not exactly what you want but you can manipulate/mark a stage differently than other stages and continue the execution using something like the skipStagesAfterUnstable option or catchError. For more info see this post. But it may also be to heavy-handed and forcing you into a limited set of results or lead to unwanted execution.
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE')
2: You can move the stage to a separate pipeline/job, and trigger it via or after the run of this pipeline
3: Another option might be something like the following pseudo-code, but this feels more like a hack and adding ('global') 'state flags' adds clutter:
failedOne = false
failedTwo = false
pipeline {
agent any
stages {
stage('Test One') {
steps {...}
post {
failure {
failedOne=true
postFailureOne()
}
}
}
stage('Test Two') {
steps {...}
post {
failure {
failedTwo=true
postFailureTwo()
}
}
}
}
post {
success { .... }
failure {
if (!failedTwo) postFailureTwo()
}
}
}
void postFailureOne() { echo 'Oeps 1' }
void postFailureTwo() { echo 'Oeps 2' }

How to abort a declarative pipeline

I'm trying the new declarative pipeline syntax.
I wonder, how can I abort all the stages and steps of a pipeline, when for example a parameter has an invalid value.
I could add a when clause to every stage, but this isn't optimal for me. Is there a better way to do so?
This should work fine with a when directive, if you make use of the error step.
For example, you could do an up-front check and abort the build if the given parameter value is not acceptable — preventing subsequent stages from running:
pipeline {
agent any
parameters {
string(name: 'targetEnv',
defaultValue: 'dev',
description: 'Must be "dev", "qa", or "staging"')
}
stages {
stage('Validate parameters') {
when {
expression {
// Only run this stage if the targetEnv is invalid
!['dev', 'qa', 'staging'].contains(params.targetEnv)
}
}
steps {
// Abort the build, skipping subsequent stages
error("Invalid target environment: ${params.targetEnv}")
}
}
stage('Checkout') {
steps {
echo 'Checking out source code...'
}
}
stage('Build') {
steps {
echo 'Building...'
}
}
}
}
You could use the FlowInterruptedException, e.g.:
import hudson.model.Result
import org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
pipeline {
...
steps {
script {
throw new FlowInterruptedException(Result.ABORTED)
}
...
}
This will stop execution immediately like the error step, but with more control over the result.
Please note that it requires you to approve a signature:
new org.jenkinsci.plugins.workflow.steps.FlowInterruptedException hudson.model.Result
Apart from Result.ABORTED there's also: Result.SUCCESS, Result.UNSTABLE, Result.FAILURE and Result.NOT_BUILT.
Disclaimer: it's a bit of a hack.

Skip Jenkins Pipeline Steps If Node Is Offline

I have a Jenkins Pipeline job that, for part of the build, uses a node with a lot of downtime. I'd like this step performed if the node is online and skipped without failing the build if the node is offline.
This is related, but different from the problem of skipping parts of a Matrix Project.
I tried to programmatically check if a node is online like so.
jenkins.model.Nodes.getNode('my-node').toComputer().isOnline()
This runs up against the Jenkins security sandbox:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: unclassified method java.lang.Class getNode java.lang.String
I tried setting a timeout that will be tripped if the node is offline.
try {
timeout(time: 10, unit: 'MINUTES') {
node('my-node') {
// Do optional step
}
}
} catch (e) {
echo 'Time out on optional step. Node down?'
}
This has a major downside. I have to know what the longest time the step would take, then wait even longer when the node is down. I tried working around that with a "canary" step:
try {
timeout(time: 1, unit: 'SECONDS') {
node('my-node') {
echo 'Node is up. Performing optional step.'
}
}
node('my-node') {
echo 'This is an optional step.'
}
} catch (e) {
echo 'Time out on optional step. Node down?'
}
This skips the step if the node is up, but busy with another job. This is the best solution I have come up with so far. Is there just a way to check if the node is online without using a timeout?
This should work:
Jenkins.instance.getNode('my-node').toComputer().isOnline()
see http://javadoc.jenkins-ci.org/jenkins/model/Jenkins.html
There is a pipeline call for this.
nodesByLabel 'my-node'
Returns [] if no node is online; returns arraylist with online instances otherwise.
I simply did this:
pipeline {
agent none
environment { AGENT_NODE = "somenode" }
stages {
stage('Offline Node') {
when {
beforeAgent true
expression {
return nodesByLabel(env.AGENT_NODE).size() > 0
}
}
agent {
label "${env.AGENT_NODE}"
}
steps {
...
}
}
}
}

Resources