Jenkins job DSL guard-rescue in pipeline - jenkins

Lately, I'm working on a project where I'm swapping deprecated build flows to build pipeline in several job DSL generated Jenkins.
One particular problem was to swap the build flows guard-rescue mechanism to pipeline syntax. I'm curious what you think about my solution:
See the following flow DSL:
guard {
b = build("parameterised_job")
} rescue {
build("analyzer_job",
PARAMETER_ONE:b.environment.get("PARAMETER_ONE"),
PARAMETER_TWO:b.environment.get("PARAMETER_TWO")
)
}
I created the following alternative with the pipeline syntax:
pipeline {
agent any
stages {
stage("build") {
steps {
script {
def b = build(job: "parameterised_job", propagate: false)
build(job: "analyzer_job",
parameters:
[[$class: 'StringParameterValue', name: 'PARAMETER_ONE', value: b.buildVariables.PARAMETER_ONE],
[$class: 'StringParameterValue', name: 'PARAMETER_TWO', value: b.buildVariables.PARAMETER_TWO]])
if(b.result == 'FAILURE') {
error("${b.projectName} FAILED")
}
}
}
}
}
}

Related

Is it possible to get build number even if build is unstable but not failed?

When building a job in a scripted pipeline, I would like to keep the external build number even if that build is unstable but not failed.
pipeline {
agent any
stages {
stage('Job1') {
steps {
script {
Job1 = build job: 'Job1'
}
}
}
stage('Job2') {
steps {
build job: 'Job2',
parameters: [
string(
name: 'Job1_ID'
value: "${Job1.number}"
)
]
}
}
}
}
I have tried with a catchError() around the job1 build, but still have that problem if the build is unstable.
I have also tried with propagate:false parameter, but I can never see the actual status of the build visually, plus, I don't want the second build to be triggered if the first is failed.
Is there any solution for that ?
What you can do is set propagate: false and then conditionally execute your second Job. Please see the pipeline below.
pipeline {
agent any
stages {
stage('Job1') {
steps {
script {
Job1 = build job: 'Job1', propagate: false
}
}
}
stage('Job2') {
when { expression { return Job1.resultIsBetterOrEqualTo("SUCCESS")}}
steps {
build job: 'Job2',
parameters: [
string(name: 'Job1_ID',value: "${Job1.number}")
]
}
}
}
}

How to use a matrix section in a declarative pipeiline

I have the following pipeline. I need this pipeline to run on 4 different nodes at the same time. I have read that using a matrix section within the declarative pipeline is key to making this work. How can I go about doing that with the pipeline below?
pipeline
{
stages
{
stage ('Test')
{
steps
{
script
{
def test_proj_choices = ['AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA']
for (choice in test_proj_choices)
{
stage ("${choice}")
{
echo "Running ${choice}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: choice), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
One helpful article can be found here : https://www.jenkins.io/blog/2019/11/22/welcome-to-the-matrix/
The official documentation here: https://www.jenkins.io/doc/book/pipeline/syntax/#declarative-matrix
Accordingly, the syntax should be:
pipeline {
agent none
stages {
stage('Tests') {
matrix {
agent any
axes {
axis {
name 'CHOICE'
values 'AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA'
}
}
stages {
stage("Test") {
steps {
echo "Running ${CHOICE}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: CHOICE), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
Note that your inner stage cannot be named dynamically, you'd get a syntax error trying to expand "${CHOICE}".

Jenkins: Trigger another job with branch name

I am using Jenkins Pipeline via declarative and I would like to trigger another job with branch name.
For instance, I have two different pipeline(PipelineA -PipelineB) with stages JobA and JobB.
One of the stage for JobA should trigger the JobB via paramater using env.GIT_BRANCH. What I mean, if we trigger the JobA via origin/develop, then it should trigger the 'JobB' and run the stages where it has origin/develop condition.
Meanwhile, we also making some separate changes on JobB and it also has its own GIT_BRANCH expression.Thus I could not able to find a way to manage this separately without affecting JobA. To be clarify, when JobA trigger JobB with origin/stage parameter, due to latest changes on JobB is origin/development whereas GIT_BRANCH is origin/development, I can not able to run the stages which has stage condition.
Here is my script.
stage ('Job A') {
steps {
script {
echo "Triggering job for branch ${env.GIT_BRANCH}"
ret = build(job: "selenium_tests",
parameters: [
string(name: "projectName", value: "Project1"),
string(name: "branchName", value: "env.GIT_BRANCH")
],
propagate: true,
wait: true)
echo ret.result
currentBuild.result = ret.result
}
}
}
parameters {
string(defaultValue: "project1", description: 'Which project do you want to test?', name: 'projectName')
string(defaultValue: "origin/development", description: 'Environment for selenium tests', name:'branchName')
}
stage ('Job B') {
when {
beforeAgent true
expression { params.projectName == 'Project1' }
expression { params.branchName == "origin/stage"}
expression{ return env.GIT_BRANCH == "origin/stage"}
}
steps {
script {
//Do something
}
}
}
Pass down one more param for branch when trigger Job B
stage('Trigger Job A') {}
stage('Trigger Job B') {
when {
allOf {
beforeAgent true
expression { params.projectName == 'Project1' }
expression{ return env.GIT_BRANCH == "origin/stage"}
}
}
steps {
build(job: "selenium_tests/Job B",
parameters: [
string(name: "projectName", value: "Project1")
strint(name: "branchName", value: "${env.GIT_BRANCH}")
],
propagate: true,
wait: true)
}
}
In Job B' Jenkinsfile add one stage as the first stage to switch to desired branch
pipeline {
parameters {
string(name: 'branchName', defaultValue: 'develop')
}
stages {
stage('Switch branch') {
steps {
sh "git checkout ${params.branchName}"
}
}
// other stages
}
}

How to build an pipeline jobs in parallel based on choice parameters values?

In Jenkins, right now i am configuring the pipeline job that can run based on choice parameters values, for each choice values there is an certain jobs need to run in parallel. for example here i need to build Job1 parameter then its only need to build Job1's parallel jobs. but i tried it here its building all the jobs, is there an way to build the jobs based on parameter values?
Choice Parameter
Name: Param
Value: Job1
Job2
import jenkins.model.*
import hudson.model.*
node('') {
String
stage ('Parallel-Job1'){
parallel(Job1: {
stage ('Parallel-test1'){
build job: 'test1', propagate: false
def jobname1 = "test1"
}
}, Job1: {
stage ('Parallel-test2'){
build job: 'test2', propagate: false
def jobname2 = "test2"
}
})
stage ('Parallel-Job2'){
parallel(Job2: {
stage ('Parallel-test3'){
build job: 'test3', propagate: false
def jobname1 = "test3"
}
})
}
}
}
if (param == "Job1") {
stage('Parallel-Job1') {steps ..}
PA: in this case you won't see the skipped pipeline stage on the general view
Or:
stage('conditional stage') {
agent label:'my-node'
when {
expression {
return ${Param} != 'Job1';
}
}
steps {
echo 'foo bar'
}
}

Jenkinsfile - How to pass parameters for all the stages

To explain the issue, consider that I have 2 jenkins jobs.
Job1 : PARAM_TEST1
it accepts a parameterized value called 'MYPARAM'
Job2: PARAM_TEST2
it also accepts a parameterized value called 'MYPARAM'
Sometimes I am in need of running these 2 jobs in sequence - so i created a separate pipeline job as shown below. It works just fine.
it also accepts a parameterized value called 'MYPARAM' to simply pass it to the build job steps.
pipeline {
agent any
stages {
stage("PARAM 1") {
steps {
build job: 'PARAM_TEST1', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
stage("PARAM 2") {
steps {
build job: 'PARAM_TEST2', parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")]
}
}
}
}
My question:
This example is simple. Actually I have 20 jobs. I do not want to repeat parameters: [string(name: 'MYPARAM', value: "${params.MYPARAM}")] in every single stage.
Is there any way to set the parameters for all the build job steps in one single place?
What you could do is place the common params on the pipeline level and add specific ones to those in the stages
pipeline {
agent any
parameters {
string(name: 'PARAM1', description: 'Param 1?')
string(name: 'PARAM2', description: 'Param 2?')
}
stages {
stage('Example') {
steps {
echo "${params}"
script {
def myparams = params + string(name: 'MYPARAM', value: "${params.MYPARAM}")
build job: 'downstream-pipeline-with-params', parameters: myparams
}
}
}
}
}

Resources