How to compare string variables in Jenkins `when` condition - jenkins

I'm defining a Jenkins declarative pipeline and having a hard time configuring a step to not execute if two strings are equal.
I've tried several things but string comparison doesn't work.
Here's my current state:
stages {
stage('Check if image has changed') {
steps {
script {
OLD_DIGEST = sh(returnStdout: true, script: "podman manifest inspect registry/myimage:11 2>/dev/null | jq .config.digest").trim()
NEW_DIGEST = sh(returnStdout: true, script: "podman inspect --format='sha256:{{.Id}}' myimage:11-tmp").trim()
}
sh "echo previous digest:${OLD_DIGEST}, new digest:${NEW_DIGEST}"
}
}
stage('Release') {
when {
allOf {
expression { env.RELEASE != null && env.RELEASE == "true" }
expression { env.OLD_DIGEST != env.NEW_DIGEST }
}
}
steps {
sh "echo Releasing image..."
sh "podman image push myimage:11-tmp registry/myimage:11.${DATE_TIME}"
sh "podman image push myimage:11-tmp registry/myimage:11"
}
}
}
More specifically, the issues lies in the when:
allOf {
expression { env.RELEASE != null && env.RELEASE == "true" }
expression { env.OLD_DIGEST != env.NEW_DIGEST }
}
The first expression works fine but I can't make the second work: even if OLD_DIGEST and NEW_DIGEST are different, the step is skipped.
Example output:
previous digest:sha256:736fd651afdffad2ee48a55a3fbab8de85552f183602d5bfedf0e74f90690e32, new digest:sha256:9003077f080f905d9b1a960b7cf933f04756df9560663196b65425beaf21203d
...
Stage "Release" skipped due to when conditional
I've also tried expression { OLD_DIGEST != NEW_DIGEST } (removing the env.) but now the result is the opposite: even when both strings are equals, the step is NOT skipped.
Output in this case:
previous digest:sha256:8d966d43262b818073ea23127dedb61a43963a7fafc5cffdca85141bb4aada57, new digest:sha256:8d966d43262b818073ea23127dedb61a43963a7fafc5cffdca85141bb4aada57
...
Releasing image...
I'm wondering if the issue lies in the expression or allOf at some point.

According to my tests in the latest 2023 version, env variables are initialized on any stage, so the previous values are being overrided.
Note: Inside when, the env vars has the default values, ignoring the expected values put in the previous stage. After that, in the steps, has the expected values(updated in the previous stage)
If you use global variables instead env variables, it works. I simulates your podman output with echo.
def OLD_DIGEST
def NEW_DIGEST
pipeline {
agent any
environment {
RELEASE = "true"
}
stages {
stage('Check if image has changed') {
steps {
script {
OLD_DIGEST = sh(returnStdout: true, script: "echo '1'").trim()
NEW_DIGEST = sh(returnStdout: true, script: "echo '1'").trim()
}
sh "echo previous digest:${OLD_DIGEST}, new digest:${NEW_DIGEST}"
}
}
stage('Release') {
when {
allOf {
expression { env.RELEASE != null && env.RELEASE == "true" }
expression { OLD_DIGEST != NEW_DIGEST }
}
}
steps {
sh "echo Releasing image..."
}
}
}
}
when OLD_DIGEST = 1 && NEW_DIGEST = 1 , the stage is skipped
if there are different, the stage is executed

The root cause of my issue was the output of my two strings to compare which was indeed different: one was "xxx" while the other was xxx but Jenkins output doesn't show the double quotes.
The correct Jenkins comparison, as stated in the comments, is expression { OLD_DIGEST != NEW_DIGEST } (without env.).

Related

why is this environment variable evaluated every time it is used?

I have noticed this (to me) strange behaviour. I have this Jenkins declarative pipeline:
#!groovy
pipeline {
agent {
node {
label 'mine-agent-pod'
}
}
environment {
MARKER = """run-${sh(
returnStdout: true,
script: "date -Ins | sed 's/[^a-zA-Z0-9-]/_/g'"
).trim()}"""
STATUS_DATA = "status-data-${MARKER}.json"
}
stages {
stage('Setup') {
steps {
sh("""echo MARKER=${MARKER}""")
sh("""echo STATUS_DATA=${STATUS_DATA}""")
}
}
}
}
I wanted the MARKER to be a kinda of ID I would use to mark all temporary stuff I create in a build (and I like it to be a date). But looks like MARKER is evaluated whenever it is used, as the output of the build shows (notice how nanoseconds part of the string differs):
[Pipeline] sh
+ echo MARKER=run-2020-07-07T12_04_23_369785902_00_00
MARKER=run-2020-07-07T12_04_23_369785902_00_00
[Pipeline] sh
+ echo STATUS_DATA=status-data-run-2020-07-07T12_04_23_727188019_00_00.json
STATUS_DATA=status-data-run-2020-07-07T12_04_23_727188019_00_00.json
Why is that? How to achieve having "static" variable?
It's due to Groovy closures have an interesting advantage over mere expressions: lazy evaluation. More detail
environment {
MARKER = 'run-' + sh(
returnStdout: true,
script: "date -Ins | sed 's/[^a-zA-Z0-9-]/_/g'").trim()
STATUS_DATA = "status-data-${MARKER}.json"
}
After a couleague's great advice, defining variable outside of pipeline helped:
#!groovy
def MARKER = """run-${ new Date().format("yyyy-MM-dd'T'HH:mm:ss.SZ") }"""
pipeline {
agent {
node {
label 'sat-cpt'
}
}
environment {
STATUS_DATA = "status-data-${MARKER}.json"
}
stages {
stage('Setup') {
steps {
sh("""echo MARKER=${MARKER}""")
sh("""echo STATUS_DATA=${STATUS_DATA}""")
}
}
}
}
This prints:
[Pipeline] sh
+ echo MARKER=run-2020-07-08T19:41:56.130+0000
MARKER=run-2020-07-08T19:41:56.130+0000
[Pipeline] sh
+ echo STATUS_DATA=status-data-run-2020-07-08T19:41:56.130+0000.json
STATUS_DATA=status-data-run-2020-07-08T19:41:56.130+0000.json

buildingTag() always returns false

Whenever I try to create a conditional stage with buildingTag(), the stage always gets skipped, even when the current commit is a tag. Here is my Jenkinsfile:
pipeline {
agent {
docker {
image 'node:10'
}
}
stages {
stage('Build') {
steps {
sh 'yarn install'
sh 'node scripts/build.js'
}
}
stage('Lint') {
steps {
sh 'yarn lint'
}
}
stage('Deploy') {
when {
buildingTag()
}
environment {
}
steps {
sh 'node scripts/deploy.js'
sh 'node scripts/publish.js'
}
}
}
}
Likely due to this bug:
https://issues.jenkins-ci.org/browse/JENKINS-55987
Workaround is:
when {
expression {
return !isVersionTag(readCurrentTag())
}
}
with:
def boolean isVersionTag(String tag) {
echo "checking version tag $tag"
if (tag == null) {
return false
}
// use your preferred pattern
def tagMatcher = tag =~ /\d+\.\d+\.\d+/
return tagMatcher.matches()
}
// workaround https://issues.jenkins-ci.org/browse/JENKINS-55987
def String readCurrentTag() {
return sh(returnStdout: true, script: "git describe --tags").trim()
}
buildingTag() requires that the TAG_NAME environment varible is set.
This is not set automatically in a simple (not multibranch) pipeline.
pipeline {
agent any
environment {
// To get the tag like shown soru's answer:
// TAG_NAME = sh(returnStdout: true, script: "git describe --tags").trim()
// In my case I already have a tag saved as an environment variable:
// gitlabBranch=refs/tags/tagname
TAG_NAME = "${env.gitlabBranch.split('/')[2]}"
}
stages {
stage('buildingTag') {
when { buildingTag() }
steps {
echo 'buildingTag works here.'
}
}
}
}
I also come across this problem. All you need to do enable Advanced clone behaviours -> Fetch tags in the project settings, and set TAG_NAME environment variable in the Jenkins file.
1- Advanced clone behaviours
2- Fetch tags
3- And set TAG_NAME variable in pipeline (buildingTag function requires this)
pipeline {
environment {
TAG_NAME = sh(returnStdout: true, script: "git --no-pager tag --points-at HEAD").trim()
}
agent {
...
4- Use Jenkins's buildingTag function to check whether commit has tag or not
...
stage("Publish Release Artifact") {
when {
buildingTag()
}
...
I have been using soru's solution, but I had problems when I was building a branch that is tagged, so I tried this and it seems to work:
def boolean isVersionTag(String tag) {
echo "checking version tag $tag"
if (tag == null) {
return false
}
// use your preferred pattern
def tagMatcher = tag =~ /\d+\.\d+\.\d+/
return tagMatcher.matches()
}
def String readCurrentTag() {
return sh(returnStdout: true, script: 'echo ${TAG_NAME}').trim()
}

Jenkins pipeline is skipping groovy 'else if' clause

I am conducting some tests in my pipeline. My aim is that if an error file exists the build should fail. But if for some reason the tests experienced an exception and didn't write either an error or successful file, the pipeline should also fail. If neither of the conditions for failure are met, I would like the an upstream job to execute.
I wrote it in stage and initially it looked like this:
stage('system tests') {
steps {
dir(project_root) {
def error_exists = sh(
script: 'ls error.txt', returnStatus: true
)
if (error_exists == 0) {
currentBuild.result = 'FAILED'
return
}
build job: 'my-job;
}
}
}
The above code works. When the tests being executed wrote an error file, the pipeline fails. I then tried to modify the code to cater for the outcome where neither error or success files are written.
stage('system tests') {
steps {
dir(project_root) {
def error_exists = sh(
script: 'ls error.txt', returnStatus: true
)
def success_exists = sh(
script: 'ls success.txt', returnStatus: true
)
if (error_exists == 0) {
currentBuild.result = 'FAILED'
return
} else if (success_exists == 1 && error_exists == 1) {
currentBuild.result = 'FAILED'
return
}
build job: 'my-job;
}
}
}
I simulated a situation where neither file was written and the pipeline didn't fail, and instead it triggered the upstream build. Why am I not entering the else if clause` if the result of both shell scripts is false? I took the logical operators from here and I think they should be met (The code below is output from the shell scripts in the new-job pipeline)
[new-job] Running shell script
+ ls error.txt
ls: cannot access error.txt: no such file or directory
[new-job] Running shell script
+ ls success.txt
ls: cannot access success.txt: no such file or directory
If these files do not exist then sh jenkins step returns error code 2. You should rewrite your 'if condition' like that:
success_exists == 2 && error_exists == 2
But, I think that in your case this code is more suitable:
stage('system tests') {
steps {
dir(project_root) {
def error_exists = sh(
script: 'ls error.txt', returnStatus: true
)
def success_exists = sh(
script: 'ls success.txt', returnStatus: true
)
if (error_exists == 0) {
currentBuild.result = 'FAILED'
return
} else if (success_exists != 0 && error_exists != 0) {
currentBuild.result = 'FAILED'
return
}
build job: 'my-job;
}
}
Because there may be other reasons for not being able to find the file (lack of access, etc).

Using contains with Jenkins expression and env var

I am trying to create a Jenkinsfile to handle different steps in prod vs dev environments. I was attempting to use the anyOf pattern with an expression that checks the JOB_URL environmental variable to determine which build server/build instruction to follow.
Jenkinsfile ends up looking something like the below:
stage('In Prod') {
when {
allOf {
expression { params.P1 == 'x' }
expression { params.P2 == 'y' }
expression { env.JOB_URL.contains('prod_server.com') }
}
}
...
}
stage('In Dev') {
when {
allOf {
expression { params.P1 == 'x' }
expression { params.P2 == 'y' }
expression { env.JOB_URL.contains('dev_server.com') }
}
}
...
}
Expected Behavior:
In Dev -> run In Dev step
In Prod -> run In Prod step
Actual Behavior:
In Dev -> run In Prod AND In Dev step
In Prod -> run In Prod step
Yes, I have checked to make sure that JOB_URL on the dev does not contain prod_server.com.
I have also tried !env.JOB_URL.contains('dev_server.com') as an additional expression for the prod step with the same result.
I only know enough groovy to get through Jenkins, and am somewhat new to Jenkins pipeline syntax, so maybe I've misunderstood the behavior here, but from what I understand from the Jenkins expression documentation:
when returning strings from your expressions they must be
converted to booleans or return null to evaluate to false. Simply
returning "0" or "false" will still evaluate to "true".
And as a sanity check, I confirmed that the groovy documentation says contains should be returning a boolean.
You can use a regular expression comparator in the expression to check the one of these environment variables:
built-in: JENKINS_URL and BUILD_URL (source built-in var)
plugin-provided JOB_URL (exists but can't find source)
Note: environment variable are exposed through the reserved environment variable map env (using env variable), e.g. server_url = env.JENKINS_URL.
Try something like this:
pipeline {
agent none
parameters {
string(name: 'P1', defaultValue: 'x', description: '')
string(name: 'P2', defaultValue: 'y', description: '')
}
stages {
stage('Init') {
steps {
echo "params = ${params.toString()}"
echo "env.JENKINS_URL = ${env.JENKINS_URL}"
}
}
stage('In Prod') {
when {
allOf {
expression { params.P1 == 'x' }
expression { params.P2 == 'y' }
expression { env.JENKINS_URL ==~ /.*prod_server.com.*/ }
}
}
steps {
echo "Prod"
}
}
stage('In Dev') {
when {
allOf {
expression { params.P1 == 'x' }
expression { params.P2 == 'y' }
expression { env.JENKINS_URL ==~ /.*dev_server.com.*/ }
}
}
steps {
echo "DEV"
}
}
}
}

How do I pass variables between stages in a declarative Jenkins pipeline?

How do I pass variables between stages in a declarative pipeline?
In a scripted pipeline, I gather the procedure is to write to a temporary file, then read the file into a variable.
How do I do this in a declarative pipeline?
E.g. I want to trigger a build of a different job, based on a variable created by a shell action.
stage("stage 1") {
steps {
sh "do_something > var.txt"
// I want to get var.txt into VAR
}
}
stage("stage 2") {
steps {
build job: "job2", parameters[string(name: "var", value: "${VAR})]
}
}
If you want to use a file (since a script is the thing generating the value you need), you could use readFile as seen below. If not, use sh with the script option as seen below:
// Define a groovy local variable, myVar.
// A global variable without the def, like myVar = 'initial_value',
// was required for me in older versions of jenkins. Your mileage
// may vary. Defining the variable here maybe adds a bit of clarity,
// showing that it is intended to be used across multiple stages.
def myVar = 'initial_value'
pipeline {
agent { label 'docker' }
stages {
stage('one') {
steps {
echo "1.1. ${myVar}" // prints '1.1. initial_value'
sh 'echo hotness > myfile.txt'
script {
// OPTION 1: set variable by reading from file.
// FYI, trim removes leading and trailing whitespace from the string
myVar = readFile('myfile.txt').trim()
}
echo "1.2. ${myVar}" // prints '1.2. hotness'
}
}
stage('two') {
steps {
echo "2.1 ${myVar}" // prints '2.1. hotness'
sh "echo 2.2. sh ${myVar}, Sergio" // prints '2.2. sh hotness, Sergio'
}
}
// this stage is skipped due to the when expression, so nothing is printed
stage('three') {
when {
expression { myVar != 'hotness' }
}
steps {
echo "three: ${myVar}"
}
}
}
}
Simply:
pipeline {
parameters {
string(name: 'custom_var', defaultValue: '')
}
stage("make param global") {
steps {
tmp_param = sh (script: 'most amazing shell command', returnStdout: true).trim()
env.custom_var = tmp_param
}
}
stage("test if param was saved") {
steps {
echo "${env.custom_var}"
}
}
}
I had a similar problem as I wanted one specific pipeline to provide variables and many other ones using it to get this variables.
I created a my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
I can reuse these variables in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
there is no need for (hidden plugin) parameter definitions or temp-file access. Sharing varibles across stages can be acomplished by using global Groovy variables in a Jenkinsfile like so:
#!/usr/bin/env groovy
def MYVAR
def outputOf(cmd) { return sh(returnStdout:true,script:cmd).trim(); }
pipeline {
agent any
stage("stage 1") {
steps {
MYVAR = outputOf('echo do_something')
sh "echo MYVAR has been set to: '${MYVAR}'"
}
}
stage("stage 2") {
steps {
sh '''echo "...in multiline quotes: "''' + MYVAR + '''" ... '''
build job: "job2", parameters[string(name: "var", value: MYVAR)]
}
}
}
I have enhanced the existing solution by correcting syntax .Also used hidden parameter plugin so that it does not show up as an extra parameter in Jenkins UI. Works well :)
properties([parameters([[$class: 'WHideParameterDefinition', defaultValue: 'yoyo', name: 'hidden_var']])])
pipeline {
agent any
stages{
stage("make param global") {
steps {
script{
env.hidden_var = "Hello"
}
}
}
stage("test if param was saved") {
steps {
echo"About to check result"
echo "${env.hidden_var}"
}
}
}
}

Resources