why is this environment variable evaluated every time it is used? - jenkins

I have noticed this (to me) strange behaviour. I have this Jenkins declarative pipeline:
#!groovy
pipeline {
agent {
node {
label 'mine-agent-pod'
}
}
environment {
MARKER = """run-${sh(
returnStdout: true,
script: "date -Ins | sed 's/[^a-zA-Z0-9-]/_/g'"
).trim()}"""
STATUS_DATA = "status-data-${MARKER}.json"
}
stages {
stage('Setup') {
steps {
sh("""echo MARKER=${MARKER}""")
sh("""echo STATUS_DATA=${STATUS_DATA}""")
}
}
}
}
I wanted the MARKER to be a kinda of ID I would use to mark all temporary stuff I create in a build (and I like it to be a date). But looks like MARKER is evaluated whenever it is used, as the output of the build shows (notice how nanoseconds part of the string differs):
[Pipeline] sh
+ echo MARKER=run-2020-07-07T12_04_23_369785902_00_00
MARKER=run-2020-07-07T12_04_23_369785902_00_00
[Pipeline] sh
+ echo STATUS_DATA=status-data-run-2020-07-07T12_04_23_727188019_00_00.json
STATUS_DATA=status-data-run-2020-07-07T12_04_23_727188019_00_00.json
Why is that? How to achieve having "static" variable?

It's due to Groovy closures have an interesting advantage over mere expressions: lazy evaluation. More detail
environment {
MARKER = 'run-' + sh(
returnStdout: true,
script: "date -Ins | sed 's/[^a-zA-Z0-9-]/_/g'").trim()
STATUS_DATA = "status-data-${MARKER}.json"
}

After a couleague's great advice, defining variable outside of pipeline helped:
#!groovy
def MARKER = """run-${ new Date().format("yyyy-MM-dd'T'HH:mm:ss.SZ") }"""
pipeline {
agent {
node {
label 'sat-cpt'
}
}
environment {
STATUS_DATA = "status-data-${MARKER}.json"
}
stages {
stage('Setup') {
steps {
sh("""echo MARKER=${MARKER}""")
sh("""echo STATUS_DATA=${STATUS_DATA}""")
}
}
}
}
This prints:
[Pipeline] sh
+ echo MARKER=run-2020-07-08T19:41:56.130+0000
MARKER=run-2020-07-08T19:41:56.130+0000
[Pipeline] sh
+ echo STATUS_DATA=status-data-run-2020-07-08T19:41:56.130+0000.json
STATUS_DATA=status-data-run-2020-07-08T19:41:56.130+0000.json

Related

Jenkinsfile pipeline.environment values excluded from env.getEnvironment()

(edited/updated from original post to attempt to address confusion about what the problem is)
The problem is: Values that are set in a Jenkinsfile environment section are not added to the object returned by env.getEnvironment()
The question is: How do I get a map of the complete environment, including values that were assigned in the environment section? Because env.getEnvironment() doesn't do that.
Example Jenkinsfile:
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
// get env values as a map (for passing to groovy methods)
def envObject = env.getEnvironment()
// see what env.getEnvironment() looks like
// notice ONE is not present in the output, but TWO is
// ONE is set using ONE = '1' in the environment section above
// TWO is set using env['TWO'] = '2' in the Init stage above
println envObject.toString()
// for good measure loop through the env.getEnvironment() map
// and print any value(s) named ONE or TWO
// only TWO: 2 is output
envObject.each { k,v ->
if (k == 'ONE' || k == 'TWO') {
println "${k}: ${v}"
}
}
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output (output of envObject.toString() shortened to ... except relevant part):
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Init)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
[..., TWO:2]
[Pipeline] echo
TWO: 2
[Pipeline] sh
+ env
+ grep -E ONE|TWO
ONE=1
TWO=2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Notice ONE is missing from the env.getEnvironment() object, but TWO is present.
Also notice that both ONE and TWO are set in the actual environment and I am not asking how to access the environment or how to iterate through the values returned by env.getEnvironment(). The issue is that env.getEnvironment() does not return all the values in the environment, it excludes any values that were set inside the environment section of the Jenkinsfile.
I don't have a "why" answer for you, but you can cheat and get a map by parsing the output from env via the readProperties step.
def envMap = readProperties(text: sh(script: 'env', returnStdout: true))
println(envMap.getClass())
println("${envMap}")
I would get the env and convert it to map with the help of properties
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
def envProp = readProperties text: sh (script: "env", returnStdout: true).trim()
Map envMapFromProp = envProp as Map
echo "ONE=${envMapFromProp.ONE}\nTWO=${envMapFromProp.TWO}"
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output of env.getEnvironment() method will not return a list or Map, Hence it's difficult to iterate with each but there are some workaround you can do to make this work.
import groovy.json.JsonSlurper
pipeline {
agent any;
environment {
ONE = 1
TWO = 2
}
stages {
stage('debug') {
steps {
script {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parseText(env.getEnvironment().toString())
assert object instanceof Map
object.each { k,v ->
echo "Key: ${k}, Value: ${v}"
}
}
}
}
}
}
Note - env.getEnvironment().toString() will give you a JSON String . While parsing the JOSN string if groovy jsonSlurper.parseText found any special character it will through an error
You can also explore a little bit around env Jenkins API and find an appropriate method that will either return a Map or List so that you can use each

Jenkins pipeline (parallel && dynamically)?

Question
I have simple parallel pipeline (see code) which I use together with Jenkins 2.89.2. Additionally I use parameters and now want to be able to in-/decrease the number of deployVM A..Z stages automatically by providing the parameter before job execution.
How can I dynamically build my pipeline by providing a parameter?
Researched so far:
Jenkins pipeline script created dynamically - Not getting this to work with my Jenkins version
Can I create dynamically stages in a Jenkins pipeline? - Not working either
Code
The pseudo code of what I want - dynamic generation:
pipeline {
agent any
parameters {
string(name: 'countTotal', defaultValue: '3')
}
stages {
stage('deployVM') {
def list = [:]
for(int i = 0; i < countTotal.toInteger; i++) {
list += stage("deployVM ${i}") {
steps {
script {
sh "echo p1; sleep 12s; echo phase${i}"
}
}
}
}
failFast true
parallel list
}
}
}
The code I have so far - executes parallel but is static:
pipeline {
agent any
stages {
stage('deployVM') {
failFast true
parallel {
stage('deployVM A') {
steps {
script {
sh "echo p1; sleep 12s; echo phase1"
}
}
}
stage('deployVM B') {
steps {
script {
sh "echo p1; sleep 20s; echo phase2"
}
}
}
}
}
}
}
Although the question assumes using declarative pipeline I would suggest to use scripted pipeline because it's way more flexible.
Your task can be accomplished this way
properties([
parameters([
string(name: 'countTotal', defaultValue: '3')
])
])
def stages = [failFast: true]
for (int i = 0; i < params.countTotal.toInteger(); i++) {
def vmNumber = i //alias the loop variable to refer it in the closure
stages["deployVM ${vmNumber}"] = {
stage("deployVM ${vmNumber}") {
sh "echo p1; sleep 12s; echo phase${vmNumber}"
}
}
}
node() {
parallel stages
}
Also take a look at snippet generator which allows you to generate some scripted pipeline code.
Using Declarative pipeline also you can achieve this.
Follow my answer HERE
In above link answer I have used Var.collectEntries but map also can be used.
#Vitalii
I wrote similiar code piece, but unfoutunelty, all three element been loopped all shows the last one, not sure if it had something to do with groovy / jenkinsfile itself, that some clouse / reference went break with wrong usage
my purpose is to distribute tasks to specific work nodes
node_candicates = ["worker-1", "worder-2", "worker-3"]
def jobs = [:]
for (node_name in node_candidates){
jobs["run on $node_name"] = { // good
stage("run on $node_name"){ // all show the third
node(node_name){ // all show the third
print "on $node_name"
sh "hostname"
}
}
}
}
parallel jobs
it went totally Ok if I expand / explain the loop, instead of loop over it, like
parallel worker_1: {
stage("worker_1"){
node("worker_1"){
sh """hostname ; pwd """
print "on worker_1"
}
}
}, worker_2: {
stage("worker_2"){
node("worker_2"){
sh """hostname ; pwd """
print "on worker_2"
}
}
}, worker_3: {
stage("worker_3"){
node("worker_3"){
sh """hostname ; pwd """
print "on worker_3"
}
}
}

How do I pass variables between stages in a declarative Jenkins pipeline?

How do I pass variables between stages in a declarative pipeline?
In a scripted pipeline, I gather the procedure is to write to a temporary file, then read the file into a variable.
How do I do this in a declarative pipeline?
E.g. I want to trigger a build of a different job, based on a variable created by a shell action.
stage("stage 1") {
steps {
sh "do_something > var.txt"
// I want to get var.txt into VAR
}
}
stage("stage 2") {
steps {
build job: "job2", parameters[string(name: "var", value: "${VAR})]
}
}
If you want to use a file (since a script is the thing generating the value you need), you could use readFile as seen below. If not, use sh with the script option as seen below:
// Define a groovy local variable, myVar.
// A global variable without the def, like myVar = 'initial_value',
// was required for me in older versions of jenkins. Your mileage
// may vary. Defining the variable here maybe adds a bit of clarity,
// showing that it is intended to be used across multiple stages.
def myVar = 'initial_value'
pipeline {
agent { label 'docker' }
stages {
stage('one') {
steps {
echo "1.1. ${myVar}" // prints '1.1. initial_value'
sh 'echo hotness > myfile.txt'
script {
// OPTION 1: set variable by reading from file.
// FYI, trim removes leading and trailing whitespace from the string
myVar = readFile('myfile.txt').trim()
}
echo "1.2. ${myVar}" // prints '1.2. hotness'
}
}
stage('two') {
steps {
echo "2.1 ${myVar}" // prints '2.1. hotness'
sh "echo 2.2. sh ${myVar}, Sergio" // prints '2.2. sh hotness, Sergio'
}
}
// this stage is skipped due to the when expression, so nothing is printed
stage('three') {
when {
expression { myVar != 'hotness' }
}
steps {
echo "three: ${myVar}"
}
}
}
}
Simply:
pipeline {
parameters {
string(name: 'custom_var', defaultValue: '')
}
stage("make param global") {
steps {
tmp_param = sh (script: 'most amazing shell command', returnStdout: true).trim()
env.custom_var = tmp_param
}
}
stage("test if param was saved") {
steps {
echo "${env.custom_var}"
}
}
}
I had a similar problem as I wanted one specific pipeline to provide variables and many other ones using it to get this variables.
I created a my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
I can reuse these variables in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
there is no need for (hidden plugin) parameter definitions or temp-file access. Sharing varibles across stages can be acomplished by using global Groovy variables in a Jenkinsfile like so:
#!/usr/bin/env groovy
def MYVAR
def outputOf(cmd) { return sh(returnStdout:true,script:cmd).trim(); }
pipeline {
agent any
stage("stage 1") {
steps {
MYVAR = outputOf('echo do_something')
sh "echo MYVAR has been set to: '${MYVAR}'"
}
}
stage("stage 2") {
steps {
sh '''echo "...in multiline quotes: "''' + MYVAR + '''" ... '''
build job: "job2", parameters[string(name: "var", value: MYVAR)]
}
}
}
I have enhanced the existing solution by correcting syntax .Also used hidden parameter plugin so that it does not show up as an extra parameter in Jenkins UI. Works well :)
properties([parameters([[$class: 'WHideParameterDefinition', defaultValue: 'yoyo', name: 'hidden_var']])])
pipeline {
agent any
stages{
stage("make param global") {
steps {
script{
env.hidden_var = "Hello"
}
}
}
stage("test if param was saved") {
steps {
echo"About to check result"
echo "${env.hidden_var}"
}
}
}
}

Jenkinsfile Declarative Pipeline defining dynamic env vars

I'm new to Jenkins pipeline; I'm defining a declarative syntax pipeline and I don't know if I can solve my problem, because I didn't find a solution.
In this example, I need to pass a variable to ansible plugin (in old version I use an ENV_VAR or injecting it from file with inject plugin) that variable comes from a script.
This is my perfect scenario (but it doesn't work because environment{}):
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('Deploy') {
environment {
ANSIBLE_CONFIG = '${WORKSPACE}/chimera-ci/ansible/ansible.cfg'
VERSION = sh("python3.5 docker/get_version.py")
}
steps {
ansiblePlaybook credentialsId: 'example-credential', extras: '-e version=${VERSION}', inventory: 'development', playbook: 'deploy.yml'
}
}
}
}
I tried other ways to test how env vars work in other post, example:
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('PREPARE VARS') {
steps {
script {
env['VERSION'] = sh(script: "python3.5 get_version.py")
}
echo env.VERSION
}
}
}
}
but "echo env.VERSION" return null.
Also tried the same example with:
- VERSION=python3.5 get_version.py
- VERSION=python3.5 get_version.py > props.file (and try to inject it, but didnt found how)
If this is not possible I will do it in the ansible role.
UPDATE
There is another "issue" in Ansible Plugin, to use vars in extra vars it must have double quotes instead of single.
ansiblePlaybook credentialsId: 'example-credential', extras: "-e version=${VERSION}", inventory: 'development', playbook: 'deploy.yml'
You can create variables before the pipeline block starts. You can have sh return stdout to assign to these variables. You don't have the same flexibility to assign to environment variables in the environment stanza. So substitute in python3.5 get_version.py where I have echo 0.0.1 in the script here (and make sure your python script just returns the version to stdout):
def awesomeVersion = 'UNKNOWN'
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1').trim()
}
}
}
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
The output of the above pipeline is:
awesomeVersion: 0.0.1
In Jenkins 2.76 I was able to simplify the solution from #burnettk to:
pipeline {
agent { label 'docker' }
environment {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1')
}
stages {
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
Using the "pipeline utility steps" plugin, you can define general vars available to all stages from a properties file. For example, let props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}
A possible variation of the main answer is to provide variable using another pipeline instead of a sh script.
example (set the variable pipeline) : my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
(use these variables) in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
// call set variable job
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
For those who wants the environment's key to be dynamic, the following code can be used:
stage('Prepare Environment') {
steps {
script {
def data = [
"k1": "v1",
"k2": "v2",
]
data.each { key ,value ->
env."$key" = value
// env[key] = value // Deprecated, this can be used as well, but need approval in sandbox ScriptApproval page
}
}
}
}
You can also dump all your vars into a file, and then use the '-e #file' syntax. This is very useful if you have many vars to populate.
steps {
echo "hello World!!"
sh """
var1: ${params.var1}
var2: ${params.var2}
" > vars
"""
ansiblePlaybook inventory: _inventory, playbook: 'test-playbook.yml', sudoUser: null, extras: '-e #vars'
}
You can do use library functions in the environments section, like so:
#Library('mylibrary') _ // contains functions.groovy with several functions.
pipeline {
environment {
ENV_VAR = functions.myfunc()
}
…
}

Cannot define variable in pipeline stage

I'm trying to create a declarative Jenkins pipeline script but having issues with simple variable declaration.
Here is my script:
pipeline {
agent none
stages {
stage("first") {
def foo = "foo" // fails with "WorkflowScript: 5: Expected a step # line 5, column 13."
sh "echo ${foo}"
}
}
}
However, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: Expected a step # line 5, column 13.
def foo = "foo"
^
I'm on Jenkins 2.7.4 and Pipeline 2.4.
The Declarative model for Jenkins Pipelines has a restricted subset of syntax that it allows in the stage blocks - see the syntax guide for more info. You can bypass that restriction by wrapping your steps in a script { ... } block, but as a result, you'll lose validation of syntax, parameters, etc within the script block.
I think error is not coming from the specified line but from the first 3 lines. Try this instead :
node {
stage("first") {
def foo = "foo"
sh "echo ${foo}"
}
}
I think you had some extra lines that are not valid...
From declaractive pipeline model documentation, it seems that you have to use an environment declaration block to declare your variables, e.g.:
pipeline {
environment {
FOO = "foo"
}
agent none
stages {
stage("first") {
sh "echo ${FOO}"
}
}
}
Agree with #Pom12, #abayer. To complete the answer you need to add script block
Try something like this:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME}"
}
// ----------------
stages {
stage('Build Container') {
steps {
echo 'Building Container..'
script {
if (ENVIRONMENT_NAME == 'development') {
ENV_NAME = 'Development'
} else if (ENVIRONMENT_NAME == 'release') {
ENV_NAME = 'Production'
}
}
echo 'Building Branch: ' + env.BRANCH_NAME
echo 'Build Number: ' + env.BUILD_NUMBER
echo 'Building Environment: ' + ENV_NAME
echo "Running your service with environemnt ${ENV_NAME} now"
}
}
}
}
In Jenkins 2.138.3 there are two different types of pipelines.
Declarative and Scripted pipelines.
"Declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefore should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements."
jenkins pipeline: agent vs node?
Here is an example of using environment and global variables in a Declarative Pipeline. From what I can tell enviroment are static after they are set.
def browser = 'Unknown'
pipeline {
agent any
environment {
//Use Pipeline Utility Steps plugin to read information from pom.xml into env variables
IMAGE = readMavenPom().getArtifactId()
VERSION = readMavenPom().getVersion()
}
stages {
stage('Example') {
steps {
script {
browser = sh(returnStdout: true, script: 'echo Chrome')
}
}
}
stage('SNAPSHOT') {
when {
expression {
return !env.JOB_NAME.equals("PROD") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "SNAPSHOT"
echo "${browser}"
}
}
stage('RELEASE') {
when {
expression {
return !env.JOB_NAME.equals("TEST") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "RELEASE"
echo "${browser}"
}
}
}//end of stages
}//end of pipeline
You are using a Declarative Pipeline which requires a script-step to execute Groovy code. This is a huge difference compared to the Scripted Pipeline where this is not necessary.
The official documentation says the following:
The script step takes a block of Scripted Pipeline and executes that
in the Declarative Pipeline.
pipeline {
agent none
stages {
stage("first") {
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
you can define the variable global , but when using this variable must to write in script block .
def foo="foo"
pipeline {
agent none
stages {
stage("first") {
script{
sh "echo ${foo}"
}
}
}
}
Try this declarative pipeline, its working
pipeline {
agent any
stages {
stage("first") {
steps{
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
}

Resources