(edited/updated from original post to attempt to address confusion about what the problem is)
The problem is: Values that are set in a Jenkinsfile environment section are not added to the object returned by env.getEnvironment()
The question is: How do I get a map of the complete environment, including values that were assigned in the environment section? Because env.getEnvironment() doesn't do that.
Example Jenkinsfile:
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
// get env values as a map (for passing to groovy methods)
def envObject = env.getEnvironment()
// see what env.getEnvironment() looks like
// notice ONE is not present in the output, but TWO is
// ONE is set using ONE = '1' in the environment section above
// TWO is set using env['TWO'] = '2' in the Init stage above
println envObject.toString()
// for good measure loop through the env.getEnvironment() map
// and print any value(s) named ONE or TWO
// only TWO: 2 is output
envObject.each { k,v ->
if (k == 'ONE' || k == 'TWO') {
println "${k}: ${v}"
}
}
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output (output of envObject.toString() shortened to ... except relevant part):
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Init)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
[..., TWO:2]
[Pipeline] echo
TWO: 2
[Pipeline] sh
+ env
+ grep -E ONE|TWO
ONE=1
TWO=2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Notice ONE is missing from the env.getEnvironment() object, but TWO is present.
Also notice that both ONE and TWO are set in the actual environment and I am not asking how to access the environment or how to iterate through the values returned by env.getEnvironment(). The issue is that env.getEnvironment() does not return all the values in the environment, it excludes any values that were set inside the environment section of the Jenkinsfile.
I don't have a "why" answer for you, but you can cheat and get a map by parsing the output from env via the readProperties step.
def envMap = readProperties(text: sh(script: 'env', returnStdout: true))
println(envMap.getClass())
println("${envMap}")
I would get the env and convert it to map with the help of properties
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
def envProp = readProperties text: sh (script: "env", returnStdout: true).trim()
Map envMapFromProp = envProp as Map
echo "ONE=${envMapFromProp.ONE}\nTWO=${envMapFromProp.TWO}"
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output of env.getEnvironment() method will not return a list or Map, Hence it's difficult to iterate with each but there are some workaround you can do to make this work.
import groovy.json.JsonSlurper
pipeline {
agent any;
environment {
ONE = 1
TWO = 2
}
stages {
stage('debug') {
steps {
script {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parseText(env.getEnvironment().toString())
assert object instanceof Map
object.each { k,v ->
echo "Key: ${k}, Value: ${v}"
}
}
}
}
}
}
Note - env.getEnvironment().toString() will give you a JSON String . While parsing the JOSN string if groovy jsonSlurper.parseText found any special character it will through an error
You can also explore a little bit around env Jenkins API and find an appropriate method that will either return a Map or List so that you can use each
Related
I want to use some common value across different conditions in post section of pipeline hence I tried following -
1.
post {
script {
def variable = "<some dynamic value here>"
}
failure{
script{
"<use variable here>"
}
}
success{
script{
"<use variable here>"
}
}
}
2
post {
def variable = "<some dynamic value here>"
failure{
script{
"<use variable here>"
}
}
success{
script{
"<use variable here>"
}
}
}
But it results into compilation error.
Can you please suggest how I can declare a variable in post section which can be used across conditions?
You could use always condition which is guaranteed to be executed before any other conditions like success or failure. If you want to store String value, you can use use env variable to store it (environment variable always casts given value to a string). Alternatively, you can define a global variable outside the pipeline and then initialize it with the expected dynamic value inside the always condition. Consider the following example:
def someGlobalVar
pipeline {
agent any
stages {
stage("Test") {
steps {
echo "Test"
}
}
}
post {
always {
script {
env.FOO = "bar"
someGlobalVar = 23
}
}
success {
echo "FOO is ${env.FOO}"
echo "someGlobalVar = ${someGlobalVar}"
}
}
}
Output:
Running on Jenkins in /home/wololock/.jenkins/workspace/pipeline-post-sections
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
Test
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] echo
FOO is bar
[Pipeline] echo
someGlobalVar = 23
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
My project has many common variables for many other projects, so I use Jenkins Shared Library and created a vars/my_vars.groovy file where I defined my variables and return Map of them:
class my_vars {
static Map varMap = [:]
static def loadVars (Map config) {
varMap.var1 = "val1"
varMap.var2 = "val2"
// Many more variables ...
return varMap
}
}
I load the Shared Library in my Jenkinsfile, and call the function in the environment bullet, as I want those variables to be as environment variables .
Jenkinsfile:
pipeline {
environment {
// initialize common vars
common_vars = my_vars.loadVars()
} // environment
stages {
stage('Some Stage') {
// ...
}
}
post {
always {
script {
// Print environment variables
sh "env"
} // script
} // always
} // post
} // pipeline
The thing is that the environment bullet gets KEY=VALUE pairs, thus my common_vars map is loaded like a String value (I can see that on sh "env").
...
vars=[var1:val1, var2:val2]
...
What is the correct way to declare those values as an environment variables?
My target to get this:
...
var1=val1
var2=val2
...
Pipeline's environment variables store only String values. That is why when you assign a map to env.common_vars variables it stores map.toString() equivalent.
If you want to rewrite key-values from a map to the environment variables, you can iterate the variables map and assign each k-v pair to something like env."$k" = v. You can do that by calling a class method inside the environment block - that way you can be sure that the environment variables are assigned no matter which stage your pipeline gets restarted from. Consider the following example:
class MyVars {
private Map config = [
var1: "val1",
var2: "val2"
]
String initializeEnvironmentVariables(final Script script) {
config.each { k,v ->
script.env."$k" = v
}
return "Initialization of env variables completed!"
}
}
pipeline {
agent any
environment {
INITIALIZE_ENV_VARIABLES_FROM_MAP = "${new MyVars().initializeEnvironmentVariables(this)}"
}
stages {
stage("Some stage") {
steps {
echo "env.var1 = ${env.var1}"
}
}
}
post {
always {
script {
sh 'printenv | grep "var[0-9]\\+"'
}
}
}
}
In this example, we use MyVars class to store some global config map (it can be a part of a shared library, here, for simplicity, it is a part of the Jenkinsfile). We use INITIALIZE_ENV_VARIABLES_FROM_MAP environment variable assignment to call MyVars.initializeEnvironmentVariables(this) method that can access env from the script parameter. Calling this method from inside environment block has one significant benefit - it guarantees that environment variables will be initialized even if you restart the pipeline from any stage.
And here is the output of this exemplary pipeline:
Running on Jenkins in /home/wololock/.jenkins/workspace/pipeline-env-map
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Some stage)
[Pipeline] echo
env.var1 = val1
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] sh
+ grep 'var[0-9]\+'
+ printenv
var1=val1
var2=val2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
As you can see we it sets env.var1 and env.var2 from the map encapsulated in MyVars class. Both variables can be accessed inside the pipeline step, script block or even inside the shell environment variables.
As far as I know there is no easy way to do this in declarative pipeline (e.g. in the environment directive. Instead, what you can do is to setup the environment outside of the declarative definition, like this:
my_vars.loadVars().each { key, value ->
env[key] = value
}
// Followed by your pipelines definition:
pipeline {
stages {
stage('Some Stage') {
// ...
}
}
// ...
} // pipeline
As an full example:
class my_vars {
static Map varMap = [:]
static def loadVars (Map config) {
varMap.var1 = "val1"
varMap.var2 = "val2"
// Many more variables ...
return varMap
}
}
my_vars.loadVars().each { key, value ->
env[key] = value
}
pipeline {
agent any
stages {
stage("Some stage") {
steps {
echo "env.var1 = ${env.var1}"
}
}
}
}
Which outputs the following when built:
Started by user xxx
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on yyy in /zzz
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Some stage)
[Pipeline] echo
env.var1 = val1
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Edit; If your class (my_vars) is located in a shared library (MySharedLibrary):
library 'MySharedLibrary' // Will load vars/my_vars.groovy
my_vars.loadVars().each { key, value ->
env[key] = value
}
pipeline {
agent any
stages {
stage("Some stage") {
steps {
echo "env.var1 = ${env.var1}"
}
}
}
}
You don't have to return a map of your environment variables from your shared library. You can simply set them in a shared library method, the method will run in the same container as your pipeline.
In you shared library vars/ directory:
def setVars() {
env.var1 = "var1"
env.var2 = "var2"
env.var3 = "var3"
}
In your pipeline:
pipeline {
agent any
stages {
stage("Setup") {
steps {
script {
imported_shared_lib.setVars()
}
}
}
}
}
Others mentioned the need to preserve the environment variables even if you restart the pipeline from a certain stage. In my experiments, the variables are preserved using this method, even if the setVars() method is not called in the environment{} block.
I've inherited some Jenkins pipeline and try to improve it. Jenkins and groovy is quite fresh topic for me, so most probably I'm doing something wrong.
I'm using Jenkins ver. 2.121.3
Main aim was to add build parameter to do some extra cleaning during build. So I've added parameter CLEAN_FIRST with Boolean type and default value false to a job configuration and did something like this in pipeline:
// CLEAN_FIRST = false
// def prefix = CLEAN_FIRST ? "" : "REM"
pipeline {
agent none
stages {
stage('Some step') {
steps {
script {
node('master') {
cleanWs()
try {
def prefix = CLEAN_FIRST ? "" : "REM"
echo "CLEAN_FIRST=$CLEAN_FIRST prefix=$prefix"
bat (label: 'build third party',
script: """
$prefix call cleanSomthing.bat
call doOtherStuff.bat
"""
} finally {
echo "some stuff"
}
} // node
} // script
} // steps
} // stage
} // stages
} // pipeline
Now this doesn't work as expected. "REM" prefix is not added.
Echo prints:
CLEAN_FIRST=false prefix=
And bat invokes cleanSomthing.bat which I wish to avoid (to save on build times).
I've tried to make prefix global, but with same result.
Most probably this is caused by some evaluation order or scoping issue, but I can't put finger on it.
Can someone give me a clue why it doesn't work? How to fix it?
Answered own question. Is this problem fixed on some version of Jenkins?
replace
def prefix = CLEAN_FIRST ? "" : "REM"
with
def prefix = params.CLEAN_FIRST ? "" : "REM"
Ok I've found source of problems. It is a bit funny.
When running this pipeline (tested on Mac machine since it had empty job queue):
pipeline {
agent none
stages {
stage('Some step') {
steps {
script {
node('Mac') {
cleanWs()
try {
def logic = true
def prefix = CLEAN_FIRST ? "Ole" : "REM"
def typeLogic = logic.getClass()
def typeParam = CLEAN_FIRST.getClass()
echo "typeLogic=$typeLogic typeParam=$typeParam"
echo "CLEAN_FIRST=$CLEAN_FIRST prefix=$prefix"
sh (script: """
echo prefix=$prefix
""")
} finally {
echo "some stuff"
}
} // node
} // script
} // steps
} // stage
} // stages
} // pipeline
I've got this outcome:
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] stage
[Pipeline] { (Some step)
[Pipeline] script
[Pipeline] {
[Pipeline] node
Running on master in /Users/builder/jenkins/workspace/EIbuild_MacOS
[Pipeline] {
[Pipeline] cleanWs
[WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
[Pipeline] echo
typeLogic=class java.lang.Boolean typeParam=class java.lang.String
[Pipeline] echo
CLEAN_FIRST=false prefix=Ole
[Pipeline] sh
[EIbuild_MacOS] Running shell script
+ echo prefix=Ole
prefix=Ole
[Pipeline] echo
some stuff
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] End of Pipeline
Finished: SUCCESS
So now source the problem is obvious.
Jenkins in configuration promises variable of type Boolean, but in fact provides type String with values are "true" or "false" which are always evaluated as true when used as condition since both values are not empty strings :).
My question is similar to this one about how to load an external groovy script, and then calling a method from it in a different groovy script. So far I have been able to get methods that don't return a value to work but I am having trouble getting a returned value into a variable that is called.
For example, the following pipeline code works but gives a value of null for $build_user when I run the Jenkins pipeline. It doesn't actually return what I expect it to and I don't know why.
node {
stage('test') {
def tools = load "/var/lib/jenkins/workflow-libs/vars/tools.groovy"
build_user = tools.get_user()
echo "build_user: $build_user"
}
}
Here is what the relevant tools.groovy looks like.
def exampleMethod() {
// Do stuff
}
// Try to get a build username
def get_user() {
try {
wrap([$class: 'BuildUser']) {
// Set up our variables
fallback_user = 'GitHub'
github_user = BUILD_USER
commit_author = 'Test1'
// Try to use Jenkins build user first
if (github_user) {
echo "using github_user: $github_user"
return github_user
}
// Otherwise try to use commit author
else if (commit_author) {
echo "using commit_author: $commit_author"
return commit_author
}
// Otherwise username is blank so we use the default fallback
else {
echo "using fallback: $fallback_user"
return fallback_user
}
}
}
catch (err) {
// Ignore errors
}
echo "Done."
}
return this
Here is the full Jenkins output for the above code.
Started by user XXX
[Pipeline] node
Running on master in /var/lib/jenkins/workspace/test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] load
[Pipeline] { (/var/lib/jenkins/workflow-libs/vars/tools.groovy)
[Pipeline] }
[Pipeline] // load
[Pipeline] wrap
[Pipeline] {
[Pipeline] echo
using github_user: XXX
[Pipeline] }
[Pipeline] // wrap
[Pipeline] echo
Done.
[Pipeline] echo
build_user: null
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
The above method doesn't work at all if I remove return this at the end and throws the following error in Jenkins.
java.lang.NullPointerException: Cannot invoke method get_user() on
null object
...
What am I doing wrong? I suspect that I'm missing something easy but I'm not great with Groovy, so I'm not sure what it could be.
You have to end your tools.groovywith return this.
See the answer on this question How do you load a groovy file and execute it
your function get_user() returns nothing.
the return(s) inside wrap([$class: 'BuildUser']) {...} do return from wrap class and not from your function.
Despite following this answer and others, I am unable to successfully use a local groovy file in my Jenkinsfile (both are in the same repository).
def deployer = null
...
...
...
pipeline {
agent {
label 'cf_slave'
}
options {
skipDefaultCheckout()
disableConcurrentBuilds()
}
stages {
stage ("Checkout SCM") {
steps {
checkout scm
}
}
...
...
...
stage ("Publish CF app") {
steps {
script {
STAGE_NAME = "Publish CF app"
deployer = fileLoader.load ('deployer')
withCredentials(...) {
if (BRANCH_NAME == "develop") {
...
...
...
} else {
deployer.generateManifest()
}
}
}
}
}
...
...
}
deployer.groovy:
#!/usr/bin/env groovy
def generateManifest() {
sh "..."
echo "..."
}
In the console log (stack):
[Pipeline] stage
[Pipeline] { (Publish CF app)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
before loading groovy file
[Pipeline] echo
Loading from deployer.groovy
[Pipeline] load
[Pipeline] // load
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
Update:
It seems the problem was not with loading the file but rather with the contents of the file, where I execute the following which apparently does not play well:
sh "node $(pwd)/config/mustacher manifest.template.yml config/environments/common.json config/environments/someFile.json"
echo "..."
When only the echo is there, this is the stack.
So not the sh "node ..." nor the echo work. Even changing it just to sh "pwd" fails as well. What could it be? the syntax in the file? the way it is called in the pipeline?
If I will make the same node call in the pipeline (for example in the withCredentials if statement, it works.
Add a return this to the bottom of the deployer.groovy file, and then change you load step to use relative path and extension to groovy file like load('deployer.groovy').
The return this is documented on jenkins.io:
Takes a filename in the workspace and runs it as Groovy source text.
The loaded file can contain statements at top level or just load and run a closure. For example:
def pipeline
node('slave') {
pipeline = load 'pipeline.groovy'
pipeline.functionA()
}
pipeline.functionB()
pipeline.groovy
def pipelineMethod() {
...code
}
return this
Where pipeline.groovy defines functionA and functionB functions (among others) before ending with return this