I have a Jenkins pipeline that basically reads a JSON file and prints out some values from it.
import groovy.json.JsonSlurper
import groovy.json.JsonSlurperClassic
pipeline {
agent any
stages {
stage('test') {
steps {
script {
def environment = new JsonSlurperClassic().parseText('''
{
"list": [
{
"name": "service1"
},
{
"name": "service2"
},
{
"name": "NotAService"
},
{
"name": "AnotherDummyService-mock",
}
]
}
'''
)
def forLoopBuilders = [:]
for (artifact in environment.list) {
if (!artifact.name.contains("-mock")) {
println("Before parallel, the value is ${artifact.name}")
forLoopBuilders[artifact.name] = { println(artifact.name) }
}
}
parallel forLoopBuilders
def closureBuilders = [:]
environment.list.each { artifact ->
if (!artifact.name.contains("-mock")) {
println("Before parallel, the value is ${artifact.name}")
closureBuilders[artifact.name] = { println(artifact.name) }
}
}
parallel closureBuilders
}
}
}
}
}
#NonCPS
def jsonParse(def json) {
new groovy.json.JsonSlurperClassic().parseText(json)
}
The builders variable is how I store stages that will run in parallel. Basically it is
[stageA: What to do in this stageA, anotherStage: What to do in this anotherStage]
The output is as below
10:20:53 Before parallel, the value is service1
10:20:53 [Pipeline] echo
10:20:53 Before parallel, the value is service2
10:20:53 [Pipeline] echo
10:20:53 Before parallel, the value is NotAService
10:20:53 [Pipeline] parallel
10:20:53 [Pipeline] { (Branch: service1)
10:20:53 [Pipeline] { (Branch: service2)
10:20:53 [Pipeline] { (Branch: NotAService)
10:20:53 [Pipeline] echo
10:20:53 AnotherDummyService-mock
10:20:53 [Pipeline] }
10:20:53 [Pipeline] echo
10:20:53 AnotherDummyService-mock
10:20:53 [Pipeline] }
10:20:53 [Pipeline] echo
10:20:53 AnotherDummyService-mock
10:20:53 [Pipeline] // parallel
10:20:53 [Pipeline] echo
10:20:53 Before parallel, the value is service1
10:20:53 [Pipeline] echo
10:20:54 Before parallel, the value is service2
10:20:54 [Pipeline] echo
10:20:54 Before parallel, the value is NotAService
10:20:55 [Pipeline] parallel
10:20:55 [Pipeline] { (Branch: service1)
10:20:55 [Pipeline] { (Branch: service2)
10:20:55 [Pipeline] { (Branch: NotAService)
10:20:55 [Pipeline] echo
10:20:55 service1
10:20:55 [Pipeline] }
10:20:55 [Pipeline] echo
10:20:55 service2
10:20:55 [Pipeline] }
10:20:55 [Pipeline] echo
10:20:55 NotAService
As you can see the outputs from running the parallel stages are different. Why is that so?
What I want is the output from parallel forLoopBuilders should be the same as that from closureBuilders.
This seems to be the result of how closures within for loops capture the loop variable, artifact in your case.
There's only a single loop variable that keeps being re-assigned... so the closures capture that single variable, but once the loop ends, that variable will have the last value assigned to it, hence all closures will only see that value later.
You can see how that behaves by running this simple example in pure Groovy:
def map = [
'Service1': 's1',
'Service2': 's2'
]
def closures = []
for (entry in map.entrySet()) {
closures << { println "ENTRY: $entry" }
}
closures*.call()
This will print:
ENTRY: Service2=s2
ENTRY: Service2=s2
I.e. the closures capture the last value of the entry variable.
Groovy closures are smarter in capturing values, they actually create new variables on each run, so if you replace the for-loop with a each { ... } construct, it works:
def map = [
'Service1': 's1',
'Service2': 's2'
]
def closures = []
map.entrySet().each { entry ->
closures << { println "ENTRY: $entry" }
}
closures*.call()
Prints:
ENTRY: Service1=s1
ENTRY: Service2=s2
In your case, just just use each { } and it should do what you want.
EDIT
If you insist in using the for loop, you should do like you would do in Java and assign the current value to a new variable.
The following code has the same result as the one using each:
def map = [
'Service1': 's1',
'Service2': 's2'
]
def closures = []
for (entry in map.entrySet()) {
def value = entry
closures << { println "ENTRY: $value" }
}
closures*.call()
Related
I'm new to Groovy. I'm not able to figure out what's wrong here.
Depends on the choice of input, I expect the script to execute either Step 'Hello' or 'Bye' but it skips both. I mostly orientated to this Jenkins pipeline conditional stage using "When" for choice parameters, but still can't figure it out.
How can I use those choice parameters correctly?
pipeline {
agent any
stages {
stage('Init') {
steps('Log-in'){
echo 'Log-in'
}
}
stage('Manual Step') {
input {
message "Hello or Goodbye?"
ok "Say!"
parameters{choice(choices:['Hello','Bye'], description: 'Users Choice', name: 'CHOICE')}
}
steps('Input'){
echo "choice: ${CHOICE}"
echo "choice params.: " + params.CHOICE //null
echo "choice env: " + env.CHOICE //Hello
}
}
stage('Hello') {
when{ expression {env.CHOICE == 'Hello'}}
steps('Execute'){
echo 'Say Hello'
}
}
stage('Bye') {
when{ expression {env.CHOICE == 'Bye'}}
steps('Execute'){
echo 'Say Bye'
}
}
}
}
Output:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Init)
[Pipeline] echo
Log-in
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Manual Step)
[Pipeline] input
Input requested
Approved by Admin
[Pipeline] withEnv
[Pipeline] {
[Pipeline] echo
choice: Hello
[Pipeline] echo
choice params.: null
[Pipeline] echo
choice env: Hello
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Hello)
Stage "Hello" skipped due to when conditional
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Bye)
Stage "Bye" skipped due to when conditional
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
From the docs:
Any parameters provided as part of the input submission will be available in the environment for the rest of the stage.
This means that your parameter CHOICE does not exist in the other stages. If you want to have a parameter that's available on all the stages, you can define a parameter outside of the stage, i.e.:
pipeline {
agent any
parameters {
choice(choices:['Hello','Bye'], description: 'Users Choice', name: 'CHOICE')
}
stages {
stage('Init') {
steps('Log-in') {
echo 'Log-in'
}
}
stage('Manual Step') {
steps('Input') {
echo "choice: ${CHOICE}"
echo "choice params.: " + params.CHOICE
echo "choice env: " + env.CHOICE
}
}
stage('Hello') {
when {
expression { env.CHOICE == 'Hello' }
}
steps('Execute') {
echo 'Say Hello'
}
}
stage('Bye') {
when {
expression {env.CHOICE == 'Bye'}
}
steps('Execute'){
echo 'Say Bye'
}
}
}
}
This will behave as expected. The difference is that the job won't ask you for input, instead, you will provide the wanted parameters before pressing build.
(edited/updated from original post to attempt to address confusion about what the problem is)
The problem is: Values that are set in a Jenkinsfile environment section are not added to the object returned by env.getEnvironment()
The question is: How do I get a map of the complete environment, including values that were assigned in the environment section? Because env.getEnvironment() doesn't do that.
Example Jenkinsfile:
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
// get env values as a map (for passing to groovy methods)
def envObject = env.getEnvironment()
// see what env.getEnvironment() looks like
// notice ONE is not present in the output, but TWO is
// ONE is set using ONE = '1' in the environment section above
// TWO is set using env['TWO'] = '2' in the Init stage above
println envObject.toString()
// for good measure loop through the env.getEnvironment() map
// and print any value(s) named ONE or TWO
// only TWO: 2 is output
envObject.each { k,v ->
if (k == 'ONE' || k == 'TWO') {
println "${k}: ${v}"
}
}
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output (output of envObject.toString() shortened to ... except relevant part):
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Init)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
[..., TWO:2]
[Pipeline] echo
TWO: 2
[Pipeline] sh
+ env
+ grep -E ONE|TWO
ONE=1
TWO=2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Notice ONE is missing from the env.getEnvironment() object, but TWO is present.
Also notice that both ONE and TWO are set in the actual environment and I am not asking how to access the environment or how to iterate through the values returned by env.getEnvironment(). The issue is that env.getEnvironment() does not return all the values in the environment, it excludes any values that were set inside the environment section of the Jenkinsfile.
I don't have a "why" answer for you, but you can cheat and get a map by parsing the output from env via the readProperties step.
def envMap = readProperties(text: sh(script: 'env', returnStdout: true))
println(envMap.getClass())
println("${envMap}")
I would get the env and convert it to map with the help of properties
pipeline {
agent any
environment {
// this is not included in env.getEnvironment()
ONE = '1'
}
stages {
stage('Init') {
steps {
script {
// this is included in env.getEnvironment()
env['TWO'] = '2'
}
}
}
stage('Test') {
steps {
script {
def envProp = readProperties text: sh (script: "env", returnStdout: true).trim()
Map envMapFromProp = envProp as Map
echo "ONE=${envMapFromProp.ONE}\nTWO=${envMapFromProp.TWO}"
// now show that both ONE and TWO are indeed in the environment
// by shelling out and using the env linux command
// this outputs ONE=1 and TWO=2
sh 'env | grep -E "ONE|TWO"'
}
}
}
}
}
Output of env.getEnvironment() method will not return a list or Map, Hence it's difficult to iterate with each but there are some workaround you can do to make this work.
import groovy.json.JsonSlurper
pipeline {
agent any;
environment {
ONE = 1
TWO = 2
}
stages {
stage('debug') {
steps {
script {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parseText(env.getEnvironment().toString())
assert object instanceof Map
object.each { k,v ->
echo "Key: ${k}, Value: ${v}"
}
}
}
}
}
}
Note - env.getEnvironment().toString() will give you a JSON String . While parsing the JOSN string if groovy jsonSlurper.parseText found any special character it will through an error
You can also explore a little bit around env Jenkins API and find an appropriate method that will either return a Map or List so that you can use each
I got multiple parallel branches job, each branch contains numerous stages.
def build_jobs = [:]
build_jobs['1'] = {
stage ('1A'){}
stage ('1B'){}
}
build_jobs['2'] = {
stage ('2A'){}
stage ('2B'){}
}
build_jobs['3'] = {
stage ('3A'){}
stage ('3B'){}
}
parallel build_jobs
Is there a way to know during the run on which parallel branch stage executed ?
For example:
1A -> 1
1C -> 1
2B -> 2
Thanks
You can get this information from Jenkins API at ${JENKINS_URL}blue/rest/organizations/jenkins/pipelines/${JOB_NAME}/runs/${BUILD_ID}/nodes/. This gets you a JSON that you can iterate over until you meet a stage that has the same name as your ${STAGE_NAME}.
Here's a little example:
pipeline {
agent { label 'master' }
stages {
stage('doing something') {
steps {
script {
def build_jobs = [:]
build_jobs['1'] = {
stage ('1A'){ script { getParent() } }
stage ('1B'){}
}
build_jobs['2'] = {
stage ('2A'){}
stage ('2B'){ script { getParent() } }
}
build_jobs['3'] = {
stage ('3A'){}
stage ('3B'){}
}
parallel build_jobs
}
}
}
}
}
def getParent() {
def URL = "${JENKINS_URL}blue/rest/organizations/jenkins/pipelines/${JOB_NAME}/runs/${BUILD_ID}/nodes/"
def output = sh (returnStdout: true,
script: "curl -s $URL")
def parsed_data = readJSON text: output
def my_parent_id = null
for (stage in parsed_data) {
def displayName = stage.displayName
if (displayName == env.STAGE_NAME) {
my_parent_id = stage.firstParent
break
}
}
for (stage in parsed_data) {
def my_id = stage.id
if (my_id == my_parent_id) {
println "FOUND! The stage ${STAGE_NAME} is running under a parent: ${stage.displayName}"
return
}
}
}
Output:
[Pipeline] }
+ curl -s https://<jenkins>/blue/rest/organizations/jenkins/pipelines/test/runs/3108/nodes/
[Pipeline] readJSON
[Pipeline] echo
FOUND! The stage 1A is running under a parent: 1
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (1B)
+ curl -s https://<jenkins>/blue/rest/organizations/jenkins/pipelines/test/runs/3108/nodes/
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] readJSON
[Pipeline] echo
FOUND! The stage 2B is running under a parent: 2
[Pipeline] }
I want to use some common value across different conditions in post section of pipeline hence I tried following -
1.
post {
script {
def variable = "<some dynamic value here>"
}
failure{
script{
"<use variable here>"
}
}
success{
script{
"<use variable here>"
}
}
}
2
post {
def variable = "<some dynamic value here>"
failure{
script{
"<use variable here>"
}
}
success{
script{
"<use variable here>"
}
}
}
But it results into compilation error.
Can you please suggest how I can declare a variable in post section which can be used across conditions?
You could use always condition which is guaranteed to be executed before any other conditions like success or failure. If you want to store String value, you can use use env variable to store it (environment variable always casts given value to a string). Alternatively, you can define a global variable outside the pipeline and then initialize it with the expected dynamic value inside the always condition. Consider the following example:
def someGlobalVar
pipeline {
agent any
stages {
stage("Test") {
steps {
echo "Test"
}
}
}
post {
always {
script {
env.FOO = "bar"
someGlobalVar = 23
}
}
success {
echo "FOO is ${env.FOO}"
echo "someGlobalVar = ${someGlobalVar}"
}
}
}
Output:
Running on Jenkins in /home/wololock/.jenkins/workspace/pipeline-post-sections
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
Test
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] echo
FOO is bar
[Pipeline] echo
someGlobalVar = 23
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
I'm trying to pass in a commit messages concatenated string to a shell script via a Jenkins declarative pipeline. I can get the concatenated string, but I cannot figure out how to pass it to my shell script. Environment variables are readable in my shell script, but I cannot set the Environment variable outside of my stages, as the stage is where I define my git connection, and if I set it in the stage it does not update the Environment variable that I call in my post section. How can I pass the value of changeString to my bash script (in success)?
pipeline {
agent any
environment {
CHANGE_STRING = 'Initial default value.'
}
stages {
stage('Build') {
environment {
CHANGE_STRING = 'This change is only available in this stage and not in my shell script'
}
steps {
echo 'Build stage'
git branch: 'develop',
credentialsId: 'blah',
url: 'blah.git'
sh """
npm install
"""
script{
MAX_MSG_LEN = 100
def changeString = ""
def changeLogSets = currentBuild.changeSets
for (int i = 0; i < changeLogSets.size(); i++) {
def entries = changeLogSets[i].items
for (int j = 0; j < entries.length; j++) {
def entry = entries[j]
truncated_msg = entry.msg.take(MAX_MSG_LEN)
changeString += " - ${truncated_msg} [${entry.author}]\n"
}
}
if (!changeString) {
changeString = " - No new changes"
}
//I would like to set CHANGE_STRING here
}
}
}
}
post {
success {
echo 'Successfull build'
sh """
bash /var/lib/jenkins/jobs/my-project/hooks/onsuccess
"""
}
}
}
If you want to export environment variable from script step and access it outside the current stage you have to use a variable name that was not specified in global or local environment {} block. Consider following example:
pipeline {
agent any
environment {
IMMUTABLE_VARIABLE = 'my value'
}
stages {
stage('Build') {
steps {
script{
def random = new Random()
if (random.nextInt(2) == 1) {
env.CHANGE_STRING = "Lorem ipsum dolor sit amet"
} else {
env.CHANGE_STRING = "Foo Bar"
}
env.IMMUTABLE_VARIABLE = 'a new value'
echo "IMMUTABLE_VARIABLE = ${env.IMMUTABLE_VARIABLE}"
}
}
}
}
post {
success {
echo 'Successfull build'
sh '''
echo $CHANGE_STRING
echo "IMMUTABLE_VARIABLE = $IMMUTABLE_VARIABLE"
'''
}
}
}
This is just a simplification of your pipeline script. When I run it I see following console output:
[Pipeline] node
Running on Jenkins in /var/jenkins_home/workspace/test-pipeline
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Build)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
IMMUTABLE_VARIABLE = my value
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] echo
Successfull build
[Pipeline] sh
[test-pipeline] Running shell script
+ echo Foo Bar
Foo Bar
+ echo IMMUTABLE_VARIABLE = my value
IMMUTABLE_VARIABLE = my value
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
The shell script in post success block prints Foo Bar in the first line and IMMUTABLE_VARIABLE = my value in the second one. Also notice that even though I have explicitly try to override
env.IMMUTABLE_VARIABLE = 'a new value'
it didn't make any effect and when I did
echo "IMMUTABLE_VARIABLE = ${env.IMMUTABLE_VARIABLE}"
it simply echoed the initial value from environment {} block:
IMMUTABLE_VARIABLE = my value
Hope it helps.