I have a set of static environmental variables in the environmental directive section of a declarative pipeline. These values are available to every stage in the pipeline.
I want the values to change based on an arbitrary condition.
Is there a way to do this?
pipeline {
agent any
environment {
if ${params.condition} {
var1 = '123'
var2 = abc
} else {
var1 = '456'
var2 = def
}
}
stages {
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
stag('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
Looking for the same thing I found a nice answer in other question:
Basically is to use the ternary conditional operator
pipeline {
agent any
environment {
var1 = "${params.condition == true ? "123" : "456"}"
var2 = "${params.condition == true ? abc : def}"
}
}
Note: keep in mind that in the way you wrote your question (and I did my answer) the numbers are Strings and the letters are variables.
I would suggest you to create a stage "Environment" and declare your variable according to the condition you want, something like below:-
pipeline {
agent any
environment {
// Declare variables which will remain same throughout the build
}
stages {
stage('Environment') {
agent { node { label 'master' } }
steps {
script {
//Write condition for the variables which need to change
if ${params.condition} {
env.var1 = '123'
env.var2 = abc
} else {
env.var1 = '456'
env.var2 = def
}
sh "printenv"
}
}
}
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
stage('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
}
Suppose we want to use optional params for downstream job if it is called from upsteam job, and default params if downsteam job is called by itself.
But we don't want to have "holder" params with default value in downstream for some reason.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
you can get another level of flexibility, using maps:
stage("set_env_vars") {
steps {
script {
def MY_MAP1 = [A: "123", B: "456", C: "789"]
def MY_MAP2 = [A: "abc", B: "def", C: "ghi"]
env.var1 = MY_MAP1."${env.switching_var}"
env.var2 = MY_MAP2."${env.switching_var}"
}
}
}
This way, more choices are possible.
Related
I'm using Jenkins declarative pipeline and I want to make a conditional step depending on an environment variable, which is set according the existence of a file.
So I just want to make something like that : if Dockerfile exist, perform next stage, else don't.
To perform this I tried :
pipeline {
// ...
stage {
stage('Docker') {
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
when {
environment name: 'IS_DOCKERFILE', value: true
}
stage('Build') {
// ...
}
}
}
}
Or :
pipeline {
// ...
stage {
stage('Docker') {
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
when {
expression {
env.IS_DOCKERFILE == true
}
}
stage('Build') {
// ...
}
}
}
}
In both cases, the Dockerfile exist and it is in the workspace. I also tried with strings ("true") but everytime, the pipeline continue without executing the stage 'Build'.
Any suggestions ?
This is because the exprsssion:
IS_DOCKERFILE = fileExists 'Dockerfile'
Creates the environment variable with boolean value as string:
$ set
IS_DOCKERFILE='false'
So the solution would be to use .toBoolean() like this:
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
stages {
stage("build docker image") {
when {
expression {
env.IS_DOCKERFILE.toBoolean()
}
}
steps {
echo 'fileExists'
}
}
stage("build libraries") {
when {
expression {
!env.IS_DOCKERFILE.toBoolean()
}
}
steps {
echo 'fileNotExists'
}
}
}
As #Sergey already posted, the problem is that you're comparing a string to a boolean. See fileExists: Verify if file exists in workspace.
Besides his answer, you can compare directly to a string:
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
stages {
stage("build docker image") {
when {
expression {IS_DOCKERFILE == 'true'}
}
steps {
echo 'fileExists'
}
}
stage("build libraries") {
when {
expression {IS_DOCKERFILE == 'false'}
}
steps {
echo 'fileNotExists'
}
}
}
I have this template:
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent any
....
stages {
stage('My stages') {
steps {
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
$v
}
}
}
}
}
}
post { ... }
}
}
Then I use the template in a pipeline:
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': sh "do something...",
'Second stage': myCustomCommand("foo","bar")
]
}
In the stagesParams I pass the instances of my command (sh and myCustomCommand) and they land in the template as $v. How can I then execute them? Some sort of InvokeMethod($v)?
At the moment I am getting this error:
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
The problem of using node is that it doesn't work in situations like parallel:
parallelStages = [:]
v.each { k2, v2 ->
parallelStages["$k2"] = {
// node {
stage("$k2") {
notifySlackStartStage()
$v2
checkLog()
}
// }
}
}
If you want to execute sh step provided with a map, you need to store map values as closures, e.g.
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': {
sh "do something..."
}
'Second stage': {
myCustomCommand("foo","bar")
}
]
}
Then in the script part of your pipeline stage you will need to execute the closure, but also set the delegate and delegation strategy to the workflow script, e.g.
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
v.resolveStrategy = Closure.DELEGATE_FIRST
v.delegate = this
v.call()
}
}
}
I have issue porting scripted to declarative pipeline. I used to have in scripted:
//Scripted
def myEnv = [:]
stage ('Prepare my env') { [...] myEnv = ... }
stage ('Fancy stuff') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
}
}
stage ('Fancy stuff2') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
} }
}
and now in declarative I would like to have
//Declarative
def myEnv = [:]
pipeline {
agent none
stage('Prepare my env') {
steps {
script {
[...]
myEnv = ...
}
}
}
stages {
environment { myEnv }
stage('Fancy stuff') {
[...]
}
stage('Fancy stuff2') {
[...]
}
} }
when I try to run this, it fails withEnv
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: WorkflowScript: xx: "myEnv" is not a valid environment
expression. Use "key = value" pairs with valid Java/shell keys.
Fair enough.
What should I do to be able to use declarative environment { } to avoid using withEnv(myEnv) one in every further steps?
it seems that the part you are missing is the usage of environment clause.
Instead of
environment { myEnv }
It should be
environment { myEnvVal = myEnv }
Just as the error method mentions this should be key = value pair.
Your issue comes from the type of your variable myEnv. You define it as a map when you do def myEnv = [:].
So it works with withEnv that takes a map as parameter but it does not work with environment {...} that takes only "key = value" statements.
The solution depends on how you add environment variables contained in myEnv.
The simplest way is using environment directive by listing all the key/values contained in your former variable myEnv:
pipeline{
agent none
environment {
test1 = 'test-1'
test2 = 'test-2'
}
stages{
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}
But you also do it the scripted way :
pipeline{
agent none
stages{
stage('Prepare my env') {
steps {
script {
def test = []
for (int i = 1; i < 3; ++i) {
test[i] = 'test-' + i.toString()
}
test1 = test[1]
test2 = test[2]
}
}
}
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}
Previously asked a question about how to overwrite variables defined in an environment directive and it seems that's not possible.
I want to set a variable in one stage and have it accessible to other stages.
In a declarative pipeline it seems the only way to do this is in a script{} block.
For example I need to set some vars after checkout. So at the end of the checkout stage I have a script{} block that sets those vars and they are accessible in other stages.
This works, but it feels wrong. And for the sake of readability I'd much prefer to declare these variables at the top of the pipeline and have them overwritten. So that would mean having a "set variables" stage at the beginning with a script{} block that just defines vars- thats ugly.
I'm pretty sure I'm missing an obvious feature here. Do declarative pipelines have a global variable feature or must I use script{}
This is working without an error,
def my_var
pipeline {
agent any
environment {
REVISION = ""
}
stages {
stage('Example') {
steps {
script{
my_var = 'value1'
}
}
}
stage('Example2') {
steps {
script{
echo "$my_var"
}
}
}
}
}
Like #mkobit says, you can define the variable to global level out of pipeline block. Have you tried that?
def my_var
pipeline {
agent any
stages {
stage('Example') {
steps {
my_var = 'value1'
}
}
stage('Example2') {
steps {
printl(my_var)
}
}
}
}
For strings, add it to the 'environment' block:
pipeline {
environment {
myGlobalValue = 'foo'
}
}
But for non-string variables, the easiest solution I've found for declarative pipelines is to wrap the values in a method.
Example:
pipeline {
// Now I can reference myGlobalValue() in my pipeline.
...
}
def myGlobalValue() {
return ['A', 'list', 'of', 'values']
// I can also reference myGlobalValue() in other methods below
def myGlobalSet() {
return myGlobalValue().toSet()
}
#Sameera's answer is good for most use cases. I had a problem with appending operator += though. So this did NOT work (MissingPropertyException):
def globalvar = ""
pipeline {
stages {
stage("whatever) {
steps {
script {
globalvar += "x"
}
}
}
}
}
But this did work:
globalvar = ""
pipeline {
stages {
stage("whatever) {
steps {
script {
globalvar += "x"
}
}
}
}
}
The correct syntax is:
For global static variable
somewhere at the top of the file, before pipeline {, declare:
def MY_VAR = 'something'
For global variable that you can edit and reuse accross stages:
At the top of your file, add an import to Field:
import groovy.transform.Field
somewhere before pipeline {, declare:
#Field def myVar
then inside your step, inside a script, set the variable
stage('some stage') {
steps {
script {
myVar = 'I mutate myVar with success'
}
}
}
to go even further, you can declare functions:
before the pipeline {
def initSteps() {
cleanWs()
checkout scm
}
and then
stages {
stage('Init') {
steps {
initSteps()
}
}
}
This worked for me
pipeline {
agent any
stages {
stage('Example') {
steps {
script{
env.my_var = 'value1'
}
}
}
stage('Example2') {
steps {
printl(my_var)
}
}
}
}
How do I pass variables between stages in a declarative pipeline?
In a scripted pipeline, I gather the procedure is to write to a temporary file, then read the file into a variable.
How do I do this in a declarative pipeline?
E.g. I want to trigger a build of a different job, based on a variable created by a shell action.
stage("stage 1") {
steps {
sh "do_something > var.txt"
// I want to get var.txt into VAR
}
}
stage("stage 2") {
steps {
build job: "job2", parameters[string(name: "var", value: "${VAR})]
}
}
If you want to use a file (since a script is the thing generating the value you need), you could use readFile as seen below. If not, use sh with the script option as seen below:
// Define a groovy local variable, myVar.
// A global variable without the def, like myVar = 'initial_value',
// was required for me in older versions of jenkins. Your mileage
// may vary. Defining the variable here maybe adds a bit of clarity,
// showing that it is intended to be used across multiple stages.
def myVar = 'initial_value'
pipeline {
agent { label 'docker' }
stages {
stage('one') {
steps {
echo "1.1. ${myVar}" // prints '1.1. initial_value'
sh 'echo hotness > myfile.txt'
script {
// OPTION 1: set variable by reading from file.
// FYI, trim removes leading and trailing whitespace from the string
myVar = readFile('myfile.txt').trim()
}
echo "1.2. ${myVar}" // prints '1.2. hotness'
}
}
stage('two') {
steps {
echo "2.1 ${myVar}" // prints '2.1. hotness'
sh "echo 2.2. sh ${myVar}, Sergio" // prints '2.2. sh hotness, Sergio'
}
}
// this stage is skipped due to the when expression, so nothing is printed
stage('three') {
when {
expression { myVar != 'hotness' }
}
steps {
echo "three: ${myVar}"
}
}
}
}
Simply:
pipeline {
parameters {
string(name: 'custom_var', defaultValue: '')
}
stage("make param global") {
steps {
tmp_param = sh (script: 'most amazing shell command', returnStdout: true).trim()
env.custom_var = tmp_param
}
}
stage("test if param was saved") {
steps {
echo "${env.custom_var}"
}
}
}
I had a similar problem as I wanted one specific pipeline to provide variables and many other ones using it to get this variables.
I created a my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
I can reuse these variables in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
there is no need for (hidden plugin) parameter definitions or temp-file access. Sharing varibles across stages can be acomplished by using global Groovy variables in a Jenkinsfile like so:
#!/usr/bin/env groovy
def MYVAR
def outputOf(cmd) { return sh(returnStdout:true,script:cmd).trim(); }
pipeline {
agent any
stage("stage 1") {
steps {
MYVAR = outputOf('echo do_something')
sh "echo MYVAR has been set to: '${MYVAR}'"
}
}
stage("stage 2") {
steps {
sh '''echo "...in multiline quotes: "''' + MYVAR + '''" ... '''
build job: "job2", parameters[string(name: "var", value: MYVAR)]
}
}
}
I have enhanced the existing solution by correcting syntax .Also used hidden parameter plugin so that it does not show up as an extra parameter in Jenkins UI. Works well :)
properties([parameters([[$class: 'WHideParameterDefinition', defaultValue: 'yoyo', name: 'hidden_var']])])
pipeline {
agent any
stages{
stage("make param global") {
steps {
script{
env.hidden_var = "Hello"
}
}
}
stage("test if param was saved") {
steps {
echo"About to check result"
echo "${env.hidden_var}"
}
}
}
}