Expand ENV vars from String - jenkins

How can I expand ${ENV} variables in Jenkins Pipeline if they are in a string I do not control?
For example I have configured my Pipeline Job to load the Pipeline from a parameterized SCM:
If I now access that branch via scm.branches[0].name inside the Pipeline I am currently getting ${REF}, too.
(The checkout scm part of the pipeline works fine, thats not the problem.)
I have tried the tm() step, but that throws org.jenkinsci.plugins.tokenmacro.MacroEvaluationException: Unrecognized macro 'REF' in '${REF}'
I for example cannot use to update the build name:
currentBuild.displayName = "${scm.branches[0].name} (#$BUILD_NUMBER)"

If REF is an Environment variable you can use String interpolation. Just put the variable into double quotes within the pipeline.
echo "${REF}"
Update
Not sure if there is better groovy way to do this. But following is an option you can use.
steps {
script {
script {
echo "${scm.branches[0].name}"
String branchName = "${scm.branches[0].name}"
String envNameOnly = branchName.substring(2, branchName.length()-1)
def env = System.getenv()[envNameOnly]
echo "$env"
}
}
}

Related

How to read parameters from a file within a Jenkins pipeline job

I have a pipeline and it's run for three different branches(dev/uat/master). Some parameters change for each branch hence they are hardcoded for each environment resulting in three Jenkinsfiles (one for each environment).
My second solution is to have three different properties file based on environment. A single Jenkins job will trigger the Jenkins jobs but based on branch name (which I will pick up from GitHub webhook trigger).
My Jenkinsfile has an environment variable whose assignment looks like below:
myJenkinsJob.jenkinsfile
serviceAccountName = sh(returnStdout: true, script: "awk -F= '{$1 ~ /serviceAccountName/ ; gsub($1"=","") ; print}' dev.properties").trim()
dev.properties file looks like this:
serviceAccountName=abc#def.com
This evaluates to the value mentioned in the properties file. serviceAccountName=abc#def.com.
Does anyone has any better/easier alternative? Some plugin which can read the parameters passed in file without going through all sh commands for assignments in environment/parameters block?
You can use one Jenkinsfile for all branches and add an init stage to setup your variables according the branch name, using the BRANCH_NAME environment variable :
stage ('Init') {
steps {
script {
switch(env.BRANCH_NAME) {
case 'dev':
serviceAccountName = 'dev#def.com'
break
case 'uat':
serviceAccountName = 'uat#def.com'
break
case 'master':
serviceAccountName = 'master#def.com'
break
default:
error('Unexpected branch name')
}
}
}
}
If you want to use a properties file you can use the readFile syntax or use a YAML and use the readYaml which can be easier to parse the retrieved value.
Example :
dev.yml file can look like this :
service-account-name: abc#def.com
And then using readYaml in your pipeline :
def devData = readYaml file: 'dev.yml'
def serviceAccountName = devData.service-account-name
For all the environment variables Jenkins supplies see the page https://your.jenkins.host:port/env-vars.html.

How to use inject environment variables (Properties File Path) in Jenkins Pipeline

Want to use the below functionality(shown in image link) in Jenkins as code, but i'm failing to do, kindly help me replicate the functionality in the image to groovy script
stage ('Build Instance') {
sh '''
bash ./build.sh -Ddisable-rpm=false
'''
env "/fl/tar/ver.prop"
}
Jenkins GUI usage of Env Inject
Got a simple workaround :
script {
def props = readProperties file: '/fl/tar/ver.prop' //readProperties is a step in Pipeline Utility Steps plugin
env.WEATHER = props.WEATHER //assuming the key name is WEATHER in properties file
}

Load env variables successfully in Jenkins pipeline but not while the pipeline was used as shared library

In one stage of my declarative jenkins pipeline codes, it executes a bash script(sh '''./a.sh''', script "a.sh" is maintained outsides) - in that script, the value of "jarVersion" is injected in ${WORKSPACE}/.jarVersion (echo "jarVersion=${jarVersion}" > ${WORKSPACE}/.jarVersion). At later stage, we need get the value of jarVersion. We use load "${WORKSPACE}/.jarVersion" and ${jarVersion} to get the value. It works when we do so in pipeline script.
However, when we set this pipeline as a shared library (put it in /vars/testSuite.groovy) and call it in another pipeline script. It can not recognize var ${jarVersion}.
Please advise how to solve the issue. A common question is: how to transfer a value in a script from stage A to stage B?
stage('getJarVersion'){
steps{
script{
load "${WORKSPACE}/.jarVersion"
currentBuild.description = "jarVersion:${jarVersion}"
}
}
}
I expected it could work as it is in pipeline scripts.
But it shows:
groovy.lang.MissingPropertyException: No such property: jarVersion for class: testSuite
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:458)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:34)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at testSuite.call(/jenkins/jobs/TestSuite1/builds/11/libs/pipelineUtilities/vars/testSuite.groovy:84)
With the stages under the same groovy file, you have to declare the object out of the stage blocks and before the node block. So for each stage, you can define the value inside the variable:
Pipeline {
def my_var
stage('stage1'){
---------
}
stage('stage2'){
---------
}
}
If you are defining a stage per file, you have to create the closures with the input object and to pass it in the call from the parent groovy file:
test.groovy:
def call(def my_obj, String my_string) {
stage('my_stage') {
println(my_obj)
}
}
parent_test.groovy
test(obj_value,string_value)

Jenkins Global environment variables in Jenkinsfile

How do I invoke Global environment variables in Jenkinsfile?
For example, if I have a variable -
name:credentialsId
value:xxxx-xxxx-xxxxx-xxxxxxxxx
How do I use it in the groovy script?
I tried ${credentialsId}, but it didn't work. It will just give error:
java.lang.NoSuchMethodError: No such DSL method '$' found among steps [ArtifactoryGradleBuild, ........
In a Jenkinsfile, you have the "Working with the Environment" which mentions:
The full list of environment variables accessible from within Jenkins Pipeline is documented at localhost:8080/pipeline-syntax/globals#env,
The syntax is ${env.xxx} as in:
node {
echo "Running ${env.BUILD_ID} on ${env.JENKINS_URL}"
}
See also "Managing the Environment".
How can I pass the Global variables to the Jenkinsfile?
When I say Global variables - I mean in
Jenkins -> Manage Jenkins -> Configure System -> Global properties -> Environment variables
See "Setting environment variables"
Setting an environment variable within a Jenkins Pipeline can be done with the withEnv step, which allows overriding specified environment variables for a given block of Pipeline Script, for example:
Jenkinsfile (Pipeline Script)
node {
/* .. snip .. */
withEnv(["NAME=value"]) {
... your job
}
}
When referring to env in Groovy scope, simply use env.VARIABLE_NAME, for example to pass on BUILD_NUMBER of upstream job to a triggered job:
stage ('Starting job') {
build job: 'TriggerTest', parameters: [
[$class: 'StringParameterValue', name: 'upstream_build_number', value: env.BUILD_NUMBER]
]
}
Scripted pipeline
To read an environment variable whose name you know, use env.NAME
To read an environment variable whose name is not known until runtime use env.getProperty(name).
For example, a value from a YAML config file represents an environment variable name:
config.yaml (in workspace)
myconfig:
key: JOB_DISPLAY_URL
Jenkinsfile
node {
println("Running job ${env.JOB_NAME}")
def config = readYaml(file:'config.yaml')
def value = env.getProperty(config.myconfig.key)
println("Value of property ${config.myconfig.key} is ${value}")
}
For getting values all env.VAR, env['VAR'], env.getProperty('VAR') are fine.
For setting values the only safe way at the moment is withEnv. If you try to assign values to env.VAR it may not work in some cases like for parallel pipelines (like in JENKINS-59871).
Another syntax is $ENV:xxxx
node {
echo "Running $ENV.BUILD_ID on $ENV.JENKINS_URL" }
This worked for me

Load file with environment variables Jenkins Pipeline

I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
I was thinking about not using pipeline (but i will not be able to use my Jenkinsfile).
Or make different jobs for staging and production, using EnvInject Plugin (But i lose my stage view)
Or make withEnv (but the code gets big, because today i am working with 12 env vars)
One way you could load environment variables from a file is to load a Groovy file.
For example:
Let's say you have a groovy file in '$JENKINS_HOME/.envvars' called 'stacktest-staging.groovy'.
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello"
env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node {
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
echo "${env.DB_URL}"
echo "${env.DB_URL2}"
}
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see:
issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') {
def someGroovyVar = 'Hello world'
withEnv(['VAR1=VALUE ONE',
"VAR2=${someGroovyVar}"
]) {
def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true)
echo result
}
}
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
echo "User name: $USER"
echo "Password: $PASSWORD"
}
}
The jenkisn console log output hides the real values:
[Pipeline] echo
User name: ****
[Pipeline] echo
Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
withEnv(["ENV_USERNAME=${USER}",
"ENV_PASSWORD=${PASSWORD}"
]) {
def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true)
echo result
}
}
}
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps)
Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value.
But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well.
To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
Using withEnv() to pass environment variables from file splitted by new line and casted to List:
writeFile file: 'version.txt', text: 'version=6.22.0'
withEnv(readFile('version.txt').split('\n') as List) {
sh "echo ${version}"
}
If you are using Jenkins 2.0 you can load the property file (which consists of all required Environment variables along with their corresponding values) and read all the environment variables listed there automatically and inject it into the Jenkins provided env entity.
Here is a method which performs the above stated action.
def loadProperties(path) {
properties = new Properties()
File propertiesFile = new File(path)
properties.load(propertiesFile.newDataInputStream())
Set<Object> keys = properties.keySet();
for(Object k:keys){
String key = (String)k;
String value =(String) properties.getProperty(key)
env."${key}" = "${value}"
}
}
To call this method we need to pass the path of property file as a string variable For example, in our Jenkins file using groovy script we can call like
path = "${workspace}/pic_env_vars.properties"
loadProperties(path)
Please ask me if you have any doubt
Here is a complete example of externalizing environment variables and loading them in Jenkins pipeline execution. The pipeline is written in a declarative style.
stage('Reading environment variable defined in groovy file') {
steps {
script {
load "./pipeline/basics/extenvvariable/env.groovy"
echo "${env.env_var1}"
echo "${env.env_var2}"
}
}
}
Complete code example:
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/Jenkinsfile
Where variables are loaded from a groovy file placed with the pipeline code only.
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/env.groovy
This pattern comes very handy when you are creating a generic pipeline that could be used across teams.
You can externalize the dependent variable in such groovy file and each team can define their values according to their ecosystem.
Another solution is to use a custom method without allowing extra permissions such as for new Properties() which leads to this error before allowing:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.util.Properties
or adding extra plugin methods such as readProperties.
here is a method which reads a simple file named env_vars in this format:
FOO=bar
FOO2=bar
pipeline {
<... skipped lines ...>
script {
loadEnvironmentVariablesFromFile("env_vars")
echo "show time! ${BAR} ${BAR2}"
}
<... skipped lines ...>
}
private void loadEnvironmentVariablesFromFile(String path) {
def file = readFile(path)
file.split('\n').each { envLine ->
def (key, value) = envLine.tokenize('=')
env."${key}" = "${value}"
}
}

Resources