How to use environment variable inside Jenkinsfile - jenkins

I am having similar issue as mentioned here
I am trying to deploy an application via Jenkinsfile. For which I have to run this command on the deploy stage in Jenkins (if I hardcode the value then it works fine):
xldDeploy serverCredentials: 'usernam', environmentId: 'Environments/SysTest1/SysTest1_1', packageId: 'Applications/Testapp/testapp_1.0.4.5.Build39_TAG-test'
"testapp_1.0.4.5.Build39_TAG-test" is getting generated at running time. Which can be created by concating "${TagVersion}.Build${env.BUILD_NUMBER}_${ComponentTagName}"
I tried below code in my Jenkins pipeline:
stage('Deploy') {
node('noibuild01') {
if ("${env.Build_WildflyCPECommon}" == 'true') {
echo "${TagVersion}"
echo "${ComponentTagName}"
echo "${env.BUILD_NUMBER}"
script {
env.buildNumber = "${TagVersion}.Build${env.BUILD_NUMBER}_${ComponentTagName}"
env.packageid = "'Applications/Testapp/${env.buildNumber}'"
}
echo "${env.buildNumber}"
echo "${env.packageid}"
xldDeploy serverCredentials: 'nex8voo', environmentId: 'Environments/SysTest1/SysTest1_1', packageId: "${env.packageid}"
}
}
}
I checked the output, it is showing correctly:
echo "${env.buildNumber}" giving
testapp_1.0.4.5.Build39_TAG-test
echo "${env.packageid}" giving
'Applications/Testapp/testapp_1.0.4.5.Build39_TAG-test'
But xldDeploy serverCredentials: 'username', environmentId: 'Environments/SysTest1/SysTest1_1', packageId: "${env.packageid}"
is taking as:
[/repository/ci/'Applications/Testapp/testapp_1.0.4.5.Build39_TAG-test']
Repository entity: ['Applications/Testapp/testapp_1.0.4.5.Build39_TAG-test'] not found
I think I can't use packageId: "${env.packageid}".
Is there anything I could try? Maybe Groovy or Python code?

Your packageid environment variable is not being assigned a concatenated string correctly. You have literal quotes inside the string interpolation quotes. You should change it to:
env.packageid = "Applications/Testapp/${env.buildNumber}"
to only interpolate the string, which is the functionality you want here.
Additionally, you do not need to interpolate the environment variable inside an empty string for your method parameter, so your method invocation can be cleaned up as:
xldDeploy serverCredentials: 'nex8voo', environmentId: 'Environments/SysTest1/SysTest1_1', packageId: env.packageid

Related

Jenkins pipeline returns "Bad Subtitution" for shell command

I'm attempting to run the following command in a shell block in my Jenkins pipeline:
jq '.Resources[].TargetService.Properties.TaskDefinition = "'"arn:aws:ecs:us-east-1:${ACCOUNT_NUMBER}:task-definition/${TASK_NAME}:${NEW_REVISION}"'"'
This command works perfectly fine when I run it directly on the Jenkins node in shell.
When I insert it into the Pipeline like this:
stage('process json') {
steps {
dir('mydir') {
sh """
NEW_REVISION=\$(cat revision.txt)
jq '.Resources[].TargetService.Properties.TaskDefinition = "'"arn:aws:ecs:us-east-1:\${env.AWS_ACCOUNT_NUMBER}:task-definition/\${env.TASK_NAME}:\${NEW_REVISION}"'"'
"""
}
}
}
I get a Bad substitution error without any more information. As far as I know, I'm escaping variables and quotation correctly. I can bypass the error if I remove the double quotes like this:
jq '.Resources[].TargetService.Properties.TaskDefinition = "arn:aws:ecs:us-east-1:${ACCOUNT_NUMBER}:task-definition/${TASK_NAME}:${NEW_REVISION}"'
But that ends up processing the variables literally.
Notes: I'm aware of the security issue by not passing jq --arg and prepared to modify my command after I can get the simpler format working. revision.txt contains a numeric value. The env.* variables are declared earlier as part of the pipeline environment.
env is a Jenkins Object and you seem to be escaping env.* variables as well. If you have already exported these variables as Environment variables they should be available to you in the shell environment. So simply drop the env part from the variables or remove the escape characters from such variables and let Jenkins interpolate them.
stage('process json') {
steps {
dir('mydir') {
sh """
NEW_REVISION=\$(cat revision.txt)
jq '.Resources[].TargetService.Properties.TaskDefinition = "'"arn:aws:ecs:us-east-1:\${AWS_ACCOUNT_NUMBER}:task-definition/\${TASK_NAME}:\${NEW_REVISION}"'"'
"""
}
}
}

Expand ENV vars from String

How can I expand ${ENV} variables in Jenkins Pipeline if they are in a string I do not control?
For example I have configured my Pipeline Job to load the Pipeline from a parameterized SCM:
If I now access that branch via scm.branches[0].name inside the Pipeline I am currently getting ${REF}, too.
(The checkout scm part of the pipeline works fine, thats not the problem.)
I have tried the tm() step, but that throws org.jenkinsci.plugins.tokenmacro.MacroEvaluationException: Unrecognized macro 'REF' in '${REF}'
I for example cannot use to update the build name:
currentBuild.displayName = "${scm.branches[0].name} (#$BUILD_NUMBER)"
If REF is an Environment variable you can use String interpolation. Just put the variable into double quotes within the pipeline.
echo "${REF}"
Update
Not sure if there is better groovy way to do this. But following is an option you can use.
steps {
script {
script {
echo "${scm.branches[0].name}"
String branchName = "${scm.branches[0].name}"
String envNameOnly = branchName.substring(2, branchName.length()-1)
def env = System.getenv()[envNameOnly]
echo "$env"
}
}
}

Bad substitution when passing parameter to shell script in Jenkinsfile

In a Jenkinsfile I'm attempting to set an environment variable by setting the stdOut of a shell script. The script contains an AWS command that returns an InstanceID:
stage('Set InstanceID') {
steps {
script {
env.IID = sh (script: 'scripts/get-node-id.sh "${params.ENVIRONMENT}" "${params.NODE}"', returnStdout: true).trim()
}
}
}
No matter what I do or how many backslashes I use to escape the quotes, nothing works. I get a bad substitution error. I've also tried without double quotes.
If I hardcode in the shell script arguments, it runs fine.
How do I get this working if I want to use the parameter values here?
Groovy (the language of the Jenkinsfile) and Bash share the same substitution syntax. As you're using single quotes in your example code, the Groovy substitution does not work (see https://groovy-lang.org/syntax.html#_single_quoted_string). So Bash will try to do the substitution, but does not know these variables as they are Jenkins parameter values.
So solve this you need to use double quotes for your script and escape the double quotes in it (or use singe quotes):
stage('Set InstanceID') {
steps {
script {
env.IID = sh (script: "scripts/get-node-id.sh \"${params.ENVIRONMENT}\" \"${params.NODE}\"", returnStdout: true).trim()
}
}
}
I managed to resolve this, the comment from #yong above was almost what I needed. I needed three lots of double quotes and to unquote the variables:
stage('Set InstanceID') {
steps {
script {
env.IID = sh (script: """scripts/get-node-id.sh ${params.ENVIRONMENT} ${params.NODE}""", returnStdout: true).trim()
}
}
}

How to access groovy variable from pipeline into shell script?

I've a global variable in pipeline say BACKUP_DIR_NAME and in shell script which is inside pipeline, I want to build path using it hence have following code -
BACKUP_DIR_NAME="10-04-2020"
pipeline {
agent any
stages {
stage('First') {
steps {
script {
sh '''
BACKUP_DIR_PATH="/home/oracle/SeleniumFramework/SeleniumResultsBackup/"$BACKUP_DIR_NAME"/"
echo "Directory path is "$BACKUP_DIR_PATH
'''
}
}
}
}
}
When executed this, I can see value of BACKUP_DIR_NAME is evaluated as empty. Could you please help me to correct above code?
You mix two types of variables in your sh step. In the first line, you are trying to access the Groovy variable and interpolate its value to construct shell variable. In the second line, you expect to access this shell variable.
To satisfy the first part, you need to use double quotes to construct a Groovy string that supports variables interpolation. To satisfy the second part, you need to escape \$ to prevent $BACKUP_DIR_PATH from being interpolated.
BACKUP_DIR_NAME="10-04-2020"
pipeline {
agent any
stages {
stage('First') {
steps {
script {
sh """
BACKUP_DIR_PATH="/home/oracle/SeleniumFramework/SeleniumResultsBackup/"$BACKUP_DIR_NAME"/"
echo "Directory path is "\$BACKUP_DIR_PATH
"""
}
}
}
}
}

Load file with environment variables Jenkins Pipeline

I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
I was thinking about not using pipeline (but i will not be able to use my Jenkinsfile).
Or make different jobs for staging and production, using EnvInject Plugin (But i lose my stage view)
Or make withEnv (but the code gets big, because today i am working with 12 env vars)
One way you could load environment variables from a file is to load a Groovy file.
For example:
Let's say you have a groovy file in '$JENKINS_HOME/.envvars' called 'stacktest-staging.groovy'.
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello"
env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node {
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
echo "${env.DB_URL}"
echo "${env.DB_URL2}"
}
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see:
issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') {
def someGroovyVar = 'Hello world'
withEnv(['VAR1=VALUE ONE',
"VAR2=${someGroovyVar}"
]) {
def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true)
echo result
}
}
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
echo "User name: $USER"
echo "Password: $PASSWORD"
}
}
The jenkisn console log output hides the real values:
[Pipeline] echo
User name: ****
[Pipeline] echo
Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
withEnv(["ENV_USERNAME=${USER}",
"ENV_PASSWORD=${PASSWORD}"
]) {
def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true)
echo result
}
}
}
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps)
Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value.
But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well.
To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
Using withEnv() to pass environment variables from file splitted by new line and casted to List:
writeFile file: 'version.txt', text: 'version=6.22.0'
withEnv(readFile('version.txt').split('\n') as List) {
sh "echo ${version}"
}
If you are using Jenkins 2.0 you can load the property file (which consists of all required Environment variables along with their corresponding values) and read all the environment variables listed there automatically and inject it into the Jenkins provided env entity.
Here is a method which performs the above stated action.
def loadProperties(path) {
properties = new Properties()
File propertiesFile = new File(path)
properties.load(propertiesFile.newDataInputStream())
Set<Object> keys = properties.keySet();
for(Object k:keys){
String key = (String)k;
String value =(String) properties.getProperty(key)
env."${key}" = "${value}"
}
}
To call this method we need to pass the path of property file as a string variable For example, in our Jenkins file using groovy script we can call like
path = "${workspace}/pic_env_vars.properties"
loadProperties(path)
Please ask me if you have any doubt
Here is a complete example of externalizing environment variables and loading them in Jenkins pipeline execution. The pipeline is written in a declarative style.
stage('Reading environment variable defined in groovy file') {
steps {
script {
load "./pipeline/basics/extenvvariable/env.groovy"
echo "${env.env_var1}"
echo "${env.env_var2}"
}
}
}
Complete code example:
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/Jenkinsfile
Where variables are loaded from a groovy file placed with the pipeline code only.
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/env.groovy
This pattern comes very handy when you are creating a generic pipeline that could be used across teams.
You can externalize the dependent variable in such groovy file and each team can define their values according to their ecosystem.
Another solution is to use a custom method without allowing extra permissions such as for new Properties() which leads to this error before allowing:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.util.Properties
or adding extra plugin methods such as readProperties.
here is a method which reads a simple file named env_vars in this format:
FOO=bar
FOO2=bar
pipeline {
<... skipped lines ...>
script {
loadEnvironmentVariablesFromFile("env_vars")
echo "show time! ${BAR} ${BAR2}"
}
<... skipped lines ...>
}
private void loadEnvironmentVariablesFromFile(String path) {
def file = readFile(path)
file.split('\n').each { envLine ->
def (key, value) = envLine.tokenize('=')
env."${key}" = "${value}"
}
}

Resources