Jenkins pipeline error in handling json file - jenkins

I'm newbie to Jenkins pipeline and writing a groovy script to parse a json file. However I'm facing an error which many have faced but none of the solutions worked for me. Below is my Jenkinsfile and error msg.
def envname = readJSON file: '${env.WORKSPACE}/manifest.json'
pipeline {
agent any
stages {
stage('Build') {
steps {
echo WORKSPACE
sh "ls -a ${WORKSPACE}"
}
}
}
}
[Pipeline] Start of Pipeline
[Pipeline] readJSON
[Pipeline] End of Pipeline
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException:
Required context class hudson.FilePath is missing Perhaps you forgot
to surround the code with a step that provides this, such as: node at
org.jenkinsci.plugins.pipeline.utility.steps.AbstractFileOrTextStepExecution.run(AbstractFileOrTextStepExecution.java:30)
I even tried readJSON file: '${WORKSPACE}/manifest.json but that didn't work too. I'm sure the mistake is with the first line since when removing that line, there execution is successful. The docs are pretty helpful but I'm not able to track down where exactly I'm going wrong that is why posted here.
UPDATE:
I tried the following methods def envname = readJSON file: "./manifest.json" and def envname = readJSON file: "${env.WORKSPACE}/manifest.json" and even tried them defining under the steps block. Nothing worked. Below is the error msg I recieved when I defined them under step block
WorkflowScript: 5: Expected a step # line 7, column 13
def envname =
^
Below is the official syntax doc of readJson and I can see that I'm using the correct syntax only. but still doesn't work as expected.
https://www.jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#readjson-read-json-from-files-in-the-workspace

'${env.WORKSPACE}/manifest.json' is interpolating the Groovy env map as a shell variable. You need to interpolate it as a Groovy variable like "${env.WORKSPACE}/manifest.json".
sh "ls -a ${WORKSPACE}" is interpolating the shell environment variable WORKSPACE as a Groovy variable. You need to interpolate it as a shell variable like sh 'ls -a ${WORKSPACE}'.
echo WORKSPACE is attempting to resolve the shell variable WORKSPACE as a first class Groovy variable expression. You need to use the Groovy env map instead like echo env.WORKSPACE.
As for the global variable indefinite type assignment on the first line: if it still throws the error above after making those fixes, then it may be due to invalid use of scripted syntax in a declarative syntax pipeline. You likely need to place it inside a step block within your pipeline in that case.

I've solved this myself with the help of "Matt Schuchard"'s below answer. I'm not sure whether this is the only way to solve but this worked for me.
pipeline {
agent any
stages {
stage('Json-Build') {
steps {
script {
def envname = readJSON file: "${env.WORKSPACE}/manifest.json"
element1 = "${envname.dev}"
echo element1
}
}
}
}
}

Related

Jenkinsfile shell command not using env variables as expected

In my Jenkinsfile I want to dynamically find the unity version using a python script like so:
environment {
UNITY_EDITOR = bat(script: "py $WORKSPACE/get_versions.py --unity", returnStdout: true).trim()
UNITY_BASE = "C:/Program Files/Unity/Hub/Editor/$UNITY_EDITOR/Editor/Unity.exe"
UNITY_WRAPPER = "UnityBatchWrapper -silent-crashes -no-dialogs -batchmode -quit -unityPath \"$UNITY_BASE\""
}
post {
always {
script {
echo "Returning license"
licenseReturnStatus = bat (
script: "$UNITY_WRAPPER -returnlicense",
returnStatus: true
) == 0
}
}
From other stackoverflow answers this seems like it should work, but instead my Jenkins job errors out during the post-build step because $UNITY_WRAPPER isn't defined:
groovy.lang.MissingPropertyException: No such property: UNITY_WRAPPER for class: groovy.lang.Binding
I'm thinking the batch step is what's failing, even though Jenkins doesn't complain about it. I've also tried using $env.WORKSPACE and %WORKSPACE% and that doesn't work either.
I'm beginning to think $WORKSPACE doesn't exist til after the environments step...
Turns out I didn't have Python installed since it was an ephemeral GCP builder and I hadn't updated the node label yet.
For anyone reading this that has trouble with bat commands - be sure to put an # sign in front of your command like "#py ..." or else the command will be echoed in the output. Also trim your output so it doesn't have CRLF in it.

How to set environment PATH in Jenkins Declarative Pipeline bat block

I'm trying to set the environment PATH variable in a Jenkins Declarative Pipeline and am trying to use the same in a bat block on a windows machine. (I'm trying to modify the path so that I can use the same to call an executable without explicitly specifying the path.)
The path does not get passed to the bat block for some reason.
Any pointers to what could be the issue is highly appreciated from all you experienced developers out there. Thanks in advance!
Following is my code.
pipeline {
agent { label 'docker' }
environment {
PATH = "/hot/new/bin:$PATH"
}
stages {
stage ('build') {
steps {
echo "PATH is: $PATH"
bat """
echo PATH is: %PATH%
"""
}
}
}
}
Output is as follows:
PATH is: /hot/new/bin:blah:blah:my_env_path_content_remianing
PATH is: blah:blah:blah:my_env_path_content_remianing
What about using this syntax to make groovy able to interpolate the variable ?
bat """
echo PATH is: ${env.PATH}
"""
Or like this:
bat "echo PATH is: ${env.PATH}"

new File("path/tmp.txt") at Jenkins node

I have a very simple pipeline which works on a master. I was reading a line in a tmp.txt which works on Jenkins (master).
stage ('Stage 1'){
node('master') {
File file1 = new File("env.Workspace/tmp.txt")
def String my_line = file1.readLines().get(0)
…
}
}
I’ve have to move the stage to other one node (slave) and it doesn’t work anymore. If there is a tmp.txt in a workspace of master – pipeline reads it. But I want to read the tmp.txt in a workspace of node, not from master!
stage ('Stage 1'){
node('Agent_1') {
File file1 = new File("env.Workspace/tmp.txt")
def String my_line = file1.readLines().get(0)
…
}
}
I've found an info that:
“File always implies a file path on the current computer”.
What does it mean? It must be possible to read a file from node..
Can anybody help there?
Do not use native Groovy/Java IO functions, but use pipeline steps instead. The reason for this is that the pipeline code itself is always executed on the master!
The correct (pseudo) code, using the readFile step, would be like:
stage ('Stage 1'){
node('Agent_1') {
def String my_line = readFile("tmp.txt")
…
}
}

Syntax error while using backslash in Jenkinsfile

I try to make simple pipeline on Jenkins to remove files from few directories time to time. I decided not to create python script with Jenkinsfile as new project, instead of it I try to define new pipeline script in Jenkins job.
pipeline {
agent any
stages {
stage('Check virtualenv') {
steps {
sh """
rm -r /mnt/x/some/directory/Problem\ 1.0/path
"""
}
}
}
}
And I got an error WorkflowScript: 4: unexpected char: '\'. How can I use path with whitespace on it without using backslash? Any other ideas how define path?
The '\' character is a special character in Groovy. If you tried to compile this kind of code with the normal Groovy compiler, it would give you a better error message. The easiest way to handle it would be to escape it:
"""
rm -r /mnt/x/some/directory/Problem\\ 1.0/path
"""
You can modify the shell command as follows:
sh """
rm -r /mnt/x/some/directory/Problem""" + """ 1.0/path"""
Provide space before 1.0 as required. Hope this helps.

Load file with environment variables Jenkins Pipeline

I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
I was thinking about not using pipeline (but i will not be able to use my Jenkinsfile).
Or make different jobs for staging and production, using EnvInject Plugin (But i lose my stage view)
Or make withEnv (but the code gets big, because today i am working with 12 env vars)
One way you could load environment variables from a file is to load a Groovy file.
For example:
Let's say you have a groovy file in '$JENKINS_HOME/.envvars' called 'stacktest-staging.groovy'.
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello"
env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node {
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
echo "${env.DB_URL}"
echo "${env.DB_URL2}"
}
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see:
issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') {
def someGroovyVar = 'Hello world'
withEnv(['VAR1=VALUE ONE',
"VAR2=${someGroovyVar}"
]) {
def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true)
echo result
}
}
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
echo "User name: $USER"
echo "Password: $PASSWORD"
}
}
The jenkisn console log output hides the real values:
[Pipeline] echo
User name: ****
[Pipeline] echo
Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
withEnv(["ENV_USERNAME=${USER}",
"ENV_PASSWORD=${PASSWORD}"
]) {
def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true)
echo result
}
}
}
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps)
Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value.
But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well.
To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
Using withEnv() to pass environment variables from file splitted by new line and casted to List:
writeFile file: 'version.txt', text: 'version=6.22.0'
withEnv(readFile('version.txt').split('\n') as List) {
sh "echo ${version}"
}
If you are using Jenkins 2.0 you can load the property file (which consists of all required Environment variables along with their corresponding values) and read all the environment variables listed there automatically and inject it into the Jenkins provided env entity.
Here is a method which performs the above stated action.
def loadProperties(path) {
properties = new Properties()
File propertiesFile = new File(path)
properties.load(propertiesFile.newDataInputStream())
Set<Object> keys = properties.keySet();
for(Object k:keys){
String key = (String)k;
String value =(String) properties.getProperty(key)
env."${key}" = "${value}"
}
}
To call this method we need to pass the path of property file as a string variable For example, in our Jenkins file using groovy script we can call like
path = "${workspace}/pic_env_vars.properties"
loadProperties(path)
Please ask me if you have any doubt
Here is a complete example of externalizing environment variables and loading them in Jenkins pipeline execution. The pipeline is written in a declarative style.
stage('Reading environment variable defined in groovy file') {
steps {
script {
load "./pipeline/basics/extenvvariable/env.groovy"
echo "${env.env_var1}"
echo "${env.env_var2}"
}
}
}
Complete code example:
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/Jenkinsfile
Where variables are loaded from a groovy file placed with the pipeline code only.
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/env.groovy
This pattern comes very handy when you are creating a generic pipeline that could be used across teams.
You can externalize the dependent variable in such groovy file and each team can define their values according to their ecosystem.
Another solution is to use a custom method without allowing extra permissions such as for new Properties() which leads to this error before allowing:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.util.Properties
or adding extra plugin methods such as readProperties.
here is a method which reads a simple file named env_vars in this format:
FOO=bar
FOO2=bar
pipeline {
<... skipped lines ...>
script {
loadEnvironmentVariablesFromFile("env_vars")
echo "show time! ${BAR} ${BAR2}"
}
<... skipped lines ...>
}
private void loadEnvironmentVariablesFromFile(String path) {
def file = readFile(path)
file.split('\n').each { envLine ->
def (key, value) = envLine.tokenize('=')
env."${key}" = "${value}"
}
}

Resources