How to retrieve current workspace using Jenkins Pipeline Groovy script? - jenkins

I am trying to get the current workspace of my Jenkins build using a Groovy pipeline script:
node('master') {
// PULL IN ENVIRONMENT VARIABLES
// Jenkins makes these variables available for each job it runs
def buildNumber = env.BUILD_NUMBER
def workspace = env.WORKSPACE
def buildUrl = env.BUILD_URL
// PRINT ENVIRONMENT TO JOB
echo "workspace directory is ${workspace}"
echo "build URL is ${env.BUILD_URL}"
}
It returns:
[Pipeline] Allocate node : Start
Running on master in /Users/Shared/Jenkins/Home/jobs/test/workspace
[Pipeline] node {
[Pipeline] echo
workspace directory is null
[Pipeline] echo
build URL is http://localhost:8080/job/test/5/
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
Finished: SUCCESS

For me just ${WORKSPACE} worked without even initializing the variable workspace.

There is no variable included for that yet, so you have to use shell-out-read-file method:
sh 'pwd > workspace'
workspace = readFile('workspace').trim()
Or (if running on master node):
workspace = pwd()

I think you can also execute the pwd() function on the particular node:
node {
def PWD = pwd();
...
}

A quick note for anyone who is using bat in the job and needs to access Workspace:
It won't work.
$WORKSPACE https://issues.jenkins-ci.org/browse/JENKINS-33511 as mentioned here only works with PowerShell. So your code should have powershell for execution
stage('Verifying Workspace') {
powershell label: '', script: 'dir $WORKSPACE'
}

I have successfully used as shown below in Jenkinsfile:
steps {
script {
def reportPath = "${WORKSPACE}/target/report"
...
}
}

This is where you can find the answer in the job-dsl-plugin code.
Basically you can do something like this:
readFileFromWorkspace('src/main/groovy/com/groovy/jenkins/scripts/enable_safehtml.groovy')

In Jenkins pipeline script, I am using
targetDir = workspace
Works perfect for me. No need to use ${WORKSPACE}

The key is that, this works if used within double quotes instead of single quotes, below is my code and this worked!
script {
echo 'Entering Stage - Nexus Upload'
def artefactPath = "${WORKSPACE}/build/deploy/identityiq.war"
echo "printing the path ${artefactPath}"
}

Related

How to read property from config file inside Jenkins pipeline using Config File Provider Plugin

I want to parametrize my Jenkins pipeline with a simple properties config file
skip_tests=true
that I've added to Jenkins Config File Managment:
In my pipeline I'm importing this file and try to read from it using the Jenkins Pipeline Config File Plugin.
node('my-swarm') {
MY_CONFIG = '27206b95-d69b-4494-a430-0a23483a6408'
try {
stage('prepare') {
configFileProvider([configFile(fileId: "$MY_CONFIG", variable: 'skip_tests')]) {
echo $skip_tests
assert $skip_tests == 'true'
}
}
} catch (Exception e) {
currentBuild.result = 'FAILURE'
print e
}
}
This results in an error:
provisioning config files...
copy managed file [my.properties] to file:/home/jenkins/build/workspace/my-workspace#tmp/config7043792000148664559tmp
[Pipeline] {
[Pipeline] }
Deleting 1 temporary files
[Pipeline] // configFileProvider
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
groovy.lang.MissingPropertyException: No such property: $skip_tests for
class: groovy.lang.Binding
Any ideas what I'm doing wrong here?
With the help of the other answers and How to read properties file from Jenkins 2.0 pipeline script I found the following code to work:
configFileProvider([configFile(fileId: "$PBD1_CONFIG", variable: 'configFile')]) {
def props = readProperties file: "$configFile"
def skip_tests = props['skip_tests']
if (skip_tests == 'true') {
print 'skipping tests'
} else {
print 'running tests'
}
}
I had to use readProperties from Jenkins' Pipeline Utility Steps Plugin.
Since the file is in property format you can use it in a shell step:
sh """
source ${MY_CONFIG}
.
.
.
"""
You would need to export the properties that need to be available on programs that the shell calls (e.g. Maven)
You made a wrong usage of Groovy GString, you should wrap $skip_tests in " or use skip_tests directly.
configFileProvider([configFile(fileId: "$MY_CONFIG", variable: 'skip_tests')]) {
echo skip_tests
assert skip_tests == 'true'
echo "$skip_tests"
assert "$skip_tests" == 'true'
}
Note: the value of skip_tests is the file path of the config file which is copied from master to job's workspace. It's not the content of the config file.

Jenkins Pipeline Conditional Stage based on Environment Variable

I want to create a Jenkins (v2.126) Declarative syntax pipeline, which has stages with when() clauses checking the value of an environment variable. Specifically I want to set a Jenkins job parameter (so 'build with parameters', not pipeline parameters) and have this determine if a stage is executed.
I have stage code like this:
stage('plan') {
when {
environment name: ExecuteAction, value: 'plan'
}
steps {
sh 'cd $dir && $tf plan'
}
}
The parameter name is ExecuteAction. However, when ExecuteAction is set via a Job "Choice" parameter to: plan, this stage does not run. I can see the appropriate value is coming in via environment variable by adding this debug stage:
stage('debug') {
steps {
sh 'echo "ExecuteAction = $ExecuteAction"'
sh 'env'
}
}
And I get Console output like this:
[Pipeline] stage
[Pipeline] { (debug)
[Pipeline] sh
[workspace] Running shell script
+ echo 'ExecuteAction = plan'
ExecuteAction = plan
[Pipeline] sh
[workspace] Running shell script
+ env
...
ExecuteAction=plan
...
I am using the when declarative syntax from Jenkins book pipeline syntax, at about mid-page, under the when section, built-in conditions.
Jenkins is running on Gnu/Linux.
Any ideas what I might be doing wrong?
Duh! You need to quote the environment variable's name in the when clause.
stage('plan') {
when {
environment name: 'ExecuteAction', value: 'plan'
}
steps {
sh 'cd $dir && $tf plan'
}
}
I believe you need to use params instead of environment. Try the following:
when {
expression { params.ExecuteAction == 'plan' }
}

Create new Jenkins jobs using Pipeline Job and Groovy script

I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered

iterate over environment variables in Jenkins Pipeline Groovy [duplicate]

Given a jenkins build pipeline, jenkins injects a variable env into the node{}. Variable env holds environment variables and values.
I want to print all env properties within the jenkins pipeline. However, I do no not know all env properties ahead of time.
For example, environment variable BRANCH_NAME can be printed with code
node {
echo ${env.BRANCH_NAME}
...
But again, I don't know all variables ahead of time. I want code that handles that, something like
node {
for(e in env){
echo e + " is " + ${e}
}
...
which would echo something like
BRANCH_NAME is myBranch2
CHANGE_ID is 44
...
I used Jenkins 2.1 for this example.
According to Jenkins documentation for declarative pipeline:
sh 'printenv'
For Jenkins scripted pipeline:
echo sh(script: 'env|sort', returnStdout: true)
The above also sorts your env vars for convenience.
Another, more concise way:
node {
echo sh(returnStdout: true, script: 'env')
// ...
}
cf. https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#code-sh-code-shell-script
The following works:
#NonCPS
def printParams() {
env.getEnvironment().each { name, value -> println "Name: $name -> Value $value" }
}
printParams()
Note that it will most probably fail on first execution and require you approve various groovy methods to run in jenkins sandbox. This is done in "manage jenkins/in-process script approval"
The list I got included:
BUILD_DISPLAY_NAME
BUILD_ID
BUILD_NUMBER
BUILD_TAG
BUILD_URL
CLASSPATH
HUDSON_HOME
HUDSON_SERVER_COOKIE
HUDSON_URL
JENKINS_HOME
JENKINS_SERVER_COOKIE
JENKINS_URL
JOB_BASE_NAME
JOB_NAME
JOB_URL
You can accomplish the result using sh/bat step and readFile:
node {
sh 'env > env.txt'
readFile('env.txt').split("\r?\n").each {
println it
}
}
Unfortunately env.getEnvironment() returns very limited map of environment variables.
Why all this complicatedness?
sh 'env'
does what you need (under *nix)
Cross-platform way of listing all environment variables:
if (isUnix()) {
sh env
}
else {
bat set
}
Here's a quick script you can add as a pipeline job to list all environment variables:
node {
echo(env.getEnvironment().collect({environmentVariable -> "${environmentVariable.key} = ${environmentVariable.value}"}).join("\n"))
echo(System.getenv().collect({environmentVariable -> "${environmentVariable.key} = ${environmentVariable.value}"}).join("\n"))
}
This will list both system and Jenkins variables.
I use Blue Ocean plugin and did not like each environment entry getting its own block. I want one block with all the lines.
Prints poorly:
sh 'echo `env`'
Prints poorly:
sh 'env > env.txt'
for (String i : readFile('env.txt').split("\r?\n")) {
println i
}
Prints well:
sh 'env > env.txt'
sh 'cat env.txt'
Prints well: (as mentioned by #mjfroehlich)
echo sh(script: 'env', returnStdout: true)
The pure Groovy solutions that read the global env variable don't print all environment variables (e. g. they are missing variables from the environment block, from withEnv context and most of the machine-specific variables from the OS). Using shell steps it is possible to get a more complete set, but that requires a node context, which is not always wanted.
Here is a solution that uses the getContext step to retrieve and print the complete set of environment variables, including pipeline parameters, for the current context.
Caveat: Doesn't work in Groovy sandbox. You can use it from a trusted shared library though.
def envAll = getContext( hudson.EnvVars )
echo envAll.collect{ k, v -> "$k = $v" }.join('\n')
Show all variable in Windows system and Unix system is different, you can define a function to call it every time.
def showSystemVariables(){
if(isUnix()){
sh 'env'
} else {
bat 'set'
}
}
I will call this function first to show all variables in all pipline script
stage('1. Show all variables'){
steps {
script{
showSystemVariables()
}
}
}
The easiest and quickest way is to use following url to print all environment variables
http://localhost:8080/env-vars.html/
The answers above, are now antiquated due to new pipeline syntax. Below prints out the environment variables.
script {
sh 'env > env.txt'
String[] envs = readFile('env.txt').split("\r?\n")
for(String vars: envs){
println(vars)
}
}
Includes both system and build environment vars:
sh script: "printenv", label: 'print environment variables'
if you really want to loop over the env list just do:
def envs = sh(returnStdout: true, script: 'env').split('\n')
envs.each { name ->
println "Name: $name"
}
I found this is the most easiest way:
pipeline {
agent {
node {
label 'master'
}
}
stages {
stage('hello world') {
steps {
sh 'env'
}
}
}
}
You can get all variables from your jenkins instance. Just visit:
${jenkins_host}/env-vars.html
${jenkins_host}/pipeline-syntax/globals
ref: https://www.jenkins.io/doc/pipeline/tour/environment/
node {
sh 'printenv'
}
You can use sh 'printenv'
stage('1') {
sh "printenv"
}
another way to get exactly the output mentioned in the question:
envtext= "printenv".execute().text
envtext.split('\n').each
{ envvar=it.split("=")
println envvar[0]+" is "+envvar[1]
}
This can easily be extended to build a map with a subset of env vars matching a criteria:
envdict=[:]
envtext= "printenv".execute().text
envtext.split('\n').each
{ envvar=it.split("=")
if (envvar[0].startsWith("GERRIT_"))
envdict.put(envvar[0],envvar[1])
}
envdict.each{println it.key+" is "+it.value}
I suppose that you needed that in form of a script, but if someone else just want to have a look through the Jenkins GUI, that list can be found by selecting the "Environment Variables" section in contextual left menu of every build
Select project => Select build => Environment Variables

Using loaded script in Jenkinsfile

I had a script that was being executed as a jenkins pipeline and it was working fine. I wanted to reuse it for multiple environments so I moved all the code to functions and load them from multiple files.
Library file - Healthcheck:
#!groovy
#NonCPS
def check(type) {
stage "prepare"
echo "TEST1"
props = readProperties file:'build.properties'
echo "TEST2"
stage "queues"
checkQueues()
}
#NonCPS
def checkQueues() {
txt = "http://ltxl0207.sgdcelab.sabre.com:8161/api/jolokia/read/org.apache.activemq:brokerName=localhost,destinationName=!/tss!/trip_source_updates,destinationType=Queue,type=Broker/QueueSize".toURL().getText(requestProperties: [Authorization: "Basic " + "admin:admin".getBytes().encodeBase64().toString()])
json = new groovy.json.JsonSlurper().parseText(txt)
echo "Got response: " + txt
}
return this;
File that uses it - Healthcheck-dev:
#!groovy
node {
checkout scm
healthcheck = load 'Healthcheck'
healthcheck.check('DEV')
}
And the trouble is that the script doesn't get pass readProperties and the prepare stage it just stops there, ignoring the queues stage:
[Pipeline] load
[Pipeline] { (Healthcheck)
[Pipeline] }
[Pipeline] // load
[Pipeline] stage (prepare)
Entering stage prepare
Proceeding
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
What I'm doing wrong? When I move the code to single file it works correctly.
Have you tried to execute your pipeline with your 2 files ("Healthcheck" & "Healtcheck-dev") being in the same repository ? Because when you load a script from another SCM repo like you seem to be doing, Jenkins actually creates another directory #script or such for your loaded script.
You might need to do something like this to load your general script from the correct workspace :
node {
checkout scm
dir("${projectWorkspace}#script") {
healthcheck = load 'Healthcheck'
healthcheck.check('DEV')
}
}
with ${projectWorkspace} being the original workspace for your build.

Resources