Pass variables between Jenkins stages - jenkins

I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.

A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}

Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}

Related

Jenkins pipeline script global variable

I am learning jenkins, and am working on a sample pipeline
pipeline {
agent any
stages {
stage('Stage1') {
steps {
bat '''
cd C:/Users/roger/project/
python -u script1.py
'''
}
}
stage('Stage2') {
steps {
bat '''
cd cd C:/Users/roger/project/abc/
python -u script2.py
'''
}
}
stage('Stage3') {
steps {
bat '''
cd cd C:/Users/roger/project/abc/new_dir/
python -u demo.py
'''
}
}
}
}
is there a way to store the base path of project C:/Users/roger/project/ as a variable, so that it can be used to append new path to it instead of writing the whole path.
How could I write above stages, so that I don't have to repeat writing the same base path each time to each stage
You have several options, the easiest way will be to define the parameter inside the environment directive (read more) which will make the parameter available for all stages in the pipeline and will also load it to the execution environment of any interpreter step like sh, bat and powershell thus making the parameter also available to the scripts you execute as an environment variable.
In addition the environment directive supports credential parameters which is very useful.
In your case it will look like:
pipeline {
agent any
environment {
BASE_PATH = 'C:/Users/roger/project/'
}
stages {
stage('Stage1') {
steps {
// Using the parameter as a runtime environment variable with bat syntax %%
bat '''
cd %BASE_PATH%
python -u script1.py
'''
}
}
stage('Stage2') {
steps {
// Using groovy string interpolation to construct the command with the parameter value
bat """
cd ${env.BASE_PATH}abc/
python -u script2.py
"""
}
}
}
}
Another option you have is to use global variables defined at the top section of the pipeline, which will behave like any groovy variable and will be available for all stages in your pipeline (but not for the execution environment of interpreter steps).
Something like:
BASE_PATH = 'C:/Users/roger/project/'
pipeline {
agent any
stages {
stage('Stage1') {
steps {
// Using the parameter insdie a dir step to change directory
dir(BASE_PATH) {
bat 'python -u script1.py'
}
}
}
stage('Stage2') {
steps {
// Using groovy string interpolation to construct the command with the parameter value
bat """
cd ${BASE_PATH}abc/
python -u script2.py
"""
}
}
}
}

Jenkins using docker agent with environment declarative pipeline

I would like to install maven and npm via docker agent using Jenkins declarative pipeline. But When I would like to use below script Jenkins throws an error as below. It might be using agent none but how can I use node with docker agent via declarative pipeline jenkins.
ERROR: Attempted to execute a step that requires a node context while
‘agent none’ was specified. Be sure to specify your own ‘node { ... }’
blocks when using ‘agent none’.
I try to set agent any but this time I received an error "Still waiting to schedule task
Waiting for next available executor"
pipeline {
agent none
// environment{
proxy = https://
// stable_revision = sh(script: 'curl -H "Authorization: Basic $base64encoded"
// }
stages {
stage('Build') {
agent {
docker { image 'maven:3-alpine'}
}
steps {
sh 'mvn --version'
echo "$apigeeUsername"
echo "Stable Revision: ${env.stable_revision}"
}
}
stage('Test') {
agent { docker { image 'maven:3-alpine' image 'node:8.12.0' } }
environment {
HOME = '.'
}
steps {
script{
try{
sh 'npm install'
sh 'node --version'
//sh 'npm test/unit/*.js'
}catch(e){
throw e
}
}
}
}
// stage('Policy-Code Analysis') {
// steps{
// sh "npm install -g apigeelint"
// sh "apigelint -s wiservice_api_v1/apiproxy/ -f codeframe.js"
// }
// }
stage('Promotion'){
steps{
timeout(time: 2, unit: 'DAYS') {
input 'Do you want to Approve?'
}
}
}
stage('Deployment'){
steps{
sh "mvn -f wiservice_api_v1/pom.xml install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} -Dapigee.config.options=update"
//sh "mvn apigee-enterprise:install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} "
}
}
}
}
Basically your error message tells you everything you need to know:
ERROR: Attempted to execute a step that requires a node context while
‘agent none’ was specified. Be sure to specify your own ‘node { ... }’
blocks when using ‘agent none’.
so what is the issue here? You use agent none for your pipeline which means you do not specify a specific agent for all stages. An agent executes a specific stage. If a stage has no agent it can't be executed and this is your issue here.
The following 2 stage have no agent which means no docker-container / server or whatever where it can be executed.
stage('Promotion'){
steps{
timeout(time: 2, unit: 'DAYS') {
input 'Do you want to Approve?'
}
}
}
stage('Deployment'){
steps{
sh "mvn -f wiservice_api_v1/pom.xml install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} -Dapigee.config.options=update"
//sh "mvn apigee-enterprise:install -Ptest -Dusername=${apigeeUsername} -Dpassword=${apigeePassword} "
}
}
so you have to add agent { ... } to both stage seperately or use a global agent like following and remove the agent from your stages:
pipeline {
agent {
docker { image 'maven:3-alpine'}
} ...
For further information see guide to set up master and agent machines or distributed jenkins builds or the official documentation.
I think you meant to add agent any instead of agent none, because each stage requires at least one agent (either declared at the top for the pipeline or per stage).
Also, I see some more issues.
Your Test stage specifies two images for the same stage.
agent { docker { image 'maven:3-alpine' image 'node:8.12.0' } } although, your stage is executing only npm commands. I believe only one of the image will be downloaded.
To clarify bit more on mkemmerz answer, your Promotion stage is designed correctly. If you plan to have an input step in the pipeline, do not add an agent for the pipeline because input steps block the executor context. See this link https://jenkins.io/blog/2018/04/09/whats-in-declarative/

How to pass variables set in sh script to subsequent Jenkins Pipeline Steps

I have a jenkins pipeline file where i need to call an sh file
node {
stage("Stage1") {
checkout scm
sh '''
echo "Invoking the sh script"
valueNeedstobepassed = "test"
'''
}
stage ('stage2') {
Need to refer the "valueNeedstobepassed" varaible in my
pipleline step
}
}
I am not able to refer the variable "valueNeedstobepassed" on stage 2
Any help please?

Passing variables extracted from shell in Jenkinsfile

I am trying to pass variables extracted in a stage in Jenkinsfile between stages. For example:
stage('Dummy Stage') {
sh '''#!/bin/bash -l
export abc=`output of some command`
.....
.....
'''
Now, how can I pass the variable abc to a subsequent stage? I have tried setting the variable by adding a def section at the top of the file but looks like it doesnt work. In the absence of a neater way, I am having to retype the commands
Here is what I do to get the number of commits on master as a global environment variable:
pipeline {
agent any
environment {
COMMITS_ON_MASTER = sh(script: "git rev-list HEAD --count", returnStdout: true).trim()
}
stages {
stage("Print commits") {
steps {
echo "There are ${env.COMMITS_ON_MASTER} commits on master"
}
}
}
}
You can use the longer form of the sh step and return the output (see Pipeline document). Your variable should be defined outside the stages.
You can use an imperatively created environment variable inside a script block in you stage steps, for example:
stage("Stage 1") {
steps {
script {
env.RESULT_ON_STAGE_1 = sh (
script: 'echo "Output of some command"',
returnStdout: true
)
}
echo "In stage 1: ${env.RESULT_ON_STAGE_1}"
}
}
stage("Stage 2") {
steps {
echo "In stage 2: ${env.RESULT_ON_STAGE_1}"
}
}
This guide explains use of environment variables in pipelines with examples.
My issue concerned having two 'sh' commands where one uses single quotes (where I set a variable) and the other uses double quotes (where I access 'env' variables set in the jenkinsfile such as BUILD_ID).
Here's how I solved it.
script {
env.TEST = sh(
script:
'''
echo "TEST"
''',
returnStdout: true
)
sh """
echo ${env.BUILD_ID}
echo ${env.TEST}
"""
}

Pipeline step having trouble resolving a file path

I am having trouble getting a shell command to complete in a stage I have defined:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
The shell command issues a protractor call which takes a config file argument, but this file fails to be found when protractor tries to retrieve it.
If I take a look at the workspace directory for where the repo is checked out to from the checkout scm step I can see the test directory is present with the config file present the sh step is referencing.
So I'm unsure why the file cannot be found.
I thought about trying to verify the files that can be seen around the time the protractor command is being issued.
So something like:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
def files = findFiles(glob: 'test/**/*.conf.js')
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
}
}
}
}
But this doesnt work, I dont think findFiles can be used inside a step?
Can anyone offer any suggestions about what may be going on here?
Thanks
to do the debugging you were attempting (to see if the file is actually there) you could wrap the findFiles in a script (making sure your echo is before the step that fails) or use a basic find in an "sh" step like this:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
// you could use the unix find command instead of groovy's findFiles
sh 'find test -name *.conf.js'
// if you're using a non-dsl-step (like findFiles), you must wrap it in a script
script {
def files = findFiles(glob: 'test/**/*.conf.js')
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
}

Resources