My pipeline script is
VersionNumber([
versionNumberString : '1.0.${BUILD_DAY}',
projectStartDate : '1990-07-01',
PrefixVariable : ''
])
Through jobs its creating an enviorment varibale . But through pipeline how can I echo version number string?
just assign it to a environment variable and use it:
environment {
VERSION = VersionNumber([
versionNumberString : '${BUILD_YEAR}.${BUILD_MONTH}.${BUILD_ID}',
projectStartDate : '2014-05-19'
]);
}
then you can output it to a file:
steps {
sh 'echo "$VERSION" > version.txt';
}
or to console:
steps {
sh 'echo "$VERSION"';
}
where ever you use $VERSION it will be replaced with your version number
Try with the following code snippet:
environment {
VERSION = VersionNumber([projectStartDate: '2017-05-12', skipFailedBuilds: true, versionNumberString: '${YEARS_SINCE_PROJECT_START, XX}.${BUILD_MONTH, XX}.${BUILDS_THIS_MONTH}', versionPrefix: 'v']);
}
Here is a Jenkins Declarative Pipeline example:
pipeline {
environment {
XCODE_BUILD_NUMBER = VersionNumber(projectStartDate: '1970-01-01', versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMddHHmm"}', versionPrefix: '')
}
stages {
stage('Example Print') {
steps{
echo XCODE_BUILD_NUMBER
sh 'add dollar sign when using sh: $XCODE_BUILD_NUMBER'
}
}
}
}
Related
(update below)
I have a declarative pipeline job which can take an argument VERSION.
pipeline {
parameters {
string(name: VERSION, defaultValue: '')
}
// ...
}
If no VERSION is given, like when Gitlab send a hook to this job, I want to compute it from git, so I do something like this
stages {
stage('Prepare') {
steps {
// ...
if (! env.VERSION) {
VERSION = sh(script: "git describe", returnStdout: true).trim()
}
}
}
}
Now I want to "inject" this variable to
my build script. It needs to find "VERSION" in the environment variables
to the jenkins mail notificator. And get it to retreive ${VERSION} in subject or body text
I tried changing above code with
stages {
stage('Prepare') {
steps {
// ...
if (! env.VERSION) {
env.VERSION = sh(script: "git describe", returnStdout: true).trim()
}
}
}
}
Got this error groovy.lang.MissingPropertyException: No such property: VERSION for class: groovy.lang.Binding
I then tried to add a "environment" step below
environment {
VERSION = ${VERSION}
}
but it didn't solve my problem.
I'm looking for any help to solve it.
UPDATE
I now have a working pipeline which looks like
pipeline {
agent any
parameters {
string(name: 'VERSION', defaultValue: '')
}
environment {
def VERSION = "${params.VERSION}"
}
stages {
stage('Prepare & Checkout') {
steps {
script {
if (! env.VERSION) {
VERSION = sh(script: "date", returnStdout: true).trim()
}
echo "** version: ${VERSION} **"
}
}
}
stage('Build') {
steps {
// sh "./build.sh"
echo "** version2: ${VERSION} **"
}
}
} // stages
post {
always {
mail to: 'foo#example.com',
subject: "SUCCESS: ${VERSION}",
body: """<html><body><p>SUCCESS</p></body></html>""",
mimeType: 'text/html',
charset: 'UTF-8'
deleteDir()
}
}
} // pipeline
I needed to add the "environment" step to be able to get $VERSION in all Stages (not only in the one it is manipulated).
I still need to find a way to inject this $VERSION variable in the environment variables, so that my build script can find it.
If you want to inject the variable in the environment so that you can use it later, you could define another variable that is equal to env.VERSION or the output of the shell scrip. Then use that variable in your pipeline eg:
pipeline {
parameters {
string(name: VERSION, defaultValue: '')
}
def version = env.VERSION
stages {
stage('Prepare') {
steps {
// ...
if (!version) {
version = sh(script: "git describe", returnStdout: true).trim()
}
}
}
mail subject: "$version build succeeded", ...
}
If you want other jobs to be able to access the value of VERSION after the build is run, you can write it in a file and archive it.
Edit:
In order for your script to be able to use the version variable, you can either make your script take version as a parameter or you can use the withEnv step.
Assuming you are using Parametrized pipelines, you should call variable as ${params.parameterName}
Although parameters are available in env they currently are created before the first time the pipeline is run, therefore you should access them via params:
In your case:
${params.VERSION}
I am using the declarative syntax for my pipeline, and would like to store the path to the workspace being used on one of my stages, so that same path can be used in a later stage.
I have seen I can call pwd() to get the current directory, but how do I assign to a variable to be used between stages?
EDIT
I have tried to do this by defining by own custom variable and using like so with the ws directive:
pipeline {
agent { label 'master' }
stages {
stage('Build') {
steps {
script {
def workspace = pwd()
}
sh '''
npm install
bower install
gulp set-staging-node-env
gulp prepare-staging-files
gulp webpack
'''
stash includes: 'dist/**/*', name: 'builtSources'
stash includes: 'config/**/*', name: 'appConfig'
node('Protractor') {
dir('/opt/foo/deploy/') {
unstash 'builtSources'
unstash 'appConfig'
}
}
}
}
stage('Unit Tests') {
steps {
parallel (
"Jasmine": {
node('master') {
ws("${workspace}"){
sh 'gulp karma-tests-ci'
}
}
},
"Mocha": {
node('master') {
ws("${workspace}"){
sh 'gulp mocha-tests'
}
}
}
)
}
post {
success {
sh 'gulp combine-coverage-reports'
sh 'gulp clean-lcov'
publishHTML(target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: false,
reportDir: 'test/coverage',
reportFiles: 'index.html',
reportName: 'Test Coverage Report'
])
}
}
}
}
}
In the Jenkins build console, I see this happens:
[Jasmine] Running on master in /var/lib/jenkins/workspace/_Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ#2
[Pipeline] [Jasmine] {
[Pipeline] [Jasmine] ws
[Jasmine] Running in /var/lib/jenkins/workspace/_Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ#2#2
The original workspace allocated from the first stage is actually _Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ
So it doesnt look like it working, what am I doing wrong here?
Thanks
pipeline {
agent none
stages {
stage('Stage-One') {
steps {
echo 'StageOne.....'
script{ name = 'StackOverFlow'}
}
}
stage('Stage-Two'){
steps{
echo 'StageTwo.....'
echo "${name}"
}
}
}
}
Above prints StackOverFlow in StageTwo for echo "${name}"
You can also use sh "echo ${env.WORKSPACE}" to get The absolute path of the directory assigned to the build as a workspace.
You could put the value into an environment variable like described in this answer
CURRENT_PATH= sh (
script: 'pwd',
returnStdout: true
).trim()
Which version are you running? Maybe you can just assign the WORKSPACE variable to an environment var?
Or did i totally misunderstand and this is what you are looking for?
How do I pass variables between stages in a declarative pipeline?
In a scripted pipeline, I gather the procedure is to write to a temporary file, then read the file into a variable.
How do I do this in a declarative pipeline?
E.g. I want to trigger a build of a different job, based on a variable created by a shell action.
stage("stage 1") {
steps {
sh "do_something > var.txt"
// I want to get var.txt into VAR
}
}
stage("stage 2") {
steps {
build job: "job2", parameters[string(name: "var", value: "${VAR})]
}
}
If you want to use a file (since a script is the thing generating the value you need), you could use readFile as seen below. If not, use sh with the script option as seen below:
// Define a groovy local variable, myVar.
// A global variable without the def, like myVar = 'initial_value',
// was required for me in older versions of jenkins. Your mileage
// may vary. Defining the variable here maybe adds a bit of clarity,
// showing that it is intended to be used across multiple stages.
def myVar = 'initial_value'
pipeline {
agent { label 'docker' }
stages {
stage('one') {
steps {
echo "1.1. ${myVar}" // prints '1.1. initial_value'
sh 'echo hotness > myfile.txt'
script {
// OPTION 1: set variable by reading from file.
// FYI, trim removes leading and trailing whitespace from the string
myVar = readFile('myfile.txt').trim()
}
echo "1.2. ${myVar}" // prints '1.2. hotness'
}
}
stage('two') {
steps {
echo "2.1 ${myVar}" // prints '2.1. hotness'
sh "echo 2.2. sh ${myVar}, Sergio" // prints '2.2. sh hotness, Sergio'
}
}
// this stage is skipped due to the when expression, so nothing is printed
stage('three') {
when {
expression { myVar != 'hotness' }
}
steps {
echo "three: ${myVar}"
}
}
}
}
Simply:
pipeline {
parameters {
string(name: 'custom_var', defaultValue: '')
}
stage("make param global") {
steps {
tmp_param = sh (script: 'most amazing shell command', returnStdout: true).trim()
env.custom_var = tmp_param
}
}
stage("test if param was saved") {
steps {
echo "${env.custom_var}"
}
}
}
I had a similar problem as I wanted one specific pipeline to provide variables and many other ones using it to get this variables.
I created a my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
I can reuse these variables in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
there is no need for (hidden plugin) parameter definitions or temp-file access. Sharing varibles across stages can be acomplished by using global Groovy variables in a Jenkinsfile like so:
#!/usr/bin/env groovy
def MYVAR
def outputOf(cmd) { return sh(returnStdout:true,script:cmd).trim(); }
pipeline {
agent any
stage("stage 1") {
steps {
MYVAR = outputOf('echo do_something')
sh "echo MYVAR has been set to: '${MYVAR}'"
}
}
stage("stage 2") {
steps {
sh '''echo "...in multiline quotes: "''' + MYVAR + '''" ... '''
build job: "job2", parameters[string(name: "var", value: MYVAR)]
}
}
}
I have enhanced the existing solution by correcting syntax .Also used hidden parameter plugin so that it does not show up as an extra parameter in Jenkins UI. Works well :)
properties([parameters([[$class: 'WHideParameterDefinition', defaultValue: 'yoyo', name: 'hidden_var']])])
pipeline {
agent any
stages{
stage("make param global") {
steps {
script{
env.hidden_var = "Hello"
}
}
}
stage("test if param was saved") {
steps {
echo"About to check result"
echo "${env.hidden_var}"
}
}
}
}
I'm new to Jenkins pipeline; I'm defining a declarative syntax pipeline and I don't know if I can solve my problem, because I didn't find a solution.
In this example, I need to pass a variable to ansible plugin (in old version I use an ENV_VAR or injecting it from file with inject plugin) that variable comes from a script.
This is my perfect scenario (but it doesn't work because environment{}):
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('Deploy') {
environment {
ANSIBLE_CONFIG = '${WORKSPACE}/chimera-ci/ansible/ansible.cfg'
VERSION = sh("python3.5 docker/get_version.py")
}
steps {
ansiblePlaybook credentialsId: 'example-credential', extras: '-e version=${VERSION}', inventory: 'development', playbook: 'deploy.yml'
}
}
}
}
I tried other ways to test how env vars work in other post, example:
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('PREPARE VARS') {
steps {
script {
env['VERSION'] = sh(script: "python3.5 get_version.py")
}
echo env.VERSION
}
}
}
}
but "echo env.VERSION" return null.
Also tried the same example with:
- VERSION=python3.5 get_version.py
- VERSION=python3.5 get_version.py > props.file (and try to inject it, but didnt found how)
If this is not possible I will do it in the ansible role.
UPDATE
There is another "issue" in Ansible Plugin, to use vars in extra vars it must have double quotes instead of single.
ansiblePlaybook credentialsId: 'example-credential', extras: "-e version=${VERSION}", inventory: 'development', playbook: 'deploy.yml'
You can create variables before the pipeline block starts. You can have sh return stdout to assign to these variables. You don't have the same flexibility to assign to environment variables in the environment stanza. So substitute in python3.5 get_version.py where I have echo 0.0.1 in the script here (and make sure your python script just returns the version to stdout):
def awesomeVersion = 'UNKNOWN'
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1').trim()
}
}
}
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
The output of the above pipeline is:
awesomeVersion: 0.0.1
In Jenkins 2.76 I was able to simplify the solution from #burnettk to:
pipeline {
agent { label 'docker' }
environment {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1')
}
stages {
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
Using the "pipeline utility steps" plugin, you can define general vars available to all stages from a properties file. For example, let props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}
A possible variation of the main answer is to provide variable using another pipeline instead of a sh script.
example (set the variable pipeline) : my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
(use these variables) in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
// call set variable job
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
For those who wants the environment's key to be dynamic, the following code can be used:
stage('Prepare Environment') {
steps {
script {
def data = [
"k1": "v1",
"k2": "v2",
]
data.each { key ,value ->
env."$key" = value
// env[key] = value // Deprecated, this can be used as well, but need approval in sandbox ScriptApproval page
}
}
}
}
You can also dump all your vars into a file, and then use the '-e #file' syntax. This is very useful if you have many vars to populate.
steps {
echo "hello World!!"
sh """
var1: ${params.var1}
var2: ${params.var2}
" > vars
"""
ansiblePlaybook inventory: _inventory, playbook: 'test-playbook.yml', sudoUser: null, extras: '-e #vars'
}
You can do use library functions in the environments section, like so:
#Library('mylibrary') _ // contains functions.groovy with several functions.
pipeline {
environment {
ENV_VAR = functions.myfunc()
}
…
}
I'm trying to create a declarative Jenkins pipeline script but having issues with simple variable declaration.
Here is my script:
pipeline {
agent none
stages {
stage("first") {
def foo = "foo" // fails with "WorkflowScript: 5: Expected a step # line 5, column 13."
sh "echo ${foo}"
}
}
}
However, I get this error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: Expected a step # line 5, column 13.
def foo = "foo"
^
I'm on Jenkins 2.7.4 and Pipeline 2.4.
The Declarative model for Jenkins Pipelines has a restricted subset of syntax that it allows in the stage blocks - see the syntax guide for more info. You can bypass that restriction by wrapping your steps in a script { ... } block, but as a result, you'll lose validation of syntax, parameters, etc within the script block.
I think error is not coming from the specified line but from the first 3 lines. Try this instead :
node {
stage("first") {
def foo = "foo"
sh "echo ${foo}"
}
}
I think you had some extra lines that are not valid...
From declaractive pipeline model documentation, it seems that you have to use an environment declaration block to declare your variables, e.g.:
pipeline {
environment {
FOO = "foo"
}
agent none
stages {
stage("first") {
sh "echo ${FOO}"
}
}
}
Agree with #Pom12, #abayer. To complete the answer you need to add script block
Try something like this:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME}"
}
// ----------------
stages {
stage('Build Container') {
steps {
echo 'Building Container..'
script {
if (ENVIRONMENT_NAME == 'development') {
ENV_NAME = 'Development'
} else if (ENVIRONMENT_NAME == 'release') {
ENV_NAME = 'Production'
}
}
echo 'Building Branch: ' + env.BRANCH_NAME
echo 'Build Number: ' + env.BUILD_NUMBER
echo 'Building Environment: ' + ENV_NAME
echo "Running your service with environemnt ${ENV_NAME} now"
}
}
}
}
In Jenkins 2.138.3 there are two different types of pipelines.
Declarative and Scripted pipelines.
"Declarative pipelines is a new extension of the pipeline DSL (it is basically a pipeline script with only one step, a pipeline step with arguments (called directives), these directives should follow a specific syntax. The point of this new format is that it is more strict and therefore should be easier for those new to pipelines, allow for graphical editing and much more.
scripted pipelines is the fallback for advanced requirements."
jenkins pipeline: agent vs node?
Here is an example of using environment and global variables in a Declarative Pipeline. From what I can tell enviroment are static after they are set.
def browser = 'Unknown'
pipeline {
agent any
environment {
//Use Pipeline Utility Steps plugin to read information from pom.xml into env variables
IMAGE = readMavenPom().getArtifactId()
VERSION = readMavenPom().getVersion()
}
stages {
stage('Example') {
steps {
script {
browser = sh(returnStdout: true, script: 'echo Chrome')
}
}
}
stage('SNAPSHOT') {
when {
expression {
return !env.JOB_NAME.equals("PROD") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "SNAPSHOT"
echo "${browser}"
}
}
stage('RELEASE') {
when {
expression {
return !env.JOB_NAME.equals("TEST") && !env.VERSION.contains("RELEASE")
}
}
steps {
echo "RELEASE"
echo "${browser}"
}
}
}//end of stages
}//end of pipeline
You are using a Declarative Pipeline which requires a script-step to execute Groovy code. This is a huge difference compared to the Scripted Pipeline where this is not necessary.
The official documentation says the following:
The script step takes a block of Scripted Pipeline and executes that
in the Declarative Pipeline.
pipeline {
agent none
stages {
stage("first") {
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
you can define the variable global , but when using this variable must to write in script block .
def foo="foo"
pipeline {
agent none
stages {
stage("first") {
script{
sh "echo ${foo}"
}
}
}
}
Try this declarative pipeline, its working
pipeline {
agent any
stages {
stage("first") {
steps{
script {
def foo = "foo"
sh "echo ${foo}"
}
}
}
}
}