I need to pass environment variables to my executable and my unit tests. This works locally but not on Jenkins. On Jenkins, my environment variable gets reset in between gradle tasks
task setupEnv(type: Exec) {
commandLine 'export', "ABC=def"
}
test {
dependsOn 'setupEnv'
scanForTestClasses = false
include '**/*Test.*'
}
Note: I'm simplifying here for SO (I'm aware of the environment command in Gradle) but even with this simple example it works locally but not on Jenkins, meaning *Test.java files see nothing for System.getEnv("ABC"). I'm looking at how to not have Jenkins reset environment variables
There are two options to set dynamical env in pipeline. As a result these envs will be available globally (to all stages in the pipeline):
set it in a stage
stage('set date') {
steps {
script {
env.ANY_NAME_OF_THE_SCRIPT=sh(returnStdout: true, script: "date +%Y-%m-%d").trim()
}
}
}
set it in environment
pipeline {
agent {
label 'some_label'
}
environment {
HELLO = """${sh(
returnStdout: true,
script: 'echo hello'
).trim()}"""
}
}
I have a doubt on the below issue can someone please help me on this.
I wanted to pass maven pom.xml properties from the shell in jenkins pipeline which needs to be substituted by maven and not by groovy or shell.
Example:
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=${db.url} package'
}
}
}
}
Here ${db.url} should be substituted by the url from maven setting.xml file properties and not by groovy or shell in Jenkins pipeline.
I have tried different combination but it gives me error in Jenkins pipeline.
If the above maven property is constant(some constant url) then it is easy to pass but when I wanted to pass any variable property (${db.url}) then I am not able to do so with any syntax.
If you want maven to evaluate ${db.url}, it has to be like
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=\\${db.url} package'
}
}
}
}
Now if you see, Jenkins will prepare the maven command like this-
If you don't escape it will give you Bad substitution error
I am Unable to add the above circled functionality in attached image as Declarative Pipeline Syntax.
PS I am new to this, i Searched for this on others answers but no one matches my requirements.
For example if there is a Parameter in jenkins named VERSION, maven command should become
clean deploy -B -s pathtosettings.xml -DVERSION=valueinparameter
Below is my current code
NOte : I WANT ALL THE PARAMETERS AUTOMATICALLY -DVERSION=${params.VERSION} doesnt help me
pipeline {
agent any
stages {
stage('Checkout Scm') {
steps {
git 'ssh://git#XXXXXXXXXXXXXXXXXXXXXXXXX.git'
}
}
stage('Maven Build 0') {
steps {
configFileProvider([configFile(fileId:'0c0631a5-6510-4b4a-833d-4b80fa67d5f3', targetLocation: 'settings.xml', variable: 'SETTINGS_XML')]) {
withMaven{
sh "mvn clean deploy -B -s ${SETTINGS_XML}
}
}
}
}
tools {
jdk 'JDK_1.8'
}
parameters {
string(name: 'VERSION', defaultValue: '3_12_0', description: 'version to be in maven')
}
}
First, I think you doesn't need targetLocation to perform this.
To access to your parameter value, you need to use params prefix.
This is how I'm using the configFileProvider to make it work :
configFileProvider([configFile(fileId: 'configFileId', variable: 'SETTINGS_XML')]) {
sh "mvn clean deploy -s \$SETTINGS_XML -B -DVERSION=$params.VERSION"
}
With this, the variable which link the settings file is not replaced and it's correctly used in my pipeline and the version is replaced in the command. Don't forget to use a
'Maven settings.xml' type of file in the configFileProvider.
steps {
script{
foo= " "
params.each {param ->
foo = "${foo} -D${param.key}=${param.value} "
}
}
configFileProvider([configFile(fileId:'XXXX', targetLocation: 'settings.xml', variable: 'SETTINGS_XML')]) {
withMaven{
sh "mvn clean deploy -B -s ${SETTINGS_XML} - ${foo}"
}
}
This is the Only Approach found
Given a jenkins build pipeline, jenkins injects a variable env into the node{}. Variable env holds environment variables and values.
I want to print all env properties within the jenkins pipeline. However, I do no not know all env properties ahead of time.
For example, environment variable BRANCH_NAME can be printed with code
node {
echo ${env.BRANCH_NAME}
...
But again, I don't know all variables ahead of time. I want code that handles that, something like
node {
for(e in env){
echo e + " is " + ${e}
}
...
which would echo something like
BRANCH_NAME is myBranch2
CHANGE_ID is 44
...
I used Jenkins 2.1 for this example.
According to Jenkins documentation for declarative pipeline:
sh 'printenv'
For Jenkins scripted pipeline:
echo sh(script: 'env|sort', returnStdout: true)
The above also sorts your env vars for convenience.
Another, more concise way:
node {
echo sh(returnStdout: true, script: 'env')
// ...
}
cf. https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#code-sh-code-shell-script
The following works:
#NonCPS
def printParams() {
env.getEnvironment().each { name, value -> println "Name: $name -> Value $value" }
}
printParams()
Note that it will most probably fail on first execution and require you approve various groovy methods to run in jenkins sandbox. This is done in "manage jenkins/in-process script approval"
The list I got included:
BUILD_DISPLAY_NAME
BUILD_ID
BUILD_NUMBER
BUILD_TAG
BUILD_URL
CLASSPATH
HUDSON_HOME
HUDSON_SERVER_COOKIE
HUDSON_URL
JENKINS_HOME
JENKINS_SERVER_COOKIE
JENKINS_URL
JOB_BASE_NAME
JOB_NAME
JOB_URL
You can accomplish the result using sh/bat step and readFile:
node {
sh 'env > env.txt'
readFile('env.txt').split("\r?\n").each {
println it
}
}
Unfortunately env.getEnvironment() returns very limited map of environment variables.
Why all this complicatedness?
sh 'env'
does what you need (under *nix)
Cross-platform way of listing all environment variables:
if (isUnix()) {
sh env
}
else {
bat set
}
Here's a quick script you can add as a pipeline job to list all environment variables:
node {
echo(env.getEnvironment().collect({environmentVariable -> "${environmentVariable.key} = ${environmentVariable.value}"}).join("\n"))
echo(System.getenv().collect({environmentVariable -> "${environmentVariable.key} = ${environmentVariable.value}"}).join("\n"))
}
This will list both system and Jenkins variables.
I use Blue Ocean plugin and did not like each environment entry getting its own block. I want one block with all the lines.
Prints poorly:
sh 'echo `env`'
Prints poorly:
sh 'env > env.txt'
for (String i : readFile('env.txt').split("\r?\n")) {
println i
}
Prints well:
sh 'env > env.txt'
sh 'cat env.txt'
Prints well: (as mentioned by #mjfroehlich)
echo sh(script: 'env', returnStdout: true)
The pure Groovy solutions that read the global env variable don't print all environment variables (e. g. they are missing variables from the environment block, from withEnv context and most of the machine-specific variables from the OS). Using shell steps it is possible to get a more complete set, but that requires a node context, which is not always wanted.
Here is a solution that uses the getContext step to retrieve and print the complete set of environment variables, including pipeline parameters, for the current context.
Caveat: Doesn't work in Groovy sandbox. You can use it from a trusted shared library though.
def envAll = getContext( hudson.EnvVars )
echo envAll.collect{ k, v -> "$k = $v" }.join('\n')
Show all variable in Windows system and Unix system is different, you can define a function to call it every time.
def showSystemVariables(){
if(isUnix()){
sh 'env'
} else {
bat 'set'
}
}
I will call this function first to show all variables in all pipline script
stage('1. Show all variables'){
steps {
script{
showSystemVariables()
}
}
}
The easiest and quickest way is to use following url to print all environment variables
http://localhost:8080/env-vars.html/
The answers above, are now antiquated due to new pipeline syntax. Below prints out the environment variables.
script {
sh 'env > env.txt'
String[] envs = readFile('env.txt').split("\r?\n")
for(String vars: envs){
println(vars)
}
}
Includes both system and build environment vars:
sh script: "printenv", label: 'print environment variables'
if you really want to loop over the env list just do:
def envs = sh(returnStdout: true, script: 'env').split('\n')
envs.each { name ->
println "Name: $name"
}
I found this is the most easiest way:
pipeline {
agent {
node {
label 'master'
}
}
stages {
stage('hello world') {
steps {
sh 'env'
}
}
}
}
You can get all variables from your jenkins instance. Just visit:
${jenkins_host}/env-vars.html
${jenkins_host}/pipeline-syntax/globals
ref: https://www.jenkins.io/doc/pipeline/tour/environment/
node {
sh 'printenv'
}
You can use sh 'printenv'
stage('1') {
sh "printenv"
}
another way to get exactly the output mentioned in the question:
envtext= "printenv".execute().text
envtext.split('\n').each
{ envvar=it.split("=")
println envvar[0]+" is "+envvar[1]
}
This can easily be extended to build a map with a subset of env vars matching a criteria:
envdict=[:]
envtext= "printenv".execute().text
envtext.split('\n').each
{ envvar=it.split("=")
if (envvar[0].startsWith("GERRIT_"))
envdict.put(envvar[0],envvar[1])
}
envdict.each{println it.key+" is "+it.value}
I suppose that you needed that in form of a script, but if someone else just want to have a look through the Jenkins GUI, that list can be found by selecting the "Environment Variables" section in contextual left menu of every build
Select project => Select build => Environment Variables
I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/