Token replacement in a Jenkins Config file - jenkins

I have a deployment pipeline job thats in need of a deployment template file. There are some secure passwords in that file that I want to keep secure.
So I added a Config file provider plugin (v 2.13) and had placeholders in it that corresponded to global passwords. This unfortunately is not working. Just to test I had a Jenkinsfile like below
node {
checkout scm
withEnv(['INSTANCE=Something']) {
configFileProvider(
[configFile(fileId: 'prescribe', variable: 'DEPLOY_FILE')]) {
sh "echo $env.INSTANCE"
sh "cat ${env.DEPLOY_FILE}"
}
}
}
And the file with id 'prescribe' as
${branch}
${ENV, var=INSTANCE}
${ENV.INSTANCE}
${ENV,INSTANCE}
${env, var=INSTANCE}
And I tried keeping INSTANCE as also a global password, global variable.
However none of the tokens are replaced.
Any ideas what I'm doing wrong.

The only way that i get it working, was using the parameters of the job configuration.
And in the file i use this interpolation
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<appSettings>
<add key="overrideDefaultEndpoint" value="true" />
<add key="endpoint" value="${ENDPOINT}"/>
where ENDPOINT is the name of the job parameter.

Problem here is that Token Macro only accepts predefined env vars.
Options:
Job parameters, as stated by #Taringalio in another answer
Try and provide env vars via EnvInject plugin
Please see the related issue in the jenkins tracker https://issues.jenkins-ci.org/browse/JENKINS-39998

Related

Seed job fails to create a MavenJob but no error is reported

I created a Junit jenkins test case where a in-memory jenkins instance is launched (as we use #Rule jenkinsrule). The code of the test case is available here.
The test case will create a FreeStyleProject (= seed job) which will use as Groovy script DSL a maven.groovy file
But when the test case is executed, the following message is reported during the the job build execution. The message reports ghe consequence of the import/parsing of the mavenJob.groovy file as the job expects that a new job will be created.
Legacy code started this job. No cause information is available
Running as SYSTEM
Building in workspace /var/folders/t2/jwchtqkn5y76hrfrws7dqtqm0000gn/T/j h5344303144116520886/workspace/test0
Processing provided DSL script
ERROR: java.io.IOException: Unable to read /var/folders/t2/jwchtqkn5y76hrfrws7dqtqm0000gn/T/j h5344303144116520886/jobs/mvn-spring-boot-rest-http/config.xml
Finished: FAILURE
And of course no stack trace of the error is stdout or stderr.
How can I investigate the problem and fix it ?
Remark:
If I use the config.xml file and import it in a separate jenkins instance, the job succeeded
config.xml file generated, it looks good (vs same config.xml file created using the UI)
<?xml version='1.1' encoding='UTF-8'?>
<project>
<keepDependencies>false</keepDependencies>
<properties/>
<scm class="hudson.scm.NullSCM"/>
<canRoam>false</canRoam>
<disabled>false</disabled>
<blockBuildWhenDownstreamBuilding>false</blockBuildWhenDownstreamBuilding>
<blockBuildWhenUpstreamBuilding>false</blockBuildWhenUpstreamBuilding>
<triggers/>
<concurrentBuild>false</concurrentBuild>
<builders>
<javaposse.jobdsl.plugin.ExecuteDslScripts>
<scriptText>mavenJob(&apos;mvn-spring-boot-rest-http&apos;) {
description &apos;A Maven Job compiling the project Spring Boot Rest HTTP Example&apos;
parameters {
gitParameter {
name &apos;SELECTED_TAG&apos;
description &apos;The Git tag to checkout&apos;
type &apos;PT_TAG&apos;
defaultValue &apos;2.3.4-2&apos;
branch &apos;&apos;
branchFilter &apos;origin/(.*)&apos;
quickFilterEnabled false
selectedValue &apos;DEFAULT&apos;
sortMode &apos;DESCENDING_SMART&apos;
tagFilter &apos;*&apos;
useRepository &apos;.*rest-http-example.git&apos;
listSize &apos;10&apos;
}
}
scm {
git {
remote {
url &apos;https://github.com/snowdrop/rest-http-example.git&apos;
// branch(&apos;$SELECTED_TAG&apos;)
branch(&apos;2.3.4-2&apos;)
}
}
}
rootPOM &apos;pom.xml&apos;
goals &apos;clean install&apos;
}</scriptText>
<usingScriptText>true</usingScriptText>
<sandbox>false</sandbox>
<ignoreExisting>false</ignoreExisting>
<ignoreMissingFiles>false</ignoreMissingFiles>
<failOnMissingPlugin>false</failOnMissingPlugin>
<failOnSeedCollision>false</failOnSeedCollision>
<unstableOnDeprecation>false</unstableOnDeprecation>
<removedJobAction>IGNORE</removedJobAction>
<removedViewAction>IGNORE</removedViewAction>
<removedConfigFilesAction>IGNORE</removedConfigFilesAction>
<lookupStrategy>JENKINS_ROOT</lookupStrategy>
</javaposse.jobdsl.plugin.ExecuteDslScripts>
</builders>
<publishers/>
<buildWrappers/>
</project>
Many thanks in advance for your help.
I created a thread discussion here too: https://groups.google.com/g/jenkinsci-users/c/mRSwARFapyA
Charles
The problem was related to many missing dependencies needed to run the test case.
I upgraded the build.gradle file and now that works.
https://github.com/ch007m/jenkins-job-dsl/blob/jenkins-2.271/build.gradle#L53-L72
BTW, the error message reported was not correlated at all to the root cause and How to fix the problem. that should be improved within the code ;-)

environmentVariables() not working for Jenkins Job DSL and Pipeline Jobs

We use Jenkins Job DSL for our CI setup. Since we are using a special command only available in the traditional Jenkinsfile syntax, we need to use a pipeline job.
Inside of the pipeline job we check out our project from Git. We are using the pipeline job for multiple projects, so we want to inject the git url into the pipeline script.
This is a short version of our script generating the pipeline job:
def createPipelineJob(def jobName, def gitUrl) {
pipelineJob(jobName) {
environmentVariables(GIT_URL: gitUrl)
definition {
cps {
script('''
node {
sh 'env | sort'
}
''')
sandbox(true)
}
}
}
}
This creates the following XML config:
<flow-definition>
<actions/>
<description/>
<keepDependencies>false</keepDependencies>
<properties>
<EnvInjectJobProperty>
<info>
<propertiesContent>GIT_URL=my-git.url</propertiesContent>
<loadFilesFromMaster>false</loadFilesFromMaster>
</info>
<on>true</on>
<keepJenkinsSystemVariables>true</keepJenkinsSystemVariables>
<keepBuildVariables>true</keepBuildVariables>
<overrideBuildParameters>false</overrideBuildParameters>
<contributors/>
</EnvInjectJobProperty>
</properties>
<triggers/>
<definition class="org.jenkinsci.plugins.workflow.cps.CpsFlowDefinition">
<script>
node { sh 'env | sort' }
</script>
<sandbox>true</sandbox>
</definition>
</flow-definition>
But if i run this, the GIT_URL environment variable is not listed (other environment variables are). But if i instead create the pipeline job manually with this setup, the GIT_URL environment variable is printed just fine. Creating the job manually pretty much creates the same xml configuration:
<flow-definition plugin="workflow-job#2.15">
<actions>
<io.jenkins.blueocean.service.embedded.BlueOceanUrlAction plugin="blueocean-rest-impl#1.3.1">
<blueOceanUrlObject class="io.jenkins.blueocean.service.embedded.BlueOceanUrlObjectImpl">
<mappedUrl>blue/organizations/jenkins/test-jobname</mappedUrl>
<modelObject class="flow-definition" reference="../../../.."/>
</blueOceanUrlObject>
</io.jenkins.blueocean.service.embedded.BlueOceanUrlAction>
</actions>
<description/>
<keepDependencies>false</keepDependencies>
<properties>
<com.sonyericsson.rebuild.RebuildSettings plugin="rebuild#1.27">
<autoRebuild>false</autoRebuild>
<rebuildDisabled>false</rebuildDisabled>
</com.sonyericsson.rebuild.RebuildSettings>
<EnvInjectJobProperty plugin="envinject#2.1.5">
<info>
<propertiesContent>GIT_URL=my-git.url</propertiesContent>
<secureGroovyScript plugin="script-security#1.35">
<script/>
<sandbox>false</sandbox>
</secureGroovyScript>
<loadFilesFromMaster>false</loadFilesFromMaster>
</info>
<on>true</on>
<keepJenkinsSystemVariables>true</keepJenkinsSystemVariables>
<keepBuildVariables>true</keepBuildVariables>
<overrideBuildParameters>false</overrideBuildParameters>
</EnvInjectJobProperty>
<org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty>
<triggers/>
</org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty>
</properties>
<definition class="org.jenkinsci.plugins.workflow.cps.CpsFlowDefinition" plugin="workflow-cps#2.41">
<script>
node { sh 'env | sort' }
</script>
<sandbox>true</sandbox>
</definition>
<triggers/>
<disabled>false</disabled>
</flow-definition>
We are pretty lost because we are new to jenkins and this problem is holding us for days now.
Edit:
The job is generated on the jenkins master node but executed on a slave node
Jenkins 2.37.3
Environment Injector Plugin 2.1.5
Pipeline 2.5
This is more of a comment than an answer, but I modified and tested your DSL code and it works fine.
I created a DSL job using the script:
def createPipelineJob(def jobName, def gitUrl) {
pipelineJob(jobName) {
environmentVariables(GIT_URL: gitUrl)
definition {
cps {
script('''
node {
sh "echo $GIT_URL"
}
''')
sandbox(true)
}
}
}
}
createPipelineJob('new-job-2','my-git.url')
The resulting pipeline job has the same XML as the one you posted (minus the shell script), and building the pipeline job prints the value of GIT_URL.
[new-job-1] Running shell script
+ echo my-git.url
my-git.url
My recommendation:
If the short version you posted (or maybe try mine) doesn't work, I would try to see if upgrading Jenkins or the plugins makes any difference.
If the short version you posted or mine does work, maybe post the full version, perhaps there's an error there.
As it turns out the Environment Injector Plugin was not installed successfully, therefore the script did not run properly. So all i had to do was to restart Jenkins and everything worked just fine. Special thanks to Javier Garcés ensuring me that my script was indeed correct.

Use Jenkins DSL to specify a a Git executable in a github scm node

I'm converting some Jenkins jobs to DSL scripts.
Some of these use github for SCM and as this is supported by the DSL this is easy enough to configure. However, after over 100 job conversions, for the first time I need to specify a Git executable (all jobs so far have used the default) and there doesn't seem to be a way to do this. The job.xml shows this:
<scm class="hudson.plugins.git.GitSCM" plugin="git#2.4.4">
<configVersion>2</configVersion>
<userRemoteConfigs>...</userRemoteConfigs>
<branches>...</branches>
<doGenerateSubmoduleConfigurations>false</doGenerateSubmoduleConfigurations>
<gitTool>Ubuntu Git</gitTool>
<submoduleCfg class="list"/>
<extensions>
<hudson.plugins.git.extensions.impl.SparseCheckoutPaths>
<sparseCheckoutPaths>
<hudson.plugins.git.extensions.impl.SparseCheckoutPath>
<path>
octane.pricing/octane.trader/server/work/mif_interface/cfg
</path>
</hudson.plugins.git.extensions.impl.SparseCheckoutPath>
</sparseCheckoutPaths>
</hudson.plugins.git.extensions.impl.SparseCheckoutPaths>
</extensions>
</scm>
I can do all of this using the DSL apart from <gitTool>Ubuntu Git</gitTool>.
This isn't mentioned in the DSL so I presume this isn't supported so I tried using the configure block (bearing in mind I'm still learning exactly how to use that). Tried a few things but the one I most expected to work:
configure { project ->
project << 'hudson.plugins.git.GitSCM' {
paramDefs << 'gitTool' {
string('Ubuntu Git')
}
}
}
But no dice - the XML still shows the "default" option.
I'm surprised this can't be specified directly in the DSL but can anyone see what I am doing wrong with that configure block?
The best option is to use the nested configure block of the Git SCM context:
job('example') {
scm {
git {
remote {
github('owner/repo')
}
configure { scmNode ->
scmNode / gitTool('changeme')
}
}
}
}
See configure in the Job DSL API Viewer and more info about the Configure Block in the Job DSL wiki.

Jenkins task for remote hosts

In deploy scenario i need to create and run jenkins task on list of hosts, i.e. create something like parametrized task (where ip address is a parameter) or a task on Multijob Plugin with HOST axis, but run by only 2 ones in parallel over multiple hosts.
One of the option could be to run ansible with the list of hosts, but i'd like to see a status per each host separately, and relaunch a jenkins job if needed.
The main option is to use Job DSL Plugin or Pipeline Plugin, but here i need help to understand what classes/methods of dsl groovy code should be used to achieve this.
Can anyone help with it?
Assume that the hosts have been configured as Jenkins slaves already.
Assume that hosts are provided in pipeline job parameter
HOSTS as whitespace separated list. Following example should get you started:
def hosts_pairs = HOSTS.split().collate(2)
for (pair in host_pairs) {
def branches = [:]
for (h in pair) {
def host = h // fresh variable per iteration; it will be mutated
branches[host] = {
stage(host) {
node(host) {
// do the actual job here, e.g.
// execute a shell script
sh "echo hello world"
}
}
}
}
parallel branches
}
A combination of Matrix project and Throttle Concurrent Builds Plugin is possible.
All you need is to setup a single user-defined axis (e.g. "targetHost") with all IP addresses as values and set the desired throttling under "Throttle Concurrent Builds" (please note that you have to enable the "Execute concurrent builds if necessary" option to tell jenkins to allow concurrent execution).
The axis values are available during every child build in the corresponding environment variable (e.g. targetHost).
Below is an example config.xml with simple ping&wait build step:
<?xml version='1.0' encoding='UTF-8'?>
<matrix-project plugin="matrix-project#1.7.1">
<actions/>
<description></description>
<keepDependencies>false</keepDependencies>
<properties>
<hudson.plugins.throttleconcurrents.ThrottleJobProperty plugin="throttle-concurrents#1.9.0">
<maxConcurrentPerNode>2</maxConcurrentPerNode>
<maxConcurrentTotal>2</maxConcurrentTotal>
<categories class="java.util.concurrent.CopyOnWriteArrayList"/>
<throttleEnabled>true</throttleEnabled>
<throttleOption>project</throttleOption>
<limitOneJobWithMatchingParams>false</limitOneJobWithMatchingParams>
<matrixOptions>
<throttleMatrixBuilds>true</throttleMatrixBuilds>
<throttleMatrixConfigurations>true</throttleMatrixConfigurations>
</matrixOptions>
<paramsToUseForLimit></paramsToUseForLimit>
</hudson.plugins.throttleconcurrents.ThrottleJobProperty>
</properties>
<scm class="hudson.scm.NullSCM"/>
<canRoam>true</canRoam>
<disabled>false</disabled>
<blockBuildWhenDownstreamBuilding>false</blockBuildWhenDownstreamBuilding>
<blockBuildWhenUpstreamBuilding>false</blockBuildWhenUpstreamBuilding>
<triggers/>
<concurrentBuild>true</concurrentBuild>
<axes>
<hudson.matrix.TextAxis>
<name>targetHost</name>
<values>
<string>127.0.0.1</string>
<string>127.0.0.2</string>
<string>127.0.0.3</string>
<string>127.0.0.4</string>
<string>127.0.0.5</string>
</values>
</hudson.matrix.TextAxis>
</axes>
<builders>
<hudson.tasks.Shell>
<command>sleep 7
ping -c 7 $targetHost
sleep 7</command>
</hudson.tasks.Shell>
</builders>
<publishers/>
<buildWrappers/>
<executionStrategy class="hudson.matrix.DefaultMatrixExecutionStrategyImpl">
<runSequentially>false</runSequentially>
</executionStrategy>
</matrix-project>
Good luck!

Load file with environment variables Jenkins Pipeline

I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
I was thinking about not using pipeline (but i will not be able to use my Jenkinsfile).
Or make different jobs for staging and production, using EnvInject Plugin (But i lose my stage view)
Or make withEnv (but the code gets big, because today i am working with 12 env vars)
One way you could load environment variables from a file is to load a Groovy file.
For example:
Let's say you have a groovy file in '$JENKINS_HOME/.envvars' called 'stacktest-staging.groovy'.
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello"
env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node {
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
echo "${env.DB_URL}"
echo "${env.DB_URL2}"
}
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see:
issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') {
def someGroovyVar = 'Hello world'
withEnv(['VAR1=VALUE ONE',
"VAR2=${someGroovyVar}"
]) {
def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true)
echo result
}
}
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
echo "User name: $USER"
echo "Password: $PASSWORD"
}
}
The jenkisn console log output hides the real values:
[Pipeline] echo
User name: ****
[Pipeline] echo
Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
withEnv(["ENV_USERNAME=${USER}",
"ENV_PASSWORD=${PASSWORD}"
]) {
def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true)
echo result
}
}
}
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps)
Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value.
But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well.
To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
Using withEnv() to pass environment variables from file splitted by new line and casted to List:
writeFile file: 'version.txt', text: 'version=6.22.0'
withEnv(readFile('version.txt').split('\n') as List) {
sh "echo ${version}"
}
If you are using Jenkins 2.0 you can load the property file (which consists of all required Environment variables along with their corresponding values) and read all the environment variables listed there automatically and inject it into the Jenkins provided env entity.
Here is a method which performs the above stated action.
def loadProperties(path) {
properties = new Properties()
File propertiesFile = new File(path)
properties.load(propertiesFile.newDataInputStream())
Set<Object> keys = properties.keySet();
for(Object k:keys){
String key = (String)k;
String value =(String) properties.getProperty(key)
env."${key}" = "${value}"
}
}
To call this method we need to pass the path of property file as a string variable For example, in our Jenkins file using groovy script we can call like
path = "${workspace}/pic_env_vars.properties"
loadProperties(path)
Please ask me if you have any doubt
Here is a complete example of externalizing environment variables and loading them in Jenkins pipeline execution. The pipeline is written in a declarative style.
stage('Reading environment variable defined in groovy file') {
steps {
script {
load "./pipeline/basics/extenvvariable/env.groovy"
echo "${env.env_var1}"
echo "${env.env_var2}"
}
}
}
Complete code example:
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/Jenkinsfile
Where variables are loaded from a groovy file placed with the pipeline code only.
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/env.groovy
This pattern comes very handy when you are creating a generic pipeline that could be used across teams.
You can externalize the dependent variable in such groovy file and each team can define their values according to their ecosystem.
Another solution is to use a custom method without allowing extra permissions such as for new Properties() which leads to this error before allowing:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.util.Properties
or adding extra plugin methods such as readProperties.
here is a method which reads a simple file named env_vars in this format:
FOO=bar
FOO2=bar
pipeline {
<... skipped lines ...>
script {
loadEnvironmentVariablesFromFile("env_vars")
echo "show time! ${BAR} ${BAR2}"
}
<... skipped lines ...>
}
private void loadEnvironmentVariablesFromFile(String path) {
def file = readFile(path)
file.split('\n').each { envLine ->
def (key, value) = envLine.tokenize('=')
env."${key}" = "${value}"
}
}

Resources