Jenkins pipeline: load properties from file - jenkins

Below pipeline codes works well:
pipeline {
agent {
label "test_agent"
}
stages {
stage("test") {
steps {
script {
sh "echo 'number=${BUILD_NUMBER}' >log"
if (fileExists('log')) {
load 'log'
retVal = "${number}"
}
echo "${retVal}"
}
}
}
}
}
However, when I tried to put the logic of read file to a lib(named getNumber.groovy) and call it in pipeline, like this:
getNumber.groovy
def call() {
def retVal
if (fileExists('log')) {
load 'log'
retVal = "${number}"
}
return retVal
}
This is how the pipeline (test.groovy) call this lib:
#Library('lib') _
pipeline {
agent {
label "test_agent"
}
stages {
stage("test") {
steps {
script {
sh "echo 'number=${BUILD_NUMBER}' >log"
def retVal = getNumber()
echo "${retVal}"
}
}
}
}
}
It always fail with below error:
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: number for class: getNumber
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:458)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:34)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
Any suggestion? How to fix it if I want to encapsulate the logic in a lib?
[Edit]
If I change this segment
load 'log'
retVal = "${number}"
to this:
def matcher = readFile('log') =~ '^number=(.+)'
retVal=matcher ? matcher[0][1] : null
it works. But I just curious why the previous one can't work.

Related

How to create Dynamic array of sshUserPrivateKey in jenkinsfile?

Error
Could not instantiate
{bindings=[#sshUserPrivateKey(credentialsId=dev-02,keyFileVariable=dev-02-key),
#sshUserPrivateKey(credentialsId=dev-01,keyFileVariable=dev-01-key)]}
for org.jenkinsci.plugins.credentialsbinding.impl.BindingStep:
java.lang.ClassCastException:
org.jenkinsci.plugins.credentialsbinding.impl.BindingStep.bindings
expects
java.util.List<org.jenkinsci.plugins.credentialsbinding.MultiBinding>
but received class java.lang.String
I am trying to create sshUserPrivatekey value array in in jenkinsfile.
I am trying to create this array service_array and want to use in withCredentials but facing issue in that.
stages {
stage('configure value') {
steps {
script {
def service_array=[]
def conjourString=""
if (env.BRANCH_NAME == 'stage') {
env.varible_host='stage'
} else if(env.BRANCH_NAME == 'prod') {
env.varible_host='production'
} else if(env.BRANCH_NAME == 'dev') {
env.varible_host='dev'
}else{
env.varible_host='dev'
}
def listofserver = readJSON file: "${env.WORKSPACE}/inventory/listofserver.json"
for(def i = 0;i<listofserver[env.varible_host].size();i++) {
println(listofserver[env.varible_host][i]);
def hostname=listofserver[env.varible_host][i]['host_name']
def keyString=hostname+"-key"
service_array.add(sshUserPrivateKey(credentialsId: hostname,
keyFileVariable: keyString));
}
env.service_array=service_array
}
}
},
stage('Run Ansible Playbook Playbook') {
steps {
echo "!* Running Ansible Playbook ${env.ANSIBLE_PB}. *!"
withCredentials(env.service_array) {
}
}
}
}
I am getting error in env.service_array, what can i use here?

Jenkins Pipeline Conditional Environmental Variables

I have a set of static environmental variables in the environmental directive section of a declarative pipeline. These values are available to every stage in the pipeline.
I want the values to change based on an arbitrary condition.
Is there a way to do this?
pipeline {
agent any
environment {
if ${params.condition} {
var1 = '123'
var2 = abc
} else {
var1 = '456'
var2 = def
}
}
stages {
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
stag('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
Looking for the same thing I found a nice answer in other question:
Basically is to use the ternary conditional operator
pipeline {
agent any
environment {
var1 = "${params.condition == true ? "123" : "456"}"
var2 = "${params.condition == true ? abc : def}"
}
}
Note: keep in mind that in the way you wrote your question (and I did my answer) the numbers are Strings and the letters are variables.
I would suggest you to create a stage "Environment" and declare your variable according to the condition you want, something like below:-
pipeline {
agent any
environment {
// Declare variables which will remain same throughout the build
}
stages {
stage('Environment') {
agent { node { label 'master' } }
steps {
script {
//Write condition for the variables which need to change
if ${params.condition} {
env.var1 = '123'
env.var2 = abc
} else {
env.var1 = '456'
env.var2 = def
}
sh "printenv"
}
}
}
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
stage('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
}
Suppose we want to use optional params for downstream job if it is called from upsteam job, and default params if downsteam job is called by itself.
But we don't want to have "holder" params with default value in downstream for some reason.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
you can get another level of flexibility, using maps:
stage("set_env_vars") {
steps {
script {
def MY_MAP1 = [A: "123", B: "456", C: "789"]
def MY_MAP2 = [A: "abc", B: "def", C: "ghi"]
env.var1 = MY_MAP1."${env.switching_var}"
env.var2 = MY_MAP2."${env.switching_var}"
}
}
}
This way, more choices are possible.

Issue porting Jenkinsfile scripted to declarative withEnv{} => environment{}

I have issue porting scripted to declarative pipeline. I used to have in scripted:
//Scripted
def myEnv = [:]
stage ('Prepare my env') { [...] myEnv = ... }
stage ('Fancy stuff') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
}
}
stage ('Fancy stuff2') {
node() {
withEnv(myEnv) {
// here use what is defined in myEnv
} }
}
and now in declarative I would like to have
//Declarative
def myEnv = [:]
pipeline {
agent none
stage('Prepare my env') {
steps {
script {
[...]
myEnv = ...
}
}
}
stages {
environment { myEnv }
stage('Fancy stuff') {
[...]
}
stage('Fancy stuff2') {
[...]
}
} }
when I try to run this, it fails withEnv
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup failed: WorkflowScript: xx: "myEnv" is not a valid environment
expression. Use "key = value" pairs with valid Java/shell keys.
Fair enough.
What should I do to be able to use declarative environment { } to avoid using withEnv(myEnv) one in every further steps?
it seems that the part you are missing is the usage of environment clause.
Instead of
environment { myEnv }
It should be
environment { myEnvVal = myEnv }
Just as the error method mentions this should be key = value pair.
Your issue comes from the type of your variable myEnv. You define it as a map when you do def myEnv = [:].
So it works with withEnv that takes a map as parameter but it does not work with environment {...} that takes only "key = value" statements.
The solution depends on how you add environment variables contained in myEnv.
The simplest way is using environment directive by listing all the key/values contained in your former variable myEnv:
pipeline{
agent none
environment {
test1 = 'test-1'
test2 = 'test-2'
}
stages{
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}
But you also do it the scripted way :
pipeline{
agent none
stages{
stage('Prepare my env') {
steps {
script {
def test = []
for (int i = 1; i < 3; ++i) {
test[i] = 'test-' + i.toString()
}
test1 = test[1]
test2 = test[2]
}
}
}
stage('Fancy stuff'){
steps{
echo "${test1}"
}
}
stage('Fancy stuff2'){
steps{
echo "${test2}"
}
}
}
}

Dynamic number of parallel steps in declarative pipeline

I'm trying to create a declarative pipeline which does a number (configurable via parameter) jobs in parallel, but I'm having trouble with the parallel part.
Basically, for some reason the below pipeline generates the error
Nothing to execute within stage "Testing" # line .., column ..
and I cannot figure out why, or how to solve it.
import groovy.transform.Field
#Field def mayFinish = false
def getJob() {
return {
lock("finiteResource") {
waitUntil {
script {
mayFinish
}
}
}
}
}
def getFinalJob() {
return {
waitUntil {
script {
try {
echo "Start Job"
sleep 3 // Replace with something that might fail.
echo "Finished running"
mayFinish = true
true
} catch (Exception e) {
echo e.toString()
echo "Failed :("
}
}
}
}
}
def getJobs(def NUM_JOBS) {
def jobs = [:]
for (int i = 0; i < (NUM_JOBS as Integer); i++) {
jobs["job{i}"] = getJob()
}
jobs["finalJob"] = getFinalJob()
return jobs
}
pipeline {
agent any
options {
buildDiscarder(logRotator(numToKeepStr:'5'))
}
parameters {
string(
name: "NUM_JOBS",
description: "Set how many jobs to run in parallel"
)
}
stages {
stage('Setup') {
steps {
echo "Setting it up..."
}
}
stage('Testing') {
steps {
parallel getJobs(params.NUM_JOBS)
}
}
}
}
I've seen plenty of examples doing this in the old pipeline, but not declarative.
Anyone know what I'm doing wrong?
At the moment, it doesn't seem possible to dynamically provide the parallel branches when using a Declarative Pipeline.
Even if you have a stage prior where, in a script block, you call getJobs() and add it to the binding, the same error message is thrown.
In this case you'd have to fall back to using a Scripted Pipeline.

Can I create dynamically stages in a Jenkins pipeline?

I need to launch a dynamic set of tests in a declarative pipeline.
For better visualization purposes, I'd like to create a stage for each test.
Is there a way to do so?
The only way to create a stage I know is:
stage('foo') {
...
}
I've seen this example, but I it does not use declarative syntax.
Use the scripted syntax that allows more flexibility than the declarative syntax, even though the declarative is more documented and recommended.
For example stages can be created in a loop:
def tests = params.Tests.split(',')
for (int i = 0; i < tests.length; i++) {
stage("Test ${tests[i]}") {
sh '....'
}
}
As JamesD suggested, you may create stages dynamically (but they will be sequential) like that:
def list
pipeline {
agent none
options {buildDiscarder(logRotator(daysToKeepStr: '7', numToKeepStr: '1'))}
stages {
stage('Create List') {
agent {node 'nodename'}
steps {
script {
// you may create your list here, lets say reading from a file after checkout
list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
}
}
post {
cleanup {
cleanWs()
}
}
}
stage('Dynamic Stages') {
agent {node 'nodename'}
steps {
script {
for(int i=0; i < list.size(); i++) {
stage(list[i]){
echo "Element: $i"
}
}
}
}
post {
cleanup {
cleanWs()
}
}
}
}
}
That will result in:
dynamic-sequential-stages
If you don't want to use for loop, and generated pipeline to be executed in parallel then, here is an answer.
def jobs = ["JobA", "JobB", "JobC"]
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
Note that all generated stages will be executed into 1 node.
If you are willing to executed the generated stages to be executed into different nodes.
def agents = ['master', 'agent1', 'agent2']
// enter valid agent name in array.
def generateStage(nodeLabel) {
return {
stage("Runs on ${nodeLabel}") {
node(nodeLabel) {
echo "Running on ${nodeLabel}"
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : generateStage(it)]
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
You can of course add more than 1 parameters and can use collectEntries for 2 parameters.
Please remember return in function generateStage is must.
#Jorge Machado: Because I cannot comment I had to post it as an answer. I've solved it recently. I hope it'll help you.
Declarative pipeline:
A simple static example:
stage('Dynamic') {
steps {
script {
stage('NewOne') {
echo('new one echo')
}
}
}
}
Dynamic real-life example:
// in a declarative pipeline
stage('Trigger Building') {
when {
environment(name: 'DO_BUILD_PACKAGES', value: 'true')
}
steps {
executeModuleScripts('build') // local method, see at the end of this script
}
}
// at the end of the file or in a shared library
void executeModuleScripts(String operation) {
def allModules = ['module1', 'module2', 'module3', 'module4', 'module11']
allModules.each { module ->
String action = "${operation}:${module}"
echo("---- ${action.toUpperCase()} ----")
String command = "npm run ${action} -ddd"
// here is the trick
script {
stage(module) {
bat(command)
}
}
}
}
You might want to take a look at this example - you can have a function return a closure which should be able to have a stage in it.
This code shows the concept, but doesn't have a stage in it.
def transformDeployBuildStep(OS) {
return {
node ('master') {
wrap([$class: 'TimestamperBuildWrapper']) {
...
} } // ts / node
} // closure
} // transformDeployBuildStep
stage("Yum Deploy") {
stepsForParallel = [:]
for (int i = 0; i < TargetOSs.size(); i++) {
def s = TargetOSs.get(i)
def stepName = "CentOS ${s} Deployment"
stepsForParallel[stepName] = transformDeployBuildStep(s)
}
stepsForParallel['failFast'] = false
parallel stepsForParallel
} // stage
Just an addition to what #np2807 and #Anton Yurchenko have already presented: you can create stages dynamically and run the in parallel by simply delaying list of stages creation (but keeping its declaration), e.g. like that:
def parallelStagesMap
def generateStage(job) {
return {
stage("stage: ${job}") {
echo "This is ${job}."
}
}
}
pipeline {
agent { label 'master' }
stages {
stage('Create List of Stages to run in Parallel') {
steps {
script {
def list = ["Test-1", "Test-2", "Test-3", "Test-4", "Test-5"]
// you may create your list here, lets say reading from a file after checkout
// personally, I like to use scriptler scripts and load the as simple as:
// list = load '/var/lib/jenkins/scriptler/scripts/load-list-script.groovy'
parallelStagesMap = list.collectEntries {
["${it}" : generateStage(it)]
}
}
}
}
stage('Run Stages in Parallel') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
That will result in Dynamic Parallel Stages:
I use this to generate my stages which contain a Jenkins job in them.
build_list is a list of Jenkins jobs that i want to trigger from my main Jenkins job, but have a stage for each job that is trigger.
build_list = ['job1', 'job2', 'job3']
for(int i=0; i < build_list.size(); i++) {
stage(build_list[i]){
build job: build_list[i], propagate: false
}
}
if you are using Jenkinsfile then, I achieved it via dynamically creating the stages, running them in parallel and also getting Jenkinsfile UI to show separate columns. This assumes parallel steps are independent of each other (otherwise don't use parallel) and you can nest them as deep as you want (depending upon the # of for loops you'll nest for creating stages).
Jenkinsfile Pipeline DSL: How to Show Multi-Columns in Jobs dashboard GUI - For all Dynamically created stages - When within PIPELINE section see here for more.

Resources