Evaluation of the configuration body of a custom pipeline step implementation - jenkins

For a custom Jenkins pipeline step I have some pipeline using this step like:
library 'infrastructure'
infrastructurePipeline {
project ='jenkins'
artifact = ['master', 'backup']
dockerRegistry = [
credentialsId: 'dockerRegistryDeployer',
url : "http://${env.DOCKER_REGISTRY}"
]
}
However, in this context the variable env doesn't seem to be bound hence the expression can't be evaluated.
I can replace env by System.getenv() and it will work. But this comes with a side-effect that I would rather avoid. I have to approve the usage of System.getenv() which is warned to be a security vulnerabulity.
The custom step implements configuration evaluation as recommended in https://jenkins.io/doc/book/pipeline/shared-libraries/#defining-a-more-structured-dsl.
The most relevant code for the step is in vars/infrastructurePipeline.groovy
def call(Closure body) {
// evaluate the body block, and collect configuration into the object
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
def label=config.label ?: 'docker'
String tag = env.BUILD_NUMBER
String deliveryTag=null
String project = config.project
def artifact = config.artifact
...
}

Related

How to pass parameters and variables from a file to jenkinsfile?

I'm trying to convert my jenkins pipeline to a shared library since it can be reusable on most of the application. As part of that i have created groovy file in vars folder and kept pipeline in jenkins file in github and able to call that in jenkins successfully
As part of improving this i want to pass params, variables, node labels through a file so that we should not touch jenkins pipeline and if we want to modify any vars, params, we have to do that in git repo itself
pipeline {
agent
{
node
{
label 'jks_deployment'
}
}
environment{
ENV_CONFIG_ID = 'jenkins-prod'
ENV_CONFIG_FILE = 'test.groovy'
ENV_PLAYBOOK_NAME = 'test.tar.gz'
}
parameters {
string (
defaultValue: 'test.x86_64',
description: 'Enter app version',
name: 'app_version'
)
choice (
choices: ['10.0.0.1','10.0.0.2','10.0.0.3'],
description: 'Select a host to be delpoyed',
name: 'host'
)
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
def props = readProperties file: 'extravars.properties'
env.var1 = props.var1
env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
In above code,i used pipeline utility steps plugin and able to read variables from extravars.properties file. Is it same way we can do for jenkins parameters also? Or do we have any suitable method to take care of passing this parameters via a file from git repo?
Also is it possible to pass variable for node label also?
=====================================================================
Below are the improvements which i have made in this project
Used node label plugin to pass the node name as variable
Below is my vars/sayHello.groovy file content
def call(body) {
// evaluate the body block, and collect configuration into the object
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent
{
node
{
label "${pipelineParams.slaveName}"
}
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
// def props = readProperties file: 'extravars.properties'
// script {
readProperties(file: 'extravars.properties').each {key, value -> env[key] = value }
//}
// env.var1 = props.var1
// env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
}
stage ('stage2') {
steps {
sh "echo ${var1}"
sh "echo ${var2}"
sh "echo ${pipelineParams.appVersion}"
sh "echo ${pipelineParams.hostIp}"
}
}
}
}
}
Below is my vars/params.groovy file
properties( [
parameters([
choice(choices: ['10.80.66.171','10.80.67.6','10.80.67.200'], description: 'Select a host to be delpoyed', name: 'host')
,string(defaultValue: 'fxxxxx.x86_64', description: 'Enter app version', name: 'app_version')
])
] )
Below is my jenkinsfile
def _hostIp = params.host
def _appVersion = params.app_version
sayHello {
slaveName = 'master'
hostIp = _hostIp
appVersion = _appVersion
}
Now Is it till we can improve this?Any suggestions let me know.

Passing environment variable as a pipeline parameter to Jenkins shared library

I have a shared Jenkins library that has my pipeline for Jenkinsfile. The library is structured as follows:
myPipeline.groovy file
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = params.name
}
}
}
}
}
And my Jenkinsfile is as follows:
Jenkinsfile
#Library("some-shared-lib") _
myPipeline{
name = "Some name"
}
Now, I would like to replace "Some name" string with "env.JOB_NAME" command. Normally in Jenkinsfile, I would use name = "${env.JOB_NAME}" to get the info, but because I am using my shared library instead, it failed to work. Error message is as follows:
java.lang.NullPointerException: Cannot get property 'JOB_NAME' on null object
I tried to play around with brackets and other notation but never got it to work. I think that I incorrectly pass a parameter. I would like Jenkinsfile to assign "${env.JOB_NAME}" to projectName variable, once library runs the pipeline that I am calling (via myPipeline{} command)
You can do like this in myPipeline.groovy:
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = "${env.JOB_NAME}"
}
}
}
}
}

How to read and use the node label from properties file?

Trying to read and use the node label from properties file as below, I have hello/world.jenkins which is my JenkinsFile and hello/world file checked-in which has some properties including NODE_LABEL=my-server-name, both files are in in git, and I am using Pipeline script from SCM definition and hello/world.jenkins as Script Path in Jenkins pipeline configuration.
def scriptPath = currentBuild.rawBuild.parent.definition.scriptPath // hello/world.jenkins
String fileWithoutExt = scriptPath.take(scriptPath.lastIndexOf('.')) // hello/world
println "props_file=" + fileWithoutExt // prints correctly.
properties = readProperties file: "$fileWithoutExt" // here it fails, I could see hello/world file present in the workspace
echo "node: ${properties.NODE_LABEL}"
pipeline {
agent { label props1.NODE_LABEL }
...
stages {
...
}
}
I cannot load properties file outside of stage, is there any other way to read the node name for properties file?
log:
props_file=hello/world
[Pipeline] readProperties
[Pipeline] End of Pipeline
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
properties is not visible as it's not declared as a global environment variable. Do this instead:
env.properties = readProperties file: "$fileWithoutExt"
This works:
def scriptPath = currentBuild.rawBuild.parent.definition.scriptPath // hello/world.jenkins
String fileWithoutExt = scriptPath.take(scriptPath.lastIndexOf('.')) // hello/world
pipeline {
environment {
nodeProp = readProperties file: "${fileWithoutExt}"
nodeLabel = "$nodeProp.NODE_LABEL"
}
agent { label env.nodeLabel }
...

How can I define a custom pipeline directive?

For example something like:
pipeline {
customDirective {
sh "env"
..
}
}
This is currently not possible. You can only define custom pipeline steps via a Shared Library and use them within the stage/steps section and in the condition block of the post section. If you need for any reasons more customization you would have to have a look into the Scripted pipeline syntax. It allows to use most functionality of Groovy and is therefore very flexible.
This works:
customDirective.groovy (Shared Library)
def call(Closure body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
config.script()
}
Jenkinsfile
customDirective {
url="http://something.another.com"
title="The Title"
script = {
sh "env"
}
}

Jenkins Shared Library delegation error

I have a Jenkins shared library with the following file:
vars/testlib.groovy
def foo() {
echo 'foo'
}
def bar(body) {
body.delegate = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
}
And a Pipeline script as follows:
Jenkinsfile
library 'testlib#master'
testlib.foo()
testlib.bar {
testlib.foo()
}
I get the following output:
[Pipeline] echo
foo
[Pipeline] End of Pipeline
java.lang.NullPointerException: Cannot invoke method foo() on null object
For some reason, the closure being passed to testlib.bar doesn't see testlib anymore. This only happens if the resolution strategy favors the delegate; if I use OWNER_ONLY or OWNER_FIRST it works. It also works if I provide testlib in the delegate, either by setting it in the map or by just setting body.delegate = body.owner, and it works if I avoid the resolution by just referring to owner.testlib.foo in the closure. Furthermore, this only happens with library code; if I just make a test class in the Jenkinsfile it works fine.
It seems as though if the resolution strategy is to check the delegate, and the delegate doesn't provide that property, it immediately fails without bothering to check the owner next. Am I doing something wrong?
I can't explain exactly what is going on with the Groovy closure delegation in the Jenkins pipeline but I had a similar problem and I fixed it like this:
vars/foo.groovy:
def call() {
echo 'foo'
}
vars/bar.groovy:
//
// Something like:
//
// bar {
// script = {
// foo()
// return 'Called foo'
// }
// }
//
def call(body) {
def config = [:]
body.delegate = config
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
// In the bar DSL element
echo 'I am bar'
// Expecting a script element as a closure. The insanceof needs script approvals
//assert config.script != null, 'A script element was not supplied'
//assert config.script instanceof Closure, 'The script element supplied must be a closure'
// Call the script closure
config.script.delegate = this
config.script.resolveStrategy = Closure.DELEGATE_FIRST
def result = config.script.call()
// Returning the script result
return result
}
Jenkinsfile:
library 'testlib#master'
def result = bar {
script = {
foo()
return 'Called foo'
}
}
echo "result from bar: ${result}"
Jenkins output:
[Pipeline] echo
I am bar
[Pipeline] echo
foo
[Pipeline] echo
result from bar: Called foo
[Pipeline] End of Pipeline
Finished: SUCCESS
Just consider the 'bar' DSL closure body passed as some configuration being passed in the form of some assignments like "x = y". So make one of these a closure element that is executed by the implementation of bar() and then you can call other library elements that are defined. I have the code for this example on my Github: https://github.com/macg33zr/jenkins-pipeline-experiments. You might also want to try unit testing outside of Jenkins - I have an example here using a library JenkinsPipelineUnit: https://github.com/macg33zr/pipelineUnit. I recommend this unit test approach if doing some complex work in pipeline as it will preserve your sanity!

Resources