Jenkins Shared Library delegation error - jenkins

I have a Jenkins shared library with the following file:
vars/testlib.groovy
def foo() {
echo 'foo'
}
def bar(body) {
body.delegate = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
}
And a Pipeline script as follows:
Jenkinsfile
library 'testlib#master'
testlib.foo()
testlib.bar {
testlib.foo()
}
I get the following output:
[Pipeline] echo
foo
[Pipeline] End of Pipeline
java.lang.NullPointerException: Cannot invoke method foo() on null object
For some reason, the closure being passed to testlib.bar doesn't see testlib anymore. This only happens if the resolution strategy favors the delegate; if I use OWNER_ONLY or OWNER_FIRST it works. It also works if I provide testlib in the delegate, either by setting it in the map or by just setting body.delegate = body.owner, and it works if I avoid the resolution by just referring to owner.testlib.foo in the closure. Furthermore, this only happens with library code; if I just make a test class in the Jenkinsfile it works fine.
It seems as though if the resolution strategy is to check the delegate, and the delegate doesn't provide that property, it immediately fails without bothering to check the owner next. Am I doing something wrong?

I can't explain exactly what is going on with the Groovy closure delegation in the Jenkins pipeline but I had a similar problem and I fixed it like this:
vars/foo.groovy:
def call() {
echo 'foo'
}
vars/bar.groovy:
//
// Something like:
//
// bar {
// script = {
// foo()
// return 'Called foo'
// }
// }
//
def call(body) {
def config = [:]
body.delegate = config
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
// In the bar DSL element
echo 'I am bar'
// Expecting a script element as a closure. The insanceof needs script approvals
//assert config.script != null, 'A script element was not supplied'
//assert config.script instanceof Closure, 'The script element supplied must be a closure'
// Call the script closure
config.script.delegate = this
config.script.resolveStrategy = Closure.DELEGATE_FIRST
def result = config.script.call()
// Returning the script result
return result
}
Jenkinsfile:
library 'testlib#master'
def result = bar {
script = {
foo()
return 'Called foo'
}
}
echo "result from bar: ${result}"
Jenkins output:
[Pipeline] echo
I am bar
[Pipeline] echo
foo
[Pipeline] echo
result from bar: Called foo
[Pipeline] End of Pipeline
Finished: SUCCESS
Just consider the 'bar' DSL closure body passed as some configuration being passed in the form of some assignments like "x = y". So make one of these a closure element that is executed by the implementation of bar() and then you can call other library elements that are defined. I have the code for this example on my Github: https://github.com/macg33zr/jenkins-pipeline-experiments. You might also want to try unit testing outside of Jenkins - I have an example here using a library JenkinsPipelineUnit: https://github.com/macg33zr/pipelineUnit. I recommend this unit test approach if doing some complex work in pipeline as it will preserve your sanity!

Related

jenkins parallel call for external function loaded from file

I have an external function to call multiple times (with different variable values) from my jenkins pipeline and was hoping to make this call parallel.
Here's my pipeline code.
stepsForParallel = [:]
parallel_list = ['apple','banana','orange']
def groovyFile = load "${path}/runFruit.groovy"
for(value in parallel_list){
stepsForParallel.put("Run for fruit-${value}", groovyFile.runForFruit(value))
}
parallel stepsForParallel
And my groovy file called 'runFruit' is as follows.
#!/usr/bin/env groovy
def runForFruit(fruitValue){
..do something here..
}
return this
For this code the pipeline still runs sequentially.
Perhaps because in this line inside the for loop
stepsForParallel.put("Run for fruit-${value}", groovyFile.runForFruit(value)) an actual call to the external function is being made?
Any suggestions on how to achieve this parallelization?
Take a look at the Documentation for the parallel step, it takes a map from branch names (the name of the parallel executions) to closures which are the code that will be executed per parallel stage. In your case you are not passing a closure, but rather executing the function itself.
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
So you need to construct your code to fit the parallel format by passing a closure that contains your code to execute as the map value.
In your case it will look something like (using collectEntries):
parallel_list = ['apple','banana','orange']
def groovyFile = load "${path}/runFruit.groovy"
stepsForParallel = parallel_list.collectEntries{ fruit ->
["Run for fruit-${fruit}" : {
groovyFile.runForFruit(fruit)
}]
}
parallel stepsForParallel
Or if you prefer your format using the for loop:
stepsForParallel = [:]
parallel_list = ['apple','banana','orange']
def groovyFile = load "${path}/runFruit.groovy"
for(value in parallel_list){
stepsForParallel.put("Run for fruit-${value}", { groovyFile.runForFruit(value) })
}
parallel stepsForParallel
You can also use it in below way :
def fruit_list = ['apple','banana','orange']
def groovyFile
def stepsForParallel = [:]
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
groovyFile = load "runFruit.groovy"
stepsForParallel = fruit_list.collectEntries {
["${it}" : groovyFile.runForFruit(it)]
}
}
}
}
}
}
parallel stepsForParallel
Output:

How to pass parameters in a stage call in Jenkinsfile

Actually my Jenkinsfile looks like this:
#Library('my-libs') _
myPipeline{
my_build_stage(project: 'projectvalue', tag: '1.0' )
my_deploy_stage()
}
I am trying to pass these two variables (project and tag) to my build_stage.groovy, but it is not working.
What is the correct syntax to be able to use $params.project or $params.tag in my_build_stage.groovy?
Please see the below code which will pass parameters.
In your Jenkinsfile write below code:
// Global variable is used to get data from groovy file(shared library file)
def mylibrary
def PROJECT_VALUE= "projectvalue"
def TAG = 1
pipeline{
agent{}
stages{
stage('Build') {
steps {
script {
// Load Shared library Groovy file mylibs.Give your path of mylibs file which will contain all your function definitions
mylibrary= load 'C:\\Jenkins\\mylibs'
// Call function my_build stage and pass parameters
mylibrary.my_build_stage(PROJECT_VALUE, TAG )
}
}
}
stage('Deploy') {
steps {
script {
// Call function my_deploy_stage
mylibrary.my_deploy_stage()
}
}
}
}
}
Create a file named : mylibs(groovy file)
#!groovy
// Write or add Functions(definations of stages) which will be called from your jenkins file
def my_build_stage(PROJECT_VALUE,TAG_VALUE)
{
echo "${PROJECT_VALUE} : ${TAG_VALUE}"
}
def my_deploy_stage()
{
echo "In deploy stage"
}
return this

Jenkins custom DSL Cannot invoke method image() on null object

I'm trying to define custom DSL refer https://www.jenkins.io/doc/book/pipeline/shared-libraries/#defining-custom-steps
it seems work if just define simple command in {}
but failed when use complicated command
(root)
+- vars
| +- shareL.groovy
| +- xxx.groovy
| +- monitorStep.groovy
shareL.groovy
def install(){
print "test install"
}
def checkout(){
print "test checkout"
}
monitorStep.groovy
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
// This is where the magic happens - put your pipeline snippets in here, get variables from config.
script {
def status
def failed_cause=''
try{
body()
} catch(e){
status = 'fail'
failed_cause = "${e}"
throw e
}finally {
def myDataMap = [:]
myDataMap['stage'] = STAGE_NAME
myDataMap['status'] = status
myDataMap['failed_cause'] = failed_cause
influxDbPublisher selectedTarget: 'myTest', measurementName: 'myTestTable',customData: myDataMap
}
}
}
Jenkinsfile
#!groovy
#Library('myShareLibray#') _
pipeline {
stages{
stage('Checkout') {
steps {
script {
monitorStep{
shareL.checkout()
}
}
}
}
stage('Install') {
steps {
script {
monitorStep{
docker.image("node").inside(){
shareL.install()
}
}
}
}
}
}
}
first stage failed with
java.lang.NullPointerException: Cannot invoke method checkout() on null object
second stage failed with
java.lang.NullPointerException: Cannot invoke method image() on null object
The problem is that the closure cannot find the name shareL which should be accessible in the closure's delegate, which is in our case the map config.
You need to redeclare the map to expose the name shareL and additionally a second name install which must be invokable.
The solution is to rewrite the map like this:
def config = [ shareL : [install: { println "Inside the map" }] ]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
Then when you call body() it will find shareL.install() but this will not point to the call() method in the shareL.groovy but to the property in the map.

Passing environment variable as a pipeline parameter to Jenkins shared library

I have a shared Jenkins library that has my pipeline for Jenkinsfile. The library is structured as follows:
myPipeline.groovy file
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = params.name
}
}
}
}
}
And my Jenkinsfile is as follows:
Jenkinsfile
#Library("some-shared-lib") _
myPipeline{
name = "Some name"
}
Now, I would like to replace "Some name" string with "env.JOB_NAME" command. Normally in Jenkinsfile, I would use name = "${env.JOB_NAME}" to get the info, but because I am using my shared library instead, it failed to work. Error message is as follows:
java.lang.NullPointerException: Cannot get property 'JOB_NAME' on null object
I tried to play around with brackets and other notation but never got it to work. I think that I incorrectly pass a parameter. I would like Jenkinsfile to assign "${env.JOB_NAME}" to projectName variable, once library runs the pipeline that I am calling (via myPipeline{} command)
You can do like this in myPipeline.groovy:
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = "${env.JOB_NAME}"
}
}
}
}
}

How to stop a Jenkins Pipeline build with using a closure processor

How can I have a closure stop a build as aborted without looking like a failure?
In my original Jenkins file I have something like this:
node() {
println 'should print'
if (shouldStop()) {
return
}
println 'should NOT print'
}
# Output
should print
The return out of the node block ends the build with whatever status is presently has and does not perform the additional actions.
If I have a closure in the mix things the unwanted code still executes
withMyAction(closure) {
def result
try {
prepareEnv()
result = closure()
} finally {
cleanUp()
}
return result
}
node() {
println 'should print'
withMyAction {
if (shouldStop()) {
return
}
}
println 'should NOT print'
}
# Output
should print
should NOT print
It seems that my options are
Stop via throw new Exception or error()
Wrap my downstream sections of the job in if status == ...
Is there an approach where I can gracefully stop the build early without using return from Jenkinsfile as my approach?

Resources