I'm trying to define custom DSL refer https://www.jenkins.io/doc/book/pipeline/shared-libraries/#defining-custom-steps
it seems work if just define simple command in {}
but failed when use complicated command
(root)
+- vars
| +- shareL.groovy
| +- xxx.groovy
| +- monitorStep.groovy
shareL.groovy
def install(){
print "test install"
}
def checkout(){
print "test checkout"
}
monitorStep.groovy
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
// This is where the magic happens - put your pipeline snippets in here, get variables from config.
script {
def status
def failed_cause=''
try{
body()
} catch(e){
status = 'fail'
failed_cause = "${e}"
throw e
}finally {
def myDataMap = [:]
myDataMap['stage'] = STAGE_NAME
myDataMap['status'] = status
myDataMap['failed_cause'] = failed_cause
influxDbPublisher selectedTarget: 'myTest', measurementName: 'myTestTable',customData: myDataMap
}
}
}
Jenkinsfile
#!groovy
#Library('myShareLibray#') _
pipeline {
stages{
stage('Checkout') {
steps {
script {
monitorStep{
shareL.checkout()
}
}
}
}
stage('Install') {
steps {
script {
monitorStep{
docker.image("node").inside(){
shareL.install()
}
}
}
}
}
}
}
first stage failed with
java.lang.NullPointerException: Cannot invoke method checkout() on null object
second stage failed with
java.lang.NullPointerException: Cannot invoke method image() on null object
The problem is that the closure cannot find the name shareL which should be accessible in the closure's delegate, which is in our case the map config.
You need to redeclare the map to expose the name shareL and additionally a second name install which must be invokable.
The solution is to rewrite the map like this:
def config = [ shareL : [install: { println "Inside the map" }] ]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
Then when you call body() it will find shareL.install() but this will not point to the call() method in the shareL.groovy but to the property in the map.
Related
I just simply want to pass the repo name cloud-nates in shared pipeline, so i've passed the parameter deployName from jenkinsfile to shared library. below is the Jenkinsfile
#Library("minePipelines#auto") _
if (env.BRANCH_NAME in ["auto", "stage"]) {
reloadDeploy {
deployName = "cloud-nates"
}
}
And below is the shared-pipeline reloadDeploy.groovy code :
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
properties([disableConcurrentBuilds()])
node("ops") {
timeout(unit: 'SECONDS', time: 60) {
stage("Reload Deployment") {
echo params.deployName
}
}
}
}
This prints null in console o/p. i've googled it but no luck :(
please feel free to ask any doubts.
I have a shared Jenkins library that has my pipeline for Jenkinsfile. The library is structured as follows:
myPipeline.groovy file
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = params.name
}
}
}
}
}
And my Jenkinsfile is as follows:
Jenkinsfile
#Library("some-shared-lib") _
myPipeline{
name = "Some name"
}
Now, I would like to replace "Some name" string with "env.JOB_NAME" command. Normally in Jenkinsfile, I would use name = "${env.JOB_NAME}" to get the info, but because I am using my shared library instead, it failed to work. Error message is as follows:
java.lang.NullPointerException: Cannot get property 'JOB_NAME' on null object
I tried to play around with brackets and other notation but never got it to work. I think that I incorrectly pass a parameter. I would like Jenkinsfile to assign "${env.JOB_NAME}" to projectName variable, once library runs the pipeline that I am calling (via myPipeline{} command)
You can do like this in myPipeline.groovy:
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = "${env.JOB_NAME}"
}
}
}
}
}
I have this template:
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent any
....
stages {
stage('My stages') {
steps {
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
$v
}
}
}
}
}
}
post { ... }
}
}
Then I use the template in a pipeline:
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': sh "do something...",
'Second stage': myCustomCommand("foo","bar")
]
}
In the stagesParams I pass the instances of my command (sh and myCustomCommand) and they land in the template as $v. How can I then execute them? Some sort of InvokeMethod($v)?
At the moment I am getting this error:
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
The problem of using node is that it doesn't work in situations like parallel:
parallelStages = [:]
v.each { k2, v2 ->
parallelStages["$k2"] = {
// node {
stage("$k2") {
notifySlackStartStage()
$v2
checkLog()
}
// }
}
}
If you want to execute sh step provided with a map, you need to store map values as closures, e.g.
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': {
sh "do something..."
}
'Second stage': {
myCustomCommand("foo","bar")
}
]
}
Then in the script part of your pipeline stage you will need to execute the closure, but also set the delegate and delegation strategy to the workflow script, e.g.
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
v.resolveStrategy = Closure.DELEGATE_FIRST
v.delegate = this
v.call()
}
}
}
I'd like to load a closure for my Jenkins build, but pass it some variables that are generic to any type of build (Go, Java, Docker) that are going on on our system. Since I'm loading the specific closure from a separate groovy file, it doesn't see those variables. For the purposes of making a simpler example, I've commented out the load and included that closure.
I'm a little unsure about how to do this - how do I pass the config from buildProject to buildSpecificProject? Am I referring to it wrong?
#!/usr/bin/groovy
//def buildSpecificProject = load 'buildSpecificProject.groovy'
def buildSpecificProject = { body->
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
println config.name
println config.builddirectory
}
def buildProject = { projbody ->
def config = [:]
projbody.resolveStrategy = Closure.DELEGATE_FIRST
projbody.delegate = config
projbody()
config.builddirectory = "/bar"
return config
}
try {
def newProjectVersion = buildSpecificProject { body ->
buildProject { projbody ->
name = 'projectname'
versionPrefix = "4.2.0"
fetchFromURL = 'git#github.com:myorg/myproject.git'
}
}
println "New Project Version = ${newProjectVersion}\n"
} catch (err) {
println err
}
I have no experience with build scripts in Jenkins really, so maybe my answer is not applicable, but from a Groovy only perspective the situation is the following:
The config variable in buildSpecificProject is a local variable, to which itself you have no access, unless you expose it or its value. Right now you do that through the setting the delegate actually.
If we say buildProject is only called from within a block given to buildSpecificProject, then the block given to buildProject is a Closure nested into the Closure given to buildSpecificProject. This Closure object will have a property owner, which will refer to the enclosing Closure instance in this case (see http://groovy-lang.org/closures.html#_owner_of_a_closure). Of this we know it has set the configuration as delegate, thus you can do projbody.owner.delegate to access the configuration set by buildSpecificProject.
But actually I would consider doing something like this:
def buildSpecificProject = { body ->
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = [config: config] // expose config
body()
println config.name
println config.builddirectory
return config
}
def buildProject = { config, projbody ->
projbody()
config.builddirectory = "/bar"
}
try {
def newProjectVersion = buildSpecificProject { body ->
// make config accessible to buildProject by providing it as parameter
buildProject(config) { projbody ->
config.name = 'projectname'
config.versionPrefix = "4.2.0"
config.fetchFromURL = 'git#github.com:myorg/myproject.git'
}
}
println "New Project Version = ${newProjectVersion}\n"
} catch (err) {
println err
}
As you can see it is actually enough for buildSpecificProject to set the delegate, which I set to a map with one key named config, containing the actual config. The disadvantage is of course, that you now have to do config.name. Also note the call "buildProject(config) { projbody ->", which gives the config to buildProject. And of course we can combine both ideas:
def buildSpecificProject = { body ->
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
println config.name
println config.builddirectory
return config
}
def buildProject = { config, projbody ->
projbody()
config.builddirectory = "/bar"
}
try {
def newProjectVersion = buildSpecificProject {
buildProject(delegate) { projbody ->
name = 'projectname'
versionPrefix = "4.2.0"
fetchFromURL = 'git#github.com:myorg/myproject.git'
}
}
println "New Project Version = ${newProjectVersion}\n"
} catch (err) {
println err
}
But I would recommend this not so much, as I do not like to depend on the delegate in this manner. I can break too easily.
I have a Jenkins shared library with the following file:
vars/testlib.groovy
def foo() {
echo 'foo'
}
def bar(body) {
body.delegate = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
}
And a Pipeline script as follows:
Jenkinsfile
library 'testlib#master'
testlib.foo()
testlib.bar {
testlib.foo()
}
I get the following output:
[Pipeline] echo
foo
[Pipeline] End of Pipeline
java.lang.NullPointerException: Cannot invoke method foo() on null object
For some reason, the closure being passed to testlib.bar doesn't see testlib anymore. This only happens if the resolution strategy favors the delegate; if I use OWNER_ONLY or OWNER_FIRST it works. It also works if I provide testlib in the delegate, either by setting it in the map or by just setting body.delegate = body.owner, and it works if I avoid the resolution by just referring to owner.testlib.foo in the closure. Furthermore, this only happens with library code; if I just make a test class in the Jenkinsfile it works fine.
It seems as though if the resolution strategy is to check the delegate, and the delegate doesn't provide that property, it immediately fails without bothering to check the owner next. Am I doing something wrong?
I can't explain exactly what is going on with the Groovy closure delegation in the Jenkins pipeline but I had a similar problem and I fixed it like this:
vars/foo.groovy:
def call() {
echo 'foo'
}
vars/bar.groovy:
//
// Something like:
//
// bar {
// script = {
// foo()
// return 'Called foo'
// }
// }
//
def call(body) {
def config = [:]
body.delegate = config
body.resolveStrategy = Closure.DELEGATE_FIRST
body()
// In the bar DSL element
echo 'I am bar'
// Expecting a script element as a closure. The insanceof needs script approvals
//assert config.script != null, 'A script element was not supplied'
//assert config.script instanceof Closure, 'The script element supplied must be a closure'
// Call the script closure
config.script.delegate = this
config.script.resolveStrategy = Closure.DELEGATE_FIRST
def result = config.script.call()
// Returning the script result
return result
}
Jenkinsfile:
library 'testlib#master'
def result = bar {
script = {
foo()
return 'Called foo'
}
}
echo "result from bar: ${result}"
Jenkins output:
[Pipeline] echo
I am bar
[Pipeline] echo
foo
[Pipeline] echo
result from bar: Called foo
[Pipeline] End of Pipeline
Finished: SUCCESS
Just consider the 'bar' DSL closure body passed as some configuration being passed in the form of some assignments like "x = y". So make one of these a closure element that is executed by the implementation of bar() and then you can call other library elements that are defined. I have the code for this example on my Github: https://github.com/macg33zr/jenkins-pipeline-experiments. You might also want to try unit testing outside of Jenkins - I have an example here using a library JenkinsPipelineUnit: https://github.com/macg33zr/pipelineUnit. I recommend this unit test approach if doing some complex work in pipeline as it will preserve your sanity!