How To Use groovy.xml.StreamingMarkupBuilder in Jenkins Pipeline - jenkins

I am trying to modify an html file in a Jenkins pipeline and I need to add a span tag. In groovy I can do the following
def newNode = new StreamingMarkupBuilder().bind { span {mkp.yield("$child")}}
where child is a string to put in the span tag.
When I tried to do this in a Jenkins Pipeline, I got an error related to a CPS mismatch so I added #NonCPS but I am now getting an error that says java.lang.NoSuchMethodError: No such DSL method 'span' found among steps
I found this: https://www.jenkins.io/doc/book/pipeline/cps-method-mismatches/ that talks about CPS mismatches. I think it is basically trying to use the span tag as a DSL method, similar to stage or steps. So is it possible to somehow use the StreamingMarkupBuilder.bind() function like I am trying to without Jenkins interpreting the span tag as a DSL method?

I ended up just creating a Node object separately. So I used the following line:
def node = new groovy.util.Node(null, 'span', child)
Then I just appended that node to another node using the append function.

Related

shared declarative pipeline: overloading call operator while keeping access to pipeline steps

I have a shared jenkins pipeline that is currently invoked via it's call() operator.
The call method accepts a fixed set of parameters.
I want to change the call() signature to accept named parameters instead. This will make it easier to extend the parameters in the future.
Because I can not update all Jenkinsfiles at once that call the shared pipeline, I have to keep the signature compatible.
My idea was to override the call() method like:
// old call method
def call(String repo, String version) {
call(repo: repo, version: version)
}
def call (Map params) {
pipeline {
agent { label 'master' }
[...]
}
}
If I invoke the pipeline with via call("repo1", "master"), it fails with: No such DSL method 'agent'.
It seems I lost access to the variable binding of the Jenkinsfile (?).
How do I ensure that the jenkins steps are still accessible when overloading call()?
Is there maybe a better solution to keep the shared pipelines compatible while changing the method signature?
According to this site: Jenkins Templating Engine overloading of step should work. I checked it and it works on my env.
Seems like your issue with No such DSL method 'agent' connected to another issue, not to the overloading.

How to add to existing properties in groovy script?

I have a dsl file in which properties such as log rotator and parameters are being defined for a jenkins job. However, I want to add a property to the jenkins job in a groovy script. I do this by putting
properties([pipelineTriggers([githubPush()])])
in the corresponding groovy script. However, this overwrites all other parameters and properties as defined in the dsl script.
What I have right now is putting all properties in the dsl script in the groovy script as well but that causes two different places where developers need to change the properties. Is there a way in groovy to simply add a new property instead of overwriting the old ones.
Something like
properties.add([pipelineTriggers([githubPush()])])
would be very helpful.

Use special agent for whole pipeline when a condition is met

There is declarative pipeline. In the beginning of pipeline block the agent selection is made using agent directive. Label-based selection is being conducted. Agent selected this way is the standard/default agent.
How to set for whole pipeline a special agent when certain condition is met?
The plan is to do condition check based on pipeline's one parameter >> can that work?
What are the points the chosen approach needs to address?
Current solution blueprint:
Groovy code prior to pipeline block
Mentioned groovy code sets a variable based on value of pipeline's parameter how to access pipeline's parameter from within Groovy code located out of pipeline?
agent section uses variable set in Groovy code matching label special agent got attached to
Both Jenkins.io and Cloudbees do not support dynamic agent selection with Declarative Pipeline syntax. Hence adding "when" expression within agent block will not work. However, the below approach can be tried
1. Create pipeline library - with a groovy file in vars folder. Keep all the stages inside this file and parameterize the "Agent" block
2. Jenkinsfile - embed the library inside Jenkinsfile and invoke the above groovy file using call(body) syntax. Pass agent deciding parameter from this Jenkinsfile.
For library syntax, please refer this url
Shared Library syntax

Adding custom job property to Jenkins job

I want to add a new mandatory job property to capture the some custom fields in the jenkins job. I searched in the plugins list but couldn't find any relevant plugin that solves the issue. Is there any plugin to solve this ? (Note: Extra columns plugin doesn't solve my usecase)
A freestyle job can be configured to build with parameters. See: https://wiki.jenkins.io/display/JENKINS/Parameterized+Build
You can configure the parameter type (string, boolean, drop down etc), give a description of the parameter and a default value. The string parameters can include validation rules:
https://wiki.jenkins.io/display/JENKINS/Validating+String+Parameter+Plugin
Though this only warns when the current parameter value does not meet the regex validation rule, it doesn't prevent the build from being submitted. If submitted in this state, however, the build will fail.
From a quick google, it appears this doesn't work for pipeline jobs, See the last comment on the plugin page url above from Miguelángel Fernández:
If you look at the implementation of class ValidatingStringParameterValue you'll see that it overrides the implementation of public BuildWrapper createBuildWrapper(AbstractBuild build) in a way that aborts if the string is invalid. This will only work on Freestyle jobs and other job types extending AbstractBuild. I'm afraid this does not apply to pipeline jobs. Maybe in your prior project you used freestyle jobs.
An alternative for freestyle jobs is to do in job validation before initiating any build steps using the 'Prepare an environment for the run' from:
https://wiki.jenkins.io/display/JENKINS/EnvInject+Plugin
You would need to write groovy to check the parameters submitted and abort the build at this point if the values aren't suitable. Something like:
def validateString = binding.variables.get('testParam')
if (!binding.variables.get('testParam').matches('\\d+')) {
println "failure of parameter validation - does not match regex"
throw new InterruptedException()
} else {
println "Validation passed carry on with build"
}
This doesn't work on pipeline builds - as the plugin is quote:
'This plugin has some known limitations. For Example, Pipeline Plugin is not fully supported.'.
But if you are using scripted pipelines you can implement something similar:
stage 'start up'
if(!env.testParam.matches('\\d+')) {
error 'failure of parameter validation - does not match regex'
}

Trigger Jenkins Job for every parameter

I have created a Global choice Parameter using Extensible Choice Parameter plugin.
I am using this parameter list in one of my parametrized jenkins job.
Is there a way in jenkins, where I can execute the job with each of the parameters in the Global choice Parameter list?
I have had a look on Build Flow job in jenkins, as suggested in this answer, but it seems it accepts hardcoded parameters only, and not dynamic.
I finally managed to resolve this using the following steps (with great help from this post) -
As my parameters list is dynamic in nature, it could be added or modified according to other jobs, we have managed it in a text file.
Next, We have used Extensible Choice Parameter plugin to display the parameters, using the groovy script -
def list = [];
File file = new File("D:/JenkinJob/parameterList.txt")
file.eachLine { line ->
list.add("$line")
}
return list
Now I want to call this jenkins job for each of the parameter.
For this, I have installed, BuildFlow plugin, and crated a new jenkins job of BuildFlow type -
Next, get the Extended Choice Parameter plugin, and configure it as follows -
Now in the flow step of this job, write this script, where "Feature" is the parameter, that is just created above, and within call to "build" parameter, pass in the name of job which we want to call for each parameter -
def features = params['Features'].split(',')
for (feature in features ) {
build("JobYouWantToCall", JobParameter: feature,)
}

Resources