I want to use groovy closures with collections in Jenkinsfile:
// other parts removed for brevity
steps {
script {
def testList = ["item1", "item2", "item3"]
testList.stream().map{it+".jpeg"}.each{println it}
}
}
// other parts removed for brevity
However, it gives error:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: java.util.stream.ReferencePipeline$Head.map() is applicable for argument types: (org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [org.jenkinsci.plugins.workflow.cps.CpsClosure2#248ba046]
Possible solutions: map(java.util.function.Function), max(java.util.Comparator), min(java.util.Comparator), wait(), dump(), grep()
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:153)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:161)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:165)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
at WorkflowScript.run(WorkflowScript:49)
at ___cps.transform___(Native Method)
at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:86)
...
I am fighting the same problem--constructs in Groovy which I use all the time, like all the handy collection methods, are breaking.
It turns out that in a Jenkinsfile, the code is passed through a transformer called Groovy CPS which enables Jenkins to stop and restart execution of a stage. In the process, stuff gets changed from its original class, and this causes the kind of error you're seeing here, because you're mixing non-CPS code (the Java lib methods) with CPS code (anything inside the Jenkinsfile).
Here's the link to the documentation of this behavior...but I didn't find it very helpful, so I am reverting to doing stuff like I would in Java (pre-lambda) with lots of for loops :(
While it’s not as handy as locally defined closures, you can, in some cases, define methods in the global scope and then rely on them via the Groovy method pointer operator.
void doSomething(final String key, final String value) {
println "${key} → ${value}"
}
pipeline {
stages {
stage('Process data') {
steps {
script {
final Map myData = [
"foo": "bar",
"plop": "yo",
]
myData.each(this.&doSomething)
}
}
}
}
}
Related
I've been getting an odd error in Jenkins when trying to run readFile() within a shared library, so that I can determine if a Dockerfile is pulling from ECR or docker hub. The error is :-
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: org.myorg.DockerManager.readFile() is applicable for argument types: (java.lang.String) values: [DockerServiceDescription/Dockerfile]
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.methodCall(DefaultInvoker.java:20)
at org.proj.DockerManager.login(/tmp/persistence/jobs/PROJ/jobs/builds/jobs/proj-subproj/branches/PROJ-4190/builds/47/libs/Global/src/org/me/DockerManager.groovy:26)
The method that's triggering the error is :-
/**
Login to docker
*/
void login() {
dockerfileHandle = readFile(dockerfile)
dockerfileLines = dockerfileHandle.readLines()
def dockerfileFROM = dockerfileLines.find{ dockerfileLine-> dockerfileLine =~ /^FROM / }
println("FROM: " + dockerfileFROM)
if (dockerfileFROM =~ /\.ecr\./ ) {
println("Using ECR")
ecrlogin()
} else {
println("Using DOCKER HUB")
dockerhublogin()
}
}
however that same code worked perfectly well within the pipeline code, it just breaks when moved into a shared library.
Any ideas?
Access to pipeline steps is done via an instance of CPSScript that represents the running pipeline. To use it in a class of shared library, you have to pass this into your library, either as constructor or method argument:
/**
Login to docker
*/
void login(def script) { // actually CPSScript, but that way, no import is needed
dockerfileHandle = script.readFile(dockerfile)
//...
}
This from your actual Jenkins file, you then call it like:
myInstance.login(this)
Also note that you println calls would result in output in the Jenkins logs, not the build logs, what you are aiming for is script.echo()
I am trying to have one common shared library which has actual pipeline as code, something like this under vars/demo.groovy
def call(Map pipelineParams) {
pipeline {
agent {
docker { image 'centos:latest' }
}
stages {
stage("Env Variables") {
steps {
sh "printenv"
//echo ${SERVICE_NAME}
}
}
stage("test") {
steps {
sh "printenv"
}
}
}
}
}
And I will be accessing the shared Library from my Jenkinsfile like this.
library identifier: 'mylibraryname#master',
//'master' refers to a valid git-ref
//'mylibraryname' can be any name
retriever: modernSCM([
$class: 'GitSCMSource',
//credentialsId: 'your-credentials-id',
remote: 'GIT URL'
])
demo()
It is working as expected but I want to send an extra environment variable or override existing variables from my Jenkinsfile without making changes to shared library which has my pipeline as code . can you help me with how could this be achieved?
I have tried giving the variables as below:
demo {
service = 'test'
var1 = 'value'
}
And tried them accessing this way:
def call(Map pipelineParams) {
pipeline {
agent {
docker { image 'centos:latest' }
}
stages {
stage("Env Variables") {
steps {
sh "printenv"
echo "pipelineParams.service"
}
}
stage("test") {
steps {
sh ' echo "hello pipelineParams.service '
}
}
}
}
}
But getting the following error:
[Pipeline] End of Pipeline
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: demo.call() is applicable for argument types: (org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [org.jenkinsci.plugins.workflow.cps.CpsClosure2#cad5c94]
Possible solutions: call(java.util.Map), wait(), any(), wait(long), main([Ljava.lang.String;), each(groovy.lang.Closure)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:64)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:160)
at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:157)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:142)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:158)
The signature of this is a map as defined here
def call(Map pipelineParams)
So you need to use a map. (https://www.baeldung.com/groovy-maps)
What this would look like is:
demo([
service: 'test',
var1: 'value'
])
Your echo statement is also not quite right, it should be:
echo(pipelineParams.service)
There is no need to use a GString or a string at all because it is a groovy variable.
Note: I use () in my examples. Strictly speaking you don't need them. I just prefer to code this way so its clear these are method arguments.
You could also just create env vars. Though I probably don't recommend it and would do it more like the way you are currently planning.
withEnv([SERVICE='test', VAR1='value]) {
demo([:]) // I put an empty map here, if you did this you would change your call method to not have the map though
}
This way above ensures the env vars only live until the end of the closure (the last } brace)
Another way, but this just sets them and they will exist globally and for the rest of the run. Not great.
env.SERVICE = 'test'
env.VAR1 = 'value'
demo([:])
in either case, in your shared lib you would check:
echo env.SERVICE
Again though I think its better the first way your doing it. Accepting a map argument and using those. env vars can be global and cause issues sometimes depending on what your running.
I've been trying for a while now to start working towards moving our free style projects over to pipeline. To do so I feel like it would be best to build up a shared library since most of our builds are the same. I read through this blog post from Jenkins. I came up with the following
// vars/buildGitWebProject.groovy
def call(body) {
def args= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = args
body()
pipeline {
agent {
node {
label 'master'
customWorkspace "c:\\jenkins_repos\\${args.repositoryName}\\${args.branchName}"
}
}
environment {
REPOSITORY_NAME = "${args.repositoryName}"
BRANCH_NAME = "${args.branchName}"
SOLUTION_NAME = "${args.solutionName}"
}
options {
buildDiscarder(logRotator(numToKeepStr: '3'))
skipStagesAfterUnstable()
timestamps()
}
stages {
stage("checkout") {
steps {
script{
assert REPOSITORY_NAME != null : "repositoryName is null. Please include it in configuration."
assert BRANCH_NAME != null : "branchName is null. Please include it in configuration."
assert SOLUTION_NAME != null : "solutionName is null. Please include it in configuration."
}
echo "building with ${REPOSITORY_NAME}"
echo "building with ${BRANCH_NAME}"
echo "building with ${SOLUTION_NAME}"
checkoutFromGitWeb(args)
}
}
stage('build and test') {
steps {
executeRake(
"set_assembly_to_current_version",
"build_solution[$args.solutionName, Release, Any CPU]",
"copy_to_deployment_folder",
"execute_dev_dropkick"
)
}
}
}
post {
always {
sendEmail(args)
}
}
}
}
in my pipeline project I configured the Pipeline to use Pipeline script and the script is as follows:
buildGitWebProject {
repositoryName:'my-git-repo'
branchName: 'qa'
solutionName: 'my_csharp_solution.sln'
emailTo='testuser#domain.com'
}
I've tried with and without the environment block but the result ends up being the same that the value is 'null' for each of those arguments. Oddly enough the script portion of the code doesn't make the build fail either... so not sure what's wrong with that. Also the echo parts show null as well. What am I doing wrong?
Your Closure body is not behaving the way you expect/believe it should.
At the beginning of your method you have:
def call(body) {
def args= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = args
body()
Your call body is:
buildGitWebProject {
repositoryName:'my-git-repo'
branchName: 'qa'
solutionName: 'my_csharp_solution.sln'
emailTo='testuser#domain.com'
}
Let's take a stab at debugging this.
If you add a println(args) after the body() in your call(body) method you will see something like this:
[emailTo:testuser#domain.com]
But, only one of the values got set. What is going on?
There are a few things to understand here:
What does setting a delegate of a Closure do?
Why does repositoryName:'my-git-repo' not do anything?
Why does emailTo='testuser#domain.com' set the property in the map?
What does setting a delegate of a Closure do?
This one is mostly straightforward, but I think it helps to understand. Closure is powerful and is the Swiss Army knife of Groovy. The delegate essentially sets what the this is in the body of the Closure. You are also using the resolveStrategy of Closure.DELEGATE_FIRST, so methods and properties from the delegate are checked first, and then from the enclosing scope (owner) - see the Javadoc for an in-depth explanation. If you call methods like size(), put(...), entrySet(), etc., they are all first called on the delegate. The same is true for property access.
Why does repositoryName:'my-git-repo' not do anything?
This may appear to be a Groovy map literal, but it is not. These are actually labeled statements. If you surround it instead with square brackets like [repositoryName:'my-git-repo'] then that would be a map literal. But, that is all you would be doing there - is creating a map literal. We want to make sure that these objects are consumed in the Closure
Why does emailTo='testuser#domain.com' set the property in the map?
This is using the map property notation feature of Groovy. As mentioned earlier, you have set the delegate of the Closure to def args= [:], which is a Map. You also set the resolveStrategy of Closure.DELEGATE_FIRST. This makes your emailTo='testuser#domain.com' resolve to being called on args, which is why the emailTo key is set to the value. This is equivalent to calling args.emailTo='testuser#domain.com'.
So, how do you fix this?
If you want to keep your Closure syntax approach, you could change the body of your call to anything that essentially stores values in the delegated args map:
buildGitWebProject {
repositoryName = 'my-git-repo'
branchName = 'qa'
solutionName = 'my_csharp_solution.sln'
emailTo = 'testuser#domain.com'
}
buildGitWebProject {
put('repositoryName', 'my-git-repo')
put('branchName', 'qa')
put('solutionName', 'my_csharp_solution.sln')
put('emailTo', 'testuser#domain.com')
}
buildGitWebProject {
delegate.repositoryName = 'my-git-repo'
delegate.branchName = 'qa'
delegate.solutionName = 'my_csharp_solution.sln'
delegate.emailTo = 'testuser#domain.com'
}
buildGitWebProject {
// example of Map literal where the square brackets are not needed
putAll(
repositoryName:'my-git-repo',
branchName: 'qa',
solutionName: 'my_csharp_solution.sln',
emailTo: 'testuser#domain.com'
)
}
Another way would be to have your call take in the Map as an argument and remove your Closure.
def call(Map args) {
// no more args and delegates needed right now
}
buildGitWebProject(
repositoryName: 'my-git-repo',
branchName: 'qa',
solutionName: 'my_csharp_solution.sln',
emailTo: 'testuser#domain.com'
)
There are also some other ways you could model your API, it will depend on the UX you want to provide.
Side note around declarative pipelines in shared library code:
It's worth keeping in mind the limitations of declarative pipelines in shared libraries. It looks like you are already doing it in vars, but I'm just adding it here for completeness. At the very end of the documentation it is stated:
Only entire pipelines can be defined in shared libraries as of this time. This can only be done in vars/*.groovy, and only in a call method. Only one Declarative Pipeline can be executed in a single build, and if you attempt to execute a second one, your build will fail as a result.
For my build job "generated-job-1" I need several parameters, which are passed in when the build (of the generated-job-1) is triggered via URL.
Here my Job Definition with parameters inside the SeedJob DSL:
job('generated-job-1'){
label ('master')
parameters{
stringParam('DEPLOY_URI', 'https://192.168.200.176/hyperManager', 'Provide the URL where DeploymentManager can be accessed.')
stringParam('REG_ID', '12', 'The id of the owner (Registration) of this deployment.')
}
steps {
groovyCommand(readFileFromWorkspace('stepscript.groovy')){
prop('name', 'value')
prop('DEPLOY_URI', $DEPLOY_URI)
}
}
}
I tried to use DEPLOY_URI, $DEPLOY_URI and ${DEPLOY_URI} and it the build fails with different error messages like
No such property: DEPLOY_URI for class: javaposse.jobdsl.dsl.helpers.step.GroovyContext
or
ERROR: (script, line 12) No such property: $DEPLOY_URI for class: javaposse.jobdsl.dsl.helpers.step.GroovyContext
or
ERROR: (script, line 12) No signature of method: javaposse.jobdsl.dsl.helpers.step.GroovyContext.$() is applicable for argument types: (script$_run_closure1$_closure3$_closure4$_closure5) values: [script$_run_closure1$_closure3$_closure4$_closure5#1a11cf0]
How can I define and pass those parameters to my step-script.groovy?
How could I use those parameters in other steps, such as shell or batchFile?
How do I access those parameters in my step-script.groovy, to work with the given data?
I searched for a while now and tried hard to get it working... No success.
Help really appreciated, as I am new to Job DSL and to Groovy.
Thanks in advance,
Anne
You need to put the variable name in quotes so that it gets evaluated when the generated job is executed, not when the DSL script runs.
job('generated-job-1') {
parameters {
stringParam('DEPLOY_URI', '...', '...')
}
steps {
groovyCommand(readFileFromWorkspace('stepscript.groovy')) {
prop('DEPLOY_URI', '$DEPLOY_URI')
}
}
}
I've Pipeline job in Jenkins (v2.7.1) where I'd like to print each element of Multi-line String parameter (Params) with 3 strings in each line: Foo, Bar, Baz as an input.
So I've tried the following syntax (using split and each):
Params.split("\\r?\\n").each { param ->
println "Param: ${param}"
}
but it fails with:
java.lang.UnsupportedOperationException: Calling public static java.lang.Object
org.codehaus.groovy.runtime.DefaultGroovyMethods.each(java.lang.Object,groovy.lang.Closure) on a CPS-transformed closure is not yet supported (JENKINS-26481); encapsulate in a #NonCPS method, or use Java-style loops
at org.jenkinsci.plugins.workflow.cps.GroovyClassLoaderWhitelist.checkJenkins26481(GroovyClassLoaderWhitelist.java:90)
which suggest to encapsulate in a #NonCPS method, or use Java-style loops.
So I've tried to encapsulate in a #NonCPS method like:
#NonCPS
def printParams() {
Params.split("\\r?\\n").each { param ->
println "Param: ${param}"
}
}
printParams()
but it fails with:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use staticMethod org.codehaus.groovy.runtime.DefaultGroovyMethods println groovy.lang.Closure java.lang.Object
Without the function (as per first example), adding #NonCPS at the beginning it complains about unexpected token.
I also tried Java-style syntax as suggested by using for operator (similar as here):
String[] params = Params.split("\\r?\\n")
for (String param: params) {
println "Param: ${param}"
}
which seems to work in plain Groovy, but it fails in Jenkins with:
java.io.NotSerializableException: java.util.AbstractList$Itr
at org.jboss.marshalling.river.RiverMarshaller.doWriteObject(RiverMarshaller.java:860)
Which syntax I should use to make it work?
The code works fine when disabling a Use Groovy Sandbox option and adding #NonCPS helper method. Alternatively, as suggested by #agg3l, proceed to Jenkins management to permit this method access.
So the working code is (same as the 2nd example):
#NonCPS
def printParams() {
Params.split("\\r?\\n").each { param ->
println "Param: ${param}"
}
}
printParams()
I know it's an old post but this is my way to do it, hopefully help anyone else
params.readLines().each {
println it
if (it) {
// if you want to avoid make operation with empty lines
}
}