Groovy - readYaml() expecting java.util.LinkedHashMap instead of a file - jenkins

As a part of our Jenkins solutions, we use Groovy in our pipelines.
In one of our groovy file I want to update a docker-stack.yaml.
To do so I'm using readYaml():
stage("Write docker-stack.yaml") {
def dockerStackYamlToWrite = readFile 'docker-stack.yaml'
def dockerStackYaml = readYaml file: "docker-stack.yaml"
def imageOrigin = dockerStackYaml.services[domain].image
def versionSource = imageOrigin.substring(imageOrigin.lastIndexOf(":") + 1, imageOrigin.length())
def imageWithNewVersion = imageOrigin.replace(versionSource, imageTag)
dockerStackYamlToWrite = dockerStackYamlToWrite.replace(imageOrigin, imageWithNewVersion)
sh "rm docker-stack.yaml"
writeFile file: "docker-stack.yaml", text: dockerStackYamlToWrite
sh "git add docker-stack.yaml"
sh "git commit -m 'promote dockerStack to ${envname}'"
sh "git push origin ${envname}"
}
I am using test to validate my code:
import org.junit.Before
import org.junit.Test
class TestUpdateVersionInDockerStack extends JenkinsfileBaseTest {
#Before
void setUp() throws Exception {
helper.registerAllowedMethod("build", [Map.class], null)
helper.registerAllowedMethod("steps", [Object.class], null)
super.setUp()
}
#Test void success() throws Exception {
def script = loadScript("src/test/jenkins/updateVersionInDockerStack/success.jenkins")
script.execute()
}
}
Here is the success.jenkins:
def execute() {
node() {
stage("Build") {
def version = buildVersion()
updateVersionInDockerStack([
DOMAIN : "security-package",
IMAGE_TAG : version,
GITHUB_ORGA : "Bla",
TARGET_ENV : "int"
])
}
}
}
return this
When I run my test I get this message:
groovy.lang.MissingMethodException: No signature of method: updateVersionInDockerStack.readYaml() is applicable for argument types: (java.util.LinkedHashMap) values: [[file:docker-stack.yaml]]
At this point I'm lost. For what I understand from the documentation readYaml() can I a file as an argument.
Can you help to understand why it is expecting a LinkedHashMap? Do you have to convert my value in a LinkedHashMap?
Thank you

Your pipeline unit test fails, because there is no readYaml method registered in pipeline's allowed methods. In your TestUpdateVersionInDockerStack test class simply add to the setUp method following line:
helper.registerAllowedMethod("readYaml", [Map.class], null)
This will instruct Jenkins pipeline unit environment that the method readYaml that accepts a single argument of type Map is allowed to use in the pipeline and invocation of this method will be registered in the unit test result stack. You can add a method printCallStack() call to your test method to see the stack of all executed steps during the test:
#Test void success() throws Exception {
def script = loadScript("src/test/jenkins/updateVersionInDockerStack/success.jenkins")
script.execute()
printCallStack()
}

Related

Jenkins samples groovy code vs plain groovy (closure errors)

In a jenkins shared library I can do something like this:
Jenkinsfile
#Library(value="my-shared-lib", changelog=false) _
jobGenerator {
notifier = [notifyEveryUnstableBuild: true]
}
sharedLibary/vars/jobGenerator.groovy
def call(body) {
println 'hi!'
}
To better understand the flow of whats goes on I have created two groovy files locally (with no reference to jenkins at all):
samples/launcher.groovy
jobGenerator {
s = 's'
}
samples/jobGenerator.groovy
def call(body) {
println 'inside jobGenerator '
}
But when I run that with:
groovy "/home/user/samples/launcher.groovy"
I get:
Caught: groovy.lang.MissingMethodException: No signature of method: launcher.jobGenerator() is applicable for argument types: (launcher$_run_closure1) values: [launcher$_run_closure1#61019f59]
groovy.lang.MissingMethodException: No signature of method: launcher.jobGenerator() is applicable for argument types: (launcher$_run_closure1) values: [launcher$_run_closure1#61019f59]
at launcher.run(launcher.groovy:2)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
So how much of the above code is jenkins/shared library specific? And is it even possible to write something like the above in plain groovy?
Or put in another way. How do I convert the above jenkins code to plain groovy?
IMHO following is close to what jenkins is doing
launcher.groovy
// load library scripts/functions
def binding = this.getBinding()
def gshell = new GroovyShell(this.getClass().getClassLoader(),binding)
new File("./my-lib").traverse(nameFilter: ~/.*\.groovy$/){f-> binding[f.name[0..-8]] = gshell.parse(f) }
// main
bar{
foo(name:"world")
}
./my-lib/foo.groovy
def call (Map m){
return "hello $m.name"
}
./my-lib/bar.groovy
def call (Closure c){
println ( "BAR: "+c() )
}
#> groovy launcher.groovy
BAR: hello world

How to pass parameters in a stage call in Jenkinsfile

Actually my Jenkinsfile looks like this:
#Library('my-libs') _
myPipeline{
my_build_stage(project: 'projectvalue', tag: '1.0' )
my_deploy_stage()
}
I am trying to pass these two variables (project and tag) to my build_stage.groovy, but it is not working.
What is the correct syntax to be able to use $params.project or $params.tag in my_build_stage.groovy?
Please see the below code which will pass parameters.
In your Jenkinsfile write below code:
// Global variable is used to get data from groovy file(shared library file)
def mylibrary
def PROJECT_VALUE= "projectvalue"
def TAG = 1
pipeline{
agent{}
stages{
stage('Build') {
steps {
script {
// Load Shared library Groovy file mylibs.Give your path of mylibs file which will contain all your function definitions
mylibrary= load 'C:\\Jenkins\\mylibs'
// Call function my_build stage and pass parameters
mylibrary.my_build_stage(PROJECT_VALUE, TAG )
}
}
}
stage('Deploy') {
steps {
script {
// Call function my_deploy_stage
mylibrary.my_deploy_stage()
}
}
}
}
}
Create a file named : mylibs(groovy file)
#!groovy
// Write or add Functions(definations of stages) which will be called from your jenkins file
def my_build_stage(PROJECT_VALUE,TAG_VALUE)
{
echo "${PROJECT_VALUE} : ${TAG_VALUE}"
}
def my_deploy_stage()
{
echo "In deploy stage"
}
return this

Groovy - Jenkins Pipeline - Groovy CPS doesn't go trough .eachLine method

I am trying to run this code inside Jenkins Pipeline script:
def getTags = { svnurl ->
def command = ["svn","ls","${svnurl}"];
def proc = command.execute()
proc.waitFor()
proc.in.eachLine {
println(it)
}
}
getTags('http://svnurlexample.net/');
The result should be a list of folders at the svn location but what I am getting is an error:
[Pipeline] echo:
1.0.0/
expected to call java.lang.ProcessImpl$ProcessPipeInputStream.eachLine but wound up catching org.jenkinsci.plugins.workflow.cps.CpsClosure2.call
The proc.in.eachLine is causing the issue, as if Groovy finds the first folder on the location but can not handle the rest and reports an error.
This is what worked for me:
#NonCPS
def getTags (svnurl) {
def command = ["svn","ls","${svnurl}"];
def proc = command.execute()
proc.waitFor()
proc.in.eachLine {
println(it)
}
}
getTags('http://svnurlexample.net/');

Call timeout() & input() inside a shared groovy script

I have a Groovy class shared between different Jenkins pipelines. I would like to move this part of the pipeline inside the shared groovy script
timeout (time: 15, unit: 'SECONDS') {
input ('Validation is required')
}
But it doesnt recognize input () or timeout()
so I have to pass them as parameters
def requireValidation (Closure timeout, Closure input) {
timeout (time: 15, unit: 'SECONDS') {
input ('Validation is required')
}
}
In there a way to import input & timeout inside the groovy script in a way I can have function without parameters?
def requireValidation()
A normal groovy class G.groovy:
class G {
def hello(s) {
println("hello ${s}")
}
def timeout( ...
def input( ...
}
and the script that needs to use it main.groovy:
def requireValidation(){
def script = new GroovyScriptEngine('.').with {
loadScriptByName('G.groovy')
}
this.metaClass.mixin script
hello('jon');
}
requireValidation()
which will print:
hello jon
The import is occurring inside the requireValidation function (inspired by Python) using the GroovyScriptEngine. The direct usage of the function is due to the magic of this.metaClass.mixin script. A better approach in main.groovy would be:
def script = new GroovyScriptEngine('.').with {
loadScriptByName('G.groovy')
}
this.metaClass.mixin script
def requireValidation(){
hello('jon');
}
requireValidation()

Jenkins pipeline script fails with "General error during class generation: Method code too large!"

When running a large Jenkins pipeline script, it can give the error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed: General error during class generation: Method code too large!
java.lang.RuntimeException: Method code too large!
What is the reason for this error and how can it be fixed?
This is due to a limit between Java and Groovy, requiring that method bytecode be no larger than 64kb. It is not due to the Jenkins Pipeline DSL.
To solve this, instead of using a single monolithic pipeline script, break it up into methods and call the methods.
For example, instead of having:
stage foo
parallel([
... giant list of maps ...
])
Instead do:
stage foo
def build_foo() {
parallel([
...giant list of maps...
])}
build_foo()
If you are using declarative pipeline with shared library, you may need to refactor and externalize your global variables in the new methods. Here is a full example:
Jenkinsfile:
#Library("my-shared-library") _
myPipeline()
myPipeline.groovy:
def call() {
String SOME_GLOBAL_VARIABLE
String SOME_GLOBAL_FILENAME
pipeline {
stages() {
stage('stage 1') {
steps {
script {
SOME_GLOBAL_VARIABLE = 'hello'
SOME_GLOBAL_FILENAME = 'hello.txt'
...
}
}
}
stage('stage 2') {
steps {
script {
doSomething(fileContent: SOME_GLOBAL_VARIABLE, filename: SOME_GLOBAL_FILENAME)
sh "cat $SOME_GLOBAL_FILENAME"
}
}
}
}
}
}
def doSomething(Map params) {
String fileContent = params.fileContent
String filename = params.filename
sh "echo $fileContent > $filename"
}

Resources