How can I define a class in a Jenkins shared library that contains an empty JSON object?
// src/org/build/Report.groovy
package org.build
public class Report implements Serializable {
def steps
def json
Report(steps) {
this.steps = steps
this.json = emptyJson()
}
#NonCPS
def emptyJson() {
return this.steps.readJSON( text: '{}' )
}
}
...is instantiated from this pipeline:
#Library('my-library')
import org.build.Report
pipeline {
agent any
stages {
stage("foo") {
steps {
script {
rep = new org.build.Report(this)
}
}
}
}
}
...and fails with the error: expected to call org.build.Report.<init> but wound up catching readJSON; see: https://jenkins.io/redirect/pipeline-cps-method-mismatches/
I had only earlier today thought I'd figured out how to solve this "class" of problem.
Earlier, I encountered the same error when invoking a shared-library function from a shared-library class. I fixed that problem per the guidance at the link that the error message noted, i.e. annotating the shared-library function with #NonCPS.
I.e. in the code below, class FirstClass is able to invoke function firstNonNull() because the function is annotated with #NonCPS; without the annotation, this code generated the same error as in the question above:
// src/org/example/FirstClass.groovy
package org.example
public class FirstClass implements Serializable {
def steps
def var
FirstClass(steps) {
this.steps = steps
this.var = steps.utils.firstNonNull( [null, null, "assigned_from_ctor"] )
}
}
// vars/utils.groovy
#NonCPS
def firstNonNull( arr ) {
for ( def i in arr ) { if ( i ) { return i } }
return null
}
#Library('my-library')
import org.example.FirstClass
pipeline {
agent any
stages {
stage("foo") {
steps {
script {
first_class = new org.example.FirstClass(this)
}
}
}
}
}
Why does this approach not work with the Report class invoking readJSON?
Related
I need to create a Script from a String and execute it in the context of the current test class. Here's my simplified code:
import spock.lang.Specification
class MyTestSpec extends Specification {
Integer getOne() { return 1 }
Integer getTwo() { return 2 }
void 'call script with local methods'() {
given:
GroovyShell shell = new GroovyShell()
Script script = shell.parse("getOne() + getTwo()")
when:
def result = script.run()
then:
result == 3
}
}
This gives me the following error:
No signature of method: Script1.getOne() is applicable for argument types: () values: []
I see that to set variables one can use shell.setProperty but how do I pass the method's implementation to the script?
Of course, as soon as I posted this, I found my answer.
import org.codehaus.groovy.control.CompilerConfiguration
import spock.lang.Specification
class MyTestSpec extends Specification {
Integer getOne() { return 1 }
Integer getTwo() { return 2 }
void 'call script with local methods'() {
given:
CompilerConfiguration cc = new CompilerConfiguration()
cc.setScriptBaseClass(DelegatingScript.name)
GroovyShell sh = new GroovyShell(this.class.classLoader, new Binding(), cc)
DelegatingScript script = (DelegatingScript) sh.parse("getOne() + getTwo()")
script.setDelegate(this)
when:
def result = script.run()
then:
result == 3
}
}
Below is my pipeline code :
dir(my_directory) {
retry(1) {
// something
}
}
Is there a possibility to access dir step in groovy through the pipeline context ?
I'm thinking of something like this below
class StepExecutor {
// some code
void dir(String directory, Closure statement) {
this.steps.dir(directory) { statement }
}
}
Yes you can. It requires you to pass the steps-object of the pipeline though.
class StepExecutor {
def steps;
public StepExecutor(def steps) {
this.steps = steps
}
// some code
void dir(String directory, Closure statement) {
this.steps.dir(directory) { statement }
}
}
creating the object from inside of your pipeline:
pipeline { ....
def stepExecutor = new StepExecutor(this);
...}
While this is specific to Jenkins code this question I think is more generally about the metaprogramming features of Groovy.
This is what my tests looks like:
// MyDeclarativePipelineTest.groovy
class MyDeclarativePipelineTest extends DeclarativePipelineTest {
#Override
#Before
public void setUp() throws Exception {
super.setUp();
helper.registerAllowedMethod("parameters", [ArrayList.class], null)
}
}
// someTests.groovy
class someTests extends MyDeclarativePipelineTest {
#Override
#Before
public void setUp() throws Exception {
super.setUp();
}
#Test
public void pipeline_should_execute_without_errors() throws Exception {
addParam('MYPARAM', 'sdfsdfsdf')
def script = loadScript("Jenkinsfile")
script.run()
assertJobStatusSuccess()
}
}
// Jenkinsfile
pipeline {
stages {
stage('Build') {
steps {
If(true) {print("sdfsdfsd")} // should fail because not wrapped in script{} step
echo 'Building..'
}
}
}
}
I'm using this library in my Spock tests: JenkinsPipelineUnit
I want to overrided the steps{} closure in my MyDeclarativePipelineTest class
The steps closure is defined in this class StageDeclaration.groovy
StageDeclaration is used in the GenericPipelineDeclaration class like this:
def stage(String name,
#DelegatesTo(strategy = DELEGATE_FIRST, value = StageDeclaration) Closure closure) {
this.stages.put(name, createComponent(StageDeclaration, closure).with { it.name = name; it })
}
And then finally the GenericPipelineDeclaration is used in DeclarativePipelineTest like this:
def pipelineInterceptor = { Closure closure ->
GenericPipelineDeclaration.binding = binding
GenericPipelineDeclaration.createComponent(DeclarativePipeline, closure).execute(delegate)
}
I'm extending DeclarativePipelineTest with my own class MyDeclarativePipelineTest
Is there a way I can override the steps closure inside of MyDeclarativePipelineTest? I want it to throw an error if raw Groovy code is defined in the steps{} closure that isn't wrapped in a script{} closure.
I want to use jmockit to test the static method in Spock, and combine the where tag to achieve different values of each mock to test different business logic. I tried a lot of writing methods, but they all failed. I hope I can get help or suggestions here. Thank you very much
Here is an example of my business code:
public class MyUtils {
public static int staticMethod(int origin) {
return 0;
}
}
public class MyClass {
public void verify(int origin) {
if (MyUtils.staticMethod(origin) == 1) {
System.out.println("1");
}
if (MyUtils.staticMethod(origin) == 2) {
System.out.println("2");
}
...
}
}
This is my Spock test codeļ¼
def "verify"() {
when:
myClass.verify(0)
then:
true
where:
mock | _
mockStatic(1) | _
mockStatic(2) | _
}
def mockStatic(val){
new MockUp<MyUtils>() {
#Mock
public int staticMethod(int origin) {
return val
}
}
}
I know that power can implement such a function, but because our team has been using jmockit, we want to know whether jmockit can implement such multiple different values of mock in Spock?
Put your method call into a closure and evaluate the closure during each iteration:
package de.scrum_master.stackoverflow.q67882559
import mockit.Mock
import mockit.MockUp
import mockit.internal.state.SavePoint
import spock.lang.Requires
import spock.lang.Specification
import spock.lang.Unroll
class StaticMethodJMockitTest extends Specification {
def jMockitSavePoint = new SavePoint()
def cleanup() {
jMockitSavePoint.rollback()
}
#Unroll
def "verify"() {
given:
mockClosure()
MyClass myClass = new MyClass()
when:
myClass.verify(0)
then:
true
where:
mockClosure << [
{ /* no mock */ },
{ mockStatic(1) },
{ mockStatic(2) }
]
}
def mockStatic(val) {
new MockUp<MyUtils>() {
#Mock
int staticMethod(int origin) {
return val
}
}
}
public static class MyUtils {
public static int staticMethod(int origin) {
return 0;
}
}
public static class MyClass {
public void verify(int origin) {
if (MyUtils.staticMethod(origin) == 1) {
System.out.println("1");
}
if (MyUtils.staticMethod(origin) == 2) {
System.out.println("2");
}
}
}
}
If you wish to use data tables, you need to help the parser a bit by explicitly adding it -> inside in the closure, if the closure is in the first column of the data table. You can also use some nice naming for your unrolled iterations:
#Unroll
def "verify #description"() {
given:
mockClosure()
MyClass myClass = new MyClass()
when:
myClass.verify(0)
then:
true
where:
description | mockClosure
"no mock" | { /* no mock */ }
"mock result 1" | { mockStatic(1) }
"mock result 2" | { mockStatic(2) }
}
The reason for creating and rolling back the save point is that JMockit does not play nice with Spock concerning mock lifecycles and the maintainer has no intention to even think about helping. See JMockit issue #668 for more info.
Given:
#!groovy
#Library('GreatUtils')
def utils = new com.X.Utils(script:this)
node {
stage('Call utils.check directly') {
utils.check()
}
}
This code works, and the library utils is called.
#!groovy
#Library('GreatUtils')
def utils = new com.X.Utils(script:this)
node {
stage('Call utils.check indirectly') {
checkUtils()
}
}
def checkUtils() {
utils.check() << -- throws exception
}
This throws -
No such property: utils
for class: groovy.lang.Binding`
Any ideas?
In groovy function don't have access to variable declare outside of their scope, the error is simply because your variable utils is out of scope.
Passing it by parameter like this should work
#!groovy
#Library('GreatUtils')
def utils = new com.X.Utils(script:this)
node {
stage('Call utils.check indirectly') {
checkUtils(utils)
}
}
def checkUtils(utils) {
utils.check()
}
Or if you don't want to use parameter you could go for a functional programming style and use Closures like
#!groovy
#Library('GreatUtils')
def utils = new com.X.Utils(script:this)
def check = { -> utils.check() }
node {
stage('Call utils.check indirectly') {
check()
}
}
Edit:
Adding the Global initialization possibility.
#!groovy
#Library('GreatUtils')
utils = new com.X.Utils(script:this)
node {
stage('Call utils.check indirectly') {
checkUtils()
}
}
def checkUtils() {
utils.check()
}
Should work.