How to use Jenkins Job DSL API in custom groovy classes - jenkins

Is something like this possible, i.e. using the JobDSL API from a class outside the main DSL script?
//main_jobdsl_script.groovy:
new JobCreator().createJob()
//JobCreator.groovy:
job("new-job") {
steps {
batchFile("Hello World")
}
}
When running it I get the error
13:03:18 ERROR: No signature of method: JobCreator.job() is applicable for argument types:
(org.codehaus.groovy.runtime.GStringImpl, StartJobCreator$_createJob_closure1)
values: ["new-job", de.dbh.jobcreation.StartJobCreator$_createStartJob_closure1#374d293]
I want to avoid that the main-script gets too big and cluttered and rather divide the code into several scripts / classes.

Yes, it is possible. The current script has access to all API methods, so you need to pass it to the custom class.
//main_jobdsl_script.groovy:
new JobCreator(this).createJob()
//JobCreator.groovy:
class JobCreator {
private final Object context
JobCreator(Object context) {
this.context = context
}
void createJob() {
context.job('new-job') {
steps {
batchFile('Hello World')
}
}
}
}

Related

How can I inject with Guice my api into dataflow jobs without needed to be serializable?

This question is a follow on after such a great answer Is there a way to upload jars for a dataflow job so we don't have to serialize everything?
This made me realize 'ok, what I want is injection with no serialization so that I can mock and test'.
Our current method requires our apis/mocks to be serialiable BUT THEN, I have to put static fields in the mock because it gets serialized and deserialized creating a new instance that dataflow uses.
My colleague pointed out that perhaps this needs to be a sink and that is treated differently? <- We may try that later and update but we are not sure right now.
My desire is from the top to replace the apis with mocks during testing. Does someone have an example for this?
Here is our bootstrap code that does not know if it is in production or inside a feature test. We test end to end results with no apache beam imports in our tests meaning we swap to any tech if we want to pivot and keep all our tests. Not only that, we catch way more integration bugs and can refactor without rewriting tests since the contracts we test are customer ones we can't easily change.
public class App {
private Pipeline pipeline;
private RosterFileTransform transform;
#Inject
public App(Pipeline pipeline, RosterFileTransform transform) {
this.pipeline = pipeline;
this.transform = transform;
}
public void start() {
pipeline.apply(transform);
pipeline.run();
}
}
Notice that everything we do is Guice Injection based so the Pipeline may be direct runner or not. I may need to modify this class to pass things through :( but anything that works for now would be great.
The function I am trying to get our api(and mock and impl to) with no serialization is thus
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
microServiceApi.writeRecord(element.getValue);
}
}
I am not sure how to pass in microServiceApi in a way that avoid serialization. I would be ok with delayed creation as well after deserialization using guice Provider provider; with provider.get() if there is a solution there too.
Solved in such a way that mocks no longer need static or serialization anymore by one since glass bridging the world of dataflow(in prod and in test) like so
NOTE: There is additional magic-ness we have in our company that passes through headers from service to service and through dataflow and that is some of it in there which you can ignore(ie. the RouterRequest request = Current.request();). so for anyone else, they will have to pass in projectId into getInstance each time.
public abstract class DataflowClientFactory implements Serializable {
private static final Logger log = LoggerFactory.getLogger(DataflowClientFactory.class);
public static final String PROJECT_KEY = "projectKey";
private transient static Injector injector;
private transient static Module overrides;
private static int counter = 0;
public DataflowClientFactory() {
counter++;
log.info("creating again(usually due to deserialization). counter="+counter);
}
public static void injectOverrides(Module dfOverrides) {
overrides = dfOverrides;
}
private synchronized void initialize(String project) {
if(injector != null)
return;
/********************************************
* The hardest part is this piece since this is specific to each Dataflow
* so each project subclasses DataflowClientFactory
* This solution is the best ONLY in the fact of time crunch and it works
* decently for end to end testing without developers needing fancy
* wrappers around mocks anymore.
***/
Module module = loadProjectModule();
Module modules = Modules.combine(module, new OrderlyDataflowModule(project));
if(overrides != null) {
modules = Modules.override(modules).with(overrides);
}
injector = Guice.createInjector(modules);
}
protected abstract Module loadProjectModule();
public <T> T getInstance(Class<T> clazz) {
if(!Current.isContextSet()) {
throw new IllegalStateException("Someone on the stack is extending DoFn instead of OrderlyDoFn so you need to fix that first");
}
RouterRequest request = Current.request();
String project = (String)request.requestState.get(PROJECT_KEY);
initialize(project);
return injector.getInstance(clazz);
}
}
I suppose this may not be what you're looking for, but your use case makes me think of using factory objects. They may depend on the pipeline options that you pass (i.e. your PipelineOptions object), or on some other configuration object.
Perhaps something like this:
class MicroserviceApiClientFactory implements Serializable {
MicroserviceApiClientFactory(PipelineOptions options) {
this.options = options;
}
public static MicroserviceApiClient getClient() {
MyPipelineOptions specialOpts = options.as(MySpecialOptions.class);
if (specialOpts.getMockMicroserviceApi()) {
return new MockedMicroserviceApiClient(...); // Or whatever
} else {
return new MicroserviceApiClient(specialOpts.getMicroserviceEndpoint()); // Or whatever parameters it needs
}
}
}
And for your DoFns and any other execution-time objects that need it, you would pass the factory:
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
ValidRecordPublisher(MicroserviceApiClientFactory msFactory) {
this.msFactory = msFactory;
}
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
if (microServiceapi == null) microServiceApi = msFactory.getClient();
microServiceApi.writeRecord(element.getValue);
}
}
This should allow you to encapsulate the mocking functionality into a single class that lazily creates your mock or your client at pipeline execution time.
Let me know if this matches what you want somewhat, or if we should try to iterate further.
I have no experience with Guice, so I don't know if Guice configurations can easily pass the boundary between pipeline construction and pipeline execution (serialization / submittin JARs / etc).
Should this be a sink? Maybe, if you have an external service, and you're writing to it, you can write a PTransform that takes care of it - but the question of how you inject various dependencies will remain.

Obtain CpsScript instance in workflow-cps groovy code?

Currently coding a lot of groovy for very specific jenkins scenarios.
The problem is that I have to keep track of the current CpsScript-instance for the context (getting properties, the environment and so on) and its invokeMethod (workflow steps and the likes).
Currently this means I pass this in the pipeline groovy script onto my entry class and from there it's passed on to every class separately, which is very annoying.
The script instance is created by the CpsFlowExecution and stored within the Continuable-instance and the CpsThreadGroup, neither of which allow you to retrieve it.
Seems that GlobalVariable derived extensions receive it so that they have a context but I'm currently not knowledgeable enough to write my own extension to leverage that.
So the question is:
Does anyone know of a way to keep track of the CpsScript-instance that doesn't require me to pass it on to every new class I create? (Or alternatively: obtain it from anywhere - does this really need to be so hard?)
Continued looking into ways to accomplish this. Even wrote a jenkins plugin that provides an cpsScript global variable. Unfortunately you need the instance to provide a context for that call, so it's useless.
So as the "least bad solution"(tm) I created a class I called ScriptContext that I can use as a base class for my pipeline classes (It implements Serializable).
When you write your pipeline script you either pass it the CpsScript statically once:
ScriptContext.script = this
Or, if you derived from it (make sure to call super()):
new MyPipeline(this)
If your class is derived from the ScriptContext your work is done. Everything will work as though you didn't create a class but just used the automagic conversion. If you use any CpsScript-level functions besides println, you might want to add these in here as well.
Anywhere else you can just call ScriptContext.script to get the script instance.
The class code (removed most of the comments to keep it as short as possible):
package ...
import org.jenkinsci.plugins.workflow.cps.*
class ScriptContext implements Serializable {
protected static CpsScript _script = null
ScriptContext(CpsScript script = null) {
if (!_script && script) {
_script = script
}
}
ScriptContext withScript(CpsScript script) {
setScript(script)
this
}
static void setScript(CpsScript script) {
if (!_script && script) {
_script = script
}
}
static CpsScript getScript()
{
_script
}
// functions defined in CpsScript itself are not automatically found
void println(what) {
_script.println(what)
}
/**
* For derived classes we provide missing method functionality by trying to
* invoke the method in script context
*/
def methodMissing(String name, args) {
if (!_script) {
throw new GroovyRuntimeException('ScriptContext: No script instance available.')
}
return _script.invokeMethod(name, args)
}
/**
* For derived classes we provide missing property functionality.
* Note: Since it's sometimes unclear whether a property is an actual property or
* just a function name without brackets, use evaluate for this instead of getProperty.
* #param name
* #param args
* #return
*/
def propertyMissing(String name) {
if (!_script) {
throw new GroovyRuntimeException('ScriptContext: No script instance available.')
}
_script.evaluate(name)
}
/**
* Wrap in node if needed
* #param body
* #return
*/
protected <V> V node(Closure<V> body) {
if (_script.env.NODE_NAME != null) {
// Already inside a node block.
body()
} else {
_script.node {
body()
}
}
}
}

Mocking findFiles in JenkinsPipelineUnit

Currently I'm trying to register findFiles step.
My set up is as follows:
src/
test/
groovy/
TestJavaLib.groovy
vars/
javaLib.groovy
javaApp.jenkinsfile
Inside TestJavaApp.groovy I have:
...
import com.lesfurets.jenkins.unit.RegressionTest
import com.lesfurets.jenkins.unit.BasePipelineTest
class TestJavaLibraryPipeline extends BasePipelineTest implements RegressionTest {
// Some overridden setUp() which loads shared libs
// and registers methods referenced in javaLib.groovy
void registerPipelineMethods() {
...
def fileList = [new File("testFile1"), new File("testFile2")]
helper.registerAllowedMethod('findFiles', { f -> return fileList })
...
}
}
and my javaLib.groovy contains this currently failing part:
...
def pomFiles = findFiles glob: "target/publish/**/${JOB_BASE_NAME}*.pom"
if (pomFiles.length < 1) { // Fails with java.lang.NullPointerException: Cannot get property 'length' on null object
error("no pom file found")
}
...
I have tried multiple closures returning various objects, but everytime I get NPE.
Question is - how to correctly register "findFiles" method?
N.B. That I'm very new to mocking and closures in groovy.
Looking at the source code and examples on GitHub, I see a few overloads of the method (here):
void registerAllowedMethod(String name, List<Class> args = [], Closure closure)
void registerAllowedMethod(MethodSignature methodSignature, Closure closure)
void registerAllowedMethod(MethodSignature methodSignature, Function callback)
void registerAllowedMethod(MethodSignature methodSignature, Consumer callback)
It doesn't look like you are registering the right signature with your call. I'm actually surprised you aren't getting a MissingMethodException with your current call pattern.
You need to add the rest of the method signature during registration. The findFiles method is taking a Map of parameters (glob: "target/publish/**/${JOB_BASE_NAME}*.pom" is a map literal in Groovy). One way to register that type would be like this:
helper.registerAllowedMethod('findFiles', [Map.class], { f -> return fileList })
I also faced the same issue. However, I was able to mock the findFiles() method using the following method signature:
helper.registerAllowedMethod(method('findFiles', Map.class), {map ->
return [['path':'testPath/test.zip']]
})
So I found a way on how to mock findFiles when I needed length property:
helper.registerAllowedMethod('findFiles', [Map.class], { [length: findFilesLength ?: 1] })
This also allows to change findFilesLength variable in tests to test different conditions in pipeline like the one in my OP.

How to access Job DSL's readFileFromWorkspace from a helper class?

I have a large number of Jenkins job definitions in Job DSL that all rely on some common functionality that I implemented in helper classes. This is the essence of the jobDsl step running these scripts:
jobDsl {
additionalClasspath('jobdsl/src/main/groovy')
targets('jobdsl/*.groovy')
sandbox(true)
}
One of the helper classes in jobdsl/src/main/groovy needs to read a file from the workspace, but it cannot access the readFileFromWorkspace function.
So this one wouldn't work:
class MyHelper {
static Closure processFile(String src) {
...
def txt = readFileFromWorkspace(src)
...
}
}
I have to take a closure parameter instead:
class MyHelper {
static Closure processFile(String src, Closure rffw) {
...
def txt = rffw(src)
...
}
}
Which makes the code calling this helper bloated:
MyHelper.processFile('foo.txt', { readFileFromWorkspace(it) })
Is there a way to make my class see readFileFromWorkspace? Actually, I couldn't even figure out to which class does this function belong to. Or whether is it a real function at all or something "magicly" defined by the DSL.
HelperClass is present in other file which is out of Job-dsl context. So to make it visible, try doing as below.
class MyHelper {
static Closure processFile(String src, def dslFactory) {
...
def txt = dslFactory.readFileFromWorkspace(src)
...
}
}
MyHelper.processFile('foo.txt', this)
The above code should work for you, else please revert to me if you encounter any problems.

Using ant tasks from gradle-script-kotlin

How can I access ant tasks from my build.gradle.kts script? In particular, I am interested in the ant.patch task.
Can I extend it, like so?
task("patchSources", Patch::class) {
Can I invoke it from other task, like this?
task("patchSources") {
doLast {
ant.patch(...)
}
}
I know how to do it in Groovy: How do I apply a patch file in Gradle?
AntBuilder extends from Groovy's AntBuilder. You can translate the dynamic method invocations from groovy like ant.patch() to Kotlin by using invokeMethod and providing the desired task as the first argument and the properties to bind as a map in the second argument.
For example, for your Patch use case (available properties documentation) the Kotlin could look like this:
val patchSources by tasks.creating {
doLast {
ant.invokeMethod("patch", mapOf(
"patchfile" to patchFile,
"dir" to configDir,
"strip" to 1
))
}
}
This works for me:
import org.apache.tools.ant.taskdefs.Patch
val patchConfigTask = task("patchConfig") {
dependsOn(unzipTask)
doLast {
val resources = projectDir.resolve("src/main/resources")
val patchFile = resources.resolve("config.patch")
Patch().apply {
setPatchfile(patchFile)
setDir(buildDir.resolve("config/"))
setStrip(1) // gets rid of the a/ b/ prefixes
execute()
}
}
}
I am not sure if it's the one-right-way-to-do-it.

Resources