I am writing a shared library for Jenkins and am running across a bit of an organizational issue.
I have a number of pipeline scripts in var, however I'm finding there are a number of repeating functions and the code is not very dry.
One solution for this has been to create helper functions inside var like var/log.groovy, var/formatter.groovy. This has worked fine and I've been calling these functions from within my pipeline scripts like var/myPipeline.groovy.
I would just like to organize my var folder a bit better and keep my helper functions inside var/utils/log.groovy for example.
The problem is I'm not sure how to access them from my pipeline scripts inside var when I put them inside a sub-directory.
How can I access them? Or is there a better way to organize my global functions?
You can put them in src in a package structure that makes sense organizationally. Them import the right things in your var scripts.
in /src/com/yourco/Formatter.groovy
package com.yourco
class Formatter {
def static String formatThis(String something) {
"this is ${something}"
}
}
In your var
import com.yourco.Formatter
..
..
..
echo Formatter.formatThis('test')
Related
It seems like it's really difficult to be able to store a bunch of variables for use in shared code in Jenkins/Groovy scripted pipelines. I've tried a bunch of methods and none of them seem to give the desired result.
This method looked the most promising, but the values all came back as null in the calling pipeline. Get Global Variables in jenkins pipeline.
My codes is something lie
import org.blabla.JobHelper
println("env.NO_PROXY: -->${env.NO_PROXY}<--")
And in the JobHelper.groovy file, I've defined
package org.blabla.project
env.NO_PROXY = 'localhost,127.0.0.1,169.254.169.254'
the names have been changed a bit to protect the innocent, but you get the idea.
the script just prints null for the value.
Is there a simple way (or indeed any way) that I can pull in a bunch of variables from a shared library file? This feels like it should be a really simple exercise, but after spending many hours searching I'm none the wiser.
In general, env is only available once the pipeline has started, but groovy scripts are resolved much earlier.
I'm using static class members as global variables. Applied to your code sample, it would look like this:
JobHelper.groovy
package org.blabla.project
# Class must be named like the file that contains it.
class JobHelper {
static String getNO_PROXY() { 'localhost,127.0.0.1,169.254.169.254' }
}
Elsewhere:
import org.blabla.project
println("NO_PROXY: -->${JobHelper.NO_PROXY}<--")
Note that Groovy automatically generates properties from get*() and set*() methods, so we can use the short form instead of having to write JobHelper.getNO_PROXY().
I have to write like 20 different scripts in K6 for an application. And most of these scripts contains common functionalities like login, choose some options, etc...
So is there a better way to write K6 scripts without duplicating these common functionalities? Can we implement common methods in somewhere and execute it inside the default function or something similar to that?
You can write your own module contains common functionalities then import them:
$ cat index.js
import { hello_world } from './modules/module.js';
export default function() {
hello_world();
}
$ cat module.js
export function hello_world() {
console.log("Hello world");
}
You can read here for more details.
Yes, you can move the common methods to separate JS files and then import them in the scripts that require them: https://docs.k6.io/docs/modules
All the tutorials that I have come across regarding writing a declarative pipeline suggest to include the stages and steps in the Jenkinsfile.
But I have noticed one of my seniors writing it the opposite way. He uses the Jenkinsfile just for defining all the properties, i.e. his Jenkinsfile is just a properties file, nothing more nothing less.
And for defining the pipeline he makes use of the shared library concepts where he writes his pipeline code in a file in the vars folder. I am not able to guess the wisdom behind this approach.
Nowhere over the internet did I come across anything similar.
Any guidance in this regard is highly appreciated. I am a beginner in the Jenkins world.
As illustrated in Extending with Shared Libraries, that approach (that I am using as well) allows to:
keep a Jenkinsfile content to a minimum
enforce a standard way of doing a particular job (as coded in the shared library)
That shared library becomes a template of a process for which you provide only values in your Jenkinsfile before delegating the actual execution to the pre-defined library.
The OP Asif Kamran Malick note that the documentation does include:
There is also a “builder pattern” trick using Groovy’s Closure.DELEGATE_FIRST, which permits Jenkinsfile to look slightly more like a configuration file than a program, but this is more complex and error-prone and is not recommended.
He then asks:
Why did the blogger prefer that way when its actually discouraged in the official doc.
I checked and we are using also Closure.DELEGATE_FIRST.
The reason is in the part "permits Jenkinsfile to look slightly more like a configuration file than a program"
This avoids us having to define a JSON block, and keep the parameter as a series of key=value lines, easier to read.
A call to a shared library is then:
#!/usr/bin/env groovy
#Library("MyLibraries") _
MyLibrary {
config1 = 'value1'
config2 = 'value2'
...
}
{
anotherConfigA = 'valueA'
anotherConfigB = 'valueB'...
astep(
...
)
}
Then your jenkins pipeline template in MyLibraries/vars/MyLibrary.yml can use those closure blocks:
def call(Closure configBlock, Closure body) {
def config = [:]
configBlock.resolveStrategy = Closure.DELEGATE_FIRST
configBlock.delegate = config
configBlock()
astep(
...
){
if (body) { body() }
}
}
I'm writing a somewhat complex global pipeline library. The library really just orchestrates a complex build, with a whole bunch of steps exposed as vars/* and a single src/com/myorg/pipeline/utils.groovy class that handles all common pieces of functionality. Each Jenkinsfile defines all 'build' specific config, and passes it to a vars/myBuildFlavor.groovy step that then calls all steps required for that flavor of the build. The vars/myBuildFlavor.groovy step also reads on a server config file that contains all config that is global to each Jenkins instance.
This setup works incredibly well. It allows users to either piece together their own builds from the steps I've exposed in the global library, or just set all build properties in their Jenkinsfile and call an existing flavor of a build that I've exposed as a step. What I'm struggling with is how I can access configuration values from both the 'build' and 'server' configuration, plus I have some random properties from steps early on in the build that I want to save and use later in the build. What is incredibly annoying is that I have to pass the entire context of the script around with 'this', or have extremely long method signatures to handle the juggling of all of these values.
What I'm thinking may be a good idea is to write a file in the workspace root that contains all build and server config values, plus any properties that I need later on in the build. Has anyone had to deal with this previously? Any major issues with my approach? Better ideas?
I haven't tried this, but you make me want to make sure this works. If you don't beat me to it, I'll give it a shot, so please report back...
The things in vars are created as singletons. So I think you should be able to do something like this:
// vars/customConfig.groovy
class customConfig implements Serializable {
private String url
private Map allTheThings
def setUrl(myUrl) {
url = myUrl
}
def getUrl() {
url
}
def setAllTheThings(Map configMap) {
allTheThings = configMap
}
def getAllTheThings() {
return allTheThings
}
def coolMethod(myVar) {
echo "This method does something cool with the ${myVar} and with ${name}"
}
}
Then access these things like:
customConfig.url = 'https://www.google.com'
echo ${customConfig.url}"
customConfig.coolMethod "FOOBAR"
customConfig.allTheThings.configItem1 = "BAZ"
customConfig.allTheThings.configItem2 = 12345
echo "${customConfig.allTheThings.configItem2} is an Int"
Since it is a "global var" or a singleton, I think you can use it everywhere and the values are all shared.
Let me know if this does what I think it will do.
What I'm trying to do
I have a script that looks something like this:
def doStuff() {
println 'stuff done'
}
return this
I am loading this script in another script so that I have a Groovy Script object that I can call doStuff from. This is in a script, call it myscript.groovy that looks like this:
Script doer = load('doStuff.groovy')
doer.doStuff()
I would like to be able to mock the Script object that is returned by load, stub doStuff, and assert that it is called. Ideally, something like the following (assume that load is already mocked):
given:
Script myscript = load('myscript.groovy')
Script mockDoer = Mock(Script)
when:
myscript.execute()
then:
1 * load('doStuff.groovy') >> mockDoer
1 * mockDoer.doStuff()
However, I am getting an NPE at the line:
doer.doStuff()
How can I mock the Script object in a way that I can make sure that the doStuff method is stubbed and called properly in my test?
Why I'm doing it this way
I know this is a bit of a weird use case. I figured I should give some context why I am trying to do this in case people want to suggest completely different ways of doing this that might not apply to what I am trying to do.
I recently started working on a project that uses some fairly complex Jenkins Pipeline scripts. In order to modularize the scripts to some degree, utility functions and pieces of different pipelines are contained in different scripts and loaded and executed similarly to how doStuff.groovy is above.
I am trying to make a small change to the scripts at the same time as introducing some testing using this library: https://github.com/lesfurets/JenkinsPipelineUnit
In one test in particular I want to mock a particular utility method and assert that it is called depending on parameters to the pipeline.
Because the scripts are currently untested, reasonably complex, I am new to them, and many different projects depend on them I am reluctant to make any sweeping changes to how the code is structured or modularized.