Use multi-method global variables in Jenkins Shared Libraries - jenkins

Consider this groovy file in a repo that is loaded as shared library in Jenkins:
/ vars
|
--- Utility.groovy
// Utility.groovy
def funcA() { ... }
def funcB() { ... }
And in the Jenkinsfile:
// Jenkinsfile
#Library('LibName') _
pipeline {
...
steps {
script {
def util = new Utility()
util.funcA()
}
}
}
This works fine. But if i try to load the library dynamically:
// Jenkinsfile
pipeline {
...
steps {
script {
library 'LibName'
def util = new Utility()
}
}
}
That doesn't work...
Can someone explain this with respect to this quote from the documentation:
The documentation of Shared Libraries in Jenkins says:
Internally, scripts in the vars directory are instantiated on-demand as singletons. This allows multiple methods to be defined in a single .groovy file for convenience.

Loading a Jenkins Shared Library dynamically has some limitation and challenges because of:
Using classes from the src/ directory is also possible, but trickier. Whereas the #Library annotation prepares the “classpath” of the script prior to compilation, by the time a library step is encountered the script has already been compiled. Therefore you cannot import or otherwise “statically” refer to types from the library. which is explained here
And it seems this question is kind of similar to this one.

Related

Jenkins - set options in a shared library for all pipelines that use the shared library

I have a bunch of repositories which use (parts of) the same Jenkins shared library for running tests, docker builds, etc. So far the shared library has greatly reduced the maintenance costs for these repos.
However, it turned out that basically all pipelines use the same set of options, e.g.:
#Library("myExample.jenkins.shared.library") _
import org.myExample.Constants
pipeline {
options {
disableConcurrentBuilds()
parallelsAlwaysFailFast()
}
agent {
label 'my-label'
}
stages {
stage {
runThisFromSharedLibrary(withParameter: "foo")
runThatFromSharedLibrary(withAnotherParameter: "bar")
...
...
In other words, I need to copy-and-paste the same option snippets in any new specific pipeline that I create.
Also, this means that I need to edit separately each Jenkinsfile (along with any peer-review processes we use internally) when I decide to change the set of options.
I'd very much like to remove this maintenance overhead somehow.
How can I delegate the option-setting to a shared library, or otherwise configure the required options for all pipelines at once?
Two options will help you the most:
Using global variables on Master/Agent level.
go to Jenkins-->Manage Jenkins-->Configure System--> Global properties.
Mark the Environment variables box then add name and value for the variable.
then you will be able to use it normally in your Jenkins pipelines as below code snippets.
Wrap the whole pipeline in a function inside shared-library.
Jenkinsfile will look like below:
#Library('shared-library') _
customServicePipeline(agent: 'staging',
timeout: 3,
server:'DEV')
shared library function
// customServicePipeline.groovy
def call(Map pipelineParams = [:]) {
pipeline {
agent { label "${pipelineParams.agent}" }
tools {
maven 'Maven-3.8.6'
jdk 'JDK 17'
}
options {
timeout(time: "${pipelineParams.timeout}", unit: 'MINUTES')
}
stages {
stage('Prep') {
steps {
echo 'prep started'
pingServer(pipelineParams.get("server"))
}
}
}
}
}

Executing shell commands from inside Pipeline Shared Library

I'm writing a shared library that will get used in Pipelines.
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def deploy(env) {
// convert environment from steps to list
def process = "ls -l".execute(envlist, null)
process.consumeProcessOutput(output, error)
process.waitFor()
println output
println error
}
}
In the Jenkinsfile, I import the library, call the class and execute the deploy function inside a script section:
stage('mystep') {
steps {
script {
def deployer = com.mypackage.HelmDeployer("test")
deployer.deploy()
}
}
}
However, no output or errors are printed on the Console log.
Is it possible to execute stuff inside a shared library class? If so, how, and what am I doing wrong?
Yes, it is possible but not really an obvious solution. Every call that is usually done in the Jenkinsfile but was moved to the shared-library needs to reference the steps object you passed.
You can also reference the Jenkins environment by calling steps.env.
I will give you a short example:
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def callMe() {
// Always call the steps object
steps.echo("Test")
steps.echo("${steps.env.BRANCH_NAME}")
steps.sh("ls -al")
// Your command could look something like this:
// def process = steps.sh(script: "ls -l", returnStdout: true).execute(steps.env, null)
...
}
}
You also have to import the object of the shared library and create an instance of it. Define the following outside of your Pipeline.
import com.mypackage.Deployer // path is relative to your src/ folder of the shared library
def deployer = new Deployer(this) // 'this' references to the step object of the Jenkins
Then you can call it in your pipeline as the following:
... script { deployer.test() } ...

How to inject environment variables in jenkinsfile using shared libraries before beginning the pipeline code?

I want to inject multiple environment variables using shared libraries into multiple jenkinsfile, as the env variables are common across these multiple jenkinsfile. Motive is to inject properties at a global level so that the variables are global and accessible throughout the pipeline.
I have tried the following:
a. Used the environment tag in jenkinsfile. This is required to be done in all the jenkinsfile, hence no code re-usability.
b. I am able to inject env variables inside the script tag of a stage. But I want to do it before the pipeline code begins. This will be like global properties which can be accessed from anywhere in the pipeline.
Instead of the below:
//Jenkinsfile
pipeline {
environment {
TESTWORKSPACE="some_value"
BUILDWORKSPACE="some_value"
...
...
30+ such env properties
}
}
I am looking for something where I can declare these env variables in a shared library groovy script and then access it throughout the pipeline. Something like below:
//Jenkinsfile
def call(Map pipelineParams) {
pipeline {
<code>
<Use pipelineParams.TESTWORKSPACE as a variable anywhere in my pipeline>
}
}
I think this is possible. Maybe your jenkinsfile could be something like :
#Library(value="my-shared-lib#master", changelog=false) _
import com.MyClass
def extendsEnv(env) {
def myClass = new MyClass()
myClass.getSharedVars().each { String key, String value ->
env[key] = value
}
}
pipeline {
...
stage('init') {
steps {
extendsEnv(env)
}
}
}
Note that your shared vars should be string values if you plan to add them as environment variable later. But I didn't try this code.
You could also imagine another solution : factorize the pipeline for all teams and use parameters override system (project level > team leavel > global level) :
#Library(value="my-shared-lib#master", changelog=false) _
def projectConfig = [:]
projectConfig['RESOURCE_REQUEST_CPU'] = '2000m'
projectConfig['RESOURCE_LIMIT_MEMORY'] = '2000Mi'
teamPipeline(projectConfig)
teamPipeline function is declared in the shared lib in vars directory, with additionnal teams parameters, which finaly calls a common pipeline with a parameter map. You set those parameters as env var as shown above.

Inject variable in jenkins pipeline with groovy script

I am building a jenkins pipeline and the job can be triggered by remote. I have the requirement to know which IP triggered the job. So I have a little groovy script, which returns the remote IP. With the EnvInject-plugin I can easily use this variable in a normal freestyle job, but how can I use this in the pipeline scirpt? I can't use the EnvInject-plugin with the pipeline-plugin :(
Here is the little script for getting the IP:
import hudson.model.*
import static hudson.model.Cause.RemoteCause
def ipaddress=""
for (CauseAction action : currentBuild.getActions(CauseAction.class)) {
for (Cause cause : action.getCauses()) {
if(cause instanceof RemoteCause){
ipaddress=cause.addr
break;
}
}
}
return ["ip":ipaddress]
You can create a shared library function (see here for examples and the directory structure). This is one of the undocumented (or really hard to find any documentation) features of Jenkins.
If you would put a file triggerIp.groovy in the directory vars, which is in the directory workflow-libs at the root level of JENKINS_HOME and put your code in that file.
The full filename then will be $JENKINS_HOME/workflow-libs/vars/ipTrigger.groovy
(You can even make a git repo for your shared libraries and clone it in that directory)
// workflow-libs/vars/ipTrigger.groovy
import hudson.model.*
import static hudson.model.Cause.RemoteCause
#com.cloudbees.groovy.cps.NonCPS
def call(currentBuild) {
def ipaddress=""
for (CauseAction action : currentBuild.getActions(CauseAction.class)) {
for (Cause cause : action.getCauses()) {
if(cause instanceof RemoteCause){
ipaddress=cause.addr
break;
}
}
}
return ["ip":ipaddress]
}
After a restart of Jenkins, from your pipeline script, you can call the method by the filename you gave it.
So from your pipeline just call def trigger = ipTrigger(currentBuild)
The the ipaddress will be, trigger.ip (sorry for the bad naming, couldn't come up with something original)

jenkinsfile use traits and other groovy synax

I would like to use a slightly more complex pipeline build via jenkinsfiles, with some reusable steps as I have a lot or similar projects. I'm using jenkins 2.0 with the pipeline plugins. I know that you can load groovy scripts which contain can contain some generic pieces of code but I was wondering if these scripts can use some of the Object oriented features of groovy like traits. For example say I had a trait called Step:
package com.foo.something.ci
trait Step {
void execute(){ echo 'Null execution'}
}
And a class that then implemented the trait in another file:
class Lint implements Step {
def execute() {
stage('lint')
node {
echo 'Do Stuff'
}
}
}
And then another class that contained the 'main' function:
class foo {
def f = new Lint()
f.execute()
}
How would I load and use all these classes in a Jenkinsfile, especially since I may have multiple classes each defining a step? Is this even possible?
Have a look at Shared Libaries. These enable the use of native groovy code in Jenkins.
Your Jenkinsfile would include your shared libary, and the use the classes you defined. Be aware, that you have to pass the steps variable of Jenkins, if you want to use stage or the other variables defined in the Jenkins Pipeline plugin.
Excerpt from the documentation:
This is the class, which would define your stages
package org.foo
class Utilities implements Serializable {
def steps
Utilities(steps) {this.steps = steps}
def mvn(args) {
steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}"
}
}
You would use it like this:
#Library('utils') import org.foo.Utilities
def utils = new Utilities(steps)
node {
utils.mvn 'clean package'
}

Resources