I would like to use a slightly more complex pipeline build via jenkinsfiles, with some reusable steps as I have a lot or similar projects. I'm using jenkins 2.0 with the pipeline plugins. I know that you can load groovy scripts which contain can contain some generic pieces of code but I was wondering if these scripts can use some of the Object oriented features of groovy like traits. For example say I had a trait called Step:
package com.foo.something.ci
trait Step {
void execute(){ echo 'Null execution'}
}
And a class that then implemented the trait in another file:
class Lint implements Step {
def execute() {
stage('lint')
node {
echo 'Do Stuff'
}
}
}
And then another class that contained the 'main' function:
class foo {
def f = new Lint()
f.execute()
}
How would I load and use all these classes in a Jenkinsfile, especially since I may have multiple classes each defining a step? Is this even possible?
Have a look at Shared Libaries. These enable the use of native groovy code in Jenkins.
Your Jenkinsfile would include your shared libary, and the use the classes you defined. Be aware, that you have to pass the steps variable of Jenkins, if you want to use stage or the other variables defined in the Jenkins Pipeline plugin.
Excerpt from the documentation:
This is the class, which would define your stages
package org.foo
class Utilities implements Serializable {
def steps
Utilities(steps) {this.steps = steps}
def mvn(args) {
steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}"
}
}
You would use it like this:
#Library('utils') import org.foo.Utilities
def utils = new Utilities(steps)
node {
utils.mvn 'clean package'
}
Related
I'm writing a shared library that will get used in Pipelines.
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def deploy(env) {
// convert environment from steps to list
def process = "ls -l".execute(envlist, null)
process.consumeProcessOutput(output, error)
process.waitFor()
println output
println error
}
}
In the Jenkinsfile, I import the library, call the class and execute the deploy function inside a script section:
stage('mystep') {
steps {
script {
def deployer = com.mypackage.HelmDeployer("test")
deployer.deploy()
}
}
}
However, no output or errors are printed on the Console log.
Is it possible to execute stuff inside a shared library class? If so, how, and what am I doing wrong?
Yes, it is possible but not really an obvious solution. Every call that is usually done in the Jenkinsfile but was moved to the shared-library needs to reference the steps object you passed.
You can also reference the Jenkins environment by calling steps.env.
I will give you a short example:
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def callMe() {
// Always call the steps object
steps.echo("Test")
steps.echo("${steps.env.BRANCH_NAME}")
steps.sh("ls -al")
// Your command could look something like this:
// def process = steps.sh(script: "ls -l", returnStdout: true).execute(steps.env, null)
...
}
}
You also have to import the object of the shared library and create an instance of it. Define the following outside of your Pipeline.
import com.mypackage.Deployer // path is relative to your src/ folder of the shared library
def deployer = new Deployer(this) // 'this' references to the step object of the Jenkins
Then you can call it in your pipeline as the following:
... script { deployer.test() } ...
Let me preface this by saying that I don't yet fully understand how jenkins DSL / groovy handles namespace, scope, variables etc.
In order to keep my code DRY I put repeated command sequences into variables.
It turns out the variable script below is not readable by the code in doParallelStuff. Why is that? Is there a way to share global variables defined in the script (or elsewhere) among both the main pipleine steps and the doParallelStuff code?
def script = """\
#/bin/bash
python xyz.py
"""
def doParallelStuff() {
tests["1"] = {
node {
stage('ps1') {
sh script
}
}
}
tests["2"] = {
node {
stage('ps2') {
sh script
}
}
}
parallel tests
}
pipeline {
stages {
stage("myStage") {
steps {
script {
sh script
doParallelStuff()
}
}
}
}
}
The actual steps are a bit more complicated, but this causes an error like the following to be thrown:
hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: script for class: WorkflowScript
When you define a variable outside of the pipeline directive scope using the def keyword you are defining it in the local scope of the main script, because the pipeline keyword is actually a method that is executed in the main script it can access the variable is they are defined and executed in the same scope (they are actually transformed into a separated class).
When you define a function outside of the pipeline directive, that function has its own scope for variables which is separated from the scope of the main script and therefore it cannot access the defined variable in the top level.
To solve it you can define the variable without the def keyword which will affect the scope in which this variable is created, as without the def (in a groovy script, not class) the variable is added to the global variables of the script (The Binding) which makes it accessible from any function or code within the groovy script. You can read more on the following question: What is the difference between defining variables using def and without?
So in your case you want a variable that is available for both the pipeline code itself and for the defined functions - so it needs to be available anywhere in the script as a global variable and therefore just define it without the def keyword, and it should do the trick:
script = """\
#/bin/bash
python xyz.py
"""
Consider this groovy file in a repo that is loaded as shared library in Jenkins:
/ vars
|
--- Utility.groovy
// Utility.groovy
def funcA() { ... }
def funcB() { ... }
And in the Jenkinsfile:
// Jenkinsfile
#Library('LibName') _
pipeline {
...
steps {
script {
def util = new Utility()
util.funcA()
}
}
}
This works fine. But if i try to load the library dynamically:
// Jenkinsfile
pipeline {
...
steps {
script {
library 'LibName'
def util = new Utility()
}
}
}
That doesn't work...
Can someone explain this with respect to this quote from the documentation:
The documentation of Shared Libraries in Jenkins says:
Internally, scripts in the vars directory are instantiated on-demand as singletons. This allows multiple methods to be defined in a single .groovy file for convenience.
Loading a Jenkins Shared Library dynamically has some limitation and challenges because of:
Using classes from the src/ directory is also possible, but trickier. Whereas the #Library annotation prepares the “classpath” of the script prior to compilation, by the time a library step is encountered the script has already been compiled. Therefore you cannot import or otherwise “statically” refer to types from the library. which is explained here
And it seems this question is kind of similar to this one.
I would like to integrate a Global library into my build flow. I have written a basic function
srv/core/jenkins/Checks.groovy:
package core.jenkins
class Checks implements Serializable {
def script
Checks(script) {
this.script = script
}
def fileExists(){
script.echo "File exists in the repo."
}
}
And it is exposed as a global var
vars/fileExisits.groovy:
def call() {
new core.jenkins.Checks(this).fileExists()
}
While configuring the Global Shared Library settings in Jenkins, I have the following settings:
Now in my jenkinsfile, Im doing something like this:
pipeline {
agent { label 'master' }
stages {
stage('Check for md files'){
steps {
sh 'echo hello'
script {
checks.fileExists()
}
}
}
}
}
This always gives the error
groovy.lang.MissingPropertyException: No such property: checks for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at
For it to work, I have to add the lines to the top of my Jenkinsfile
import core.jenkins.Checks
def checks = new Checks(this)
Is there a way for me to invoke the function fileExists from a library without having to add the above 2 lines always ?
Just replace:
checks.fileExists()
with:
fileExists()
All Groovy scripts that implements def call() methods and are stored in the vars/ folder can be triggered by their script file name. Alternatively, if you would like to keep checks.fileExists() syntax, then you need to create vars/checks.groovy script file and implement def fileExists() method inside of it.
I want to refactor my Jenkins pipeline script into classes for readability and reuse.
The problem is i get exceptions when doing so.
Let's look at a simple example:
When i run
echo currentBuild.toString()
everything is fine
But when i extract it into a class as so:
class MyClass implements Serializable {
def runBuild() {
echo currentBuild.toString()
}
}
new MyClass().runBuild()
i get an exception:
Started by user admin
Replayed #196
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: currentBuild for class: MyClass
What is the proper way of extracting pipeline code in to classes?
You are on the right way, but the problem is that you didn't pass the script object to the instance of your class and was trying to call method which is not defined in the class that you have created.
Here is one way to solve this:
// Jenkins file or pipeline scripts editor in your job
new MyClass(this).runBuild()
// Class declaration
class MyClass implements Serializable {
def script
MyClass(def script) {
this.script=script
}
def runBuild() {
script.echo script.currentBuild.toString()
}
}
your code missing declare class field script
class MyClass implements Serializable {
def script
MyClass(def script) {
this.script=script
}
def runBuild() {
script.echo script.currentBuild.toString()
}
}
this code should be ok #bram