Jenkins Shared Library - Importing classes from the /src folder in /vars - jenkins

I am trying to writing a Jenkins Shared Library for my CI process. I'd like to reference a class that is in the \src folder inside a global function defined in the \vars folder, since it would allow me to put most of the logic in classes instead of in the global functions. I am following the repository structure documented on the official Jenkins documentation:
Jenkins Shared Library structure
Here's a simplified example of what I have:
/src/com/example/SrcClass.groovy
package com.example
class SrcClass {
def aFunction() {
return "Hello from src folder!"
}
}
/vars/classFromVars.groovy
import com.example.SrcClass
def call(args) {
def sc = new SrcClass()
return sc.aFunction()
}
Jenkinsfile
#Library('<lib-name>') _
pipeline {
...
post {
always {
classFromVars()
}
}
}
My goal was for the global classes in the /vars folder to act as a sort of public facade and to use it in my Jenkinsfile as a custom step without having to instantiate a class in a script block (making it compatible with declarative pipelines). It all seems pretty straightforward to me, but I am getting this error when running the classFromVars file:
<root>\vars\classFromVars.groovy: 1: unable to resolve class com.example.SrcClass
# line 1, column 1.
import com.example.SrcClass
^
1 error
I tried running the classFromVars class directly with the groovy CLI locally and on the Jenkins server and I have the same error on both environments. I also tried specifying the classpath when running the /vars script, getting the same error, with the following command:
<root>>groovy -cp <root>\src\com\example vars\classFromVars.groovy
Is what I'm trying to achieve possible? Or should I simply put all of my logic in the /vars class and avoid using the /src folder?
I have found several repositories on GitHub that seem to indicate this is possible, for example this one: https://github.com/fabric8io/fabric8-pipeline-library, which uses the classes in the /src folder in many of the classes in the /vars folder.

As #Szymon Stepniak pointed out, the -cp parameter in my groovy command was incorrect. It now works locally and on the Jenkins server. I have yet to explain why it wasn't working on the Jenkins server though.

I found that when I wanted to import a class from the shared library I have, to a script step in the /vars I needed to do it like this:
//thanks to '_', the classes are imported automatically.
// MUST have the '#' at the beginning, other wise it will not work.
// when not using "#BRANCH" it will use default branch from git repo.
#Library('my-shared-library#BRANCH') _
// only by calling them you can tell if they exist or not.
def exampleObject = new example.GlobalVars()
// then call methods or attributes from the class.
exampleObject.runExample()

Related

how to call property file syntax and define in JOB DSL in jenkins

I want to use property file in DSL job which will take my project name in job name and svn location . Can anyone have idea how to write and syntax?
For handling properties files stored outside your repository, you have a plugin called "Config File Provider Plugin".
You use it like this:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-file0in-jenkins', targetLocation: 'path/destinationfile')]) {
// some block
}
}
}
It is capable of replacing tokens in json and xml or the whole file (as in the example)
For handling data comming from the SVN or project name you can access the environment variables. See this thread and this link

Jenkins: Get FOLDER Library by Name

in /vars for the folder I have AaConfig.groovy.
In the job's Groovy script I have:
#!groovy
#Library('globals') _
Class config = AaConfig
I can use config instead of AaConfig with no problems in subsequent code.
This is useful because I can call different configuration files: AaConfig, BbConfig, etc. without changing anything because i'm calling them through config
So now I got greedy :-), and wanted to specify the config file externally:
#!groovy
#Library('globals') _
String configName = "AaConfig"
Class config = Class.forName(configName)
Now I get:
java.lang.ClassNotFoundException: AaConfig
Any ideas how to solve this?
If you want to load AaConfig class via reflection then you have to use the same classloader that your pipeline script uses. Class.forName() in this cases uses ClassLoaderForClassArtifacts while Groovy CPS plugin uses CpsGroovyShell$CleanGroovyClassLoader.
Below you can find an example of loading class using pipeline current classloader:
String configName = 'AaConfig'
Class config = this.getClass().getClassLoader().loadClass(configName)

Defining FOLDER level variables in Jenkins using a shared \vars library

So I'm trying to make define folder level variables by putting them in a groovy file in the \vars directory.
Alas, the documentation is so bad, that it's impossible to figure out how to do that...
Assuming we have to globals G1 and G2, is this how we define them in the groovy file?
#!Groovy
static string G1 = "G1"
static string G2 = "G2"
Assuming the Groovy file is called XYZ.Groovy, how do I define it in the folder so its available for the folder's script?
Assuming I get over that, and that that LIBXYZ is the name the folder associates with the stuff in the /vars directory, is it correct to assume that when I call
#Library("LIBXYZ") _
it will make XYZ available?
In that case, is XYZ.G1 the way to access the globals?
thanks, a.
I have a working example here as I was recently curious about this. I agree that the documentation is wretched.
The following is similar to the info in README.md.
Prep: note that folder here refers to Jenkins Folders from the CloudBees Folder plugin. It is a way to organize jobs.
Code Layout
The first part to note is src/net/codetojoy/shared/Bar.groovy :
package net.codetojoy.shared
class Bar {
static def G1 = "G1"
static def G2 = "G2"
def id
def emitLog() {
println "TRACER hello from Bar. id: ${id}"
}
}
The second part is vars/folderFoo.groovy:
def emitLog(message) {
println "TRACER folderFoo. message: ${message}"
def bar = new net.codetojoy.shared.Bar(id: 5150)
bar.emitLog()
println "TRACER test : " + net.codetojoy.shared.Bar.G1
}
Edit: To use a static/"global" variable in the vars folder, consider the following vars/Keys.groovy:
class Keys {
static def MY_GLOBAL_VAR3 = "beethoven"
}
The folderFoo.groovy script can use Keys.MY_GLOBAL_VAR3.
And then usage (in my example: Basic.Folder.Jenkinsfile):
#Library('folderFoo') _
stage "use shared library"
node {
script {
folderFoo.emitLog 'pipeline test!'
}
}
Jenkins Setup: Folder
Go to New Item and create a new Folder
configure the folder with a new Pipeline library:
Name is folderFoo
Default version is master
Retrieval Method is Modern SCM
Source Code Management in my example is this repo
Jenkins Setup: Pipeline Job
create a new Pipeline job in the folder created above
though a bit confusing (and self-referential), I create a pipeline job that uses this same this repo
specify the Jenkinsfile Basic.Folder.Jenkinsfile
the job should run and use the library

Can I import a groovy script from a relative directory into a Jenkinsfile?

I've got a project structured like this:
/
/ Jenkinsfile
/ build_tools /
/ pipeline.groovy # Functions which define the pipeline
/ reporting.groovy # Other misc build reporting stuff
/ dostuff.sh # A shell script used by the pipeline
/ domorestuff.sh # Another pipeline supporting shell-script
Is it possible to import the groovy files in /build_tools so that I can use functions inside those 2 files in my Jenkinsfile?
Ideally, I'd like to have a Jenkins file that looks something like this (pseudocode):
from build_tools.pipeline import build_pipeline
build_pipeline(project_name="my project", reporting_id=12345)
The bit I'm stuck on is how you write a working equivalent of that pretend import statement on line #1 of my pseudocode.
PS. Why I'm doing this: The build_tools folder is actually a git submodule shared by many projects. I'm trying to give each project access to a common set of build tooling to stop each project maintainer from reinventing this wheel.
The best-supported way to load shared groovy code is through shared libraries.
If you have a shared library like this:
simplest-jenkins-shared-library master % cat src/org/foo/Bar.groovy
package org.foo;
def awesomePrintingFunction() {
println "hello world"
}
Shove it into source control, configure it in your jenkins job or even globally (this is one of the only things you do through the Jenkins UI when using pipeline), like in this screenshot:
and then use it, for example, like this:
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
#Library('simplest-jenkins-shared-library')
def bar = new org.foo.Bar()
bar.awesomePrintingFunction()
}
}
}
}
}
Output from the console log for this build would of course include:
hello world
There are lots of other ways to write shared libraries (like using classes) and to use them (like defining vars so you can use them in Jenkinsfiles in super-slick ways). You can even load non-groovy files as resources. Check out the shared library docs for these extended use-cases.

How to implement a Groovy global AST transformation in a Grails plugin?

I'd like to modify some of my Grails domain classes at compilation time. I initially thought this was a job for Groovy's global ASTTransformation since I don't want to annotate my domain classes (which local transformers require). What's the best way to do this?
I also tried mimicking DefaultGrailsDomainClassInjector.java by creating my own class in the same package, implementing the same interfaces, but I probably just didn't know how to package it up in the right place because I never saw my methods get invoked.
On the other hand I was able to manually create a JAR which contained a compiled AST transformation class, along with the META-INF/services artifacts that plain Groovy global transformations require. I threw that JAR into my project's "lib" dir and visit() was successfully invoked. Obviously this was a sloppy job because I am hoping to have the source code of my AST transformation in a Grails plugin and not require a separate JAR artifact if I don't have to, plus I couldn't get this approach to work by having the JAR in my Grails plugin's "lib" but had to put it into the Grails app's "lib" instead.
This post helped a bit too: Grails 2.1.1 - How to develop a plugin with an AstTransformer?
The thing about global transforms the transform code should be available when the compilation starts. Having the transformer in a jar was what i did first! But as you said it is a sloppy job.
What you want to do is have your ast transforming class compile before others gets to the compilation phase. Here is what you do!
Preparing the transformer
Create a directory called precompiled in src folder! and add the Transformation class and the classes (such as annotations) the transformer uses in this directory with the correct packaging structure.
Then create a file called org.codehaus.groovy.transform.ASTTransformation in called precompiled/META-INF/services and you will have the following structure.
precompiled
--amanu
----LoggingASTTransformation.groovy
--META-INF
----services
------org.codehaus.groovy.transform.ASTTransformation
Then write the fully qualified name of the transformer in the org.codehaus.groovy.transform.ASTTransformation file, for the example above the fully qualified name would be amanu.LoggingASTTransformation
Implementation
package amanu
import org.codehaus.groovy.transform.GroovyASTTransformation
import org.codehaus.groovy.transform.ASTTransformation
import org.codehaus.groovy.control.CompilePhase
import org.codehaus.groovy.ast.ASTNode
import org.codehaus.groovy.control.SourceUnit
#GroovyASTTransformation(phase=CompilePhase.CANONICALIZATION)
class TeamDomainASTTransformation implements ASTTransformation{
public void visit(ASTNode[] nodes, SourceUnit sourceUnit) {
println ("*********************** VISIT ************")
source.getAST()?.getClasses()?.each { classNode ->
//Class node is a class that is contained in the file being compiled
classNode.addProperty("filed", ClassNode.ACC_PUBLIC, new ClassNode(Class.forName("java.lang.String")), null, null, null)
}
}
}
Compilation
After implementing this you can go off in two ways! The first approach is to put it in a jar, like you did! and the other is to use a groovy script to compile it before others. To do this in grails, we use _Events.groovy script.
You can do this from a plugin or the main project, it doesn't matter. If it doesn't exist create a file called _Events.groovy and add the following content.
The code is copied from reinhard-seiler.blogspot.com with modifications
eventCompileStart = {target ->
...
compileAST(pluginBasedir, classesDirPath)
...
}
def compileAST(def srcBaseDir, def destDir) {
ant.sequential {
echo "Precompiling AST Transformations ..."
echo "src ${srcBaseDir} ${destDir}"
path id: "grails.compile.classpath", compileClasspath
def classpathId = "grails.compile.classpath"
mkdir dir: destDir
groovyc(destdir: destDir,
srcDir: "$srcBaseDir/src/precompiled",
classpathref: classpathId,
stacktrace: "yes",
encoding: "UTF-8")
copy(toDir:"$destDir/META-INF"){
fileset(dir:"$srcBaseDir/src/precompiled/META-INF")
}
echo "done precompiling AST Transformations"
}
}
the previous script will compile the transformer before others are compiled! This enable the transformer to be available for transforming your domain classes.
Don't forget
If you use any class other than those added in your classpath, you will have to precompile those too. The above script will compile everything in the precompiled directory and you can also add classes that don't need ast, but are needed for it in that directory!
If you want to use domain classes in transformation, You might want to do the precompilation in evenCompileEnd block! But this will make things slower!
Update
#Douglas Mendes mentioned there is a simple way to cause pre compilation. Which is more concise.
eventCompileStart = {
target -> projectCompiler.srcDirectories.add(0, "./src/precompiled")
}

Resources