groovy script loaded from jenkinsfile not found - jenkins

currently I have an "all inclusive" jenkinsfile which contains various functions.
In order to re-use those functions in other jenkinsfiles I want to put them into separate groovy scripts and load them from the jenkinsfile(s).
scmHandler.groovy:
#!groovy
def handleCheckout() {
if (env.gitlabMergeRequestId) {
echo 'Merge request detected. Merging...'
}
...
}
return this;
in jenkinsfile I do:
...
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()
I tried to follow the instructions from here but jenkins is constantly complaining that there is no such file scmHandler.groovy an I get:
java.io.FileNotFoundException: d:\jenkins\workspace\myJenkinsJob\test\scmHandler.groovy
Both jenkinsfile and scmHandler.groovy reside in a test/ subdir of the workspace in the git repo of the project to boild and are checked out correctly on master:
/var/lib/jenkins/jobs/myJenkinsJob/workspace#script/test/scmHandler.groovy
However I cannot find them on the slave node where the jenkinsfile executes the build steps inside a node {}. There I only see old versions of the jenkinsfile since the (separated) checkout step is not executed yet.
How do I correctly access the handleCheckout.groovy? What am I miss here?
Actually I find this a neat way to "include" external groovy files without using a separate library.

Use checkout scm before loading scmHandler.groovy
checkout scm
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()

Related

Share Functions among Jobs

I am using Jenkins pipeline, I created 4 Jobs, each job has some functions and Their is a redundant function existing in all those Jobs.
How to make that redundant function in a shared place and all those jobs can call this function ?
You are looking for Jenkins shared library
As the name suggest, you create a library - a pipeline shared among jenkins jobs - in a SCM (git svn ...) and in your project you create a simple Jenkinsfile calling the library.
So, every build will checkout your project, read the Jenkinsfile and then checkout the library with the pipeline.
I did it by:
Created Folder in Jenkins working home
In that folder => I Created file.groovy contains the functions i need
At the end of that file should contain
return this
in JenkinsFile add
node{shared_functionality = load "FilePath.groovy"}
This number four will include functions in .groovy file in your jenkinsFile
So you can add node statement in your JenkinsFiles to include Functions you need

Jenkins Shared Libraries context

I have a pipeline job which loads Jenkinsfile from git repository. My Jenkinsfile looks like this:
#!groovy
#Library('global-utils-lib') _
node("mvn") {
stage('build') {
checkout scm
}
stage('merge-request'){
mergeRequest()
}
}
global-utils-lib is shared library loaded in Global Pipeline Libraries from another git repo with following structure
vars/mergeRequest.groovy
mergeRequest.groovy:
def call() {
sh "ip addr"
def workspacePath = env.WORKSPACE
new File(workspacePath + "/file.txt").text
}
Job is run against docker container (docker plugin).
When I run this job then docker container is provisioned correctly and scm is downloaded but I get FileNotFoundException.
It looks like code from shared library is executed against jenkins master not slave:
presented IP comes from master
file is loaded correctly when I pass correct path to the scm on master
How can I run library code against slave? What I am missing?
It's generally not a good idea to try and do things like new File() instead of using existing Pipeline steps.
Your Pipeline script is interpreted and executed by the Jenkins master so, as you're seeing, the attempt to use the File API doesn't work as you might expect.
Sticking to Pipeline steps helps ensure that your pipeline is durable (i.e. survives restarts), is pausable, and doesn't block the execution thread, preventing parallel steps from working, for example.
In this case, the existing readFile step can be used.
I don't know how well the Docker Plugin interacts with Pipeline (though I imagine it should be transparent), and without knowing which agents have the "mvn" label, or whether you can reproduce this outside of a shared library, it's unclear why your sh step would appear to be running on the master.
The Docker Pipeline Plugin is explicitly designed for Pipeline, so it might give better results.

Multi-branch configuration with externally-defined Jenkinsfile

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

How to re-use groovy script in Jenkins Groovy Post Build plugin?

I have some groovy code which I am planning to re-use in Jenkins Groovy Post Build plugin of multiple jobs. How can I achieve this? Is there a place I can store the script in a global variable and call that in the jobs where ever I need?
You can load any groovy file living on the Jenkins master within the groovy postbuild and execute it. For example, you could have a special directory on the c drive where all the common scripts live. I'll update my answer later with some code that shows you how to load the script in.
Update
Assuming you have a test.groovy file on your C: drive, it should be as simple as the following in Groovy Postbuild:
evaluate(new File("C:\\test.groovy"))
Please view the comment section of the Groovy Postbuild for more examples and possibly other ways.
Here is the solution that worked for me:
Installed Scriptler plugin for Jenkins and saved the Groovy script in that. Now the script is available in JENKINS_HOME/scriptler/scripts directory. This way we can avoid manual step of copying files to Jenkins master.
Used the groovy file in Post build:
def env = manager.build.getEnvironment(manager.listener) evaluate(new File(env['JENKINS_HOME'] + "\\scriptler\\scripts\\GroovyForPostBuild.groovy"))
This is a copy of my answer to this similar question on StackOverflow:
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}

How do I load CPS Global Lib from the specified branch instead of master?

I am studying Jenkins Pipeline Global Lib functionality. It seems pretty handy however due to its global nature any harmful change will affect all the jobs. Thus I want to be able to test it before pushing to master on a different branch.
Is there a way to specify a branch from which I want to to include the global lib sources for a particular job?
UPDATE. I tried a workaround with direct git clone from the test branch then load my library file explicitly replacing the automatically loaded one.
The problem comes when this lib uses some other class from the src/. Because in this case its pre-loaded version from master is always used.
So in the conditions below it runs common.groovy from feature-test but prints Hello from master!!!! when b.dummy() is called.
Pipeline Script in Jenkins:
node('myhost'){
git url: 'ssh://10.0.0.1:12345/workflowLibs.git',
branch: 'feature-test'
dir ('src'){
load 'com/foo/Base.groovy'
}
dir ('vars'){
common = load 'common.groovy'
}
}
println common.dummy()
vars/common.groovy (feature-test):
package com.foo
def dummy(){
def b = new com.foo.Base()
b.dummy()
}
src/com/foo/Base.groovy (master):
package com.foo
def dummy(){
return 'Hello from master!!!!'
}
src/com/foo/Base.groovy (feature-test):
package com.foo
def dummy(){
return 'Hello from feature-test!!!!'
}
As far as I'm aware, this isn't possible — scripts pushed to this repo can't be versioned by having different branches.
There are a few alternative approaches you could take, which involve hosting your scripts in an external repo (i.e. not with the Global Lib repo):
You could use the Pipeline Remote Loader plugin, which allows you to pull Pipeline scripts from a remote repo, e.g.
def p = fileLoader.fromGit('bar/common.groovy',
'https://example.com/foo/pipelines.git',
'test-branch', null, '')
p.doSomething()
You can also use this plugin to easily load multiple Pipeline scripts from the same repo.
Alternatively, you could check out your script Git repo within a Pipeline execution and load a script directly:
stage 'Load scripts'
def p
dir('tmp') {
git url: 'https://example.com/foo/pipelines.git',
branch: 'test-branch'
p = load 'bar/common.groovy'
}
stage 'Do something'
p.doSomething()
If you want to keep on using the Global Lib repo, you could use the above techniques for testing the scripts, and then set up a Jenkins job to push your script changes into the master Global Lib repo.

Resources