Loading groovy script from jenkins slave - jenkins

I have a jenkins pipeline which load a groovy utility script like this :
Utils = load('/var/lib/jenkins/utils/Utils.groovy')
Everything is fine when I execute the pipeline on the master node. In this case I'm able to use the methods inside my class Utils in the pipeline.
node('master'){
stage('stage1'){
def Utils = load('/var/lib/jenkins/utils/Utils.groovy')
Utils.doSomething()
}
}
My problem came when I try to execute my pipeline in a slave. In this case the load above causes the error
java.io.IOException: java.io.FileNotFoundException: /var/lib/jenkins/utils/Utils.groovy (No such file or directory)
To avoid this error, in the pipeline, I load the file in master node like this
node('master'){
stage('stage1'){
Utils = load('/var/lib/jenkins/utils/Utils.groovy')
}
}
node(){
stage('stage2'){
Utils.doSomething()
}
}
This is not very efficient and I don't want to use the master just for loading the file
Have you any advice on how to load a Groovy scipt on a slave node ?
Thank you

First, the above error java.io.IOException: java.io.FileNotFoundException: /var/lib/jenkins/utils/Utils.groovy (No such file or directory) was caused when you tried to load the file when executing on the slave node. But the file is stored in /var/lib/jenkins/utils/Utils.groovy in your master node, which is another computer and another file system, I guess. So, the mistake is logical.
When you execute some pipeline operations on another node (computer, server etc.), and you want to load the file, you need to have it stored in that computer(slave) and load it from there - so the path have to be according to file place in slave computer.
So, I would suggest to:
simply store Utils.groovy file on slave machine and load it from
there
load it to your workspace on master(as you did already)
Also, you can store groovy file code in github and load it from
there not depending on master/slave filesystems (I would advice this option)

Related

Reading config file in DSL build on agent host

I try to configure Jenkins' seed job, where whole business is in provided DSL script. I want to seperate that script from its configuration, which I want to locate in additional yml file. When I try to read that file:
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
def workDir = SEED_JOB.getWorkspace()
def config = new Yaml().load(("${workDir}/config.yml" as File).text)
I receive error
java.io.FileNotFoundException: /var/lib/jenkins/workspace/test.dsl/config.yml (No such file or directory)
I suppose that Jenkins is looking for the file on a master host, not an agent node where workspace is located.
Is it possible to read yml file in DSL build step on the agent node? Or maybe I have to execute that seed job always on my master host?
This seems not possible as the jobDsl script is executed on master. You can try force to run the job on master with label master.
From the documentation in section Script location:
Job DSL scripts are executed on the Jenkins master node, but the seed job's workspace which contains the script files may reside on a build node. This mean that direct access to the file specified by FILE may not be possible from a DSL script. See Distributed builds for details.

Copy file from Jenkins master to slave in Pipeline

I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin

groovy script loaded from jenkinsfile not found

currently I have an "all inclusive" jenkinsfile which contains various functions.
In order to re-use those functions in other jenkinsfiles I want to put them into separate groovy scripts and load them from the jenkinsfile(s).
scmHandler.groovy:
#!groovy
def handleCheckout() {
if (env.gitlabMergeRequestId) {
echo 'Merge request detected. Merging...'
}
...
}
return this;
in jenkinsfile I do:
...
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()
I tried to follow the instructions from here but jenkins is constantly complaining that there is no such file scmHandler.groovy an I get:
java.io.FileNotFoundException: d:\jenkins\workspace\myJenkinsJob\test\scmHandler.groovy
Both jenkinsfile and scmHandler.groovy reside in a test/ subdir of the workspace in the git repo of the project to boild and are checked out correctly on master:
/var/lib/jenkins/jobs/myJenkinsJob/workspace#script/test/scmHandler.groovy
However I cannot find them on the slave node where the jenkinsfile executes the build steps inside a node {}. There I only see old versions of the jenkinsfile since the (separated) checkout step is not executed yet.
How do I correctly access the handleCheckout.groovy? What am I miss here?
Actually I find this a neat way to "include" external groovy files without using a separate library.
Use checkout scm before loading scmHandler.groovy
checkout scm
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()

Load pipeline script from remote URL

In Jenkins-pipeline I can load a script from the local file with
load 'dir/my-script.groovy'
Now how can I load a pipeline groovy script from a remote URL?
It would be also nice, if the loading from a remote URL could be done without allocating a node first.
It is actually very well explained in the official pipeline plugin documentation.
Basically it's just :
git 'your-remote-repo'
load 'my-script.groovy'
Or if your file is not on a Git repo just use Unix wget in a shell to get your file, e.g. :
sh "wget example.org/myscript.groovy"
load 'myscript.groovy'
Also, I'm not sure what your second question means ? You can't run pipeline code outside a node, a node just defines the environment against which your Groovy code will be run, there has to be one (master by default).

jenkins pipeline unable to read files

I have a simple Jenkinsfile where I want to load some data from the workspace. I am using the pipeline plugin to leverage the Jenkinsfile inside of the repository. The build is farmed off to a matching Jenkins agent. When I try to use "readFile" I get the following message:
java.io.FileNotFoundException: /path/to/jenkins/workspace/XXXXX/project/data.json (No such file or directory)
I also get the same message when trying to load a Groovy file from the workspace.
My Jenkinsfile looks like:
node('master') {
stage "Start"
echo "Starting"
stage "Load File"
def myJson = readFile "data.json"
}
Any ideas why I can't read these files?
Thanks,
Tim
When Jenkins processes a Jenkinsfile it does not automatically pull down the entire source repository. You need to execute "checkout scm" to pull down the contents of the repository. If you fail to do so no other files will be available to the pipeline script.

Resources