Load multiple scripts from a shared folder in pipeline - jenkins

I would like to share a global repository with a few python scripts that could be called from the pipeline Jenkinsfile.
I created the global share and added the #Library('...') _ to the Jenkinsfile. It clones the repo specified but I do not know how to call the scripts from that shared pipeline folder or do I have to put the scripts in a resource/ folder?
I haven't been able to find any specifics for this. Some of the scripts in that repo depend on each other. Any help appreciated.

In your Jenkinsfile - load the script with libraryResource
script = libraryResource 'my_script.py'
and use it
sh script

Related

Jenkins Multibranch Pipeline: script / jenkinsfile as svn external

I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks

Share Functions among Jobs

I am using Jenkins pipeline, I created 4 Jobs, each job has some functions and Their is a redundant function existing in all those Jobs.
How to make that redundant function in a shared place and all those jobs can call this function ?
You are looking for Jenkins shared library
As the name suggest, you create a library - a pipeline shared among jenkins jobs - in a SCM (git svn ...) and in your project you create a simple Jenkinsfile calling the library.
So, every build will checkout your project, read the Jenkinsfile and then checkout the library with the pipeline.
I did it by:
Created Folder in Jenkins working home
In that folder => I Created file.groovy contains the functions i need
At the end of that file should contain
return this
in JenkinsFile add
node{shared_functionality = load "FilePath.groovy"}
This number four will include functions in .groovy file in your jenkinsFile
So you can add node statement in your JenkinsFiles to include Functions you need

Copy file from Jenkins master to slave in Pipeline

I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin

groovy script loaded from jenkinsfile not found

currently I have an "all inclusive" jenkinsfile which contains various functions.
In order to re-use those functions in other jenkinsfiles I want to put them into separate groovy scripts and load them from the jenkinsfile(s).
scmHandler.groovy:
#!groovy
def handleCheckout() {
if (env.gitlabMergeRequestId) {
echo 'Merge request detected. Merging...'
}
...
}
return this;
in jenkinsfile I do:
...
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()
I tried to follow the instructions from here but jenkins is constantly complaining that there is no such file scmHandler.groovy an I get:
java.io.FileNotFoundException: d:\jenkins\workspace\myJenkinsJob\test\scmHandler.groovy
Both jenkinsfile and scmHandler.groovy reside in a test/ subdir of the workspace in the git repo of the project to boild and are checked out correctly on master:
/var/lib/jenkins/jobs/myJenkinsJob/workspace#script/test/scmHandler.groovy
However I cannot find them on the slave node where the jenkinsfile executes the build steps inside a node {}. There I only see old versions of the jenkinsfile since the (separated) checkout step is not executed yet.
How do I correctly access the handleCheckout.groovy? What am I miss here?
Actually I find this a neat way to "include" external groovy files without using a separate library.
Use checkout scm before loading scmHandler.groovy
checkout scm
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()

Multi-branch configuration with externally-defined Jenkinsfile

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

Resources