Multi-branch configuration with externally-defined Jenkinsfile - jenkins

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?

I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}

First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

Related

How to have modular Jenkins Pipeline?

I would like to create a Jenkins declarative pipeline and would like to have the pipeline structure as following:
mainPipeline.groovy
stage1.groovy
stage2.groovy
stage3.groovy
mainPipeline looks like the following:
pipeline {
stages {
stage('stage1') {
// Call method from the file Stage1.groovy
}
stage('stage2') {
// Call method from the file Stage2.groovy
}
}
}
I have two main questions:
How do I link these files to a Library?
How do I configure Jenkins Pipeline, so that Jenkins not only knows the main JenkinsFile which is mainPipeline but also the submodules?
I would not recommend to separate your Jenkinsfile into separate files, since there are better options:
You can execute jobs within your pipeline with Pipeline: Build Step plugin. Use this to execute stages that gonna be used by multiple jobs. For example I use this to deploy my applications in a common deploy job.
You can extend Jenkins with your own libraries, which you can load per job or for all jobs. See: Extending with Shared Libraries
For both methods the defining Jenkinsfiles/Groovy scripts can come from SCM.
If you really want to load script from the project path then check this question. If you want to use multiple Jenkinsfiles from the project path, you can just add more Jenkinsfiles as "Project Recognizers" when you configure the job.

Run external jenkinsfile in another jenkinsfile

let's say I have 'global' Jenkinsfile stored in separated git repo where I've defined all possible stages that any of my pipelines might want to use. Some of those steps are inside if statement to give possibility to skip them if needed.
Is there any chance to in my project create dedicated Jenkinsfile that include this global jenkinsfile and pass some parameter?
Thanks from advance.
The possibility to call certain kind of methods in your pipeline is delivered by JenkinsSharedLibrary. So the best way to do these conditional pipelinesteps would be to define your stages in Closures and then call these Closures as required in your Jenkinsfile.
Example Closure defined in your Shared Library:
//Closure which defines Groovy or Jenkins Pipeline DSL to be executed
Closure javaBuildStage = {
stage('Build Java') {
echo "This is the build stage for Java apps"
sh("./mvn clean package")
}
}
Example Jenkinsfile:
#Library('YourSharedLibrary#master')
if(project == "java"){
javaBuildStage()
}
You can decentralize all your functions with the Jenkins Shared Libraries. The Jenkins Shared Libararies will be located in a Git Repository.
For triggering same Jenkins file for all repositories, you can use Remote File Plugin.
For details you can check this answer
https://stackoverflow.com/a/58877133/6110485

groovy script loaded from jenkinsfile not found

currently I have an "all inclusive" jenkinsfile which contains various functions.
In order to re-use those functions in other jenkinsfiles I want to put them into separate groovy scripts and load them from the jenkinsfile(s).
scmHandler.groovy:
#!groovy
def handleCheckout() {
if (env.gitlabMergeRequestId) {
echo 'Merge request detected. Merging...'
}
...
}
return this;
in jenkinsfile I do:
...
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()
I tried to follow the instructions from here but jenkins is constantly complaining that there is no such file scmHandler.groovy an I get:
java.io.FileNotFoundException: d:\jenkins\workspace\myJenkinsJob\test\scmHandler.groovy
Both jenkinsfile and scmHandler.groovy reside in a test/ subdir of the workspace in the git repo of the project to boild and are checked out correctly on master:
/var/lib/jenkins/jobs/myJenkinsJob/workspace#script/test/scmHandler.groovy
However I cannot find them on the slave node where the jenkinsfile executes the build steps inside a node {}. There I only see old versions of the jenkinsfile since the (separated) checkout step is not executed yet.
How do I correctly access the handleCheckout.groovy? What am I miss here?
Actually I find this a neat way to "include" external groovy files without using a separate library.
Use checkout scm before loading scmHandler.groovy
checkout scm
def scmHandler = load ("test/scmHandler.groovy")
scmHandler.handleCheckout()

How to manage multiple Jenkins pipelines from a single repository?

At this moment we use JJB to compile Jenkins jobs (mostly pipelines already) in order to configure about 700 jobs but JJB2 seems not to scale well to build pipelines and I am looking for a way to drop it from the equation.
Mainly i would like to be able to have all these pipelines stored in a single centralized repository.
Please note that keeping the CI config (Jenkinsfile) inside each repository and branch is not possible in our use case, we need to keep all pipelines in a single "jenkins-jobs.git" repo.
As far as I know this is not possible yet, but in progress. See: https://issues.jenkins-ci.org/browse/JENKINS-43749
I think this is the purpose of jenkins shared libraries
I didn't dev such library my-self but I am using some. Basically:
Develop the "shared code" of the jenkins pipeline in a shared library
it can contains the whole pipeline (seq of steps)
Add this library to the jenkins server
In each project, add a jenkinsfile that "import" those using #Library
as #Juh_ said, you can use jenkins shared libraries, here is a complete steps, Suppose that we have three branches:
master
develop
stage
and we want to create a single Jenkins file so that we can change in only one place. All you need is creating a new branch ex: common. This branch MUST have this structure. What we are interested for now is adding a new groovy file in vars directory, ex: common.groovy. Here we can put the common Jenkins file that you wish to be used across all branches.
Here is a sample:
def call() {
node {
stage("Install Stage from common file") {
if (env.BRANCH_NAME.equals('master')){
echo "npm install from common files master branch"
}
else if(env.BRANCH_NAME.equals('develop')){
echo "npm install from common files develop branch"
}
}
stage("Test") {
echo "npm test from common files"
}
}
}
You must wrap your code call function in order to be used in other branches. now we have finished work in common branch we need to use it in our branches. go to any branch you wish to use this pipline ex: master and create Jenkinsfile and put this one line of code:
common()
This will call the common function that you have created before in common branch and will execute the pipeline.

Template Jenkinsfile for all project

For example:
I have 30 multibranch Git projects. All projects have their own Jenkinsfile. And suddenly I find a cool Jenkins plugin which I want to add for all projects. This is pain to do it on all projects and this is a big waste of time.
Is it possible to create somethink like template Jenkinsfile which will be something like wrapper for project Jenkinsfile or somethink like that which gives me possibility to do changes in 1 place instead of 30 places?
What I want is something like that:
stage {
...}
timestamps {
<include rest of stages definied in projects>
}
stage {
... }
Template file which is in some repo is looked for like this. All of the projects have their own stages defined which are included in the middle of the template Jenkinsfile and defined in project.
So Jenkinsfile in project must:
load template from repository
put stages in the middle of template Jenkinsfile
You could use:
the Config File Provider Plugin using the configuration files in Jenkins Pipelines:
node {
...
configFileProvider(
[configFile(fileId: 'jenkinsfile-template', ...)]) {
...
}
...
}
load: Evaluate a Groovy source file into the Pipeline script:
Takes a filename in the workspace and runs it as Groovy source text.
Shared Libraries:
Oftentimes it is useful to share parts of Pipelines between various projects to reduce redundancies and keep code "DRY".
Pipeline has support for creating "Shared Libraries" which can be defined in external source control repositories and loaded into existing Pipelines.

Resources