Run external jenkinsfile in another jenkinsfile - jenkins

let's say I have 'global' Jenkinsfile stored in separated git repo where I've defined all possible stages that any of my pipelines might want to use. Some of those steps are inside if statement to give possibility to skip them if needed.
Is there any chance to in my project create dedicated Jenkinsfile that include this global jenkinsfile and pass some parameter?
Thanks from advance.

The possibility to call certain kind of methods in your pipeline is delivered by JenkinsSharedLibrary. So the best way to do these conditional pipelinesteps would be to define your stages in Closures and then call these Closures as required in your Jenkinsfile.
Example Closure defined in your Shared Library:
//Closure which defines Groovy or Jenkins Pipeline DSL to be executed
Closure javaBuildStage = {
stage('Build Java') {
echo "This is the build stage for Java apps"
sh("./mvn clean package")
}
}
Example Jenkinsfile:
#Library('YourSharedLibrary#master')
if(project == "java"){
javaBuildStage()
}
You can decentralize all your functions with the Jenkins Shared Libraries. The Jenkins Shared Libararies will be located in a Git Repository.

For triggering same Jenkins file for all repositories, you can use Remote File Plugin.
For details you can check this answer
https://stackoverflow.com/a/58877133/6110485

Related

Jenkins: Access job/plugin configuration values inside pipeline

I am trying the access the values set on a job's configuration page from within my pipeline. These values are not made available as params, nor are they injected as envvars.
Setup
Jenkins, v2.263.1
GitLab Branch Source plugin, v1.5.3 (link)
Multibranch pipeline job which is pointed to a Gitlab repo
Remote Jenkinsfile Provider, v1.13 (link)
Problem
Ordinarily, one would have a Jenkinsfile in the root of the repo and therefore the scm would be associated with the repo we want to checkout and build. However, in my case the code I want to build is in a different repo to the Jenkinsfile (hence the Remote Jenkinsfile Provider plugin).
This means that I need to checkout the code I wish to build as an explicit step in the pipeline, and to do that I need to know the repo. This repo is, however, already defined in the job config.
The Branch Source plugin does export things like the branch name or merge request number/branch/target into appropriate envvars, but NOT the actual repo.
As this is a multibranch pipeline, I cannot use something like envInject either (multibranch jobs do not provide the option to 'Prepare an environment for the run' as with other jobs)
Goal
I would like to be able to access the server, owner and project fields set in the job config page. Ultimately I could manage with just the project's ssh/http address even.
Is there some clever way of accessing a job's config from within the pipeline?
Thanks for any suggestions!
Reference images
Within the gitlab branch source plugin (and the documentation) you have a lot more information, than just with the normal branch source plugin. there are environment variables for the project like GITLAB_PROJECT_GIT_SSH_URL/GITLAB_PROJECT_GIT_HTTPS_URL for the git source and many more. So far i did not see one for the server, but that would be parse-able our of the URLs.
Within this information, it should be fairly easy to checkout the repository and build it.
As through the process it came clear, that it is needed to also trigger the pipeline manually, and this is normally also possible with variables (not sure about the Remote File plugin). I assume your Jenkinsfile is a groovy script, which opens up a lot of possibilities. You can define variables and use some logic to determine if the env variable or the parameter is used.
pipeline {
parameters {
string(name: 'projectUrl', defaultValue: "")
}
stages {
stage('Prepare') {
steps {
def projectUrl = env.GITLAB_PROJECT_GIT_SSH_URL ?: params.projectUrl
// DO Checkout with projectUrl
}
}
}
}
The only critical thing you have to take into account, is that the multibranch pipeline, has to run once, for each branch or mr - so they detect the variables. Afterwards you can easily trigger it, manually by providing your values.
This allows you, to utilize webhooks for automatic actions, and also allows you to trigger the build manually when ever you like.
Sidenote: if you use the centralized jenkinsfile, for reducing duplication, you might also want to checkout Shared libraries for jenkins.
For completeness, here is a list of all current environment variables added by the jenkins gitlab branch source plugin version 1.5.3 (and only for Push Events - but they are pretty similar in the other event types too)
GITLAB_OBJECT_KIND
GITLAB_AFTER
GITLAB_BEFORE
GITLAB_REF
GITLAB_CHECKOUT_SHA
GITLAB_USER_ID
GITLAB_USER_NAME
GITLAB_USER_EMAIL
GITLAB_PROJECT_ID
GITLAB_PROJECT_ID_2
GITLAB_PROJECT_NAME
GITLAB_PROJECT_DESCRIPTION
GITLAB_PROJECT_WEB_URL
GITLAB_PROJECT_AVATAR_URL
GITLAB_PROJECT_GIT_SSH_URL
GITLAB_PROJECT_GIT_HTTP_URL
GITLAB_PROJECT_NAMESPACE
GITLAB_PROJECT_VISIBILITY_LEVEL
GITLAB_PROJECT_PATH_NAMESPACE
GITLAB_PROJECT_CI_CONFIG_PATH
GITLAB_PROJECT_DEFAULT_BRANCH
GITLAB_PROJECT_HOMEPAGE
GITLAB_PROJECT_URL
GITLAB_PROJECT_SSH_URL
GITLAB_PROJECT_HTTP_URL
GITLAB_REPO_NAME
GITLAB_REPO_URL
GITLAB_REPO_DESCRIPTION
GITLAB_REPO_HOMEPAGE
GITLAB_REPO_GIT_SSH_URL
GITLAB_REPO_GIT_HTTP_URL
GITLAB_REPO_VISIBILITY_LEVEL
GITLAB_COMMIT_COUNT
GITLAB_COMMIT_ID_#
GITLAB_COMMIT_MESSAGE_#
GITLAB_COMMIT_TIMESTAMP_#
GITLAB_COMMIT_URL_#
GITLAB_COMMIT_AUTHOR_AVATAR_URL_#
GITLAB_COMMIT_AUTHOR_CREATED_AT_#
GITLAB_COMMIT_AUTHOR_EMAIL_#
GITLAB_COMMIT_AUTHOR_ID_#
GITLAB_COMMIT_AUTHOR_NAME_#
GITLAB_COMMIT_AUTHOR_STATE_#
GITLAB_COMMIT_AUTHOR_USERNAME_#
GITLAB_COMMIT_AUTHOR_WEB_URL_#
GITLAB_COMMIT_ADDED_#
GITLAB_COMMIT_MODIFIED_#
GITLAB_COMMIT_REMOVED_#
GITLAB_REQUEST_URL
GITLAB_REQUEST_STRING
GITLAB_REQUEST_TOKEN
GITLAB_REFS_HEAD

How to have modular Jenkins Pipeline?

I would like to create a Jenkins declarative pipeline and would like to have the pipeline structure as following:
mainPipeline.groovy
stage1.groovy
stage2.groovy
stage3.groovy
mainPipeline looks like the following:
pipeline {
stages {
stage('stage1') {
// Call method from the file Stage1.groovy
}
stage('stage2') {
// Call method from the file Stage2.groovy
}
}
}
I have two main questions:
How do I link these files to a Library?
How do I configure Jenkins Pipeline, so that Jenkins not only knows the main JenkinsFile which is mainPipeline but also the submodules?
I would not recommend to separate your Jenkinsfile into separate files, since there are better options:
You can execute jobs within your pipeline with Pipeline: Build Step plugin. Use this to execute stages that gonna be used by multiple jobs. For example I use this to deploy my applications in a common deploy job.
You can extend Jenkins with your own libraries, which you can load per job or for all jobs. See: Extending with Shared Libraries
For both methods the defining Jenkinsfiles/Groovy scripts can come from SCM.
If you really want to load script from the project path then check this question. If you want to use multiple Jenkinsfiles from the project path, you can just add more Jenkinsfiles as "Project Recognizers" when you configure the job.

How to manage multiple Jenkins pipelines from a single repository?

At this moment we use JJB to compile Jenkins jobs (mostly pipelines already) in order to configure about 700 jobs but JJB2 seems not to scale well to build pipelines and I am looking for a way to drop it from the equation.
Mainly i would like to be able to have all these pipelines stored in a single centralized repository.
Please note that keeping the CI config (Jenkinsfile) inside each repository and branch is not possible in our use case, we need to keep all pipelines in a single "jenkins-jobs.git" repo.
As far as I know this is not possible yet, but in progress. See: https://issues.jenkins-ci.org/browse/JENKINS-43749
I think this is the purpose of jenkins shared libraries
I didn't dev such library my-self but I am using some. Basically:
Develop the "shared code" of the jenkins pipeline in a shared library
it can contains the whole pipeline (seq of steps)
Add this library to the jenkins server
In each project, add a jenkinsfile that "import" those using #Library
as #Juh_ said, you can use jenkins shared libraries, here is a complete steps, Suppose that we have three branches:
master
develop
stage
and we want to create a single Jenkins file so that we can change in only one place. All you need is creating a new branch ex: common. This branch MUST have this structure. What we are interested for now is adding a new groovy file in vars directory, ex: common.groovy. Here we can put the common Jenkins file that you wish to be used across all branches.
Here is a sample:
def call() {
node {
stage("Install Stage from common file") {
if (env.BRANCH_NAME.equals('master')){
echo "npm install from common files master branch"
}
else if(env.BRANCH_NAME.equals('develop')){
echo "npm install from common files develop branch"
}
}
stage("Test") {
echo "npm test from common files"
}
}
}
You must wrap your code call function in order to be used in other branches. now we have finished work in common branch we need to use it in our branches. go to any branch you wish to use this pipline ex: master and create Jenkinsfile and put this one line of code:
common()
This will call the common function that you have created before in common branch and will execute the pipeline.

Jenkins Shared Libraries context

I have a pipeline job which loads Jenkinsfile from git repository. My Jenkinsfile looks like this:
#!groovy
#Library('global-utils-lib') _
node("mvn") {
stage('build') {
checkout scm
}
stage('merge-request'){
mergeRequest()
}
}
global-utils-lib is shared library loaded in Global Pipeline Libraries from another git repo with following structure
vars/mergeRequest.groovy
mergeRequest.groovy:
def call() {
sh "ip addr"
def workspacePath = env.WORKSPACE
new File(workspacePath + "/file.txt").text
}
Job is run against docker container (docker plugin).
When I run this job then docker container is provisioned correctly and scm is downloaded but I get FileNotFoundException.
It looks like code from shared library is executed against jenkins master not slave:
presented IP comes from master
file is loaded correctly when I pass correct path to the scm on master
How can I run library code against slave? What I am missing?
It's generally not a good idea to try and do things like new File() instead of using existing Pipeline steps.
Your Pipeline script is interpreted and executed by the Jenkins master so, as you're seeing, the attempt to use the File API doesn't work as you might expect.
Sticking to Pipeline steps helps ensure that your pipeline is durable (i.e. survives restarts), is pausable, and doesn't block the execution thread, preventing parallel steps from working, for example.
In this case, the existing readFile step can be used.
I don't know how well the Docker Plugin interacts with Pipeline (though I imagine it should be transparent), and without knowing which agents have the "mvn" label, or whether you can reproduce this outside of a shared library, it's unclear why your sh step would appear to be running on the master.
The Docker Pipeline Plugin is explicitly designed for Pipeline, so it might give better results.

Multi-branch configuration with externally-defined Jenkinsfile

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

Resources