Jenkins multibranch pipeline only for subfolder - jenkins

I have git monorepo with different apps. Currently I have single Jenkinsfile in root folder that contains pipeline for app alls. It is very time consuming to execute full pipeline for all apps when commit changed only one app.
We use GitFlow-like approach to branching so Multibranch Pipeline jobs in Jenkins as perfect fit for our project.
I'm looking for a way to have several jobs in Jenkins, each one will be triggered only when code of appropriate application was changed.
Perfect solution for me looks like this:
I have several Multibranch Pipeline jobs in Jenkins. Each one looks for changes only to given directory and subdirectories. Each one uses own Jenkinsfile. Jobs pull git every X minutes and if there are changes to appropriate directories in existing branches - initiates build; if there are new branches with changes to appropriate directories - initiates build.
What stops me from this implementation
I'm missing a way to define commit to which folders must be ignored during scan execution by Multibranch pipeline. "Additional behaviour" for Multibranch pipeline doesn't have "Polling ignores commits to certain paths" option, while Pipeline or Freestyle jobs have. But I want to use Multibranch pipeline.
Solution described here doesnt work for me because if there will be new branch with changes only to "project1" then whenever Multibranch pipeline for "project2" will be triggered it will discover this new branch anyway and build it. Means for every new branch each of my Multibranch pipelines will be executed at least once no matter if there was changes to appropriate code or not.
Appreciate any help or suggestions how I can implement few Multibranch pipelines watching over same git repository but triggered only when appropriate pieces of code changed

This can be accomplished by using the Multibranch build strategy extension plugin. With this plugin, you can define a rule where the build only initiates when the changes belong to a sub-directory.
Install the plugin
On the Multibranch pipeline configuration, add a Build strategy
Select Build included regions strategy
Put a sub-folder on the field, such as subfolder/**
This way the changes will still be discovered, but they won't initiate a build if it doesn't belong to a certain set of files or folders.
This is the best approach I'm aware so far. But I think the best way would be a case where the changes doesn't even get discovered.
Edit: Gerrit Code Review plugin configuration
In case you're using the Gerrit Code Review plugin, you can also prevent new changes to be discovered by using a custom query:

I solved this by creating a project that builds other projects depending on the files changed. For example, from your repo root:
/Jenkinsfile
#!/usr/bin/env groovy
pipeline {
agent any
options {
timestamps()
}
triggers {
bitbucketPush()
}
stages {
stage('Build project A') {
when {
changeset "project-a/**"
}
steps {
build 'project-a'
}
}
stage('Build project B') {
when {
changeset "project-b/**"
}
steps {
build 'project-b'
}
}
}
}
You would then have other Pipeline projects with their own Jenkinsfile (i.e., project-a/Jenkinsfile).

I know that this post is quite old, but I solved this problem by changing the "include branches" parameter for SVN repositories (this can possibly also be done using the property "Filter by name (with wildcards)" for git repos). Instead of supplying only the actual branch name, I also included the subfolder. So instead of only supplying "trunk", I used "trunk/subfolder". This limits scanning to only that specific directory. Note that I have not yet fully tested this solution.

Related

Jenkins: Access job/plugin configuration values inside pipeline

I am trying the access the values set on a job's configuration page from within my pipeline. These values are not made available as params, nor are they injected as envvars.
Setup
Jenkins, v2.263.1
GitLab Branch Source plugin, v1.5.3 (link)
Multibranch pipeline job which is pointed to a Gitlab repo
Remote Jenkinsfile Provider, v1.13 (link)
Problem
Ordinarily, one would have a Jenkinsfile in the root of the repo and therefore the scm would be associated with the repo we want to checkout and build. However, in my case the code I want to build is in a different repo to the Jenkinsfile (hence the Remote Jenkinsfile Provider plugin).
This means that I need to checkout the code I wish to build as an explicit step in the pipeline, and to do that I need to know the repo. This repo is, however, already defined in the job config.
The Branch Source plugin does export things like the branch name or merge request number/branch/target into appropriate envvars, but NOT the actual repo.
As this is a multibranch pipeline, I cannot use something like envInject either (multibranch jobs do not provide the option to 'Prepare an environment for the run' as with other jobs)
Goal
I would like to be able to access the server, owner and project fields set in the job config page. Ultimately I could manage with just the project's ssh/http address even.
Is there some clever way of accessing a job's config from within the pipeline?
Thanks for any suggestions!
Reference images
Within the gitlab branch source plugin (and the documentation) you have a lot more information, than just with the normal branch source plugin. there are environment variables for the project like GITLAB_PROJECT_GIT_SSH_URL/GITLAB_PROJECT_GIT_HTTPS_URL for the git source and many more. So far i did not see one for the server, but that would be parse-able our of the URLs.
Within this information, it should be fairly easy to checkout the repository and build it.
As through the process it came clear, that it is needed to also trigger the pipeline manually, and this is normally also possible with variables (not sure about the Remote File plugin). I assume your Jenkinsfile is a groovy script, which opens up a lot of possibilities. You can define variables and use some logic to determine if the env variable or the parameter is used.
pipeline {
parameters {
string(name: 'projectUrl', defaultValue: "")
}
stages {
stage('Prepare') {
steps {
def projectUrl = env.GITLAB_PROJECT_GIT_SSH_URL ?: params.projectUrl
// DO Checkout with projectUrl
}
}
}
}
The only critical thing you have to take into account, is that the multibranch pipeline, has to run once, for each branch or mr - so they detect the variables. Afterwards you can easily trigger it, manually by providing your values.
This allows you, to utilize webhooks for automatic actions, and also allows you to trigger the build manually when ever you like.
Sidenote: if you use the centralized jenkinsfile, for reducing duplication, you might also want to checkout Shared libraries for jenkins.
For completeness, here is a list of all current environment variables added by the jenkins gitlab branch source plugin version 1.5.3 (and only for Push Events - but they are pretty similar in the other event types too)
GITLAB_OBJECT_KIND
GITLAB_AFTER
GITLAB_BEFORE
GITLAB_REF
GITLAB_CHECKOUT_SHA
GITLAB_USER_ID
GITLAB_USER_NAME
GITLAB_USER_EMAIL
GITLAB_PROJECT_ID
GITLAB_PROJECT_ID_2
GITLAB_PROJECT_NAME
GITLAB_PROJECT_DESCRIPTION
GITLAB_PROJECT_WEB_URL
GITLAB_PROJECT_AVATAR_URL
GITLAB_PROJECT_GIT_SSH_URL
GITLAB_PROJECT_GIT_HTTP_URL
GITLAB_PROJECT_NAMESPACE
GITLAB_PROJECT_VISIBILITY_LEVEL
GITLAB_PROJECT_PATH_NAMESPACE
GITLAB_PROJECT_CI_CONFIG_PATH
GITLAB_PROJECT_DEFAULT_BRANCH
GITLAB_PROJECT_HOMEPAGE
GITLAB_PROJECT_URL
GITLAB_PROJECT_SSH_URL
GITLAB_PROJECT_HTTP_URL
GITLAB_REPO_NAME
GITLAB_REPO_URL
GITLAB_REPO_DESCRIPTION
GITLAB_REPO_HOMEPAGE
GITLAB_REPO_GIT_SSH_URL
GITLAB_REPO_GIT_HTTP_URL
GITLAB_REPO_VISIBILITY_LEVEL
GITLAB_COMMIT_COUNT
GITLAB_COMMIT_ID_#
GITLAB_COMMIT_MESSAGE_#
GITLAB_COMMIT_TIMESTAMP_#
GITLAB_COMMIT_URL_#
GITLAB_COMMIT_AUTHOR_AVATAR_URL_#
GITLAB_COMMIT_AUTHOR_CREATED_AT_#
GITLAB_COMMIT_AUTHOR_EMAIL_#
GITLAB_COMMIT_AUTHOR_ID_#
GITLAB_COMMIT_AUTHOR_NAME_#
GITLAB_COMMIT_AUTHOR_STATE_#
GITLAB_COMMIT_AUTHOR_USERNAME_#
GITLAB_COMMIT_AUTHOR_WEB_URL_#
GITLAB_COMMIT_ADDED_#
GITLAB_COMMIT_MODIFIED_#
GITLAB_COMMIT_REMOVED_#
GITLAB_REQUEST_URL
GITLAB_REQUEST_STRING
GITLAB_REQUEST_TOKEN
GITLAB_REFS_HEAD

How to manage multiple Jenkins pipelines from a single repository?

At this moment we use JJB to compile Jenkins jobs (mostly pipelines already) in order to configure about 700 jobs but JJB2 seems not to scale well to build pipelines and I am looking for a way to drop it from the equation.
Mainly i would like to be able to have all these pipelines stored in a single centralized repository.
Please note that keeping the CI config (Jenkinsfile) inside each repository and branch is not possible in our use case, we need to keep all pipelines in a single "jenkins-jobs.git" repo.
As far as I know this is not possible yet, but in progress. See: https://issues.jenkins-ci.org/browse/JENKINS-43749
I think this is the purpose of jenkins shared libraries
I didn't dev such library my-self but I am using some. Basically:
Develop the "shared code" of the jenkins pipeline in a shared library
it can contains the whole pipeline (seq of steps)
Add this library to the jenkins server
In each project, add a jenkinsfile that "import" those using #Library
as #Juh_ said, you can use jenkins shared libraries, here is a complete steps, Suppose that we have three branches:
master
develop
stage
and we want to create a single Jenkins file so that we can change in only one place. All you need is creating a new branch ex: common. This branch MUST have this structure. What we are interested for now is adding a new groovy file in vars directory, ex: common.groovy. Here we can put the common Jenkins file that you wish to be used across all branches.
Here is a sample:
def call() {
node {
stage("Install Stage from common file") {
if (env.BRANCH_NAME.equals('master')){
echo "npm install from common files master branch"
}
else if(env.BRANCH_NAME.equals('develop')){
echo "npm install from common files develop branch"
}
}
stage("Test") {
echo "npm test from common files"
}
}
}
You must wrap your code call function in order to be used in other branches. now we have finished work in common branch we need to use it in our branches. go to any branch you wish to use this pipline ex: master and create Jenkinsfile and put this one line of code:
common()
This will call the common function that you have created before in common branch and will execute the pipeline.

Jenkins: how to trigger pipeline on git tag

We want to use Jenkins to generate releases/deployments on specific project milestones. Is it possible to trigger a Jenkins Pipeline (defined in a Jenkinsfile or Groovy script) when a tag is pushed to a Git repository?
We host a private Gitlab server, so Github solutions are not applicable to our case.
This is currently something that is sorely lacking in the pipeline / multibranch workflow. See a ticket around this here: https://issues.jenkins-ci.org/browse/JENKINS-34395
If you're not opposed to using release branches instead of tags, you might find that to be easier. For example, if you decided that all branches that start with release- are to be treated as "release branches", you can go...
if( env.BRANCH_NAME.startsWith("release-") ) {
// groovy code on release goes here
}
And if you need to use the name that comes after release-, such as release-10.1 turning into 10.1, just create a variable like so...
if( env.BRANCH_NAME.startsWith("release-") ) {
def releaseName = env.BRANCH_NAME.drop(8)
}
Both of these will probably require some method whitelisting in order to be functional.
I had the same desire and rolled my own, maybe not pretty but it worked...
In your pipeline job, mark that "This project is parameterized" and add a parameter for your tag. Then in the pipeline script checkout the tag if it is present.
Create a freestyle job that runs a script to:
Checkout
Run git describe --tags --abbrev=0 to get the latest tag.
Check that tag against a running list of builds (like in a file).
If the build hasn't occurred, trigger the pipeline job via a url passing your tag as a parameter (in your pipeline job under "Build Triggers" set "Trigger builds remotely (e.g. from scripts) and it will show the correct url.
Add the tag to your running list of builds so it doesn't get triggered again.
Have this job run frequently.
if you use multibranch pipeline, there is a discover tag. Use that plus Spencer solution

Jenkins multibranch pipeline with Jenkinsfile from different repository

I have a Git repository with code I'd like to build but I'm not "allowed" to add a Jenkinsfile in its root (it is a Debian package so I can't add files to upstream source). Is there a way to store the Jenkinsfile in one repository and have it build code from another repository? Since my code repository has several branches to build (one for each Debian release) this should be a multibranch pipeline. Commits in either the code or Jenkinsfile repositories should trigger a build.
Bonus complexity: I have several code/packaging repositories like this and I'd like to reuse the same Jenkinsfile for all of them. Thus it should somehow dynamically fetch the right Git URL to use. The branches to build have the same names across all repositories.
Short answer is : you cannot do that with a multibranch pipeline. Multibranch pipelines are only designed (at least for now) to execute a specific pipeline in Pipeline script from SCM style, with a fixed Jenkinsfile at the root of the project.
You can however use the Multi-Branch Project plugin made for multibranch freestyle projects. First, you need to define your multibranch freestyle configuration just like you would with a multibranch pipeline configuration.
Select this new item like shown below :
This type of configuration will behave exactly same as the multibranch pipeline type, i.e. it will create you a folder with the name of your configuration and a sub-project for each branch it automatically detected.
The implementation should then be a piece of cake :
Specify your SCM repository in the multibranch configuration
Call another build as part of your build/post-build as you would do in a standard freestyle project, except that you have to call a parameterized job (let's call it build-job) and give it your repository information, i.e. Git URL and current branch (you can use the pre-defined variables $GIT_URL and $GIT_BRANCH for this purpose)
In your build-job, just define either an inline pipeline or a pipeline script checked out from SCM, and inside this script do a SCM checkout and go on with the steps you need to build. Example of build-job pipeline content :
.
node() {
stage 'Checkout'
checkout scm: [$class: 'GitSCM', branches: [[name: '*/${GIT_BRANCH}']], userRemoteConfigs: [[url: '${GIT_URL}']]]
stage 'Build'
// Build steps...
}
Of course if your different multibranches projects need to be treated a bit differently, you could also use intermediate projects (let's say build-project-A, build-project-B, ...) that would in turn call the generic build-job pipeline)
The one, major drawback of this solution is that you will only have one job responsible for all of your builds, making it harder to debug. You would still have your multibranch projects going blue/red in case of success/error but you will have to go back to called build-job to find the real problem of your build.
The best way I have found is to use the Remote Jenkinsfile Provider plugin. https://plugins.jenkins.io/remote-file/
This will add an option "by Remote Jenkinsfile Provider plugin" under Build Configuration>Mode then you can point to another repo where the Jenkinsfile is. I find this to be a much better solution than the Pipeline Multibranch Defaults Plugin, which makes you store the Jenkins file in Jenkins itself, rather than in source control.
U can make use of this plugin
https://github.com/jenkinsci/pipeline-multibranch-defaults-plugin/blob/master/README.md
Where we need to configure the jenkinsfile on jenkins rather than having it on each branch of your repo
I have version 2.121 and you can do this two ways:
Way 1
In the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script from SCM" and enter the "SCM" information for how to find the "Jenkinsfile" that holds the script you want to run. It can be in the same repo you are finding branches in to create the jobs (if you put in the same GitHub repo's info) but I can't find a way to indicate that you just use the same branch for the file.
Way 2
Same as above, in the multibranch pipeline configuration > Build Configuration > Mode > Select "Custom Script" and put in "Marker File" below the name of a file you will use to identify branches that you want to have builds for.
Then, below that in Pipeline > Definition select "Pipeline Script" and put a bit of Groovy in the text box to load whatever you want or to run some script that already got loaded into the workspace.
In my case, i have an escenario whith a gitlab project based on gradle who has dependencies on another gitlab preject based on gradle too (same dashboard, but differents commits, differents developers).
I have added the following lines into my Jenkinsfile (the one which depends)
stage('Build') {
steps {
git branch: 'dev', credentialsId: 'jenkins-generated-ssh-key', url: 'git#gitlab.project.com:root/coreProject.git'
sh './gradlew clean'
}
}
Note: Be awark on the order on the sentences.
If you have doubt on how to create jenkins-generated-ssh-key please ask me

How do I make a Jenkins job start after multiple simultaneous upstream jobs succeed?

In order to get the fastest feedback possible, we occasionally want Jenkins jobs to run in Parallel. Jenkins has the ability to start multiple downstream jobs (or 'fork' the pipeline) when a job finishes. However, Jenkins doesn't seem to have any way of making a downstream job only start of all branches of that fork succeed (or 'joining' the fork back together).
Jenkins has a "Build after other projects are built" button, but I interpret that as "start this job when any upstream job finishes" (not "start this job when all upstream jobs succeed").
Here is a visualization of what I'm talking about. Does anyone know if a plugin exists to do what I'm after?
Edit:
When I originally posted this question in 2012, Jason's answer (the Join and Promoted Build plugins) was the best, and the solution I went with.
However, dnozay's answer (The Build Flow plugin) was made popular a year or so after this question, which is a much better answer. For what it's worth, if people ask me this question today, I now recommend that instead.
Pipeline plugin
You can use the Pipeline Plugin (formerly workflow-plugin).
It comes with many examples, and you can follow this tutorial.
e.g.
// build
stage 'build'
...
// deploy
stage 'deploy'
...
// run tests in parallel
stage 'test'
parallel 'functional': {
...
}, 'performance': {
...
}
// promote artifacts
stage 'promote'
...
Build flow plugin
You can also use the Build Flow Plugin. It is simply awesome - but it is deprecated (development frozen).
Setting up the jobs
Create jobs for:
build
deploy
performance tests
functional tests
promotion
Setting up the upstream
in the upstream (here build) create a unique artifact, e.g.:
echo ${BUILD_TAG} > build.tag
archive the build.tag artifact.
record fingerprints to track file usage; if any job copies the same build.tag file and records fingerprints, you will be able to track the parent.
Configure to get promoted when promotion job is successful.
Setting up the downstream jobs
I use 2 parameters PARENT_JOB_NAME and PARENT_BUILD_NUMBER
Copy the artifacts from upstream build job using the Copy Artifact Plugin
Project name = ${PARENT_JOB_NAME}
Which build = ${PARENT_BUILD_NUMBER}
Artifacts to copy = build.tag
Record fingerprints; that's crucial.
Setting up the downstream promotion job
Do the same as the above, to establish upstream-downstream relationship.
It does not need any build step. You can perform additional post-build actions like "hey QA, it's your turn".
Create a build flow job
// start with the build
parent = build("build")
parent_job_name = parent.environment["JOB_NAME"]
parent_build_number = parent.environment["BUILD_NUMBER"]
// then deploy
build("deploy")
// then your qualifying tests
parallel (
{ build("functional tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) },
{ build("performance tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) }
)
// if nothing failed till now...
build("promotion",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
// knock yourself out...
build("more expensive QA tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
good luck.
There are two solutions that I have used for this scenario in the past:
Use the Join Plugin on your "deploy" job and specify "promote" as the targeted job. You would have to specify "Functional Tests" and "Performance Tests" as the joined jobs and start them via in some fashion, post build. The Parameterized Trigger Plugin is good for this.
Use the Promoted Builds Plugin on your "deploy" job, specify a promotion that works when downstream jobs are completed and specify Functional and Performance test jobs. As part of the promotion action, trigger the "promote" job. You still have to start the two test jobs from "deploy"
There is a CRITICAL aspect to both of these solutions: fingerprints must be correctly used. Here is what I found:
The "build" job must ORIGINATE a new fingerprinted file. In other words, it has to fingerprint some file that Jenkins thinks was originated by the initial job. Double check the "See Fingerprints" link of the job to verify this.
All downstream linked jobs (in this case, "deploy", "Functional Tests" and "Performance tests") need to obtain and fingerprint this same file. The Copy Artifacts plugin is great for this sort of thing.
Keep in mind that some plugins allow you change the order of fingerprinting and downstream job starting; in this case, the fingerprinting MUST occur before a downstream job fingerprints the same file to ensure the ORIGIN of the fingerprint is properly set.
The Multijob plugin works beautifully for that scenario. It also comes in handy if you want a single "parent" job to kick off multiple "child" jobs but still be able to execute each of the children manually, by themselves. This works by creating "phases", to which you add 1 to n jobs. The build only continues when the entire phase is done, so if a phase as multiple jobs they all must complete before the rest are executed. Naturally, it is configurable whether the build continues if there is a failure within the phase.
Jenkins recently announced first class support for workflow.
I believe the Workflow Plugin is now called the Pipeline Plugin and is the (current) preferred solution to the original question, inspired by the Build Flow Plugin. There is also a Getting Started Tutorial in GitHub.
Answers by jason & dnozay are good enough. But in case someone is looking for easy way just use JobFanIn plugin.
This diamond dependency build pipeline could be configured with
the DepBuilder plugin. DepBuilder is using its own domain
specific language, that would in this case look like:
_BUILD {
// define the maximum duration of the build (4 hours)
maxDuration: 04:00
}
// define the build order of the existing Jenkins jobs
Build -> Deploy
Deploy -> "Functional Tests" -> Promote
Deploy -> "Performance Tests" -> Promote
After building the project, the build visualization will be shown on the project dashboard page:
If any of the upstream jobs didn't succeed, the build will be automatically aborted. Abort behavior could be tweaked on a per job basis, for more info see the DepBuilder documentation.

Resources