Is it possible to configure Jenkins to NOT run a pipeline on branch discovery? - jenkins

I have a multi-branch pipeline that has been configured to use branch auto-discovery. However I don't want Jenkins to automatically start a pipeline job when it discovers a new branch. I instead want the pipeline job to be started by another means (e.g. using a timer or via a REST API call).
Is this possible?

Yes, add the Suppress automatic SCM triggering property in the branch sources of your multibranch project.

You might revert to explicitly naming the branches you want to include
with auto-discovery using Filter by name (with wildcards).
The documentation states: Space-separated list of name patterns to consider. You may use * as a wildcard; for example: master release*
So, adding any new branch to the list of included branches will ensure any changes after such addition will trigger processing as usual.
Any new branch will not be processed until it is explicitly included with the list.

Related

How to set up github webhook trigger on pushing in certain branch

I have Jenkins pipeline, and configured github webhook to trigger pipeline.
How to make triggering pipeline when the certain branch was pushed, instead of triggering pipeline by pushing to every branch ?
Webhook is generic for all, there is no filter on the side github or bitbucket, all you need to handle based on payload.
you can use Generic+Webhook+Trigger+Plugin,The plugin will allow you to parse certain data from the payload, and can conditionally trigger a build depending on the branch name.
Apply the filter with branch name
generic-webhook-trigger-plugin-specific-branch

Can a single seed job process DSLs from multiple repos?

I recently managed to convert several manually-created jobs to DSL scripts (inlined into temporary 'seed' jobs), and was pleasantly surprised how straightforward it was. Now I'd like to get rid of the multiple seed jobs and try to structure things more cleanly.
To that end, I created a new jenkins-ci repo and committed all the Groovy DSL scripts to it. Then I created a job-generator Jenkins job that pulls from the jenkins-ci repo and has a single Process Job DSLs step. This step has the Look on Filesystem box ticked, with the DSL Scripts field set to jobs/*.groovy. With global push notifications already in place, this works more-or-less as intended: if I make a change to the jenkins-ci repo, the job-generator job automatically runs and regenerates all the jobs—awesome!
What I don't like about this solution is that it has poor locality of reference: the DSL scripts for the job live in a completely separate repository from the code. What I'd really like is to keep the job DSL scripts in each individual code repository, in a jenkins subfolder, and have a single seed job that processes them all. That way, changes to CI setup could be code-reviewed right alongside the code. To me, that just feels like an ideal setup.
Unfortunately, I don't have a clear idea about how to make this happen. If I could figure out a way to make the seed job watch multiple repos, such that a commit to any one of them would trigger it, perhaps I could inject another build step before the Process Job DSLs step and (somehow) script my way to victory, but... I'm unsure how to even get to that point. (I certainly don't want to do full clones of each repo in the generator job just to pull in the DSL scripts!)
I suspect I'm not the first person to wish they could put the Job DSL scripts alongside the code, though perhaps I'm over-estimating the benefits. Any advice on this topic would be much appreciated—thanks!
Unfortunately there is no direct way of solving this. Several feature requests have been opened (JENKINS-33275, JENKINS-37220), but AFAIK no one is working on any of them.
As a workaround you can use the Pipeline Multibranch Plugin and create a multibranch project for each of your repositories. You must then add a simple Jenkinsfile to each repo/branch and use the Jenkinsfile to execute your Job DSL scripts. See Use Job DSL in Pipeline scripts for details. This would require minimal coding, but I think each repo must be cloned for this to work because the Job DSL files must be available on the file system.
You can use Job DSL to create the multibranch jobs, see multibranchPipelineJob in the API viewer. This would be your "root" seed job.
If your repos are hosted on GitHub, you can also checkout the GitHub Organization Folder Plugin. With that plugin you must only create one job for each organization instead of multiple multibranch jobs.

Build GitLab merge request with Jenkins

I'm aware that both GitLab and Jenkins have integration points with one another, however for reasons beyond my control I am not able to use either.
It's easy to pass parameters to a job telling it which branch, even which commit to build. However I just can't seem to tweak it to the right configuration where it will build the merge request number I pass in as a parameter.
I need to do this with the out-of-the-box 'git' functionality in Jenkins. (Can't use the GitLab Merge Request plugin because it requires polling of the repo.) This job must be initiated manually, and the merge request number specified via parameter. I will not be triggering it with a webhook from GitLab either. This requirement is a manual and on-demand build of a specific merge request.
Is it possible, and I'm just missing something (not) obvious?
So no one else has to endure figuring this out themselves ... yes... Jenkins can build a GitLab merge request out of the box, with no crazy plugins.
In Jenkins, in the Source Code Management section, click Advanced, and set the Refspec to:
+refs/merge-requests/*/head:refs/remotes/origin/merge-requests/*
then, in the Branch Specifier field, use this:
origin/merge-requests/${MR}
where ${MR} is a parameter passed to the build - the number of the merge request to get.
This is what works for me as of Apr 2022 (Jenkins 2.303.2)
The other answers seem to now be outdated (2016 & 2018) and don't work for me (however pointed me in the right direction).
Using gitlabMergeRequestId gave me some weird huge ID for a merge request that didn't even exist in my repository (I don't know where that comes from) & MR seems to be an old placeholder back in 2016(?).
Here's what works for me:
1. Checkout of the commit
Set Pipeline > Repositories > Advanced > Refspec to:
+refs/merge-requests/*/head:refs/remotes/origin/merge-requests/*
Set Pipeline > Branches to build > Branch Specifier to:
origin/merge-requests/${gitlabMergeRequestIid}
N.B. gitlabMergeRequestIid is not a typo & differs from gitlabMergeRequestId
2. Specifying branch in Pipeline script
With the above, checkout is successful but you still need to specify what branch your pipeline script will use.
Use the gitlabSourceBranch environment variable, which works in your script for the Git plugin (didn't work for me whatsoever for the branch specifier 🤷‍♂️).
branch: '${gitlabSourceBranch}'
...
End result (excluding script):
P.S. in case the placeholders change again or this doesn't work, check https://github.com/jenkinsci/gitlab-plugin#defined-variables
In Jenkins, in the Source Code Management section, click Advanced, and set the Refspec to:
+refs/merge-requests/*/head:refs/remotes/origin/merge-requests/*
then, in the Branch Specifier field, use this:
Here are the variables that you can occupy https://github.com/jenkinsci/gitlab-plugin#defined-variables, but for this case you should occupy.
origin/merge-requests/${gitlabMergeRequestId}
where ${gitlabMergeRequestId} is a parameter passed to the build - the number of the merge request to get.
I have implemented GitLab webhook and it worked correctly

Generic jenkins jobs

I wonder, if it is possible to create generic parametric jobs ready to copy where the only post copy action is to redefine its parameters.
During my investigation I find out that:
- I can use parameters in svn path definition
- I can define the flow of builds using *Build Flow Plugin*
However I cannot force Jenkins to use parameters inside job names definition for promotion process. Is any way to achieve that?
Since I create sometimes branches from master I would like to copy the whole configuration of jobs but only one difference most times is that in the job name definition I replace master with branch name.
Why it not possible to use a parameter as the branch name?
Then when you start to build the job, you can input the parameters based on the branch you want to build.
Or find some way to get the branch info and inject it.
For example, we have one common job, which will monitor maybe 20 projects, if any of those job was merged into git, our gerrit trigger plugin will start to work, and all of job name, and branch is got dynamically from parameters.
Hope this works for you.

Jenkins manual trigger, how to display configuration variables from SVN on page

I have all my test environment configurations stored in SVN in .properties files. I also have a Jenkins job that can deploy my artifacts to a specific test environment with delivery/build pipeline manual trigger. However, this can create a sense of uncertainty because I am never quite sure what configurations I deploy to the test environment as Jenkins does not automatically show them to you.
I noticed Jenkins offers parameterized builds, and offers you a page which lets you parameterize some values e.g. using some drop-down before triggering the build. My question is , would it be possible to have Jenkins display all the key/value pairs I have defined in my .properties file, and even let me change them? This way I could always review/edit my environment properties before actually making the deployment. Ofcourse, if I make changes then I need to remember to add them to SVN too... Thanks for your input!

Resources