Short explanation:
There are many repos in our Git
Each repo has it's own Jenkinsfile who has it's own separate Job at Jenkins
All Jenkins files are doing 99% the same thing!
What we want to achieve at the moment:
Build one Jenkinsfile for all repo
Maintain branches in between repos
Delete if we can the current Jenkins files of each repo and use the only generic new file.
Have the versatile to use and manipulate parameters so Jenkins file won't be affected by any other repo config
Solutions for now:
Remove all Jenkins files in all repos
Re-configure the Jenkinsfile PATH in Jenkins gui website to direct to our new file
Put a config .yaml file in each repo who will contains all the relevant information of each repo (like key-value)
So, when each repo will be triggered, our new Jenkinsfile will load the config file and use the parameters to proceed all the stages related to the config file.
I would be happy to hear ideas / examples / snippets from you guys! It will highly help me!
Regards
Niv
(Answering due to lack of reputation for comment. Please excuse)
Hi #n1vgabay
If you are using Bitbucket for your SCM, then you may try these:
create a Organization Folder/Bitbucket team Project inside Jenkins (From Jenkins --> New Item--> Organization Folder or Bitbucket team/Project)
Update the config to filter all the repos (or regex them) under the Project inside your SCM. This will create all the repos as individual MB pipelines with all the branches under them as individual jobs. Also with Bitbucket Server Integration plugin, it automatically creates Webhooks for all the repos to trigger the jobs accordingly upon the events (Push, Commit, PR opened etc)
Using Remote JenkinsFile provider plugin, you may choose to place your Jenkinsfile elsewhere in another Proj/repo and call them from this config.
This Jenkinsfile can have all the steps you need and will run the same for all the branches which run as individual MB jobs.
More details on the same can be obtained from here.
Now if you want to use individual jenkinsfiles, then you might have to come up with having Jenkinsfiles specific to each repo which might make it complicated and your Jenkinsfile at the root folder level will have to call the Jenkinsfile present in your repo/branch level across all the repos and branches.
Hopefully this helps! :)
I'm setting up Jenkins job sonarqube scan code in the github PR but I dont know how to integrate the name of jenkins job can define in "Require branches to be up to date before merging". My purpose is whenever new PR comes, sonarqube scans code before accept to merge or not.
Please let me know if you have any idea, thanks!
I can't provide a full answer for you, but I'll clarify some of this.
None of these "branch protection rules" have anything to do with SonarQube scans.
Typically, you will want to configure your SCM repository system so that when a pull request is created, it spawns a Jenkins build, which can do whatever it needs to do, including running a SonarQube scan on the code in the pull request branch.
Github, like other similar repository technologies, will have a way to detect the creation of the pull request. That will include configuration of the Jenkins connection information. There are different ways of doing this.
I am trying to create a process that blocks a github PR from being approved and merged to the main branch until a Jenkins pipeline can confirm that a terraform plan (or whatever checks need to happen for that repo) are successful.
There are two restrictions with though
we're not allowed to install plugins that aren't approved by the company, and that's just too much hassle!
the Jenkins instance is internal so I can't use a webhook
I'm trying to use a multibranch pipeline to execute when a PR is raised but I can't see how to approve the PR once the check is complete, perhaps this isn't the best way to go?
I'd appreciate any help/pointers on this
Thanks
we have a need to set up a jenkins declarative pipeline to manage automated builds/deployments for terraform based project repos in GitHub. Basically what we need here is that for any terraform project repo in GitHub, when a pull request is submitted from a feature branch to some base branch like master, then the single multibranch-pipeline job for that repo runs a build against that feature branch and then for the command where it does a terraform commnand like the below :-
terraform plan -out=tfplan -input=false
it then posts that output to the corresponding github PR under the comment section (not as issue comment but just the PR comment), so that the reviewer can review the plan output and approve/reject the PR or add further comments on what needs to be modified in the source code. If its approved then there will be a separate job off that base branch to just do the terraform apply which we have already configured.
So the short of it is that regardless of terraform being the case here all we are looking here is how to add something back to GitHub PR as comment as part of jenkins build. I did install GitHub pull request builder plugin and could post comment on the issues, but not sure how to do that for the actual PR. I would like to have that coded in my declarative pipeline, so very much looking to your help/suggestions on that.
Just not sure how to grab the PR id each time any feature build is run or probably have a way where the build triggers on the branch only when there is a PR from that branch as source branch. Any help or suggestions here will be greatly appreciated as always.
i was able to figure this out by following the below post :- Create comment on pull request. I think i wasn't quite understanding that github treats every PR as an issue while not vice-versa, and so what you could achieve by doing a POST /repos/:owner/:repo/issues/:number/comments, is exactly what i was looking here.
I have successfully setup a webhook trigger in bitbucket for a Jenkins freestyle project, for test purposes.
Unfortunately my Jenkins project is using the Pipeline format, and I am unable to get Bitbucket to trigger that kind of project; the problem seems to be that there is no Jenkins project registered to pull from the repo that the Bitbucket webhook is coming from, and Jenkins replies with:
Error: Jenkins response: No git jobs using repository: ssh://git#myhost:7999/xxx/testing-jenkins.git and branches: master No Git consumers using SCM API plugin for: ssh://git#myhost:7999/xxx/testing-jenkins.git
The pipeline project is setup in a way that the Jenkinsfile is to be found in the given repository (ssh://git#myhost:7999/xxx/testing-jenkins.git), by using the "Pipeline script from SCM" option.
Therefore there is actually a kind of "git consumer" for the Pipeline, but this does not seem to be taken into account by Jenkins, probably because this is not a real project source, but a pipeline source.
Are there any examples of integration of Bitbucket and Jenkins Pipeline projects? I have been unable to find any.
If your are looking for a full Bitbucket and Jenkins Pipeline, I highly recommend to use the Bitbucket Branch Source Plugin. The plugin will discover all Branches and Pull Requests and build all who have a JenkinsFile in the root of repo.
You can also use create a project as Bitbucket Team, who will scan all repo of your organization:
See the official doc of CloudBees
I was struggling with the same problem. Following are the key points I followed.
In Jenkins pipeline job,
Under Build Triggers, check 'Trigger builds remotely (e.g., from scripts)' and fill in the 'Authentication Token' with some random and unique token.
In BitBucket repository,
Go to Settings > Services
Select 'Jenkins' from the drop down and 'Add service'.
Check 'Csrf Enabled'
Endpoint : http://username:apitoken#yourjenkinsurl.com/
You can find username and apitoken at Jenkins home > People
Select the user and click on configure. Under 'API Token' click on the 'Show API Token' button and you see the username and apitoken
Module name : This is optional. It can be any particular file or folder which is to be watched.
Project name : The project name in Jenkins.
If the job is in some folder structure, say I have 'MyTestFolder/MyTestPipelineJob', Project name to be mentioned is 'MyTestFolder/job/MyTestPipelineJob'
Token : 'Authentication Token' created in Jenkins job.
You are ready to go!!
I referred http://felixleong.com/blog/2012/02/hooking-bitbucket-up-with-jenkins/ and some of my instincts. :)
A simple solution is to use Generic Webhook Trigger Plugin in Jenkins.
You would need to
Enable it in a free style or pipeline job.
Configure a token string
Construct JSONPath:s to gather whatever you need from the Bitbucket Webhook.
Add the plugin endpoint in Bitbucket. JENKINS_URL/generic-webhook-trigger/invoke?token=whatever_you_picked
The plugin will give you clear feedback when it is invoked so that troubleshooting is made easy.
It is up to you to pick whatever values you need from the webhook in order to clone the correct repository or whatever it is you want to do when the it is invoked.
I have this same issue. My workaround was just to create a freestyle project that can be triggered by the WebHook, and have the the Pipeline triggered by that project's completion.
In the mean time, here's the Jenkins bug you can watch for a fix:
https://issues.jenkins-ci.org/browse/JENKINS-38447
Spend hours figuring out how to do this in 2017.10
Like #JPLemelin described, new a Jenkins item using a Bitbucket Team/project
ref to the doc: https://support.cloudbees.com/hc/en-us/articles/115000051132-How-to-Trigger-Multibranch-Jobs-from-BitBucket-Cloud- , install the plugin: The BitBucket Branch Source plugin.
go to bitbucket, and add webhook: ${your-jenkins-url}/bitbucket-scmsource-hook/notify
after these 3 steps, I finally make the pipeline jobs run after new commit into bitbucket
I had the same exact issue...
The cause was using */master for branch specifier. I needed to spell it out: origin/master (no wildcards).
It works well now.
I was finally able to make this work with Jenkinsfile in Multi Branch Pipeline:
In Bitbucket i created a webhook with my Jenkins-URL, my clone-URL and in the webhook i put the following URL (exact the url in the project of Jenkins):
http://<jenkins>/git/notifyCommit?url=http://<user>#<bitbucket>/scm/<project>/<repo>.git
When i test the trigger the result is the following:
No git jobs using repository: http://<user>#<bitbucket>/scm/<project>/<repo>.git and branches:
Scheduled indexing of <repo>
So it didn't trigger any jobs, but it triggered the multi branch scanning, so my changed branches are build.