Jenkins project that checks pull requests from multiple GitHub repositories - jenkins

I'm setting a Jenkins project to test many repositories in a GitHub organization.
My intent is to have a single Jenkins project that is able to check for PRs in a set of GitHub repos in my GitHub organization. Then I use this project to trigger another Jenkins project that checkout/build/test code on my GitHub repos.
So far I have been able to setup a Jenkins project that can check PR on a single GitHub repo, but I have not figured out if there is a way to check for PRs on multiple GitHub repos belonging to the same GitHub organization through a single Jenkins project. Is there a way to achieve this?

Jenkins Bitbucket and Jenkins GitHub plugins unfortunately don't offer the feature of watching for hooks from multiple git repositories.
Possible solutions to your problem include to:
Work with branches: I don't know which kind of code are you trying to organize into repositories, but by my experience, some people often try to organize code belonging to the same project into different repositories rather then into different branches of the same git repository. Maybe this could be an approach that solves your problem?
Have one pipeline that clones the other git repositories: When using Jenkins declarative pipelines, you could use the checkout() and the git() functions. Look at the example below:
// Single pipeline example
pipeline {
agent any
stages {
stage("I'm just printing a message in here") {
steps {
script {
print('Yep, just printing some happy message')
}
}
}
stage("Cloning repository A") {
steps {
script {
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'my-repo-a']], userRemoteConfigs: [[credentialsId: 'MY-GIT-CREDENTIALS', url: 'https://github.com/my-user/my-repo-a.git']]])
}
}
}
}
}
Have two pipelines. A main (let's call it PIPELINE-A) which will be called by the hook, and a secondary (let's call it PIPELINE-B), which will clone the other repositories and do some fun stuff:
// PIPELINE-A
pipeline {
agent any
stages {
stage('Calling PIPELINE-B') {
steps {
script {
build 'pipeline-b'
}
}
}
}
}
// PIPELINE-B
pipeline {
agent any
stages {
stage('Cloning Git repository') {
steps {
script {
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'my-repo-a']], userRemoteConfigs: [[credentialsId: 'MY-GIT-CREDENTIAL', url: 'https://github.com/my-user/my-repo-a.git']]])
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'my-repo-b']], userRemoteConfigs: [[credentialsId: 'MY-GIT-CREDENTIAL', url: 'https://github.com/my-user/my-repo-b.git']]])
}
}
}
}
}
With this solution, you're gonna have each repository cloned in a distinct folder. So after that, you should do a cd to that folder before you do any specific tasks to that project. For better organization, you can have distinct pipelines for distinct repositories, containing its tasks if you want to, but as a con of these two last solutions, you're gonna have to pick a main repository to trigger your jobs.
Best regards!

Related

Jenkins: Building multiple repos with different branches

I have multiple repos with their own jenkins files and when I am working on one repo I will need to build the others so I have an end to end app deployed for feature development. As the app runs on AWS with the containers deployed into EKS my preference is to be able to build and run on AWS.
There is an order to the building, the infrastructure needs to deployed first, before the backend services (there are 3) and the UI.
Ideally I can choose which branches from the 5 repos are deployed, and when a change on any branch that is deployed as part of the ephemeral environment occurs the pipeline will trigger.
So far what I am thinking is to have a jenkinsfile in each repo and create a 6th repo, which will have just a yaml file and jenkinsfile of its own. This pipeline job for this repo would take data from the yaml file about which branches to use, and trigger the other pipelines passing the branch to each, it would be the only repo with an actual pipeline job.
Has anyone tried this? I'm not sure if it's possible to have a pipeline watch multiple different repos and branches and act as an orchestrator, kicking off other pipelines.
There might be a much easier way to do this, I have read a lot of posts and articles but none seem to achieve what I want.
One of the approach can be writing single Jenkinsfile by combing all the stages from each repo into this single Jenkinsfile
stages {
stage('Infra Setup') {
steps {
// The below will clone your repo and will be checked out to master branch by default.
git credentialsId: 'jenkins_git_cred', url: '<your_git_url_for_clone>'
sh "git checkout branchname"
// Your steps
}
}
stage('Backend1 ') {
steps {
//If you want to checkout to a specific branch by default instead of master then use the below in your pipeline stage.
checkout([$class: 'GitSCM', branches: [[name: '*/branchname']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'jenkins_git_cred', url: 'your_git_url_for_clone']]])
}
}
stage('backend_n') {
steps {
// One or more steps need to be included within the steps block.
}
}
stage('UI') {
steps {
// One or more steps need to be included within the steps block.
}
}
}
You can generate syntax using jenkins pipeline-syntax
https://your-jenkins-url.com/pipeline-syntax/

How to run the specific branch of bitbucket in Jenkins pipeline repository through triggers

To setup Bitbucket and Jenkins Pipeline , I am using Generic Webhook Trigger Plugin in Jenkins.
I Enable it in the pipeline job.
Configure hte token string
Add the plugin endpoint in Bitbucket.
JENKINS_URL/generic-webhook-trigger/invoke?token=whatever_you_picked
this is my pipeline code for cloning the repo
pipeline{
parameters {
gitParameter branchFilter: 'origin/(.*)', defaultValue: 'dev', name: 'BRANCH_NAME', type: 'PT_BRANCH'
}
stage("clone"){
checkout([
$class: 'GitSCM',
branches: [[name: "{parms.BRANCH_NAME}"]],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
userRemoteConfigs: [[credentialsId: "${GIT_CREDENTIAL_ID}", url: "${REPO_URL}"]]
])
}
It is always cloning the dev repo , whenever any branch code is pushed to bitbucket , but I want to clone that repository that is just pushed recently . There is probably a way from JSONfile , but i am not getting how to do that
We're building our branches using a "MultiBranch Pipeline" project which includes a jenkinsfile in each branch. The MultiBranch Pipeline acts as both a folder and a triggerable job. The jenkinsfile in Bitbucket gets merged like all other code.
Hello World Pipeline
Have your Generic Webhook Trigger Plugin trigger the MultiBranch Pipeline job. This will then scan all branches in your repository and trigger a build for any which have changes since it was last scanned.
In the Jenkinsfile, you can use env.BRANCH_NAME set by Jenins at build time to pull the branch code you're building at the moment.
pipeline {
agent any
stages {
stage('Prebuild') {
steps{
script{
if(env.BRANCH_NAME == 'master'){
git credentialsId: 'myCreds', url: 'myGitUrl'
}
else{
git branch: env.BRANCH_NAME, credentialsId: 'myCreds', url: 'myGitUrl'
}
}
}
}
}
}

Jenkins save author of commit and listen for pushes

Hi I want to ask if there is possible to jenkins pipeline run every time when there is push into some repository in git. and save the author of commit into variable. My code:
stage('checkout') {
steps {
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxxxxxxxxxxxx-yyyyyyyy-zzzzzzzzzz', url: 'git#website:group/project.git']]])
}
}
To your 1st question: "is possible to Jenkins pipeline run every time when there is push into some repository in git"
answer: Yes there are various way.
To make use of Webhooks. Many SCM provider like Github , BitBucket has this option
you can make use of Jenkins git plugins to do that. (refer my screen shot)
For your 2nd question: "Save the author of commit into a variable. "
answer: It seems to duplicate question with this

Jenkins - Multiple jobs for a single Project Issue

I am relatively new to Jenkins (using 2.32). So pardon my ignorance.
In my current setup, I have 2 free-style jobs for a single project - One point to production branch (/master) and another to the Dev branch (/dev). Bitbucket is configured to invoke (webhook) Jenkins on changes.
Once the dev is built and it passes all the unit test it gets deployed to Dev Server. Eventually, all dev changes are pushed to Master via pull request. The change in Master branch triggers the Master job and deploys the artifacts to productions.
I don't feel this setup is correct and would like you experts advise on this. Having 2 jobs makes me uncomfortable. What if I want a stage release? I will need another free-style job. Doesn't make much sense.
How do I go about doing this with one job? How do you guys achieve this? Using Pipeline? Any pointers would be greatly appreciated.
TIA.
You are correct, you can manage this better with Jenkins Pipeline
What you can do is the following :
1) Checkout the code from dev branch and put it in one directory in the workspace.
2) Compile and deploy from that directory.
3) Add a manual step for approval to deploy from master branch.
4) Repeat step 1 and 2.
A sample code would look something like this:
node {
// Get code from git repo
checkout changelog: false, poll: false, scm: [$class: 'GitSCM', branches: [[name: "origin/dev"]], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'test-dev-dir']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: '<jenkins-github-credential-id>', url: 'https://github.com/test']]]
dir('test-dir') {
// Do your stuff
}
// stage concurrency: 1, name: 'approve'
// input id: 'master-deploy', message: 'Deploy from master?', ok: 'Deploy'
// Get code from git repo
checkout changelog: false, poll: false, scm: [$class: 'GitSCM', branches: [[name: "origin/master"]], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: 'test-master-dir']], submoduleCfg: [], userRemoteConfigs: [[credentialsId: '<jenkins-github-credential-id>', url: 'https://github.com/test']]]
dir('test-master-dir') {
// Preferbably create a tag for future hotfix maybe?
// Do your stuff
}
}

Clean builds with Multibranch Workflow

Using Multibranch Workflow, the command to check out looks like
checkout scm
I can't find a way to tell Jenkins to perform a clean checkout. By "clean," I mean it should remove all files from the workspace that aren't under version control.
I'm not sure if this answers the original question or not (I couldn't tell if the intention was to leave some files in the workspace) but why not just remove the workspace first, this would allow a clean checkout:
stage ('Clean') {
deleteDir()
}
stage ('Checkout') {
checkout scm
}
I run into the same problem and here is my workaround.
I created a new scm object for the checkout and extended the extensions with the CleanBeforeCheckout. But i kept the other configurations like branches and userRemoteConfigs.
checkout([
$class: 'GitSCM',
branches: scm.branches,
extensions: scm.extensions + [[$class: 'CleanBeforeCheckout']],
userRemoteConfigs: scm.userRemoteConfigs
])
It's still not perfect because you have to create a new object :(
First, you can not assume that a workflow job has a workspace as it was for freestyle jobs. Actually, a workflow job can use more than one workspace (one for each node or ws block).
Said that, what I'm going to propose is a kind of hacky: modify the scm object before checkout to set up a CleanCheckout extension (you will have to approve some calls there).
import hudson.plugins.git.extensions.impl.CleanCheckout
scm.extensions.replace(new CleanCheckout())
checkout scm
But I'd prefer Christopher Orr's proposal, use a shell step after checkout (sh 'git clean -fdx').
Behaviors can be added when configuring the source. clean before checkout, clean after checkout and Wipe out repository and force clone. This removes the need to add logic to the declarative / scripted pipelines.
Adding Christopher-Orr's comment as an answer to just do:
stage('Checkout') {
checkout scm
sh 'git clean -fdx'
}
Jenkins currently contains a page to generate groovy pipeline syntax. Selecting the checkout step you should be able to add all the additional options that you're used to.
I generated the following which should do what you want:
checkout poll: false, scm: [$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'CleanBeforeCheckout']], submoduleCfg: [], userRemoteConfigs: [[url: 'ssh://repo/location.git']]]

Resources