Jenkins: PR pipeline triggered by a commit to target branch - jenkins

I have the following Jenkinsfile:
pipeline {
environment {
STAGING_BRANCH = 'project'
}
agent any
stages {
stage('Staging Environment') {
steps {
sh "sh /bin/create-staging-env"
}
}
when {
expression { env.CHANGE_TARGET == env.STAGING_BRANCH && env.CHANGE_ID }
}
}
}
}
The condition expression { env.CHANGE_TARGET == env.STAGING_BRANCH && env.CHANGE_ID } is meant to only execute the stage when it is a pull request and the target branch is project.
I have this pull request which only has 1 commit:
But jenkins ran this pipeline multiple (7) times:
My guess for the trigger of the additional builds is that when a commit is pushed to another branch, and that branch has a pull request.
Edit
Now I understand why the builds were created. They were caused by commits to the target branch project, since the target branch changed by a commit, Jenkins will execute the pipeline for that PR again.
Then my question changes to, How to get which branch the commit that triggered this pipeline is pushed into. For example if a commit is pushed to the project branch I would like to ignore the stage for PRs. I want to do something like expression { ... && env.COMMIT_BRANCH != 'project'}

Look at the when branch clause for stages. You can tell Jenkins to run stage on particular branches and skip on stage too e.g.
stage('PR stuff') {
when {
not {
branch 'project'
}
}
steps {
sh 'something '
}
}
You can use all of and any of to match multiple clauses in the when too

Related

jenkins configuration for building on different branches

I am doing code review with gerritcodereview and I need to create a jenkins pipeline for CI, CD. I am using the events triggered by gerrit trigger plugin.
I want to obtain this:
PastchSet Created
build start on refs/changes/**/**/** branch
report results to gerrit for code review
Change Merged(into develop) or Ref Updated(develop)
build start on origin/develop branch
deploy code to internal server
Ref Updated(master)
build start on origin/master branch
deploy code to external server
Questions for which I didn't find good answers:
do I need to use a simple pipeline or multibranch pipeline?
how do I start the build on the correct branch?
how can I checkout the correct branch using a Jenkinsfile instead of using the configuration page?
You should create multibranch pipeline, and write your declarative/scripted
pipeline in Jenkinsfile
example pipeline
pipeline {
agent any
tools {
maven 'maven-3.3.6'
jdk 'jdk-11'
}
options {
buildDiscarder(logRotator(numToKeepStr: '5'))
}
stages {
stage('Build/Test') {
when {
changeRequest()
}
steps {
sh "mvn clean verify"
}
post {
success {
gerritReview labels: [Verified: 1], message: "Successful build, ${env.RUN_DISPLAY_URL}."
}
unstable {
gerritReview labels: [Verified: 0], message: "Unstable build, ${env.RUN_DISPLAY_URL}"
}
failure {
gerritReview labels: [Verified: -1], message: "Failed build, ${env.RUN_DISPLAY_URL}"
}
}
}
stage('Deploy') {
when {
branch 'develop'
}
steps {
sh 'mvn deploy'
}
}
}
}
stage build&test will run for any change in changeRequest, any new change, or patchset will trigger this stage
stage deploy will be triggered for any change merged to develop.
You could have multiple stages for one branch, they will be executed in sequence

Jenkins - How to run a stage/function before pipeline starts?

We are using a Jenkins multibranch pipeline with BitBucket to build pull request branches as part of our code review process.
We wanted to abort any queued or in-progress builds so that we only run and keep the latest build - I created a function for this:
def call(){
def jobName = env.JOB_NAME
def buildNumber = env.BUILD_NUMBER.toInteger()
def currentJob = Jenkins.instance.getItemByFullName(jobName)
for (def build : currentJob.builds){
def exec = build.getExecutor()
if(build.isBuilding() && build.number.toInteger() != buildNumber && exec != null){
exec.interrupt(
Result.ABORTED,
new CauseOfInterruption.UserInterruption("Job aborted by #${currentBuild.number}")
)
println("Job aborted previously running build #${build.number}")
}
}
}
Now in my pipeline, I want to run this function when the build is triggered by the creation or push to a PR branch.
It seems the only way I can do this is to set the agent to none and then set it to the correct node for each of the subsequent stages. This results in missing environment variables etc. since the 'environment' section runs on the master.
If I use agent { label 'mybuildnode' } then it won't run the stage until the agent is free from the currently in-progress/running build.
Is there a way I can get the 'cancelpreviousbuilds()' function to run before the agent is allocated as part of my pipeline?
This is roughly what I have currently but doesn't work because of environment variable issues - I had to do the skipDefaultCheckout so I could do it manually as part of my build:
pipeline {
agent none
environment {
VER = '1.2'
FULLVER = "${VER}.${BUILD_NUMBER}.0"
PROJECTPATH = "<project path>"
TOOLVER = "2017"
}
options {
skipDefaultCheckout true
}
stages {
stage('Check Builds') {
when {
branch 'PR-*'
}
steps {
// abort any queued/in-progress builds for PR branches
cancelpreviousbuilds()
}
}
stage('Checkout') {
agent {
label 'buildnode'
}
steps {
checkout scm
}
}
}
}
It works and aborts the build successfully but I get errors related to missing environment variables because I had to start the pipeline with agent none and then set each stage to agent label 'buildnode' to.
I would prefer that my entire pipeline ran on the correct agent so that workspaces / environment variables were set correctly but I need a way to trigger the cancelpreviousbuilds() function without requiring the buildnode agent to be allocated first.
You can try combining the declarative pipeline and the scripted pipeline, which is possible.
Example (note I haven't tested it):
// this is scripted pipeline
node('master') { // use whatever node name or label you want
stage('Cancel older builds') {
cancel_old_builds()
}
}
// this is declarative pipeline
pipeline {
agent { label 'buildnode' }
...
}
As a small side comment, you seem to use: build.number.toInteger() != buildNumber which would abort not only older builds but also newer ones. In our CI setup, we've decided that it's best to abort the current build if a newer build has already been scheduled.

Jenkins stage not triggered even if changeset is correct

My jenkins file looks something like this:
options {
skipDefaultCheckout()
}
stages{
stage("Unit Test"){
when {
allOf{
expression {params.REPOSITORY_CREDENTIALS != null && params.TRG_BRANCH != null}
anyOf {
changeset pattern: "([a-zA-Z]*_cli|clarity_xml_tools)\\/.*", comparator: "REGEX"
changeset pattern: "jenkins\\/pipeline_scripts\\/.*", comparator: "REGEX"
}
}
}
steps{
container('omics-build-agent'){
echo "Stage triggered.."
}
}
post{
always{
echo "Stage completed.."
}
success{
echo "Unit Test completed successfully.."
}
failure{
echo "Unit Test Failed!!"
}
}
}
I checked the Changes in my jenkins build info:
jenkins/pipeline_scripts/build-Build_Python-clarity_xml_tools.sh
But my stage is still not getting triggered. What am I missing?
Note: I cannot remove skipDefaultCheckout() because I have 2 different repos who should be triggering my job when changes are pushed and they should be checked out to 2 different sub folders.
So my pipeline gets triggered by the Generic Webhook Plugin (which is working fine) from either of the 2 repo. So I had to force stop the default checkout and then use the checkout plugin for gitSCM to get my repos inside the stage.
Could it be because of this?

How to get jenkins multibranch pipeline last build revision?

I have jenkins pipeline and also using a shared library for jenkins.
In my multibranch pipeline three to four repo clone while executing build using bitbucket plugin.
my question is how to get the last build revision from the previous build.
I have tried currentBuild.changeSets approach but for multiple repositories clone, it fails.
I had to get SCM revisions from the previous builds too. I didn't find any API to get it nicely, so I implemented a workaround. It is not great, but at least it works ;-)
When you save an environment variable by using env.setProperty(name, value) it is saved in the build metadata as a build variable. You can read it at any moment.
pipeline {
agent any
stages {
stage('Test') {
script {
env.setProperty('MY_ENV', env.BUILD_NUMBER)
def previousBuild = currentBuild.previousBuild
if (previousBuild != null) {
echo previousBuild.buildVariables['MY_ENV'] // prints env.BUILD_NUMBER - 1
}
}
}
}
}
In your case you have 4 checkouts. I don't know how you close sources, so let's imagine that you have a cloneRepo method and it sets the GIT_COMMIT environment variable. They you may use:
def previousBuild = currentBuild.previousBuild
if (previousBuild != null) {
echo previousBuild.buildVariables['GIT_COMMIT_REPO_1']
echo previousBuild.buildVariables['GIT_COMMIT_REPO_2']
echo previousBuild.buildVariables['GIT_COMMIT_REPO_3']
echo previousBuild.buildVariables['GIT_COMMIT_REPO_4']
}
cloneRepo(repo1)
env.setProperty('GIT_COMMIT_REPO_1', env.GIT_COMMIT)
cloneRepo(repo2)
env.setProperty('GIT_COMMIT_REPO_2', env.GIT_COMMIT)
cloneRepo(repo3)
env.setProperty('GIT_COMMIT_REPO_3', env.GIT_COMMIT)
cloneRepo(repo4)
env.setProperty('GIT_COMMIT_REPO_4', env.GIT_COMMIT)
If you use the checkout step, then you may do:
def commitId = checkout(scm).find { it.key == 'GIT_COMMIT' }
env.setProperty('GIT_COMMIT_REPO_1', commitId)

Jenkins Pipeline - conditional execution with branch and 1 other parameter (manual)

We are deploying our application using Jenkins pipeline like this -
pipeline {
agent any
stages {
stage('Build For Production') {
when { branch 'development' }
steps {
sh './bin/build.sh'
}
}
stage('Build For Production') {
when { branch 'master' }
steps {
sh './bin/copy_needed_auth.sh'
sh './bin/build.sh'
}
}
}
}
When a developer pushes code on bitbucket, The application is deployed automatically. Using branch, we set our deployment strategy.
when { branch 'master' }
But we need to set a manual chacking for deploying on production (master branch) like - when a developer will merge code in the master branch, he will also set some tag or something like that so that Jenkins pipeline will check branch + other manual logic to deploy in production.
we are doing like this -
when {
branch 'master'
tag: 'release-*'
}
But it's not working. Is there any other strategy to do that?
Use following code to use several when conditions:
when {
allOf {
branch 'master';
tag "release-*"
}
}
Related docs you can find here

Resources