Trigger Jenkins Pipeline job on merge event for gitlab MR - jenkins

this should be fairly basic, but when I research I come to things like gerrit triggrs and whatnot, which seem way too complicated for doing something simple like this.
I would like to do something like either this in the JobDSL script:
pipelineJob('deploy-game') {
definition {
environmentVariables {
env('ENVIRONMENT', "${ENVIRONMENT}")
keepBuildVariables(true)
}
cpsScm {
scm {
git{
remote {
url('https://blabla.git')
credentials('gitlab-credentials')
}
branches('${gitlabsourcebranch}')
}
}
scriptPath('path/to/this.jenkinsfile')
}
triggers {
gitlabPush {
buildOnMergeRequestEvents(true)
if ($gitlabMergeRequestState == 'merged') // this part
}
}
}
}
Or, trigger on all MR events, and then filter out in the pipeline script:
pipeline {
agent none
environment {
ENVIRONMENT = "${ENVIRONMENT}"
}
triggers {
$gitlabMergeRequestState == 'merged' // this one
}
stages {
stage ('do-stuff') {
agent {
label 'agent'
}
steps {
sh 'some commands ...'
}
}
}
}
How do I do this ?

So this is how it should be, I hope this is what you are looking for it.
pipelineJob('Job_Name') {
definition {
cpsScm {
lightweight(true)
triggers {
gitlabPush {
buildOnMergeRequestEvents(true) // it will trigger build when MR is opened.
buildOnPushEvents(true)
commentTrigger('retry a build') // When you write the comment on MR on gitlab. it will also trigger build
enableCiSkip(true)
rebuildOpenMergeRequest('source')
skipWorkInProgressMergeRequest(false)
targetBranchRegex('.*master.*|.*release.*') //This mean only push happened to master or release then only trigger jenkins build. Do not trigger build on normal feature branch push until the MR is opened.
}
}
configure {
it / triggers / 'com.dabsquared.gitlabjenkins.GitLabPushTrigger' << secretToken('ADD_TOKEN_FROM_JENKINS_JOB')
}
scm {
git {
remote {
credentials('ID')
url("git#URL.git")
branch("refs/heads/master")
}
}
}
scriptPath("jenkinsfile")
}
}
}

Related

Conditional post section in Jenkins pipeline

Say I have a simple Jenkins pipeline file as below:
pipeline {
agent any
stages {
stage('Test') {
steps {
sh ...
}
}
stage('Build') {
steps {
sh ...
}
}
stage('Publish') {
when {
buildingTag()
}
steps {
sh ...
send_slack_message("Built tag")
}
}
}
post {
failure {
send_slack_message("Error building tag")
}
}
}
Since there's a lot non-tag builds everyday, I don't want to send any slack message about non-tag builds. But for the tag builds, I want to send either a success message or a failure message, despite of which stage it failed.
So for the above example, I want:
When it's a tag build, and stage 'Test' failed, I shall see a "Error building tag" message. (This is a yes in the example)
When it's a tag build, and all stages succeeded, I shall see a "Built tag" message. (This is also a yes in the example)
When it's not a tag build, no slack message will ever been sent. (This is not the case in the example, for example, when the 'Test' stage fails, there's will be a "Error building tag" message)
As far as I know, there's no such thing as "conditional post section" in Jenkins pipeline syntax, which could really help me out here. So my question is, is there any other way I can do this?
post {
failure {
script {
if (isTagBuild) {
send_slack_message("Error building tag")
}
}
}
}
where isTagBuild is whatever way you have to differentiate between a tag or no tag build.
You could also apply the same logic, and move send_slack_message("Built tag") down to a success post stage.
In the postbuild step you can also use script step inside and use if. And inside this if step you can add emailext plugin.
Well, for those who just want some copy-pastable code, here's what I ended-up with based on #eez0's answer.
pipeline {
agent any
environment {
BUILDING_TAG = 'no'
}
stages {
stage('Setup') {
when {
buildingTag()
}
steps {
script {
BUILDING_TAG = 'yes'
}
}
}
stage('Test') {
steps {
sh ...
}
}
stage('Build') {
steps {
sh ...
}
}
stage('Publish') {
when {
buildingTag()
}
steps {
sh ...
}
}
}
post {
failure {
script {
if (BUILDING_TAG == 'yes') {
slackSend(color: '#dc3545', message: "Error publishing")
}
}
}
success {
script {
if (BUILDING_TAG == 'yes') {
slackSend(color: '#28a745', message: "Published")
}
}
}
}
}
As you can see, I'm really relying on Jenkins built-in buidingTag() function to help me sort things out, by using an env-var as a "bridge". I'm really not good at Jenkins pipeline, so please leave comments if you have any suggestions.

Run Jenkins jobs in parallel

I have 3 different jobs (Build, Undeploy and Deploy), which want to execute Build and Undeploy in parallel and after that Deploy.
From search got to know that Build Flow Plugin got deprecated.
Please suggest a plugin.
You can write Jenkins file with the below format:-
pipeline {
stages {
agent { node { label 'master' } }
stage('Build/Undeploy') {
parallel {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
//Call your build script
}
}
}
stage('Undeploy') {
agent { node { label 'Undeploy' } }
steps {
script {
//Call your undeploy script
}
}
}
}
}
stage('Deploy'){
agent { node { label 'Deploy' } }
steps {
script {
//Call your deploy script
}
}
}
}
}

Jenkinsfile - How to make a cron trigger kick off only a specifc stage?

I have a Jenkinsfile with the following triggers:
triggers {
cron('0 * * * 1-5')
}
So it will trigger at the top of the hour, every hour, Monday through Friday.
In the Jenkinsfile I have a number of stages like:
stage('CI Build and push snapshot') {
when {
anyOf { branch 'PR-*';branch 'develop' }
}
.
.
.
stage('Build Release') {
when {
branch 'master'
}
.
.
.
stage('Integration Tests') {
when {
? // not sure what goes here
}
What I want to do is, when that trigger is kicked off, I only want the Integration Tests stage to run. How do I achieve this? I think with what I have now every stage is going to be run.
Thanks!
I was able to get it working using something like:
stage('CI Build and push snapshot') {
when {
anyOf { branch 'PR-*';branch 'develop' }
not {
expression { return currentBuild.rawBuild.getCause(hudson.triggers.TimerTrigger$TimerTriggerCause) }
}
}
stage('Integration Tests') {
when {
branch 'develop'
expression { return currentBuild.rawBuild.getCause(hudson.triggers.TimerTrigger$TimerTriggerCause) }
}
Note this is using shared library functions and scripted syntax (not declarative), you will need to use script {} blocks in order to implement.
For organisation purposes, I put this into its own function in a shared library file called jobCauses.groovy under /vars, you can keep it in-line if you like, or put it at the bottom of the Jenkinsfile etc.
/**
* Checks if job cause is Cron
*
* #return boolean
*/
boolean hasTriggeredCause() {
List jobCauses = currentBuild.rawBuild.getCauses().collect { it.getClass().getCanonicalName().tokenize('.').last() }
return jobCauses.contains('TimerTriggerCause')
}
Then in your pipeline:
stage('Integration Tests') {
script {
if ( jobCauses.hasTriggeredCause() ) {
//do the thing
}
}
}

DSL Seed Job for Multibranch pipeline with Bitbucket branch plugin suppress auto build of branches

Have a DSL job to create multibranch pipeline jobs in jenkins, running Jenkins 2.107.1 with plugins: 'Branch API Plugin' 2.0.18, 'Bitbucket Branch Source Plugin' 2.2.10.
I'm unable to find a proper configuration function to enable property to "Suppress automatic SCM triggering", please help.
Here is my job that works but its just triggers the build as soon as it scans for branch:
multibranchPipelineJob("job") {
configure {
it / sources / data / 'jenkins.branch.BranchSource' / source(class: 'com.cloudbees.jenkins.plugins.bitbucket.BitbucketSCMSource') {
credentialsId('..')
id("..")
checkoutCredentialsId("..")
repoOwner("owner")
repository("my-repo")
includes()
excludes("PR-*")
}
}
}
This is how it works now... with the help of the following source code:
https://github.com/jenkinsci/bitbucket-branch-source-plugin
multibranchPipelineJob("job") {
branchSources {
branchSource {
source {
bitbucket {
credentialsId("myid")
repoOwner("iam")
repository("job")
traits {
headWildcardFilter {
includes("branchestoinclude")
excludes("toexclude")
}
}
}
}
strategy {
defaultBranchPropertyStrategy {
props {
// keep only the last 8 builds
buildRetentionBranchProperty {
buildDiscarder {
logRotator {
daysToKeepStr("-1")
numToKeepStr("8")
artifactDaysToKeepStr("-1")
artifactNumToKeepStr("-1")
}
}
}
}
}
}
}
}
// Branch behaviour
configure {
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'com.cloudbees.jenkins.plugins.bitbucket.BranchDiscoveryTrait' {
strategyId(3) // detect all branches -refer the plugin source code for various options
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(8)
}
}
}

Job DSL to create "Pipeline" type job

I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.

Resources