Bitbucket webhooks to trigger Jenkins project - jenkins

I have a simple Jenkins pipeline job that does a bunch of things and calls other jobs. There is no repo associated with the job. But this job should be called when a pull request is created in a certain repo in Bitbucket. This was easy with Gitlab where I just had to add a webhook with the Jenkins job url on Gitlab. How do I achieve this with Bitbucket? It looks like it always needs a repo url in Jenkins for the webhook to be triggered but I have no repo.
Ex my pipeline job on Jenkins is
stage('Stage 1'){
echo "hello, world!"
}
}
I want to trigger this build when a PR is created on Bitbucket for repo xyz.
Or in general how to make Jenkins pipeline jobs with pipeline script and Bitbucket Webhooks work? All either speak about freestyle job or multibranch job or pipeline job with Jenkinsfile

For this, you can use the Generic Webhook Trigger Plugin. The Job will be identified by the token you add to the Job. Here is a sample Jenkins Pipeline and how you can extract information from the Webhook request to determine it's coming from a PR.
pipeline {
agent any
triggers {
GenericTrigger(
genericVariables: [
[key: 'PR_ID', value: '$.pullrequest.id', defaultValue: 'null'],
[key: 'PR_TYPE', value: '$.pullrequest.type', defaultValue: 'null'],
[key: 'PR_TITLE', value: '$.pullrequest.title', defaultValue: 'null'],
[key: 'PR_STATE', value: '$.pullrequest.state', defaultValue: 'null'],
[key: 'PUSH_DETAILS', value: '$.push', defaultValue: 'Null']
],
causeString: 'Triggered By Bitbucket',
token: '12345678',
tokenCredentialId: '',
printContributedVariables: true,
printPostContent: true,
silentResponse: false
)
}
stages {
stage('ProcessWebHook') {
steps {
script {
echo "Received a Webhook Request from Bitbucket."
if(PR_ID != "null") {
echo "Received a PR with following Details"
echo "PR_ID: $PR_ID ; PR_TYPE: $PR_TYPE ; PR_TITLE: $PR_TITLE ; PR_STATE: $PR_STATE"
// If the PR state is either MERGED or DECLINED we have to do some cleanup
if(PR_STATE == "DECLINED" || PR_STATE == "MERGED") {
// Do your cleanup here, You should have all the PR Details to figure out what to clean
echo "Cleaning UP!!!!!!"
}
} else if(PUSH_DETAILS != "null") {
echo "This is a commit."
}
}
}
}
}
}
Then in Bitbucket, you will have the webhook URL like below.
JENKINS_URL/generic-webhook-trigger/invoke?token=12345678
You can read more about the webhook messages Bitbucket will send from here.

Related

Trigger Sonar Jenkins job from another Jenkins job

I want to create a process in Jenkins when one job is building it should internally call another job that is generating a SONAR report for the same code pull request.
When I am trying to call API to trigger Jenkins job automatically.
https://jenkins.com/job/DPNew/job/xyz/buildWithParameters?token=DW&FROM_HASH=195c8df91791768f3098ce260eb2dd8728&REPO_NAME=_python&PROJECT_KEY=%7Eabc&EMAIL=abc#gmail.com&FROM_BRANCH_NAME=feature%2FDO-451&TO_BRANCH_NAME=Port-2.7&PR_ID=622"
I am getting below error in response.
content: "<html><head><body style='background-color:white;
color:white;'>\n\n\nAuthentication required\n<!--\nYou are authenticated as: anonymous\nGroups that you are in:\n \nPermission you need to have (but didn't):
hudson.model.Hudson.Read\n ... which is implied by: hudson.security.Permission.GenericRead\n ... which is implied by:
hudson.model.Hudson.Administer\n-->\n\n</body></html>
I have already created Jenkins API token in 'user -> configure'
Edit 1:
The first Jenkin job is triggered by a pull request from Bitbucket, and the UI in bitbucket shows if the build is successful and if the build is a success it shows a sonar report.
What should I do to resolve this issue?
Instead of API call, use job, this will also make it so you can use paramers from your sonar report job and display them here/use them.
example:
pipeline {
agent any
stages {
stage('stage_name') {
steps {
build job: 'JOB_NAME'
}
}
}
}
Or with parameters:
pipeline {
agent any
stages {
stage('stage_name') {
steps {
build job: 'JOB_NAME_HERE', propagate: true, parameters:
[
[
$class: 'StringParameterValue',
name: 'STRING_NAME_HERE',
value: "STRING_VALUE_HERE"
]
]
}
}
}
}
propagate: true means that the original job will fail if JOB_NAME_HERE fails.

How to trigger a build on commit to branch in scripted pipeline

triggers { pollSCM('H */4 * * 1-5') } will this work for merge to branch
I see we have 3 options for triggers this seems to be fo declarative
corn
pollScm
upstream
where in for scripted pipeline is that something like this
properties([pipelineTrigger([triggers('gitPush')])])
OR
properties([pipelineTriggers([githubPush()])])// With this should I also enable a option on Jenkins instance
You can also use Generic Webhook Trigger plugin.
You will need to create a webhook in github and in Jenkins Pipeline something like below.
triggers {
GenericTrigger( genericVariables: [
[defaultValue: '', key: 'commit_author', regexpFilter: '', value: '$.pusher.name'],
[defaultValue: '', key: 'ref', regexpFilter: '', value: '$.ref']
],
causeString: '$commit_author committed to $ref',
printContributedVariables: false,
printPostContent: false,
regexpFilterText: '$ref',
regexpFilterExpression: 'refs/heads/develop',
token: '12345'
}
Hope this helps.
Use something like this in you Jenkinsfile. Use Only those option which you need. Remove which you don't want.
pipeline {
agent any
triggers {
github(
triggerOnPush: true,
triggerOnMergeRequest: true,
triggerOpenMergeRequestOnPush: "source",
branchFilterType: "All",
triggerOnNoteRequest: false)
}
stages {
...
}
}
NOTE: Make sure you have done all the webhook configuration bewteen github and jenkins. and installed webhook plaugin on jenkins

Trigger builds remotely (e.g., from scripts) syntax in Jenkinsfile

I am using this option in my freestyle jobs but now my team is moving to make a standard format so I have to write all my freestyle jobs in Pipeline script and I google a lot but didn't get how could I write this option in the Pipeline script.
You can trigger remote Jenkins jobs using triggerRemoteJob pipeline step.
Documentation: https://jenkins.io/doc/pipeline/steps/Parameterized-Remote-Trigger/
And here is a short example that illustrates how to use this step with authentication. I used Jenkins User Token for authentication - the token and user name was stored in the Jenkins credentials with id xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx (obfuscated id ofc). The remote job in the below example is triggered with a single parameter foo == qwe123, and it is configured to wait until the remote job gets completed, and if it fails, the job that triggered the remote job fails as well.
pipeline {
agent any
stages {
stage("Execute remote job") {
steps {
script {
def jobUrl = "https://remote-jenkins-host/job/remote-job-to-trigger/"
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx', usernameVariable: 'USERNAME', passwordVariable: 'TOKEN']]) {
def handle = triggerRemoteJob job: jobUrl,
blockBuildUntilComplete: true,
shouldNotFailBuild: true,
parameters: "foo=qwe123",
auth: TokenAuth(apiToken: env.TOKEN, userName: env.USERNAME)
echo "Remote tests status: ${handle.buildStatus.toString()}"
}
}
}
}
}
}
Hope it helps.

Append to Job properties

My job parameters defined in job-dsl.groovy are overwritten by those defined in pipeline.
I am using job-dsl-plugin and Jenkins pipeline to generate Jenkins job for each git branch. Sine my code is stored in gitLab they require gitLab integration. I am providing that using gitlab-plugin. The problem is with the 'gitLabConnection' it looks like it can be only applied from inside the Jenkins pipeline.
So if in job-dsl I would do:
branches.each { branch ->
String safeBranchName = branch.name.replaceAll('/', '-')
if (safeBranchName ==~ "^release.*")
{
return
}
def branch_folder = "${basePath}/${safeBranchName}"
folder branch_folder
pipelineJob("$branch_folder/build") {
logRotator {
numToKeep 20
}
parameters {
stringParam("BRANCH_NAME", "${safeBranchName}", "")
stringParam("PROJECT_NAME", "${basePath}", "")
{
}
And then in my Jenkins pipeline I would add the 'gitLabConnection'
node('node_A') {
properties([
gitLabConnection('gitlab.internal')
])
stage('clean up') {
deleteDir()
}
///(...)
I have to do it like:
node('node_A') {
properties([
gitLabConnection('gitlab.internal'),
parameters([
string(name: 'BRANCH_NAME', defaultValue: BRANCH_NAME, description: ''),
string(name: 'PROJECT_NAME', defaultValue: PROJECT_NAME, description: '')
])
])
stage('clean up') {
deleteDir()
}
///(...)
So that my BRANCH_NAME and PROJECT_NAME are not overwritten.
Is there another way to tackle this ?
Is it possible to append the 'gitLabConnection('gitlab.internal')' to the properties in the Jenkins pipeline ?
Unfortunately it doesn't seem like there is a way to do this yet. There's some discussion about this at https://issues.jenkins-ci.org/browse/JENKINS-43758 and I may end up opening a feature request to allow people to "append to properties"
There are 2 ways for solving this. The first one uses only Jenkins pipeline code, but if you choose this path the initial job run will most likely fail. This initial fail will happen, because at the time of first job run, the pipeline creates Jenkins job parameters. Once the parameters are created, job will work.
Option '1' - using Jenkins pipeline Only.
In 'Pipeline Syntax'/'Snippet Generator' check:
'This project is parameterised'.
Add parameter(s) you need, and hit 'Generate Pipeline Script'. In my case I get:
properties([
gitLabConnection(gitLabConnection: 'my_gitlab_connection', jobCredentialId: '', useAlternativeCredential: false),
[$class: 'JobRestrictionProperty'],
parameters([
string(defaultValue: 'test', description: 'test', name: 'test', trim: false)
]),
throttleJobProperty(categories: [], limitOneJobWithMatchingParams: false, maxConcurrentPerNode: 0, maxConcurrentTotal: 0, paramsToUseForLimit: '', throttleEnabled: false, throttleOption: 'project')
])
Option '2' - It's more complicated but, also far more powerfull. The one I finally took, because of the issues described above.
Use Jenkins job DSL plugin - https://github.com/jenkinsci/job-dsl-plugin
Gitlab plugin works quite well with it https://github.com/jenkinsci/gitlab-plugin#declarative-pipeline-jobs

How can I trigger another job from a jenkins pipeline (jenkinsfile) with GitHub Org Plugin?

How can I trigger build of another job from inside the Jenkinsfile?
I assume that this job is another repository under the same github organization, one that already has its own Jenkins file.
I also want to do this only if the branch name is master, as it doesn't make sense to trigger downstream builds of any local branches.
Update:
stage 'test-downstream'
node {
def job = build job: 'some-downtream-job-name'
}
Still, when executed I get an error
No parameterized job named some-downtream-job-name found
I am sure that this job exists in jenkins and is under the same organization folder as the current one. It is another job that has its own Jenkinsfile.
Please note that this question is specific to the GitHub Organization Plugin which auto-creates and maintains jobs for each repository and branch from your GitHub Organization.
In addition to the above mentioned answers: I wanted to start a job with a simple parameter passed to a second pipeline and found the answer on http://web.archive.org/web/20160209062101/https://dzone.com/refcardz/continuous-delivery-with-jenkins-workflow
So i used:
stage ('Starting ART job') {
build job: 'RunArtInTest', parameters: [[$class: 'StringParameterValue', name: 'systemname', value: systemname]]
}
First of all, it is a waste of an executor slot to wrap the build step in node. Your upstream executor will just be sitting idle for no reason.
Second, from a multibranch project, you can use the environment variable BRANCH_NAME to make logic conditional on the current branch.
Third, the job parameter takes an absolute or relative job name. If you give a name without any path qualification, that would refer to another job in the same folder, which in the case of a multibranch project would mean another branch of the same repository.
Thus what you meant to write is probably
if (env.BRANCH_NAME == 'master') {
build '../other-repo/master'
}
You can use the build job step from Jenkins Pipeline (Minimum Jenkins requirement: 2.130).
Here's the full API for the build step: https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
How to use build:
job: Name of a downstream job to build. May be another Pipeline job, but more commonly a freestyle or other project.
Use a simple name if the job is in the same folder as this upstream Pipeline job;
You can instead use relative paths like ../sister-folder/downstream
Or you can use absolute paths like /top-level-folder/nested-folder/downstream
Trigger another job using a branch as a param
At my company many of our branches include "/". You must replace any instances of "/" with "%2F" (as it appears in the URL of the job).
In this example we're using relative paths
stage('Trigger Branch Build') {
steps {
script {
echo "Triggering job for branch ${env.BRANCH_NAME}"
BRANCH_TO_TAG=env.BRANCH_NAME.replace("/","%2F")
build job: "../my-relative-job/${BRANCH_TO_TAG}", wait: false
}
}
}
Trigger another job using build number as a param
build job: 'your-job-name',
parameters: [
string(name: 'passed_build_number_param', value: String.valueOf(BUILD_NUMBER)),
string(name: 'complex_param', value: 'prefix-' + String.valueOf(BUILD_NUMBER))
]
Trigger many jobs in parallel
Source: https://jenkins.io/blog/2017/01/19/converting-conditional-to-pipeline/
More info on Parallel here: https://jenkins.io/doc/book/pipeline/syntax/#parallel
stage ('Trigger Builds In Parallel') {
steps {
// Freestyle build trigger calls a list of jobs
// Pipeline build() step only calls one job
// To run all three jobs in parallel, we use "parallel" step
// https://jenkins.io/doc/pipeline/examples/#jobs-in-parallel
parallel (
linux: {
build job: 'full-build-linux', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
mac: {
build job: 'full-build-mac', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
windows: {
build job: 'full-build-windows', parameters: [string(name: 'GIT_BRANCH_NAME', value: env.BRANCH_NAME)]
},
failFast: false)
}
}
Or alternatively:
stage('Build A and B') {
failFast true
parallel {
stage('Build A') {
steps {
build job: "/project/A/${env.BRANCH}", wait: true
}
}
stage('Build B') {
steps {
build job: "/project/B/${env.BRANCH}", wait: true
}
}
}
}
The command build in pipeline is there to trigger other jobs in jenkins.
Example on github
The job must exist in Jenkins and can be parametrized.
As for the branch, I guess you can read it from git
Use build job plugin for that task in order to trigger other jobs from jenkins file.
You can add variety of logic to your execution such as parallel ,node and agents options and steps for triggering external jobs. I gave some easy-to-read cookbook example for that.
1.example for triggering external job from jenkins file with conditional example:
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueOfParam2')
]
}
2.example triggering multiple jobs from jenkins file with conditionals example:
def jobs =[
'job1Title'{
if (env.BRANCH_NAME == 'master') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam1',value: 'valueNameOfParam1')
booleanParam(name: 'keyNameOfParam2',value:'valueNameOfParam2')
]
}
},
'job2Title'{
if (env.GIT_COMMIT == 'someCommitHashToPerformAdditionalTest') {
build job:'exactJobName' , parameters:[
string(name: 'keyNameOfParam3',value: 'valueOfParam3')
booleanParam(name: 'keyNameOfParam4',value:'valueNameOfParam4')
booleanParam(name: 'keyNameOfParam5',value:'valueNameOfParam5')
]
}
}

Resources