I'm trying to use a Gitlab web hook to trigger a job in Jenknis after pushing a commit/opening a merge commit using a pipeline script.
For some reason, Jenkins always checks out the master branch and builds it. How
can I specify which branch to build using the Groovy script?
I tried to use the environment variable from the Gitlab POST request, but it still always uses the master branch:
checkout changelog: false, poll: false, scm: [$class: 'GitSCM' , branches: [[name:'origin/${env.gitlabSourceBranch}']], browser: [$class 'GitLab', repoUrl: 'some-git-repo.com', version: 9.0], doGenerateSubmoduleConfiguration: false, extensions: [[$class: 'SubmoduleOption' disableSubmodules: false, parentCredentials: true, recursiveCredentials: true, recursiveSubmodules: true, reference: '', trackingSubmodules: false], [$class: 'PrebuildMerge', options: [fastForwardMode: 'FF', mergeRemote: '', mergeTarget: 'origin/${env.gitlabTargetBranch}']]], submodulecfg: [], userRemoteConfigs: [[credentialsId: '12345', url: 'git#some-git-repo.com:A/repo.git']]]
(I generated this command using the snippet generator)
You can use the "Generic Webhook trigger" plugin in Jenkins.
It allows you to get the branch name from the POST request sent by Gitlab, map it as a variable and use it inside your pipeline.
Configuration:
Inside the pipeline:
stage('Clone sources') {
steps {
git credentialsId: '...', poll: false, branch: ${branch}, url: '...'
}
}
Full example at: https://pillsfromtheweb.blogspot.com/2021/10/trigger-jenkins-on-merge-request-from.html
Related
I'm new to Jenkins and I'm trying to understand the following step in Jenkins pipeline line by line:
checkout scm
dir("some_directory") {
checkout(
changelog: false,
poll: false,
scm: [
$class : 'GitSCM',
branches : [[name: SOME_BRANCH_NAME]],
doGenerateSubmoduleConfigurations: false,
extensions : [[$class: 'CloneOption', depth: 0, honorRefspec: true, reference: '', shallow: false]],
submoduleCfg : [],
userRemoteConfigs : [[url: SOME_URL]]
]
)
sh 'pwd; ls'
}
From the research I've done I understood that
checkout scm dir("some_directory")
'dir' creates a folder in workspace if it doesn't exist, and git project gets checked out into this directory
checkout(
changelog:false,
poll:false,
scm: [...]
)
this block of code specifies git parameters of the git project that is being checked out into the directory specified above.
This is so far what I understood, can someone please lt me know if my understanding is correct? And possibly add more details to it.
Also, I am confused with the current code syntax. Would it make any difference if I rewrite the top few lines as:
checkout scm dir("some_directory")(
changelog: false,
poll: false,
etc.
)
instead of using 'checkout' two times.
I have jenkins multibranch pipeline with jenkins git plugin.
When the new pull requested is created a new PR job starts, and checkout of the repository is done automatically. The problem is sometimes it hits timeout (networking).
I try to do retry in pipeline by using GitSCM code with some conditionals:
checkout([
$class: 'GitSCM',
branches: scm.branches,
doGenerateSubmoduleConfigurations: scm.doGenerateSubmoduleConfigurations,
extensions: scm.extensions + [[$class: 'CloneOption', noTags: false, reference: '', shallow: false]],
submoduleCfg: [],
userRemoteConfigs: scm.userRemoteConfigs
])
}
It repeats the checkout just fine, but I still need to disable the first default checkout from the plugin(if it fails the job fails). How do I do that? How do I override the built-in checkout?
skipDefaultCheckout option should disable default checkout. E.g.:
options { skipDefaultCheckout() }
Read more here about it: https://www.jenkins.io/doc/book/pipeline/syntax/#available-options
I have a Java project in http://localhost:7990/scm/bout/boutique-a.git
I want to have 2 Jenkins pipeline jobs:
Job 1/ trigger on commit done on */develop
Job 2/ trigger on commit done on any */feature
p
each job will do a basic mvn install, mvn test, sonar ...
a simple script with
node {
checkout([$class: 'GitSCM',
branches: [[name: 'develop]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'SubmoduleOption', disableSubmodules: false,
parentCredentials: false, recursiveSubmodules: true, reference: '',
trackingSubmodules: false]], submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'admin',
url: 'http://localhost:7990/scm/bout/boutique-a.git']]])
}
works if a commit is done in /develop or if I give explicitly the branch name like feature/test-a but how to configure a script for any feature/
It seems that what i'm asking is not possible using pipeline job.
I found a work arround for "feature/** ". I created a param BRANCH_NAME in the job, then the branch name is send by bitbucket when a push is made on "feature/** " through a basic POST request.
http://user:token#localhost:8081/jenkins/job/MY_JOB_NAME/buildWithParameters?token=U1C1yQo7x3&BRANCH_NAME=feature/branche-test
I'm creating this new Job based on pipeline on jenkins. I want my jenkinsfile to be on bitbucket reposiotry : Let's say my config file is on bitbucket.org/config.git
The job mission is to clean install a project bitbucket.org/myProject.git
How can I configure the pipeline so it will trigger if any push is made in bitbucket.org/myProject.git and following the steps defined in bitbucket.org/config.git?
I do not want to create multi-branch pipeline and I do not want my jenkins file to be on the same repository than my project to compile.
My current config is:
pipeline {
agent any
parameters {
string(defaultValue: '', description: 'URL', name: 'GIT_URL')
string(defaultValue: '', description: 'Credential', name: 'CREDENTIAL_ID')
}
stages {
stage ('Initialize') {
steps {
git branch: 'develop', credentialsId: "${params.CREDENTIAL_ID}", url: "${params.GIT_URL}"
}
}
stage ('Build') {
steps {
sh 'mvn clean install '
echo 'build'
}
}
}
You can use shared Libraries in Jenkins. you would still need a Jenkinsfile in your code, but that would not contain any logic. It would simply refer the shared library and pass any params like git repo path.
For more information on shared libraries please refer this link https://jenkins.io/doc/book/pipeline/shared-libraries/.
For triggering the build, you can define a trigger in your pipeline. Example :
triggers {
pollSCM('H/5 * * * *')
}
or use webhooks if you don't want to poll.
Actually, i managed to make it work.
In my jenkins pipeline, i activated "Build when a change is pushed to BitBucket".
node {
checkout([$class: 'GitSCM',
branches: [[name: 'feature/test-b']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'SubmoduleOption', disableSubmodules: false,
parentCredentials: false, recursiveSubmodules: true, reference: '',
trackingSubmodules: false]], submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'admin',
url: 'http://localhost:7990/scm/bout/boutique-a.git']]])
}
When a change is made in boutique-a in the branch 'feature/test-b' my job is triggered which is cool.
Now i have this other issue, how can i trigger when change are made in feature/*
It looks like i cannot access to env.BRANCH_NAME when im not in a multibranch pipeline
I have a Jenkins pipeline script, which is supposed get 2 repositories from GitLab (triggered by Gitlab webhook), if there are changes on "master" branch in any of the repositories (and so more stuff afterwards).
My job DSL config:
pipelineJob('pipline') {
// Enable but do not use SCM polling. This needs to be enabled for
// notifyCommit to work but we don't want to do any actual polling.
configure { project ->
project / 'triggers' << 'hudson.triggers.SCMTrigger' {
spec('')
}
}
definition {
cps {
sandbox(true)
script ('''node('Foo') {
//repo abc
checkout changelog: false,
poll: true,
scm: [$class: 'GitSCM',
branches: [[name: 'refs/heads/master']],
doGenerateSubmoduleConfigurations: false,
extensions: [
[$class: 'SubmoduleOption',
disableSubmodules: true,
parentCredentials: false,
recursiveSubmodules: false,
reference: '',
trackingSubmodules: false],
[$class: 'RelativeTargetDirectory',
relativeTargetDir: "abc"]
],
submoduleCfg: [],
userRemoteConfigs: [
[url: '<repository abc>'],
[refspec: '+refs/heads/master:refs/remotes/origin/master'],
[name: 'origin']
]]
//repo def
checkout changelog: false,
poll: true,
scm: [$class: 'GitSCM',
branches: [[name: 'refs/heads/master']],
doGenerateSubmoduleConfigurations: false,
extensions: [
[$class: 'CheckoutOption', timeout: 20],
[$class: 'SubmoduleOption',
disableSubmodules: true,
parentCredentials: false,
recursiveSubmodules: false,
reference: '',
trackingSubmodules: false],
[$class: 'RelativeTargetDirectory',
relativeTargetDir: "def"],
[$class: 'PruneStaleBranch']],
submoduleCfg: [],
userRemoteConfigs: [
[url: '<repository def>'],
[refspec: '+refs/heads/master:refs/remotes/origin/master'],
[name: 'origin']
]]
...
do more things
''')
}
}
}
The polling working fine for the first repository, but when I have changes in a seconds repository not in a master branch, I still get the whole thing started. It looks that it takes last build revision SHA only from the first repository and then uses it to compare in all other repositories.
here's how the git polling log looks like:
Started on Feb 15, 2017 12:10:09 PM
Using strategy: Default
[poll] Last Built Revision: Revision <SHA from last commit in repository abc on master branch > (refs/remotes/origin1/master)
> git ls-remote -h <repository abc> # timeout=10
Found 4 remote heads on <repository abc>
[poll] Latest remote head revision on refs/heads/master is: <SHA from last commit in repository abc on master branch > - already built by 76
Using strategy: Default
[poll] Last Built Revision: Revision <SHA from last commit in repository abc on master branch > (refs/remotes/origin1/master)
> git ls-remote -h <repository def> # timeout=10
Found 2 remote heads on <repository def>
[poll] Latest remote head revision on refs/heads/master is: <SHA from last commit in repository def on master branch >
Done. Took 1.1 sec
Changes found
I was using buildFlowJob before with same functionality and it worked fine.