I'm using Jenkins together with the Bitbucket branch source plugin.
Everything works great, but I want to be able to run/exclude certain stages in my pipeline depending on whether the branch is associated with a pull request or not, such as:
pipeline {
stages {
stage('build') {
//compile
}
stage('package') {
when {
environment name: 'IS_PULL_REQUEST', value: 'true'
}
//create deployable package
}
}
}
Jenkins knows when the branch is for a PR because it merges the source with the target and also displays the branch in the pull request folder on the multibranch pipeline page.
Is there an environment variable I can use within the pipeline to exclude/include stages?
You can use BRANCH_NAME and CHANGE_ID environment variables to detect pull requests. When you run a multibranch pipeline build from a branch (before creating a pull request), the following environment variables are set:
env.BRANCH_NAME is set to the repository branch name (e.g. develop),
env.CHANGE_BRANCH is null,
env.CHANGE_ID is null.
But once you create a pull request, then:
env.BRANCH_NAME is set to the PR-\d+ name (e.g. PR-11),
env.CHANGE_BRANCH is set to the real branch name (e.g. develop),
env.CHANGE_ID is set to the pull request ID (e.g. 11).
I use the following when condition in my pipelines to detect pull requests:
when {
expression {
// True for pull requests, false otherwise.
env.CHANGE_ID && env.BRANCH_NAME.startsWith("PR-")
}
}
In Declarative Pipelines, you can also use the built-in condition changeRequest inside the when directive to determine if the branch is associated with a pull request.
stage('package') {
when {
changeRequest()
}
//create deployable package
}
You can also check if the pull request is targeted at a particular branch:
stage('package') {
when {
changeRequest target: 'master'
}
//create deployable package
}
See https://jenkins.io/doc/book/pipeline/syntax/#when.
Related
I coded a generic pipeline which accepts several parameters in order to deploy releases from a pre-defined GitHub repository to specific nodes. I wanted to host this pipeline on a Jenkinsfile on GitHub, so I configured the job to work with a "Pipeline script from SCM". The fact is - when I try and build the job - the Jenkinsfile gets checked out on every node. Is it possible to checkout and execute the Jenkinsfile only on, say, the master node and run the pipeline as intended?
EDIT: As I stated before, the pipeline works just fine and as intended setting the job to work with a pipeline script. The thing is when I try and change it to be a "Pipeline script from SCM", the Jenkinsfile gets checked out on every agent, which is a problem since I don't have git installed on any agent other than master. I want the Jenkinsfile to be checked out only on master agent and be executed as intended. FYI the pipeline below:
def agents = "$AGENTS".toString()
def agentLabel = "${ println 'Agents: ' + agents; return agents; }"
pipeline {
agent none
stages {
stage('Prep') {
steps {
script {
if (agents == null || agents == "") {
println "Skipping build"
skipBuild = true
}
if (!skipBuild) {
println "Agents set for this build: " + agents
}
}
}
}
stage('Powershell deploy script checkout') {
agent { label 'master' }
when {
expression {
!skipBuild
}
}
steps {
git url: 'https://github.com/owner/repo.git', credentialsId: 'git-credentials', branch: 'main'
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
}
stage('Deploy') {
agent { label agentLabel }
when {
expression {
!skipBuild
}
}
steps {
unstash 'deploy-script'
script {
println "Execute powershell deploy script on agents set for deploy"
}
}
}
}
}
I think that skipDefaultCheckout is what are you looking for:
pipeline {
options {
skipDefaultCheckout true
}
stages {
stage('Prep') {
steps {
script {
........................
}
}
}
}
}
Take a look to the documentation:
skipDefaultCheckout
Skip checking out code from source control by default in the agent directive.
https://www.jenkins.io/doc/book/pipeline/syntax/
I think you are requesting the impossible.
Now:
your Jenkinsfile is inside your jenkins configuration and is sent as such to each of your agents. No need for git on your agents.
Pipeline script for SCM:
Since you use git, SCM = git. So you are saying: my Pipeline needs to be fetched from a git repository. You are declaring the Deploy step to run on agent { label agentLabel }, so that step is supposed to run on another agent than master.
How would you imagine that agent could get the content of the Jenkinsfile to know what to do, but not use git ?
What happens in Jenkins?
Your master agent gets triggered that it needs to build
the master agent checkouts the Jenkinsfile using git (since it is a Pipeline script from SCM)
jenkins reads the Jenkinsfile and sees what has to be done.
for the Prep stage, I'm not quite sure what happens without agent, I guess that runs on master agent.
the Powershell deploy script checkout is marked to run on master agent, so it runs on master agent (note that the Jenkinsfile will get checked out with git two more times:
before starting the stage, because jenkins needs to know what to execute
one more checkout because you specify git url: 'https://github.com/owner/repo.git'...
the Deploy stage is marked to run on agentLabel, so jenkins tries to checkout your Jenkinsfile on that agent (using git)...
You can use Scripted Pipeline to do this, it should basically look like this
node('master') {
checkout scm
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
def stepsForParallel = [:]
env.AGENTS.split(' ').each { agent ->
stepsForParallel["deploy ${agent}"] = { ->
node(agent) {
unstash 'deploy-script'
}
}
parallel stepsForParallel
you can find all info about jenkins agent section here.
Shortly: you can call any agent by name or label.
pipeline {
agent {
label 'master'
}
}
If it will not work for you, then you will need to set any label on master node and call it by label
pipeline {
agent {
label 'master_label_here'
}
}
I am building a declarative pipeline Jenkinsfile for semantic branching. It has the format:
pipeline {
stages {
stage('develop branch build') {
when {
branch 'develop'
}
// build and deploy to QA environment
}
stage('release branch build') {
when {
branch 'release'
}
// build and deploy to live/preproduction environment
}
}
}
I would like an additional stage to run upon a Bitbucket pull request. It would do a particular PR test and deploy task, and pass or fail the pipeline accordingly.
How might I enhance this script to do that?
I use the generic webhook plugin. This work pretty nice with bitbucket.
I have a multibranch pipeline with the following behaviors:
And the following Jenkinsfile:
pipeline {
agent {
label 'apple'
}
stages {
stage('Lint') {
when {
changeRequest()
}
steps {
sh 'fastlane lint'
}
}
}
post {
success {
reportSuccess()
}
failure {
reportFailure()
}
}
}
I use a slave to run the actual build, but the master still needs to checkout the code to get the Jenkinsfile. For that, it seems to use the same behaviors as the one defined in the job even though it really only needs the Jenkinsfile.
My problem is that I want to discover pull requests by merging the pull request with the current target branch revision, but when there is a merge conflict the build will fail before the Jenkinsfile is executed. This prevents any kind of reporting done in post steps.
Is there a way to have the initial checkout not merge the target branch, but still have it merged when actually running the Jenkinsfile on a slave?
You may want to check out using "Current Pull Request revision" strategy, and then on a successful build issue a git merge command.
I'm trying to find a way to pass a configuration for a Multibranch pipeline job into the jenkinsfile when it's executing.
My goal is to configure something like the following:
Branch : Server
"master" : "prodServer"
"develop" : "devServer"
"release/*", "hotfix/*" : "stagingServer"
"feature/Thing-I-Want-To-Change-Regularly" : "testingServer"
where I can then write a Jenkinsfile like this:
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
//branch is in config branches
}
steps {
//deploy to server
}
}
}
}
I'm having trouble finding a way to achieve this. EnvInject Plugin seems to be the solution for non-Pipeline projects, but it's currently got security issues and only partial Pipeline support.
If you want to deploy to different servers depending on the branch, in Multibranch Pipelines you can use:
when { branch 'master' } (decalrative)
or
${env.BRANCH_NAME} (scripted)
to access which branch you are on and then add logic to deploy to corresponding servers based on this.
Going to post my current best approach to a global config value and hope something better comes along.
In Manage Jenkins -> Configure System -> Global Properties you can define global Environment Variables which can be accessed from Jenkins jobs. Defining an MY_BRANCH variable there could be accessed from a pipeline.
when { branch: MY_BRANCH }
Or even a RegEx and used like this
when { expression { BRANCH_NAME ==~ MY_BRANCH } }
However, this has the disadvantage that the Environment Variables are shared between every Jenkins job, not just across all branches of a single job. So careful naming will be necessary.
I am using Jenkins multi branch pipeline with bitbucket and I see an issue where the automatic build created for a PR fails as I rely on env.BRANCH_NAME.
Problem is that this env now holds not the feature branch name as expected, instead it holds the PR is (e.g. PR-2 instead of feature/test-branch).
I have code in my job that pushes to branch based on the BRANCH_NAME. This code obviously now fails as there is no branch named PR-2.
Anyone saw this before and has a workaround?
I have a stage in my pipeline setting the build name accordingly in case I have to use the CHANGE_BRANCH instead of the normal branch name.
stage('Set Build Name') {
steps {
script {
if (env.BRANCH_NAME.startsWith('PR')) {
currentBuild.displayName = "#${env.BUILD_NUMBER} - ${env.CHANGE_BRANCH}"
} else {
currentBuild.displayName = "#${env.BUILD_NUMBER} - ${env.BRANCH_NAME}"
}
}
}
}