How to get branch name in jenkins shared library - jenkins

I'm trying to write my first Jenkins shared library and I'm struggling with something basic - getting the branch name.
I could do:
sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
However, that requires a checkout. Would it be possible to get the branch name (for both multibranch and freestyle) pipeline projects? I know I'll be using git, but I would like to avoid doing a checkout (until it is necessary).

The GIT_BRANCH environment variable should give you what you want. It won't work in pipeline until Jenkins 2.60 and upgraded pipeline model definition plugin.

If you are using a pipeline job, you can
Capture object returned from scm checkout
Reference environment variable
pipeline {
// ...
stages {
stage('Setup') {
steps {
script {
// capture scm variables
def scmVars = checkout scm
String branch = scmVars.GIT_BRANCH
// or use the environment variable
branch = env.GIT_BRANCH
}
}
}
// ...
}
}
Environment variable reference.

I ended up using this:
env.CHANGE_BRANCH ?: env.GIT_BRANCH ?: scm.branches[0]?.name?.split('/')[1] ?: 'UNKNOWN'
However, this requires me to approve several things in In-Script Approvals page.

Related

Jenkins pipeline determine if a branch is for Bitbucket pull request

I'm using Jenkins together with the Bitbucket branch source plugin.
Everything works great, but I want to be able to run/exclude certain stages in my pipeline depending on whether the branch is associated with a pull request or not, such as:
pipeline {
stages {
stage('build') {
//compile
}
stage('package') {
when {
environment name: 'IS_PULL_REQUEST', value: 'true'
}
//create deployable package
}
}
}
Jenkins knows when the branch is for a PR because it merges the source with the target and also displays the branch in the pull request folder on the multibranch pipeline page.
Is there an environment variable I can use within the pipeline to exclude/include stages?
You can use BRANCH_NAME and CHANGE_ID environment variables to detect pull requests. When you run a multibranch pipeline build from a branch (before creating a pull request), the following environment variables are set:
env.BRANCH_NAME is set to the repository branch name (e.g. develop),
env.CHANGE_BRANCH is null,
env.CHANGE_ID is null.
But once you create a pull request, then:
env.BRANCH_NAME is set to the PR-\d+ name (e.g. PR-11),
env.CHANGE_BRANCH is set to the real branch name (e.g. develop),
env.CHANGE_ID is set to the pull request ID (e.g. 11).
I use the following when condition in my pipelines to detect pull requests:
when {
expression {
// True for pull requests, false otherwise.
env.CHANGE_ID && env.BRANCH_NAME.startsWith("PR-")
}
}
In Declarative Pipelines, you can also use the built-in condition changeRequest inside the when directive to determine if the branch is associated with a pull request.
stage('package') {
when {
changeRequest()
}
//create deployable package
}
You can also check if the pull request is targeted at a particular branch:
stage('package') {
when {
changeRequest target: 'master'
}
//create deployable package
}
See https://jenkins.io/doc/book/pipeline/syntax/#when.

Is there a way for a Jenkins Pipeline, in a Multibranch setup, to automatically checkout the branch that is at the latest revision?

I'm trying to configure job in Jenkins Multibranch pipeline. There are a lot of branches in SVN and I want the job to checkout only the latest one and ignores the rest of them. This job triggers a pipeline that does multiple checks on the whole build... so I always need to trigger this on the latest branch because there I will have the latest revision of the build.
The SVN structure is like this: V01_01_01 till the latest one V01_08_03. Currently I have it set up like the below and in the Jenkins pipeline I have "checkout scm", but if a new branch appears e.g. V01_08_04 I need V01_08_03 to be replaced by V01_08_04. Is there any way to do this ?
My set-up in Jenkins Multibranch pipeline
I found a hack to this. I created a python script that checks the whole repository for the latest folder that was updated.
pipeline
{
agent any
parameters
{
string(name: 'latest_folder', defaultValue: '')
}
stages
{
stage ('find latest folder')
{
steps
{
withPythonEnv('System-CPython-3.8')
{
sh 'pip3 install svn'
script {
def folder_name = sh(script: 'python3 latest_folder_svn.py', returnStdout: true)
env.latest_folder = folder_name
}
}
}
}
stage ('Checkout Step')
{
steps
{
echo "${env.latest_folder}"
}
}
}
}
This variable I will add it in the checkout step in order to have always the latest branch.
The python script is pretty straightforward. I use svn library to parse the repository and extract what I need.

Pass configuration into a Jenkins Pipeline

I'm trying to find a way to pass a configuration for a Multibranch pipeline job into the jenkinsfile when it's executing.
My goal is to configure something like the following:
Branch : Server
"master" : "prodServer"
"develop" : "devServer"
"release/*", "hotfix/*" : "stagingServer"
"feature/Thing-I-Want-To-Change-Regularly" : "testingServer"
where I can then write a Jenkinsfile like this:
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
//branch is in config branches
}
steps {
//deploy to server
}
}
}
}
I'm having trouble finding a way to achieve this. EnvInject Plugin seems to be the solution for non-Pipeline projects, but it's currently got security issues and only partial Pipeline support.
If you want to deploy to different servers depending on the branch, in Multibranch Pipelines you can use:
when { branch 'master' } (decalrative)
or
${env.BRANCH_NAME} (scripted)
to access which branch you are on and then add logic to deploy to corresponding servers based on this.
Going to post my current best approach to a global config value and hope something better comes along.
In Manage Jenkins -> Configure System -> Global Properties you can define global Environment Variables which can be accessed from Jenkins jobs. Defining an MY_BRANCH variable there could be accessed from a pipeline.
when { branch: MY_BRANCH }
Or even a RegEx and used like this
when { expression { BRANCH_NAME ==~ MY_BRANCH } }
However, this has the disadvantage that the Environment Variables are shared between every Jenkins job, not just across all branches of a single job. So careful naming will be necessary.

Jenkins declarative pipeline with Docker/Dockerfile agent from SCM

With Jenkins using the Declarative Pipeline Syntax how do i get the Dockerfile (Dockerfile.ci in this example) from the SCM (Git) since the agent block is executed before all the stages?
pipeline {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
}
stage ('Checkout') {
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
[...]
}
In all the examples i've seen, the Dockerfile seems to be already present in the workspace.
You could try to declare agent for each stage separately, for checkout stage you could use some default agent and docker agent for others.
pipeline {
agent none
stage ('Checkout') {
agent any
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
stage ('Build') {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
steps {
[...]
}
}
}
[...]
}
If you're using a multi-branch pipeline it automatically checks out your SCM before evaluating the agent. So in that case you can specify the agent from a file in the SCM.
The answer is in the Jenkins documentation on the Dockerfile parameter:
In order to use this option, the Jenkinsfile must be loaded from
either a Multibranch Pipeline or a Pipeline from SCM.
Just scroll down to the Dockerfile section, and it's documented there.
The obvious problem with this approach is that it impairs pipeline development. Now instead of testing code in a pipeline field on the server, it must be committed to the source repository for each testable change. NOTE also that the Jenkinsfile checkout cannot be sparse or lightweight as that will only pick up the script -- and not any accompanying Dockerfile to be built.
I can think of a couple ways to work around this.
Develop against agents in nodes with the reuseNode true directive. Then when code is stable, the separate agent blocks can be combined together at the top of the Jenkinsfile which must then be loaded from the SCM.
Develop using the dir() solution that specs the exact workspace directory, or alternately use one of the other examples in this solution.

jenkins pipeline get repository url variable under pipeline script from scm

I'm using Jenkins file that located in my git repository.
I have configured new job using the pipeline script from SCM that point to my jenkinsfile. I'm trying to use in my Jenkins file pipeline script the git module in order to pull my data from my git repo without configure pre-static variable and just to use the variable of the repository URL under pipeline script from SCM that already was configured in my job .
There is a way to get somehow the variable Repository URL
from this plugin without using parameters in my Jenkins pipeline script.
I have already tried the environment variable GIT_URL and other stuff that related to git from here but this didn't work.
You can find all information about scm in scm variable (instance of GitSCM if you are using git).
You can get repository URL this way
def repositoryUrl = scm.userRemoteConfigs[0].url
But if you just want to checkout that repository you can simply invoke checkout scm without needing to specify anything else. See checkout step
from this post I found a way that you can use the checkout scm to get the git repo url like this:
checkout scm
def url = sh(returnStdout: true, script: 'git config remote.origin.url').trim()
but checkout scm will pull the code and I want to avoid from that.
So I found another way (not the pretty one):
node('master'){
try{
GIT_REPO_URL = null
command = "grep -oP '(?<=url>)[^<]+' /var/lib/jenkins/jobs/${JOB_NAME}/config.xml"
GIT_REPO_URL = sh(returnStdout: true, script: command).trim();
echo "Detected Git Repo URL: ${GIT_REPO_URL}"
}
catch(err){
throw err
error "Colud not find any Git repository for the job ${JOB_NAME}"
}
}
this is did the trick for me.
Probably not directly a solution for your particular case, as you're working with git.
But for those still working with SVN using the SubversionSCM, the repository URL can be obtained using
def repositoryUrl = scm.locations[0].remote
I believe that the best solution is like this answer.
An example using declarative pipeline:
pipeline {
agent any;
stages {
stage('test'){
steps {
script {
def s = checkout scm;
if (s.GIT_URL != null) print s.GIT_URL
else if (s.SVN_URL != null) print s.SVN_URL
else print s
}
}
}
}
}
Note - this does a full checkout. If that is not desirable, I would try to handle that in checkout parameters (like here)

Resources