Using Jenkins to deploy to staging and production based on condition - jenkins

My project has a Jenkinsfile that runs smoothly. The problem is that I need to run some commands only on certain occasions. I'm using the Github plugin. I need to run the deploy only when it is in the master or a new tag, one will be for staging and the other will be production.
pipeline {
agent any
stages {
stage('Test') {
steps {
sh 'node -v'
sh 'yarn install'
sh 'yarn test -- --coverage'
}
}
stage('Build') {
steps {
sh 'yarn build'
}
}
stage('Deploy') {
steps {
sh 'aws s3 sync ./build s3://my.bucket --only-show-errors'
}
}
}
}
I need the master to deploy to a bucket and when it is new tag to another. How can I create this conditional?

How about the following working as two conditionals for two separate deployment scenarios? I think it's better to work with this using variables to indicate deployment scenarios instead of splitting this to two distinctly different steps though. You could for example write a shell script that would handle everything inside depending on tags/branches/whatever you need instead of forcing yourself to control this on pipeline level.
Each stage will have it's steps executed only when when part is satisfied. Stage Deploy will only work for master branch, while stage Deploy_NonMaster will only work any non master branch. Using the method written in when conditionals you can check for anything, including tags or whatnot.
stage ('Deploy') {
when {
expression {
GIT_BRANCH = sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
return (GIT_BRANCH == 'master')
}
}
steps {
echo 'Do stuff/deploy.'
}
}
stage ('Deploy_NonMaster') {
when {
expression {
GIT_BRANCH = sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
return !(GIT_BRANCH == 'master')
}
}
steps {
echo 'Do stuff/deploy.'
}
}

Related

Jenkins Multibranch Pipeline: How to checkout only once?

I have created very basic Multibranch Pipeline on my local Jenkins via BlueOcean UI. From default config I removed almost all behaviors except one for discovering branches. The config looks line follows:
Within Jenkinsfile I'm trying to setup following scenario:
Checkout branch
(optionally) Merge it to master branch
Build Back-end
Build Front-end
Snippet from my Jenkinsfile:
pipeline {
agent none
stages {
stage('Setup') {
agent {
label "master"
}
steps {
sh "git checkout -f ${env.BRANCH_NAME}"
}
}
stage('Merge with master') {
when {
not {
branch 'master'
}
}
agent {
label "master"
}
steps {
sh 'git checkout -f origin/master'
sh "git merge --ff-only ${env.BRANCH_NAME}"
}
}
stage('Build Back-end') {
agent {
docker {
image 'openjdk:8'
}
}
steps {
sh './gradlew build'
}
}
stage ('Build Front-end') {
agent {
docker {
image 'saddeveloper/node-chromium'
}
}
steps {
dir ('./front-end') {
sh 'npm install'
sh 'npm run buildProd'
sh 'npm run testHeadless'
}
}
}
}
}
Pipeline itself and building steps works fine, but the problem is that Jenkins adds "Check out from version control" step before each stage. The step looks for new branches, fetches refs, but also checks out current branch. Here is relevant output from full build log:
// stage Setup
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh git checkout -f my-branch
// stage Merge with master
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh git checkout -f origin/master
sh git merge --ff-only my-branch
// stage Build Back-end
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh ./gradlew build
// stage Build Front-end
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh npm install
sh npm run buildProd
sh npm run testHeadless
So as you see it effectively resets working directory to particular commit before every stage git checkout -f f067...595.
Is there any way to disable this default checkout behavior?
Or any viable option how to implement such optional merging to master branch?
Thanks!
By default, git scm will be executed in a Jenkins pipeline. You can disable it by doing:
pipeline {
agent none
options {
skipDefaultCheckout true
}
...
Also, I'd recommend take a look to other useful pipeline options https://jenkins.io/doc/book/pipeline/syntax/#options

Jenkins declarative single pipeline if branch condition not working

Using a single declarative pipeline (not multibranch pipeline)
Is there a way I can trigger a certain stage only if its the Master Branch ?
I've been unsuccessful with the following:
Stage('Deploy') {
steps {
script {
if (env.BRANCH_ENV == 'master') {
sh "mvn deploy"
} else {
echo 'Ignoring'
}
}
}
}
No matter what branch i'm deploying, everything gets ignored
any help or advice would be great
I had the same issue before and figured that env.BRANCH_ENV does not return what I expected. You can echo env.BRANCH_ENV in your pipeline to confirm.
My solution to this was to get the git branch manually:
scmVars = checkout scm
gitBranch = sh(
script: "echo ${scmVars.GIT_BRANCH} | cut -d '/' -f2",
returnStdout: true
).trim()
Here some approaches:
use return command to finalize prematurely the stage
https://stackoverflow.com/a/51406870/3957754
use when directive
when directive allows the Pipeline to determine whether the stage should be executed depending on the given condition
built-in conditions: branch, expression, allOf, anyOf, not etc.
when {
// Execute the stage when the specified Groovy expression evaluates to true
expression {
return params.ENVIRONMENT ==~ /(?i)(STG|PRD)/
}
}
Complete sample :
https://gist.github.com/HarshadRanganathan/97feed7f91b7ae542c994393447f3db4

Pass variables between Jenkins stages

I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.
A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}
Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}

Deploy to Heroku staging, then production with Jenkins

I have a Rails application with a Jenkinsfile which I'd like to set up so that a build is first deployed to staging, then if I am happy with the result, it can be built on production.
I've set up 2 Heroku instances, myapp-staging and myapp-production.
My Jenkinsfile has a node block that look like:
node {
currentBuild.result = "SUCCESS"
setBuildStatus("Build started", "PENDING");
try {
stage('Checkout') {
checkout scm
gitCommit = sh(returnStdout: true, script: 'git rev-parse HEAD').trim()
shortCommit = gitCommit.take(7)
}
stage('Build') {
parallel 'build-image':{
sh "docker build -t ${env.BUILD_TAG} ."
}, 'run-test-environment': {
sh "docker-compose --project-name myapp up -d"
}
}
stage('Test') {
ansiColor('xterm') {
sh "docker run -t --rm --network=myapp_default -e DATABASE_HOST=postgres ${env.BUILD_TAG} ./ci/bin/run_tests.sh"
}
}
stage('Deploy - Staging') {
// TODO. Use env.BRANCH_NAME to make sure we only deploy from staging
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'Heroku Git Login', usernameVariable: 'GIT_USERNAME', passwordVariable: 'GIT_PASSWORD']]) {
sh('git push https://${GIT_USERNAME}:${GIT_PASSWORD}#git.heroku.com/myapp-staging.git staging')
}
setBuildStatus("Staging build complete", "SUCCESS");
}
stage('Sanity check') {
steps {
input "Does the staging environment look ok?"
}
}
stage('Deploy - Production') {
// TODO. Use env.BRANCH_NAME to make sure we only deploy from master
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'Heroku Git Login', usernameVariable: 'GIT_USERNAME', passwordVariable: 'GIT_PASSWORD']]) {
sh('git push https://${GIT_USERNAME}:${GIT_PASSWORD}#git.heroku.com/myapp-production.git HEAD:refs/heads/master')
}
setBuildStatus("Production build complete", "SUCCESS");
}
}
My questions are:
Is this the correct way to do this or is there some other best practice? For example do I need two Jenkins pipelines for this or is one project pipeline enough?
How can I use Jenkins' BRANCH_NAME variable to change dynamically depending on the stage I'm at?
Thanks in advance!
for the first question, using one Jenkinsfile to describe the complete project pipeline is desirable. it keeps the description of the process all in one place, and shows you the process flow in one UI, so your Jenkinsfile seems great in that regard.
for the second question, you can wrap steps in if conditions based on branch. so if you wanted to, say, skip the prod deployment and the step that asks the user if staging looks ok (since you're not going to do the prod deployment) if the branch is not master, this would work.
node('docker') {
try {
stage('Sanity check') {
if (env.BRANCH_NAME == 'master') {
input "Does the staging environment look ok?"
}
}
stage('Deploy - Production') {
echo 'deploy check'
if (env.BRANCH_NAME == 'master') {
echo 'do prod deploy stuff'
}
}
} catch(error) {
}
}
i removed some stuff from your pipeline that wasn't necessary to demonstrate the idea, but i also fixed what looked to me like two issues. 1) you seemed to be mixing metaphors between scripted and declarative pipelines. i think you are trying to use a scripted pipeline, so i made it full scripted. that means you cannot use steps, i think. 2) your try was missing a catch.
at the end of the day, the UI is a bit weird with this solution, since all steps will always show up in all cases, and they will just show as green, like they passed and did what they said they would do (it will look like it deployed to prod, even on non-master branches). there is no way around this with scripted pipelines, to my knowledge. with declarative pipelines, you can do the same conditional logic with when, and the UI (at least the blue ocean UI) actually understands your intent and shows it differently.
have fun!

How to change a Jenkins Declarative Pipeline environment variable?

I'm trying to create some Docker images. For that I want to use the version number specified in the Maven pom.xml file as tag. I am however rather new to the declarative Jenkins pipelines and I can't figure out how to change my environment variable so that VERSION contains the right version for all stages.
This is my code
#!groovy
pipeline {
tools {
maven 'maven 3.3.9'
jdk 'Java 1.8'
}
environment {
VERSION = '0.0.0'
}
agent any
stages {
stage('Checkout') {
steps {
git branch: 'master', credentialsId: '290dd8ee-2381-4c5b-8d33-5631d03ee7be', url: 'git#gitlab.crosslang.local:company/SOME-API.git'
sh "git clean -f && git reset --hard origin/master"
}
}
stage('Build and Test Java code') {
steps {
script {
def pom = readMavenPom file: 'pom.xml'
VERSION = pom.version
}
echo "${VERSION}"
sh "mvn clean install -DskipTests"
}
}
stage('Build Docker images') {
steps {
dir('whales-microservice/src/main/docker'){
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
script {
docker.build "company/whales-microservice:${VERSION}"
}
}
}
}
}
}
The problem is the single quote of the statement
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
single quotes don't expand variables in groovy: http://docs.groovy-lang.org/latest/html/documentation/#_string_interpolation
so you have to double quote your shell statement:
sh "cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar"
I just wanted to mention that if you have pipeline-utility-steps plugin installed you can use readMavenPom() in the environment part, too. It looks like this:
environment {
VERSION = readMavenPom().getVersion()
}

Resources