I have a multi-branch Jenkins pipeline which is working fine. Now, I'm trying to optimize it. I have the following step:
`
stage('Migration-Dev') {
environment {
ENVIRONMENT = 'development'
}
when {
anyOf {
branch 'development';
branch 'staging';
branch 'production';
}
}
steps {
echo "Installing the NodeJS dependencies..."
sh 'npm ci'
echo "Performing some more tasks"
sh 'do somethingelse
}
}
`
Every git branch mentioned in the above step requires its unique ENVIRONMENT value to execute some specific actions.
Is it possible to dynamically create the ENVIRONMENT value name depending on the git branch instead of hard coding it? Something like if the branch is staging, then the ENVIRONMENT=release?
I would greatly appreciate any help.
For sure, please have a look at this answer
What I have for example in my pipeline:
environment {
APIGEE_ORGANIZATION = "companyName"
}
stages {
stage('Setup') {
steps {
echo 'Getting Proxies source from Gitea'
script
{
if (env.BRANCH_NAME == 'qas') {
echo 'Deployment to qas'
env.APIGEE_ENVIRONMENT = 'qas'
}
if (env.BRANCH_NAME == 'sandbox'){
env.APIGEE_ENVIRONMENT = 'sandbox'
echo 'Deployment to sandbox'
}
if (env.BRANCH_NAME == 'main'){
env.APIGEE_ENVIRONMENT = 'prod'
echo 'Deployment to prod'
}
}
Related
I am Unable to add the above circled functionality in attached image as Declarative Pipeline Syntax.
PS I am new to this, i Searched for this on others answers but no one matches my requirements.
For example if there is a Parameter in jenkins named VERSION, maven command should become
clean deploy -B -s pathtosettings.xml -DVERSION=valueinparameter
Below is my current code
NOte : I WANT ALL THE PARAMETERS AUTOMATICALLY -DVERSION=${params.VERSION} doesnt help me
pipeline {
agent any
stages {
stage('Checkout Scm') {
steps {
git 'ssh://git#XXXXXXXXXXXXXXXXXXXXXXXXX.git'
}
}
stage('Maven Build 0') {
steps {
configFileProvider([configFile(fileId:'0c0631a5-6510-4b4a-833d-4b80fa67d5f3', targetLocation: 'settings.xml', variable: 'SETTINGS_XML')]) {
withMaven{
sh "mvn clean deploy -B -s ${SETTINGS_XML}
}
}
}
}
tools {
jdk 'JDK_1.8'
}
parameters {
string(name: 'VERSION', defaultValue: '3_12_0', description: 'version to be in maven')
}
}
First, I think you doesn't need targetLocation to perform this.
To access to your parameter value, you need to use params prefix.
This is how I'm using the configFileProvider to make it work :
configFileProvider([configFile(fileId: 'configFileId', variable: 'SETTINGS_XML')]) {
sh "mvn clean deploy -s \$SETTINGS_XML -B -DVERSION=$params.VERSION"
}
With this, the variable which link the settings file is not replaced and it's correctly used in my pipeline and the version is replaced in the command. Don't forget to use a
'Maven settings.xml' type of file in the configFileProvider.
steps {
script{
foo= " "
params.each {param ->
foo = "${foo} -D${param.key}=${param.value} "
}
}
configFileProvider([configFile(fileId:'XXXX', targetLocation: 'settings.xml', variable: 'SETTINGS_XML')]) {
withMaven{
sh "mvn clean deploy -B -s ${SETTINGS_XML} - ${foo}"
}
}
This is the Only Approach found
Using a single declarative pipeline (not multibranch pipeline)
Is there a way I can trigger a certain stage only if its the Master Branch ?
I've been unsuccessful with the following:
Stage('Deploy') {
steps {
script {
if (env.BRANCH_ENV == 'master') {
sh "mvn deploy"
} else {
echo 'Ignoring'
}
}
}
}
No matter what branch i'm deploying, everything gets ignored
any help or advice would be great
I had the same issue before and figured that env.BRANCH_ENV does not return what I expected. You can echo env.BRANCH_ENV in your pipeline to confirm.
My solution to this was to get the git branch manually:
scmVars = checkout scm
gitBranch = sh(
script: "echo ${scmVars.GIT_BRANCH} | cut -d '/' -f2",
returnStdout: true
).trim()
Here some approaches:
use return command to finalize prematurely the stage
https://stackoverflow.com/a/51406870/3957754
use when directive
when directive allows the Pipeline to determine whether the stage should be executed depending on the given condition
built-in conditions: branch, expression, allOf, anyOf, not etc.
when {
// Execute the stage when the specified Groovy expression evaluates to true
expression {
return params.ENVIRONMENT ==~ /(?i)(STG|PRD)/
}
}
Complete sample :
https://gist.github.com/HarshadRanganathan/97feed7f91b7ae542c994393447f3db4
I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.
A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}
Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}
My project has a Jenkinsfile that runs smoothly. The problem is that I need to run some commands only on certain occasions. I'm using the Github plugin. I need to run the deploy only when it is in the master or a new tag, one will be for staging and the other will be production.
pipeline {
agent any
stages {
stage('Test') {
steps {
sh 'node -v'
sh 'yarn install'
sh 'yarn test -- --coverage'
}
}
stage('Build') {
steps {
sh 'yarn build'
}
}
stage('Deploy') {
steps {
sh 'aws s3 sync ./build s3://my.bucket --only-show-errors'
}
}
}
}
I need the master to deploy to a bucket and when it is new tag to another. How can I create this conditional?
How about the following working as two conditionals for two separate deployment scenarios? I think it's better to work with this using variables to indicate deployment scenarios instead of splitting this to two distinctly different steps though. You could for example write a shell script that would handle everything inside depending on tags/branches/whatever you need instead of forcing yourself to control this on pipeline level.
Each stage will have it's steps executed only when when part is satisfied. Stage Deploy will only work for master branch, while stage Deploy_NonMaster will only work any non master branch. Using the method written in when conditionals you can check for anything, including tags or whatnot.
stage ('Deploy') {
when {
expression {
GIT_BRANCH = sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
return (GIT_BRANCH == 'master')
}
}
steps {
echo 'Do stuff/deploy.'
}
}
stage ('Deploy_NonMaster') {
when {
expression {
GIT_BRANCH = sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
return !(GIT_BRANCH == 'master')
}
}
steps {
echo 'Do stuff/deploy.'
}
}
I'm trying to create some Docker images. For that I want to use the version number specified in the Maven pom.xml file as tag. I am however rather new to the declarative Jenkins pipelines and I can't figure out how to change my environment variable so that VERSION contains the right version for all stages.
This is my code
#!groovy
pipeline {
tools {
maven 'maven 3.3.9'
jdk 'Java 1.8'
}
environment {
VERSION = '0.0.0'
}
agent any
stages {
stage('Checkout') {
steps {
git branch: 'master', credentialsId: '290dd8ee-2381-4c5b-8d33-5631d03ee7be', url: 'git#gitlab.crosslang.local:company/SOME-API.git'
sh "git clean -f && git reset --hard origin/master"
}
}
stage('Build and Test Java code') {
steps {
script {
def pom = readMavenPom file: 'pom.xml'
VERSION = pom.version
}
echo "${VERSION}"
sh "mvn clean install -DskipTests"
}
}
stage('Build Docker images') {
steps {
dir('whales-microservice/src/main/docker'){
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
script {
docker.build "company/whales-microservice:${VERSION}"
}
}
}
}
}
}
The problem is the single quote of the statement
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
single quotes don't expand variables in groovy: http://docs.groovy-lang.org/latest/html/documentation/#_string_interpolation
so you have to double quote your shell statement:
sh "cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar"
I just wanted to mention that if you have pipeline-utility-steps plugin installed you can use readMavenPom() in the environment part, too. It looks like this:
environment {
VERSION = readMavenPom().getVersion()
}