Use environment variable in Jenkins pipeline options{} block - jenkins

I'm trying to use an env var in Jenkins' pipeline options {} block, but it seems they aren't extrapolated at that stage. Am I missing something or is it intentional and there's no way to make it work?
Example:
pipeline {
agent {
docker {
image '...'
label 'docker'
}
}
environment {
MAGIC_APP_NAME = "xxx"
MAGIG_APP_ID = "yyy"
}
options {
connection: gitLabConnection("GitLab-${env.MAGIC_APP_ID}")
}
}

Related

how to add shell environment variable to global env variable in jenkins?

I have a below jenkins pipeline and it is working fine
pipeline {
agent
{
node
{
label 'test'
}
}
environment{
ansible_pass = 'credentials('ans-pass')'
}
stages {
stage('Load Vars'){
steps{
script{
configFileProvider([configFile(fileId: "${ENV_CONFIG_ID}", targetLocation: "${ENV_CONFIG_FILE}")]) {
load "${ENV_CONFIG_FILE}"
}
}
}
}
stage('svc install') {
steps {
sshagent(["${SSH_KEY_ID}"])
{
sh '''
ansible-playbook main.yaml -i hosts.yaml -b --vault-password-file $ansible_pass
'''
}
}
}
}
}
Now i want to pass the global environment variable id from shell instead of hartcoding
ansible_pass = 'credentials('ans-pass')'===>>>>
this ansible-pass1 should come from managed files(config provider)
I have already below from managed files
env.ARTI_TOKEN_ID='art-token'
env.PLAYBOOK_REPO='dep.stg'
env.SSH_KEY_ID = 'test_key'
Now how to add this credential id in this file?.Tried like below
env.ansible_pass = 'ansible-pass1'
and in jenkins pipeline refered the same as below
environment{
ansible_pass = 'credentials($ansible_pass)'
}
But it didn't worked.Could you please advice
As you are using secrets in config file it is better to use secret type 'secret file' in jenkins. Follow the link to read about different types of credentials.
Also correct way of setting credentials is:
environment{
ansible_pass = credentials('credentials-id-here')
}

Setting environment variable in Jenkins pipeline stage from build parameter

I would like to configure an environment variable for my Jenkins pipeline, but dynamically based on an input parameter to the build. I'm trying to configure my pipeline to set the KUBECONFIG environment variable for kubectl commands.
My pipeline is as follows (slightly changed):
pipeline {
parameters {
choice(name: 'CLUSTER_NAME', choices: 'cluster1/cluster2')
}
stages {
// Parallel stages since only one environment variable should be set, based on input
stage ('Set environment variable') {
parallel {
stage ('Set cluster1') {
when {
expression {
params.CLUSTER_NAME == "cluster1"
}
}
environment {
KUBECONFIG = "~/kubeconf/cluster1.conf"
}
steps {
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
stage ('Set cluster2') {
when {
expression {
params.CLUSTER_NAME == "cluster2"
}
}
environment {
KUBECONFIG = "~/kubeconf/cluster2.conf"
}
steps {
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
}
stage ('Test env') {
steps {
sh "cat ${env.KUBECONFIG}"
}
}
}
}
However, while the stage where I set the environment variable can print it, once I move to another stage I only get null.
Is there some way of sharing env variables between stages? Since I'd like to use the default KUBECONFIG command (and not specify a file/context in my kubectl commands), it would be much easier to find a way to dynamically set the env variable.
I've seen the EnvInject plugin mentioned, but was unable to get it working for a pipeline, and was struggling with the documentation.
I guess that with the environment{} you are setting the environment variable only for the stage where it runs - it is not affecting the context of environment of the pipeline itself. Set environment variables like below to affect the main context. Works for me.
pipeline {
agent any
parameters {
choice(name: 'CLUSTER_NAME', choices: 'cluster1\ncluster2')
}
stages {
// Parallel stages since only one environment variable should be set, based on input
stage ('Set environment variable') {
parallel {
stage ('Set cluster1') {
when {
expression {
params.CLUSTER_NAME == "cluster1"
}
}
steps {
script{
env.KUBECONFIG = "~/kubeconf/cluster1.conf"
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
stage ('Set cluster2') {
when {
expression {
params.CLUSTER_NAME == "cluster2"
}
}
steps {
script{
env.KUBECONFIG = "~/kubeconf/cluster2.conf"
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
}
}
stage ('Test env') {
steps {
sh "cat ${env.KUBECONFIG}"
}
}
}
}

How to use env variable inside triggers section in jenkins pipeline?

Reading the properties file for the node label and triggerConfigURL, node label works, but I couldn't read and set triggerConfigURL from environment.
def propFile = "hello/world.txt" //This is present in workspace, and it works.
pipeline {
environment {
nodeProp = readProperties file: "${propFile}"
nodeLabel = "$nodeProp.NODE_LABEL"
dtcPath = "$nodeProp.DTC"
}
agent { label env.nodeLabel } // this works!! sets NODE_LABEL value from the properties file.
triggers {
gerrit dynamicTriggerConfiguration: 'true',
triggerConfigURL: env.dtcPath, // THIS DON'T WORK, tried "${env.dtcPath}" and few other notations too.
serverName: 'my-gerrit-server',
triggerOnEvents: [commentAddedContains('^fooBar$')]
}
stages {
stage('Print Env') {
steps {
script {
sh 'env' // This prints "dtcPath=https://path/of/the/dtc/file", so the dtcPath env is set.
}
}
}
After running the job, the configuration is as below:
Of the env and triggers clauses Jenkins runs one before the other, and it looks like you have experimentally proven that triggers run first and env second. It also looks like agent runs after env as well.
While I don't know why the programmers have made this specific decision, I think you are in a kind of a chicken-and-egg problem, where you want to define the pipeline using a file but can only read the file once the pipeline is defined and running.
Having said that, the following might work:
def propFile = "hello/world.txt"
def nodeProp = null
node {
nodeProp = readProperties file: propFile
}
pipeline {
environment {
nodeLabel = nodeProp.NODE_LABEL
dtcPath = nodeProp.DTC
}
agent { label env.nodeLabel }
triggers {
gerrit dynamicTriggerConfiguration: 'true',
triggerConfigURL: nodeProp.DTC,
//etc.

Dynamically select agent in Jenkinsfile

I want to be able select whether a pipeline stage is going to be executed with the dockerfile agent depending on the presence of a Dockerfile in the repository. If there's no Dockerfile, the stage should be run locally.
I tried something like
pipeline {
stage('AwesomeStage') {
when {
beforeAgent true
expression { return fileExists("Dockerfile") }
}
agent { dockerfile }
steps {
// long list of awesome steps that should be run either on Docker either locally, depending on the presence of a Dockerfile
}
}
}
But the result is that the whole stage is skipped when there's no Dockerfile.
Is it possible to do something like the following block?
//...
if (fileExists("Dockerfile")) {
agent {dockerfile}
}
else {
agent none
}
//...
I came up with this solution that relies on defining a function to avoid repetion and defines two different stages according to type of agent.
If anyone has a more elegant solution, please let me know.
def awesomeScript() {
// long list of awesome steps that should be run either on Docker either locally, depending on the presence of a Dockerfile
}
pipeline {
stage('AwesomeStageDockerfile') {
when {
beforeAgent true
expression { return fileExists("Dockerfile") }
}
agent { dockerfile }
steps {
awesomeScript()
}
}
stage('AwesomeStageLocal') {
when {
beforeAgent true
expression { return !fileExists("Dockerfile") }
}
agent none
steps {
awesomeScript()
}
}
}

Conditional environment variables in Jenkins Declarative Pipeline

I'm trying to get a declarative pipeline that looks like this:
pipeline {
environment {
ENV1 = 'default'
ENV2 = 'default also'
}
}
The catch is, I'd like to be able to override the values of ENV1 or ENV2 based on an arbitrary condition. My current need is just to base it off the branch but I could imagine more complicated conditions.
Is there any sane way to implement this? I've seen some examples online that do something like:
stages {
stage('Set environment') {
steps {
script {
ENV1 = 'new1'
}
}
}
}
But I believe this isn't setting the actually environment variable, so much as it is setting a local variable which is overriding later calls to ENV1. The problem is, I need these environment variables read by a nodejs script, and those need to be real machine environment variables.
Is there any way to set environment variables to be dynamic in a jenkinsfile?
Maybe you can try Groovy's ternary-operator:
pipeline {
agent any
environment {
ENV_NAME = "${env.BRANCH_NAME == "develop" ? "staging" : "production"}"
}
}
or extract the conditional to a function:
pipeline {
agent any
environment {
ENV_NAME = getEnvName(env.BRANCH_NAME)
}
}
// ...
def getEnvName(branchName) {
if("int".equals(branchName)) {
return "int";
} else if ("production".equals(branchName)) {
return "prod";
} else {
return "dev";
}
}
But, actually, you can do whatever you want using the Groovy syntax (features that are supported by Jenkins at least)
So the most flexible option would be to play with regex and branch names...So you can fully support Git Flow if that's the way you do it at VCS level.
use withEnv to set environment variables dynamically for use in a certain part of your pipeline (when running your node script, for example). like this (replace the contents of an sh step with your node script):
pipeline {
agent { label 'docker' }
environment {
ENV1 = 'default'
}
stages {
stage('Set environment') {
steps {
sh "echo $ENV1" // prints default
// override with hardcoded value
withEnv(['ENV1=newvalue']) {
sh "echo $ENV1" // prints newvalue
}
// override with variable
script {
def newEnv1 = 'new1'
withEnv(['ENV1=' + newEnv1]) {
sh "echo $ENV1" // prints new1
}
}
}
}
}
}
Here is the correct syntax to conditionally set a variable in the environment section.
environment {
MASTER_DEPLOY_ENV = "TEST" // Likely set as a pipeline parameter
RELEASE_DEPLOY_ENV = "PROD" // Likely set as a pipeline parameter
DEPLOY_ENV = "${env.BRANCH_NAME == 'master' ? env.MASTER_DEPLOY_ENV : env.RELEASE_DEPLOY_ENV}"
CONFIG_ENV = "${env.BRANCH_NAME == 'master' ? 'MASTER' : 'RELEASE'}"
}
I managed to get this working by explicitly calling shell in the environment section, like so:
UPDATE_SITE_REMOTE_SUFFIX = sh(returnStdout: true, script: "if [ \"$GIT_BRANCH\" == \"develop\" ]; then echo \"\"; else echo \"-$GIT_BRANCH\"; fi").trim()
however I know that my Jenkins is on nix, so it's probably not that portable
Here is a way to set the environment variables with high flexibility, using maps:
stage("Environment_0") {
steps {
script {
def MY_MAP = [ME: "ASSAFP", YOU: "YOUR_NAME", HE: "HIS_NAME"]
env.var3 = "HE"
env.my_env1 = env.null_var ? "not taken" : MY_MAP."${env.var3}"
echo("env.my_env1: ${env.my_env1}")
}
}
}
This way gives a wide variety of options, and if it is not enough, map-of-maps can be used to enlarge the span even more.
Of course, the switching can be done by using input parameters, so the environment variables will be set according to the input parameters value.
pipeline {
agent none
environment {
ENV1 = 'default'
ENV2 = 'default'
}
stages {
stage('Preparation') {
steps {
script {
ENV1 = 'foo' // or variable
ENV2 = 'bar' // or variable
}
echo ENV1
echo ENV2
}
}
stage('Build') {
steps {
sh "echo ${ENV1} and ${ENV2}"
}
}
// more stages...
}
}
This method is more simple and looks better. Overridden environment variables will be applied to all other stages also.
I tried to do it in a different way, but unfortunately it does not entirely work:
pipeline {
agent any
environment {
TARGET = "${changeRequest() ? CHANGE_TARGET:BRANCH_NAME}"
}
stages {
stage('setup') {
steps {
echo "target=${TARGET}"
echo "${BRANCH_NAME}"
}
}
}
}
Strangely enough this works for my pull request builds (changeRequest() returning true and TARGET becoming my target branch name) but it does not work for my CI builds (in which case the branch name is e.g. release/201808 but the resulting TARGET evaluating to null)

Resources