how to add shell environment variable to global env variable in jenkins? - jenkins

I have a below jenkins pipeline and it is working fine
pipeline {
agent
{
node
{
label 'test'
}
}
environment{
ansible_pass = 'credentials('ans-pass')'
}
stages {
stage('Load Vars'){
steps{
script{
configFileProvider([configFile(fileId: "${ENV_CONFIG_ID}", targetLocation: "${ENV_CONFIG_FILE}")]) {
load "${ENV_CONFIG_FILE}"
}
}
}
}
stage('svc install') {
steps {
sshagent(["${SSH_KEY_ID}"])
{
sh '''
ansible-playbook main.yaml -i hosts.yaml -b --vault-password-file $ansible_pass
'''
}
}
}
}
}
Now i want to pass the global environment variable id from shell instead of hartcoding
ansible_pass = 'credentials('ans-pass')'===>>>>
this ansible-pass1 should come from managed files(config provider)
I have already below from managed files
env.ARTI_TOKEN_ID='art-token'
env.PLAYBOOK_REPO='dep.stg'
env.SSH_KEY_ID = 'test_key'
Now how to add this credential id in this file?.Tried like below
env.ansible_pass = 'ansible-pass1'
and in jenkins pipeline refered the same as below
environment{
ansible_pass = 'credentials($ansible_pass)'
}
But it didn't worked.Could you please advice

As you are using secrets in config file it is better to use secret type 'secret file' in jenkins. Follow the link to read about different types of credentials.
Also correct way of setting credentials is:
environment{
ansible_pass = credentials('credentials-id-here')
}

Related

Reduce Stages in Jenkinsfile when using Multibranch Pipeline

In this Jenkinsfile I am trying to execute the same stage for different branches in Multibranch Pipeline. I need to configure the environment variable of each branch-name every time. Is there any better way to do this?
stage('Create New AMI for master branch') {
when { branch 'master' }
environment {
BRANCH_NAME = "${env.BRANCH_NAME}"
ENV_NAME = "prod"
}
steps {
sh "packer build jenkins/${PROJECT_NAME}/${PROJECT_NAME}-ami.json"
}
}
stage('Create New AMI for development branch') {
when { branch 'development' }
environment {
BRANCH_NAME = "${env.BRANCH_NAME}"
ENV_NAME = "dev"
}
steps {
sh "packer build jenkins/${PROJECT_NAME}/${PROJECT_NAME}-ami.json"
}
}
stage('Create New AMI for staging branch') {
when { branch 'staging' }
environment {
BRANCH_NAME = "${env.BRANCH_NAME}"
ENV_NAME = "staging"
}
steps {
sh "packer build jenkins/${PROJECT_NAME}/${PROJECT_NAME}-ami.json"
}
}
Please use shared library in this case which will contain all your stage implementations.Reference Example implemenation as shown below.
Create a file named : sharedlibraries (groovy file)
#!groovy
// Write or add Functions(definations of stages) which will be called from your jenkins file
def Create_AMI(PROJECT_NAME, ENV_NAME)
{
echo ENV_NAME
sh "packer build jenkins/${PROJECT_NAME}/${PROJECT_NAME}-ami.json"
// You can also set the environment below example:
env.ENV_NAME ="dev"
}
return this
In your Jenkinsfile write below code:
// Global variable is used to get data from groovy file(shared library)
def stagelibrary
stage('Create New AMI') {
steps {
script {
// Load Shared library Groovy file stagelibraries.Give your path of stagelibraries file whic is created
stagelibrary = load 'C:\\Jenkins\\stagelibraries'
// Execute Create_AMI function. You can add if else conditions over here, based on your branches. But am not sure of your conditions.
// You can also pass your environment variable
// in the Crete_AMI function using env.<YOURENVVARIABLE>
stagelibrary.Create_AMI(PROJECT_NAME,env.ENV_NAME)
}
}
}
The above example was shown to provide overview of shared library and you dont need to write same functions / or redundant stages.

Jenkins Environment Variables Conditional set

So I had to do a lot of different renditions of this with no success unless the environment was set before the stages. I am trying to define the environment for aws creds depending on the branch im in. qa then use qa creds for the env BUT it does not set when its inside the stage phase
agent {
docker {
image '/terraform-npm:latest'
registryCredentialsId 'dockerhubPW'
}
}
stages {
stage('Initialize Dev Environment') {
when {
branch 'dev'
}
environment {
TF_VAR_aws_access_key = credentials('dev-aws-access-key-id')
TF_VAR_aws_secret_key = credentials('dev-aws-secret-access-key')
AWS_ACCESS_KEY_ID = credentials('dev-aws-access-key-id')
AWS_SECRET_ACCESS_KEY = credentials('dev-aws-secret-access-key')
AWS_REGION = "us-west-2"
}
steps {
sh 'terraform init -backend-config="bucket=${GIT_BRANCH}-terraform-state" -backend-config="dynamodb_table=${GIT_BRANCH}-terraform-state-locking" -backend-config="region=$AWS_REGION" -backend-config="key=${GIT_BRANCH}-terraform-state/terraform.tfstate"'
}
}
IF i obviously set it before the stage phase in the pipeline of course it works.
agent {
docker {
image '/terraform-npm:latest'
registryCredentialsId 'dockerhubPW'
}
}
environment {
TF_VAR_aws_access_key = credentials('dev-aws-access-key-id')
TF_VAR_aws_secret_key = credentials('dev-aws-secret-access-key')
AWS_ACCESS_KEY_ID = credentials('dev-aws-access-key-id')
AWS_SECRET_ACCESS_KEY = credentials('dev-aws-secret-access-key')
AWS_REGION = "us-west-2"
}
stages {
stage('Initialize Dev Environment') {
when {
branch 'dev'
}
steps {
sh 'terraform init -backend-config="bucket=${GIT_BRANCH}-terraform-state" -backend-config="dynamodb_table=${GIT_BRANCH}-terraform-state-locking" -backend-config="region=$AWS_REGION" -backend-config="key=${GIT_BRANCH}-terraform-state/terraform.tfstate"'
}
}
My question is , is there a way to set the environment variables before the stages phase BUT conditionally depending on the branch?
Well, yes, there is.
First option: you can run a combination of scripted and declarative pipeline (please note that I haven't checked it works, this is just to send you down a right path):
// scripted pipeline
node('master') {
stage("Init variables") {
if (env.GIT_BRANCH == 'dev') {
env.AWS_REGION = "us-west-2"
}
else {
// ...
}
}
}
// declarative pipeline
pipeline {
agent {
docker {
image '/terraform-npm:latest'
registryCredentialsId 'dockerhubPW'
}
}
stages {
stage('Use variables') {
steps {
sh 'echo $AWS_REGION'
}
}
}
Another option is to use withEnv directive inside steps:
stage('Initialize Dev Environment') {
when {
branch 'dev'
}
steps {
withEnv(['AWS_REGION=us-west-2']) {
sh 'echo $AWS_REGION'
}
}
Thanks you MaratC for guiding me in the right path, it def helped. here is what i used
steps {
withCredentials([string(credentialsId: 'qa-aws-access-key-id', variable: 'TF_VAR_aws_access_key'),string(credentialsId: 'qa-aws-secret-access-key', variable: 'TF_VAR_aws_secret_key'),string(credentialsId: 'qa-aws-access-key-id', variable: 'AWS_ACCESS_KEY_ID'),string(credentialsId: 'qa-aws-secret-access-key', variable: 'AWS_SECRET_ACCESS_KEY')])
{
sh 'terraform plan -var-file=${GIT_BRANCH}.tfvars -out=${GIT_BRANCH}-output.plan'
}
}

Setting environment variable in Jenkins pipeline stage from build parameter

I would like to configure an environment variable for my Jenkins pipeline, but dynamically based on an input parameter to the build. I'm trying to configure my pipeline to set the KUBECONFIG environment variable for kubectl commands.
My pipeline is as follows (slightly changed):
pipeline {
parameters {
choice(name: 'CLUSTER_NAME', choices: 'cluster1/cluster2')
}
stages {
// Parallel stages since only one environment variable should be set, based on input
stage ('Set environment variable') {
parallel {
stage ('Set cluster1') {
when {
expression {
params.CLUSTER_NAME == "cluster1"
}
}
environment {
KUBECONFIG = "~/kubeconf/cluster1.conf"
}
steps {
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
stage ('Set cluster2') {
when {
expression {
params.CLUSTER_NAME == "cluster2"
}
}
environment {
KUBECONFIG = "~/kubeconf/cluster2.conf"
}
steps {
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
}
stage ('Test env') {
steps {
sh "cat ${env.KUBECONFIG}"
}
}
}
}
However, while the stage where I set the environment variable can print it, once I move to another stage I only get null.
Is there some way of sharing env variables between stages? Since I'd like to use the default KUBECONFIG command (and not specify a file/context in my kubectl commands), it would be much easier to find a way to dynamically set the env variable.
I've seen the EnvInject plugin mentioned, but was unable to get it working for a pipeline, and was struggling with the documentation.
I guess that with the environment{} you are setting the environment variable only for the stage where it runs - it is not affecting the context of environment of the pipeline itself. Set environment variables like below to affect the main context. Works for me.
pipeline {
agent any
parameters {
choice(name: 'CLUSTER_NAME', choices: 'cluster1\ncluster2')
}
stages {
// Parallel stages since only one environment variable should be set, based on input
stage ('Set environment variable') {
parallel {
stage ('Set cluster1') {
when {
expression {
params.CLUSTER_NAME == "cluster1"
}
}
steps {
script{
env.KUBECONFIG = "~/kubeconf/cluster1.conf"
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
stage ('Set cluster2') {
when {
expression {
params.CLUSTER_NAME == "cluster2"
}
}
steps {
script{
env.KUBECONFIG = "~/kubeconf/cluster2.conf"
echo "Using KUBECONFIG: ${env.KUBECONFIG}"
}
}
}
}
}
stage ('Test env') {
steps {
sh "cat ${env.KUBECONFIG}"
}
}
}
}

Declarative Jenkins Pipeline; How to declare a variable and use it in script or mail notification?

(update below)
I have a declarative pipeline job which can take an argument VERSION.
pipeline {
parameters {
string(name: VERSION, defaultValue: '')
}
// ...
}
If no VERSION is given, like when Gitlab send a hook to this job, I want to compute it from git, so I do something like this
stages {
stage('Prepare') {
steps {
// ...
if (! env.VERSION) {
VERSION = sh(script: "git describe", returnStdout: true).trim()
}
}
}
}
Now I want to "inject" this variable to
my build script. It needs to find "VERSION" in the environment variables
to the jenkins mail notificator. And get it to retreive ${VERSION} in subject or body text
I tried changing above code with
stages {
stage('Prepare') {
steps {
// ...
if (! env.VERSION) {
env.VERSION = sh(script: "git describe", returnStdout: true).trim()
}
}
}
}
Got this error groovy.lang.MissingPropertyException: No such property: VERSION for class: groovy.lang.Binding
I then tried to add a "environment" step below
environment {
VERSION = ${VERSION}
}
but it didn't solve my problem.
I'm looking for any help to solve it.
UPDATE
I now have a working pipeline which looks like
pipeline {
agent any
parameters {
string(name: 'VERSION', defaultValue: '')
}
environment {
def VERSION = "${params.VERSION}"
}
stages {
stage('Prepare & Checkout') {
steps {
script {
if (! env.VERSION) {
VERSION = sh(script: "date", returnStdout: true).trim()
}
echo "** version: ${VERSION} **"
}
}
}
stage('Build') {
steps {
// sh "./build.sh"
echo "** version2: ${VERSION} **"
}
}
} // stages
post {
always {
mail to: 'foo#example.com',
subject: "SUCCESS: ${VERSION}",
body: """<html><body><p>SUCCESS</p></body></html>""",
mimeType: 'text/html',
charset: 'UTF-8'
deleteDir()
}
}
} // pipeline
I needed to add the "environment" step to be able to get $VERSION in all Stages (not only in the one it is manipulated).
I still need to find a way to inject this $VERSION variable in the environment variables, so that my build script can find it.
If you want to inject the variable in the environment so that you can use it later, you could define another variable that is equal to env.VERSION or the output of the shell scrip. Then use that variable in your pipeline eg:
pipeline {
parameters {
string(name: VERSION, defaultValue: '')
}
def version = env.VERSION
stages {
stage('Prepare') {
steps {
// ...
if (!version) {
version = sh(script: "git describe", returnStdout: true).trim()
}
}
}
mail subject: "$version build succeeded", ...
}
If you want other jobs to be able to access the value of VERSION after the build is run, you can write it in a file and archive it.
Edit:
In order for your script to be able to use the version variable, you can either make your script take version as a parameter or you can use the withEnv step.
Assuming you are using Parametrized pipelines, you should call variable as ${params.parameterName}
Although parameters are available in env they currently are created before the first time the pipeline is run, therefore you should access them via params:
In your case:
${params.VERSION}

Jenkinsfile Declarative Pipeline defining dynamic env vars

I'm new to Jenkins pipeline; I'm defining a declarative syntax pipeline and I don't know if I can solve my problem, because I didn't find a solution.
In this example, I need to pass a variable to ansible plugin (in old version I use an ENV_VAR or injecting it from file with inject plugin) that variable comes from a script.
This is my perfect scenario (but it doesn't work because environment{}):
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('Deploy') {
environment {
ANSIBLE_CONFIG = '${WORKSPACE}/chimera-ci/ansible/ansible.cfg'
VERSION = sh("python3.5 docker/get_version.py")
}
steps {
ansiblePlaybook credentialsId: 'example-credential', extras: '-e version=${VERSION}', inventory: 'development', playbook: 'deploy.yml'
}
}
}
}
I tried other ways to test how env vars work in other post, example:
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('PREPARE VARS') {
steps {
script {
env['VERSION'] = sh(script: "python3.5 get_version.py")
}
echo env.VERSION
}
}
}
}
but "echo env.VERSION" return null.
Also tried the same example with:
- VERSION=python3.5 get_version.py
- VERSION=python3.5 get_version.py > props.file (and try to inject it, but didnt found how)
If this is not possible I will do it in the ansible role.
UPDATE
There is another "issue" in Ansible Plugin, to use vars in extra vars it must have double quotes instead of single.
ansiblePlaybook credentialsId: 'example-credential', extras: "-e version=${VERSION}", inventory: 'development', playbook: 'deploy.yml'
You can create variables before the pipeline block starts. You can have sh return stdout to assign to these variables. You don't have the same flexibility to assign to environment variables in the environment stanza. So substitute in python3.5 get_version.py where I have echo 0.0.1 in the script here (and make sure your python script just returns the version to stdout):
def awesomeVersion = 'UNKNOWN'
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1').trim()
}
}
}
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
The output of the above pipeline is:
awesomeVersion: 0.0.1
In Jenkins 2.76 I was able to simplify the solution from #burnettk to:
pipeline {
agent { label 'docker' }
environment {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1')
}
stages {
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
Using the "pipeline utility steps" plugin, you can define general vars available to all stages from a properties file. For example, let props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}
A possible variation of the main answer is to provide variable using another pipeline instead of a sh script.
example (set the variable pipeline) : my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
(use these variables) in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
// call set variable job
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
For those who wants the environment's key to be dynamic, the following code can be used:
stage('Prepare Environment') {
steps {
script {
def data = [
"k1": "v1",
"k2": "v2",
]
data.each { key ,value ->
env."$key" = value
// env[key] = value // Deprecated, this can be used as well, but need approval in sandbox ScriptApproval page
}
}
}
}
You can also dump all your vars into a file, and then use the '-e #file' syntax. This is very useful if you have many vars to populate.
steps {
echo "hello World!!"
sh """
var1: ${params.var1}
var2: ${params.var2}
" > vars
"""
ansiblePlaybook inventory: _inventory, playbook: 'test-playbook.yml', sudoUser: null, extras: '-e #vars'
}
You can do use library functions in the environments section, like so:
#Library('mylibrary') _ // contains functions.groovy with several functions.
pipeline {
environment {
ENV_VAR = functions.myfunc()
}
…
}

Resources