Jenkinsfile Folder variables missing from env - jenkins

I'm trying to use folder variables in a jenkinsfile
I tried how to access folder variables across pipeline stages? but I don't get it working.
On the Experimental Folder I added 2 properties
CCEBUILD_TOOLS_BRANCH
TEST
sh 'printenv' displays my Folder variables, but they do not show up with env.getEnvironment()
Why is this not working ?
def printParams() {
env.getEnvironment().each { name, value -> println "In printParams Name: $name -> Value $value" }
}
node() {
withFolderProperties{
stage('Info') {
sh 'printenv'
println env.getEnvironment().collect({environmentVariable -> "${environmentVariable.key} = ${environmentVariable.value}"}).join("\n")
/* printParams()*/
}
}
}
printenv output
_=/bin/printenv
BUILD_DISPLAY_NAME=#25
BUILD_ID=25
BUILD_NUMBER=25
BUILD_TAG=*****-*******-********-*****
BUILD_URL=https://hostname/job/Experimental/job/Linux%20info/25/
CCEBUILD_TOOLS_BRANCH=develop_staging
EXECUTOR_NUMBER=4
HOME=/var/lib/jenkins
HUDSON_COOKIE=*****-*******-********-*****
HUDSON_HOME=/var/lib/jenkins
HUDSON_SERVER_COOKIE=************
HUDSON_URL=https://hostname/
JENKINS_HOME=/var/lib/jenkins
JENKINS_NODE_COOKIE=*****-*******-********-*****
JENKINS_SERVER_COOKIE=durable-*****-*******-********-*****
JENKINS_URL=https://hostname/
JOB_BASE_NAME=Linux info
JOB_DISPLAY_URL=https://hostname/job/Experimental/job/Linux%20info/display/redirect
JOB_NAME=Experimental/Linux info
JOB_URL=https://hostname/job/Experimental/job/Linux%20info/
LANG=en_US.UTF-8
LOGNAME=jenkins
NODE_LABELS=developmenthost devlinux linux master releasehost rtbhost
NODE_NAME=master
PATH=/sbin:/usr/sbin:/bin:/usr/bin
PWD=/var/lib/jenkins/jobs/Experimental/jobs/Linux info/workspace
RUN_ARTIFACTS_DISPLAY_URL=https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=artifacts
RUN_CHANGES_DISPLAY_URL=https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=changes
RUN_DISPLAY_URL=https://hostname/job/Experimental/job/Linux%20info/25/display/redirect
RUN_TESTS_DISPLAY_URL=https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=tests
SHELL=/bin/bash
SHLVL=4
STAGE_NAME=Info
TEST=Ok
USER=jenkins
WORKSPACE=/var/lib/jenkins/jobs/Experimental/jobs/Linux info/workspace
env output
BUILD_DISPLAY_NAME = #25
BUILD_ID = 25
BUILD_NUMBER = 25
BUILD_TAG =*****-*******-********-*****
BUILD_URL = https://hostname/job/Experimental/job/Linux%20info/25/
CLASSPATH =
HUDSON_HOME = /var/lib/jenkins
HUDSON_SERVER_COOKIE = ************
HUDSON_URL = https://hostname/
JENKINS_HOME = /var/lib/jenkins
JENKINS_SERVER_COOKIE = ************
JENKINS_URL = https://hostname/
JOB_BASE_NAME = Linux info
JOB_DISPLAY_URL = https://hostname/job/Experimental/job/Linux%20info/display/redirect
JOB_NAME = Experimental/Linux info
JOB_URL = https://hostname/job/Experimental/job/Linux%20info/
RUN_ARTIFACTS_DISPLAY_URL = https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=artifacts
RUN_CHANGES_DISPLAY_URL = https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=changes
RUN_DISPLAY_URL = https://hostname/job/Experimental/job/Linux%20info/25/display/redirect
RUN_TESTS_DISPLAY_URL = https://hostname/job/Experimental/job/Linux%20info/25/display/redirect?page=tests

Related

How to pass parameters and variables from a file to jenkinsfile?

I'm trying to convert my jenkins pipeline to a shared library since it can be reusable on most of the application. As part of that i have created groovy file in vars folder and kept pipeline in jenkins file in github and able to call that in jenkins successfully
As part of improving this i want to pass params, variables, node labels through a file so that we should not touch jenkins pipeline and if we want to modify any vars, params, we have to do that in git repo itself
pipeline {
agent
{
node
{
label 'jks_deployment'
}
}
environment{
ENV_CONFIG_ID = 'jenkins-prod'
ENV_CONFIG_FILE = 'test.groovy'
ENV_PLAYBOOK_NAME = 'test.tar.gz'
}
parameters {
string (
defaultValue: 'test.x86_64',
description: 'Enter app version',
name: 'app_version'
)
choice (
choices: ['10.0.0.1','10.0.0.2','10.0.0.3'],
description: 'Select a host to be delpoyed',
name: 'host'
)
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
def props = readProperties file: 'extravars.properties'
env.var1 = props.var1
env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
In above code,i used pipeline utility steps plugin and able to read variables from extravars.properties file. Is it same way we can do for jenkins parameters also? Or do we have any suitable method to take care of passing this parameters via a file from git repo?
Also is it possible to pass variable for node label also?
=====================================================================
Below are the improvements which i have made in this project
Used node label plugin to pass the node name as variable
Below is my vars/sayHello.groovy file content
def call(body) {
// evaluate the body block, and collect configuration into the object
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent
{
node
{
label "${pipelineParams.slaveName}"
}
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
// def props = readProperties file: 'extravars.properties'
// script {
readProperties(file: 'extravars.properties').each {key, value -> env[key] = value }
//}
// env.var1 = props.var1
// env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
}
stage ('stage2') {
steps {
sh "echo ${var1}"
sh "echo ${var2}"
sh "echo ${pipelineParams.appVersion}"
sh "echo ${pipelineParams.hostIp}"
}
}
}
}
}
Below is my vars/params.groovy file
properties( [
parameters([
choice(choices: ['10.80.66.171','10.80.67.6','10.80.67.200'], description: 'Select a host to be delpoyed', name: 'host')
,string(defaultValue: 'fxxxxx.x86_64', description: 'Enter app version', name: 'app_version')
])
] )
Below is my jenkinsfile
def _hostIp = params.host
def _appVersion = params.app_version
sayHello {
slaveName = 'master'
hostIp = _hostIp
appVersion = _appVersion
}
Now Is it till we can improve this?Any suggestions let me know.

Jenkins pipeline environment section not executing serially

I am facing some issues in jenkins environment section while executing the jenkins pipeline environment section.
import groovy.transform.Field
#Field gitScriptPath = "https://raw.github.com/Innovation/"
#Field clrInfo
#Field gitlabMem
#Field gitSubGroupURL
#Field clrDuration
#Field cloudProvider
#Field userSpecData
#Field slackIntMes
pipeline {
agent { label 'master' }
environment {
GITHUB_TOKEN = credentials(' GITHUB_TOKEN')
GIT_URL = 'github.com/Innovation/exp-selling-iac.git'
PRE_PROV = 'k8s-jobs/iac_preprovision.yaml'
OS_PROV = 'k8s-jobs/iac_openshift.yaml'
USER_PROV = 'k8s-jobs/rhos-user-onboard-offboard.yaml'
ISTIO_PROV = 'k8s-jobs/iac_istio.yaml'
KAFKA_PROV = 'k8s-jobs/iac_kafka.yaml'
MONOLITH_PROV = 'k8s-jobs/iac_monolith.yaml'
POST_PROV = 'k8s-jobs/iac_postprovision.yaml'
DEVOPS_PROV = 'k8s-jobs/k8s_iac_devops.yaml'
dummy = sh ( script: '''echo "${USER_SPEC}" > userspec.yaml''', returnStdout: true )
NAMESPACE = sh ( script: "$JENKINS_HOME/custompath/yq r userspec.yaml Cluster.Name", returnStdout: true )
requestor = sh ( script: "$JENKINS_HOME/custompath/yq r userspec.yaml Cluster.Users.User1.ID", returnStdout: true ).trim()
APPOPS_ROLE = 'appops-customrole-v2'
}
stages {
stage('Download - Groovy Scripts'){
Here we need to get the value of NAMESPACE and requestor after executing the dummy.
But the line starting with dummy is happening after NAMESPACE and requestor lines.
The same quote was working earlier. IF i remove requestor = or APPOPS_ROLE = then everything will be fine. Please help to understand what is happening here.
As i work around i can make APPOPS_ROLE as a parameter in jenkins by configuring the job. This has something to do with the case of the variable also. ie if i make dummy ---> DUMMY it will make a difference.
Jenkins ver. 2.204.2 on openshift 3.11
I don't know why the ordering is undefined. Maybe the assignments are first stored in a hash table and then the hash table is enumerated, which would result in a seemingly random order.
As a workaround you could move the environment initialization into a stage, where you could use a script block to ensure execution order:
pipeline {
agent { label 'master' }
stages {
stage('Initialize') {
steps {
script {
env.dummy = sh ( script: '''echo "${USER_SPEC}" > userspec.yaml''', returnStdout: true )
env.NAMESPACE = sh ( script: "$JENKINS_HOME/custompath/yq r userspec.yaml Cluster.Name", returnStdout: true )
env.requestor = sh ( script: "$JENKINS_HOME/custompath/yq r userspec.yaml Cluster.Users.User1.ID", returnStdout: true ).trim()
...
}
}
}
stage('Download - Groovy Scripts'){
...
}
}
}

How to list all directories from within directory in jenkins pipeline script

I want to get all directories present in particular directory from jenkins pipeline script.
How can we do this?
If you want a list of all directories under a specific directory e.g. mydir using Jenkins Utility plugin you can do this:
Assuming mydir is under the current directory:
dir('mydir') {
def files = findFiles()
files.each{ f ->
if(f.directory) {
echo "This is directory: ${f.name} "
}
}
}
Just make sure you do NOT provide glob option. Providing that makes findFiles to return file names only.
More info: https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/
I didn't find any plugin to list folders, so I used sh/bat script in pipeline, and also this will work irrespective of operating system.
pipeline {
stages {
stage('Find all fodlers from given folder') {
steps {
script {
def foldersList = []
def osName = isUnix() ? "UNIX" : "WINDOWS"
echo "osName: " + osName
echo ".... JENKINS_HOME: ${JENKINS_HOME}"
if(isUnix()) {
def output = sh returnStdout: true, script: "ls -l ${JENKINS_HOME} | grep ^d | awk '{print \$9}'"
foldersList = output.tokenize('\n').collect() { it }
} else {
def output = bat returnStdout: true, script: "dir \"${JENKINS_HOME}\" /b /A:D"
foldersList = output.tokenize('\n').collect() { it }
foldersList = foldersList.drop(2)
}
echo ".... " + foldersList
}
}
}
}
}
I haven't tried this, but I would look at the findFiles step provided by the Jenkins Pipeline Utility Steps Plugin and set glob to an ant-style directory patter, something like '**/*/'
If you just want to log them, use
sh("ls -A1 ${myDir}")
for Linux/Unix. (Note: that's a capital letter A and the number one.)
Or, use
bat("dir /B ${myDir}")
for Windows.
If you want the list of files in a variable, you'll have to use
def dirOutput = sh("ls -A1 ${myDir}", returnStdout: true)
or
def dirOutput = bat("dir /B ${myDir}", returnStdout: true)
and then parse the output.
Recursively getting all the Directores within a directory.
pipeline {
agent any
stages {
stage('Example') {
steps {
script {
def directories = getDirectories("$WORKSPACE")
echo "$directories"
}
}
}
}
}
#NonCPS
def getDirectories(path) {
def dir = new File(path)
def dirs = []
dir.traverse(type: groovy.io.FileType.DIRECTORIES, maxDepth: -1) { d ->
dirs.add(d)
}
return dirs
}
A suggestion for the very end of Jenkinsfile:
post {
always {
echo '\n\n-----\nThis build process has ended.\n\nWorkspace Files:\n'
sh 'find ${WORKSPACE} -type d -print'
}
}
Place the find wherever you think is better. Check more alternatives at here

Jenkinsfile Declarative Pipeline defining dynamic env vars

I'm new to Jenkins pipeline; I'm defining a declarative syntax pipeline and I don't know if I can solve my problem, because I didn't find a solution.
In this example, I need to pass a variable to ansible plugin (in old version I use an ENV_VAR or injecting it from file with inject plugin) that variable comes from a script.
This is my perfect scenario (but it doesn't work because environment{}):
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('Deploy') {
environment {
ANSIBLE_CONFIG = '${WORKSPACE}/chimera-ci/ansible/ansible.cfg'
VERSION = sh("python3.5 docker/get_version.py")
}
steps {
ansiblePlaybook credentialsId: 'example-credential', extras: '-e version=${VERSION}', inventory: 'development', playbook: 'deploy.yml'
}
}
}
}
I tried other ways to test how env vars work in other post, example:
pipeline {
agent { node { label 'jenkins-node'}}
stages {
stage('PREPARE VARS') {
steps {
script {
env['VERSION'] = sh(script: "python3.5 get_version.py")
}
echo env.VERSION
}
}
}
}
but "echo env.VERSION" return null.
Also tried the same example with:
- VERSION=python3.5 get_version.py
- VERSION=python3.5 get_version.py > props.file (and try to inject it, but didnt found how)
If this is not possible I will do it in the ansible role.
UPDATE
There is another "issue" in Ansible Plugin, to use vars in extra vars it must have double quotes instead of single.
ansiblePlaybook credentialsId: 'example-credential', extras: "-e version=${VERSION}", inventory: 'development', playbook: 'deploy.yml'
You can create variables before the pipeline block starts. You can have sh return stdout to assign to these variables. You don't have the same flexibility to assign to environment variables in the environment stanza. So substitute in python3.5 get_version.py where I have echo 0.0.1 in the script here (and make sure your python script just returns the version to stdout):
def awesomeVersion = 'UNKNOWN'
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1').trim()
}
}
}
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
The output of the above pipeline is:
awesomeVersion: 0.0.1
In Jenkins 2.76 I was able to simplify the solution from #burnettk to:
pipeline {
agent { label 'docker' }
environment {
awesomeVersion = sh(returnStdout: true, script: 'echo 0.0.1')
}
stages {
stage('output_version') {
steps {
echo "awesomeVersion: ${awesomeVersion}"
}
}
}
}
Using the "pipeline utility steps" plugin, you can define general vars available to all stages from a properties file. For example, let props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}
A possible variation of the main answer is to provide variable using another pipeline instead of a sh script.
example (set the variable pipeline) : my-set-env-variables pipeline
script
{
env.my_dev_version = "0.0.4-SNAPSHOT"
env.my_qa_version = "0.0.4-SNAPSHOT"
env.my_pp_version = "0.0.2"
env.my_prd_version = "0.0.2"
echo " My versions [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}
(use these variables) in a another pipeline my-set-env-variables-test
script
{
env.dev_version = "NOT DEFINED DEV"
env.qa_version = "NOT DEFINED QA"
env.pp_version = "NOT DEFINED PP"
env.prd_version = "NOT DEFINED PRD"
}
stage('inject variables') {
echo "PRE DEV version = ${env.dev_version}"
script
{
// call set variable job
def variables = build job: 'my-set-env-variables'
def vars = variables.getBuildVariables()
//println "found variables" + vars
env.dev_version = vars.my_dev_version
env.qa_version = vars.my_qa_version
env.pp_version = vars.my_pp_version
env.prd_version = vars.my_prd_version
}
}
stage('next job') {
echo "NEXT JOB DEV version = ${env.dev_version}"
echo "NEXT JOB QA version = ${env.qa_version}"
echo "NEXT JOB PP version = ${env.pp_version}"
echo "NEXT JOB PRD version = ${env.prd_version}"
}
For those who wants the environment's key to be dynamic, the following code can be used:
stage('Prepare Environment') {
steps {
script {
def data = [
"k1": "v1",
"k2": "v2",
]
data.each { key ,value ->
env."$key" = value
// env[key] = value // Deprecated, this can be used as well, but need approval in sandbox ScriptApproval page
}
}
}
}
You can also dump all your vars into a file, and then use the '-e #file' syntax. This is very useful if you have many vars to populate.
steps {
echo "hello World!!"
sh """
var1: ${params.var1}
var2: ${params.var2}
" > vars
"""
ansiblePlaybook inventory: _inventory, playbook: 'test-playbook.yml', sudoUser: null, extras: '-e #vars'
}
You can do use library functions in the environments section, like so:
#Library('mylibrary') _ // contains functions.groovy with several functions.
pipeline {
environment {
ENV_VAR = functions.myfunc()
}
…
}

in this jenkins declarative pipeline, how can I pass a value to DIABLE_AUTH from a file?

pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
sh 'printenv'
}
}
}
}
I tried load(/path/to/file), but it is giving me an error(unexpected char '\')
Just use the readFile step which is available.
This step parses you a file in the workspace into a String.
You can read DISABLE_AUTH then from this string and use it in the pipeline script.
Here is an example script.
def content = readFile 'gradle.properties'
Properties properties = new Properties()
InputStream is = new ByteArrayInputStream(content.getBytes());
properties.load(is)
def runtimeString = 'DIABLE_AUTH'
echo properties."$runtimeString"
DIABLE_AUTH= properties."$runtimeString"
echo DIABLE_AUTH

Resources