how to get jenkins parameter's default value - jenkins

I want to have a check in jenkins job, where if the jenkins parameters have been modified then I want to display an Input modal to proceed or abort the job .But for that I need to verify if the parameters have been modified.
So, I can get modified parameters value in jenkins job, but how to retrieve default values of those parameters so that I can verify if any parameters have been modified?

def DEFAULT_VALUE = "42"
pipeline {
agent any
parameters { string(name: 'MY_PARAM', defaultValue: DEFAULT_VALUE, description: '') }
stages {
stage('Example') {
steps {
script {
if (params.MY_PARAM == DEFAULT_VALUE) {
echo 'Default value used'
} else {
echo 'Non-default value used'
}
}
}
}
}

Related

Converting a gstringimpl to java.lang.string in a Jenkinsile

I have a pipeline which takes a persistent string parameter input. The pipeline then checks whether the parameter value is present in a list.
The problem is that the persisted string is of type gstringimpl, and the list items are java.lang.string type. When I use the .contains() method, even though the value is in the list, it won't return true, which I believe is due to the different data types.
I've tried everything online, including the toString() method but I can't get it to work. I'm attaching my code below.
String ver = ""
pipeline {
agent {
docker{
image 'registry/abc/builder:0.1.5'
args '-t -d -v maven-m2-cache:/home/node/.m2'
}
}
parameters {
persistentString(name: 'Version', defaultValue: '8.4.7.8', description: 'Version to build', successfulOnly: false)
}
stages {
stage('Analyze Parameter'){
steps{
script{
ver = "${Version}".toString()
}
}
}
stage('Build'){
steps{
script{
def version_list1 = ['8.4.7.8','8.3.7.9','8.5.4.7']
if (version_list1.contains("${ver}")){
println("build version branch")
} else {
println("${ver}")
println("${ver}".getClass())
println(version_list1[0])
println(version_list1[0].getClass())
println("build master branch")
}
}
}
}
}
}
The pipeline always goes into the else block and prints the following:
8.4.7.8
class org.codehaus.groovy.runtime.GStringImpl
8.4.7.8
java.lang.string
build master branch
Don't use String interpolation to resolve Parameters. Instead directly access it like params.PARAM_NAME, example below.
script{
def version_list1 = ['8.4.7.8','8.3.7.9','8.5.4.7']
if (version_list1.contains(params.Version)){
println("build version branch")
} else {
println("build master branch")
}
}

how to set string input default value when the input value is null in jenkins pipeline

I am set a string parameter in Jenkins pipeline(groovy script) like this:
def call(String type, Map map) {
if (type == "gradle") {
pipeline {
agent any
parameters {
string(name: 'k8sResourceType', defaultValue: "${map.k8sResourceType}", description: 'Kubernetes Resource Type')
}
}
}
is it possible to set a default value when ${map.k8sResourceType} is null? if the ${map.k8sResourceType} is null set the Kubernetes resource type to Deployment. Because 90% of apps are Deployment and only special apps are StatefulSet in Kubernetes. I am a newbie in groovy.
You better use environment instead of parameters to achieve what you want
pipeline {
agent any
environment {
k8sResourceType = getResourceType(map.k8sResourceType)
}
stages {
stage('Hello World') {
steps {
echo "Value: ${env.k8sResourceType}"
}
}
}
}
def getResourceType(value) {
return value == null ? "Deployment" : value
}

Is there way to set variable from a stage local variable to Jenkins global variable?

Is there a way to access stage local variable in jenkins global pipeline, I'm trying to use the var1 value from Example stage in post always block.
// Declarative //
pipeline {
agent any
stages {
stage('Example') {
steps {
def var1 = sh 'ssh yourname#yourmachine 'grep uploadRate= /root/yourscript' '
}
}
}
post {
always {
echo 'Reading a Var1 Value' + var1
}
}
}
error:
Error when executing always post condition:
groovy.lang.MissingPropertyException: No such property: var1 for class: WorkflowScript
You cant directly call the variable which assigned in build steps in post action.
As a solution what you can do is pass the 'Example' stage result to file and then by using Environment Inject Plugin you can access the value in post action.
After installed the plugin set the file name in job configurations.
plugin setup
pipeline {
agent any
stages {
stage('Example') {
steps{
script {
sh 'date > output.txt'
}
}
}
}
post {
always {
script {
curDate = readFile 'outFile.txt'
echo "The current date is ${curDate}"
}
}
}
}

Value returned from a script does not assigned to a variable declared in jenkins declarative pipeline stage

I am working on adding a jenkins Declarative pipeline for automation testing. In the test run stage i want to extract the failed tests from the log. i am using a groovy function for extracting the test result. this function is not a part of the jenkins pipeline. It is another script file. The function works fine and it build a string containing the failure details. Inside a pipeline stage i am calling this function and assinging the returned string to another variable. But when i echo the variable value it prints empty string.
pipeline {
agent {
kubernetes {
yamlFile 'kubernetesPod.yml'
}
}
environment{
failure_msg = ""
}
stages {
stage('Run Test') {
steps {
container('ansible') {
script {
def notify = load('src/TestResult.groovy')
def result = notify.extractTestResult("${WORKSPACE}/testreport.xml")
sh "${result}"
if (result != "") {
failure_msg = failure_msg + result
}
}
}
}
}
post {
always {
script {
sh 'echo Failure message.............${failure_msg}'
}
}
}
}
here 'sh 'echo ${result}'' print empty string. But 'extractTestResult()' returns a non-empty string.
Also i am not able to use the environment variable 'failure_msg' in post section it return an error 'groovy.lang.MissingPropertyException: No such property: failure_msg for class: groovy.lang.Binding'
can anyone please help me with this ?
EDIT:
Even after i fixed the string interpolation, i was getting the same
error. That was because jenkins does not allow using 'sh' inside
docker container. there is an open bug ticket in jenkins issue
board
I would suggest to use a global variable for holding the error message. My guess is that the variable is not existing in your scope.
def FAILURE_MSG // Global Variable
pipeline {
...
stages {
stage(...
steps {
container('ansible') {
script {
...
if (result != "") {
FAILURE_MSG = FAILURE_MSG + result
}
}
}
}
}
post {
always {
script {
sh "${FAILURE_MSG}" // Hint: Use correct String Interpolation
}
}
}
}
(Similar SO question can be found here)

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Resources