I found out how to create input parameters dynamically from this SO answer
agent any
stages {
stage("Release scope") {
steps {
script {
// This list is going to come from a file, and is going to be big.
// for example purpose, I am creating a file with 3 items in it.
sh "echo \"first\nsecond\nthird\" > ${WORKSPACE}/list"
// Load the list into a variable
env.LIST = readFile (file: "${WORKSPACE}/list")
// Show the select input
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'CHOOSE_RELEASE', choices: env.LIST, description: 'What are the choices?')]
}
echo "Release scope selected: ${env.RELEASE_SCOPE}"
}
}
}
}
This allows us to choose only one as it's a choice parameter, how to use the same list to create checkbox parameter, so the user can choose more than one as needed? e.g: if the user chooses first and third, then the last echo should print
Release scope selected: first,third or the following is fine too, so I can iterate over and find the true ones Release scope selected: {first: true, second: false, third: true}
I could use extendedChoice as below
agent any
stages {
stage("Release scope") {
steps {
script {
// This list is going to come from a file, and is going to be big.
// for example purpose, I am creating a file with 3 items in it.
sh "echo \"first\nsecond\nthird\" > ${WORKSPACE}/list"
// Load the list into a variable
env.LIST = readFile("${WORKSPACE}/list").replaceAll(~/\n/, ",")
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [extendedChoice(
name: 'ArchitecturesCh',
defaultValue: "${env.BUILD_ARCHS}",
multiSelectDelimiter: ',',
type: 'PT_CHECKBOX',
value: env.LIST
)]
// Show the select input
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'CHOOSE_RELEASE', choices: env.LIST, description: 'What are the choices?')]
}
echo "Release scope selected: ${env.RELEASE_SCOPE}"
}
}
}
}
There is a booleanParam: https://www.jenkins.io/doc/book/pipeline/syntax/#parameters
parameters {
booleanParam(
name: 'MY_BOOLEAN',
defaultValue: true,
description: 'My boolean'
)
} // parameters
It's oddly named, as all the other param types don't have the "Param" name in them. Eg: string, choice, etc.
Related
I am having a somewhat difficult time figuring out how to make a simple Jenkins pipeline to print values from a simple map.
I use extendedChoice plugin.
The requirement is the following:
The user has a dropdown selection of names, once selected a name, job will simply print (in log) its value (.key).
this is the code I am trying to work with, made numerous changes and still get various errors and nothing works.
if anyone has any idea, will be glad to hear about it :D
def data = ["john": "33", "alex": "45", "michael": "22"]
properties([
parameters ([
extendedChoice(
name: 'CHOICE',
description: 'name and age selection',
type: 'PT_SINGLE_SELECT',
value: data.key // i think i am writing this wrong.. i need to see names in selection dropdown box
)
])
])
pipeline {
agent any
stages {
stage('print choice') {
steps {
println params.CHOICE.value // how to print .value for user i selected?
}
}
}
}
Here is a working Pipeline for your example.
def data = ["john": "33", "alex": "45", "michael": "22"]
properties([
parameters ([
extendedChoice(
name: 'CHOICE',
description: 'name and age selection',
type: 'PT_SINGLE_SELECT',
value: "${data.keySet().join(',').toString()}"
)
])
])
pipeline {
agent any
stages {
stage('print choice') {
steps {
println params.CHOICE
println data.get(params.CHOICE)
}
}
}
}
I have a Jenkins job that I need to set a parameter, if the user choose "EXISTING" to use an existing file, then the next parameter should be a dropdown with a list the existing file names for user to choose. If the user choose "NEW", the next parameter should be a string parameter for user to enter a file name.
I can not find how to use Active Choices Parameter and Active Choices Reactive Parameter to accomplish this. Maybe I am looking at the wrong direction?
For the first parameter CHOICE, I have the groovy script:
return [
"EXISTING",
"NEW"
]
for the second parameter FILES, I have this:
if (CHOICE.equals("EXISTING")) {
return ["file1","file2",..."filen"]
}
else if (CHOICE.equals("NEW")) {
return []
}
This does not do what I want.
How do I make the second parameter to allow user's input when the first parameter is "NEW"? Or to make the code to use two additional different parameters, one to show the existing file list when user choose EXISTING and one to let user enter a file name if the user choose NEW?
After I pick up the parameter values from above, I will pass them to a ansible playbook in the Build section.
Thanks!
The following will work for you. If you want to get the choices from a function, you can also do that.
def INPUT_1 = ""
def INPUT_2 = ""
pipeline {
agent any
stages {
stage("Get File Name") {
steps{
timeout(time: 300, unit: 'SECONDS') {
script {
// The Initial Choice
INPUT_1 = input message: 'Please select', ok: 'Next',
parameters: [
choice(name: 'PRODUCT', choices: ['NEW','EXISTING'], description: 'Please select')]
if(INPUT_1.equals("NEW")) {
INPUT_2 = input message: 'Please Name the file', ok: 'Next',
parameters: [string(name: 'File Name', defaultValue: 'abcd.txt', description: 'Give me a name for the file?')]
} else {
INPUT_2 = input message: 'Please Select the File', ok: 'Next',
parameters: [
choice(name: 'File Name', choices: ["file1","file2","filen"], description: 'Select the file')]
}
echo "INPUT 1 ::: ${INPUT_1}"
echo "INPUT 2 ::: ${INPUT_2}"
}
}
}
}
}
post {
success {
echo 'The process is successfully Completed....'
}
}
}
How can I convert all the parameters in a Jenkins pipeline to lowercase. Similar to trim, is there an attribute that one could add as part of the parameter declaration,
For trim, I have something like below,
parameters {
string defaultValue: '', description: 'Some dummy parameter', name: 'someparameter', trim: true
}
In my pipeline job, I have more than 10 string parameters and would like to convert them all to lowercase
Here's one approach:
pipeline {
agent any
parameters {
string ( name: 'testName', description: 'name of the test to run')
}
stages {
stage('only') {
environment {
TEST_NAME=params.testName.toLowerCase()
}
steps {
echo "the name of the test to run is: ${params.testName}"
sh 'echo "In Lower Case the test name is: ${TEST_NAME}"'
}
}
}
}
sh """ ${the_parameter.toLowerCase()} """
Need to use double quote so you have a GString
Put the toLowerCase() function call inside the brace for shell to refer back to groovy
Actually one can just do
VAR = "${VAR.toLowerCase()}"
Had to use this for my use case. It will not convert but it will prevent passing wrong value.
validatingString(name: "MYVAR",
defaultValue: "",
regex: /^[a-z0-9]+$/,
failedValidationMessage: "",
description: "")
My Question:
I want to send email if an enduser ever wants to receive emails on a regular basis with the attached csv file, we can change the pipeline to set the "isSendEmail" to "true" and populate the default value with that user's email address and the job will automatically send emails from that point on.
but while configuring the jenkisnfile, job runs fine but there is no email notification received?
what is the correct way to configure parameter for Email Id's in jenkinsfile to send email so that doing it this way also allows me to easily run the job manually and sends the email to other people on an adhoc basis.
Jenkinsfile
pipeline {
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Use AWS CLI to detect untagged resources') { steps {
script {
def awsUtil = new AwsUtil()
sh """#!/bin/bash
set
aws resourcegroupstaggingapi get-resources --tags-per-page 100 --tag-filters Key=SysName,Values=KickMe --tag-filters Key="SysName in ServiceNow",Values=False > tag-filter.json
jq '.ResourceTagMappingList[].Tags |= map(select(.Key|IN("ManagedBy","SysName","Owner","SysName in ServiceNow","Name","TSM")))' tag-filter.json > untagged-resources.json
ls
echo "un tag resource.json file data"
cat untagged-resources.json
echo "--------------"
cat tag-filter.json
python untag_resources_convert_csv.py
ls
"""
}}}
stage('Setup parameters') { steps {
script {
properties([
parameters([
booleanParam(
defaultValue: true,
description: '',
name: 'isSendEmail'
),
string(
defaultValue: '',
description: '',
name: 'def#xxx.com,abc#xxx.com,ghi#xxx.com',
trim: true
)
])
])
}}}
}
post {
always {
archiveArtifacts artifacts: 'unTagResources_Details.csv', onlyIfSuccessful: true
emailext (
attachmentsPattern: 'unTagResources_Details.csv', body: '''${SCRIPT, template="groovy-html.template"}''',
subject: '$DEFAULT_SUBJECT',
mimeType: 'text/html',to: "email id"
);
}
}
}
One way to achieve this is using the following technique: define an environment variable (or a global one) in you pipeline that will hold the default mailing list (DEFUALT_MAIL_LIST in the example) which can be changed in the pipeline code according to future needs.
In addition define an pipeline parameter (MailingList in the example) that will enable users that are manually building the job to pass a comma separated string of mails that will receive the mail notification.
Finally add a condition to your post block to check if one of the parameters is filled - and if so send the mails to all recipients.
This solution will allow you both a default, code configured, mailing list alongside a user controlled list.
Here is an implementation exmaple:
pipeline {
agent any
options {
buildDiscarder logRotator( numToKeepStr: '5')
}
environment{
DEFUALT_MAIL_LIST = 'def#xxx.com,abc#xxx.com,ghi#xxx.com'
}
parameters {
string(name:'MailingList', defaultValue: '',description: 'Email mailing list', trim: true)
}
stages {
stage('Use AWS CLI to detect untagged resources') {
steps {
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'unTagResources_Details.csv', onlyIfSuccessful: true
script {
if(params.MailingList || env.DEFUALT_MAIL_LIST){
emailext subject: '$DEFAULT_SUBJECT', mimeType: 'text/html', attachmentsPattern: 'unTagResources_Details.csv',
to: "${params.MailingList},${env.DEFUALT_MAIL_LIST}", body: '${SCRIPT, template="groovy-html.template"}'
}
}
}
}
}
If both parameters are empty, no mail will be sent.
You can also modify the logic so if the user inputed the mail list it will be sent to the user input, otherwise it will be sent to the default list.
Using the declarative pipeline syntax, I want to be able to define parameters based on an array of repos, so that when starting the build, the user can check/uncheck the repos that should not be included when the job runs.
final String[] repos = [
'one',
'two',
'three',
]
pipeline {
parameters {
booleanParam(name: ...) // static param
// now include a booleanParam for each item in the `repos` array
// like this but it's not allowed
script {
repos.each {
booleanParam(name: it, defaultValue: true, description: "Include the ${it} repo in the release?")
}
}
}
// later on, I'll loop through each repo and do stuff only if its value in `params` is `true`
}
Of course, you can't have a script within the parameters block, so this won't work. How can I achieve this?
Using the Active Choices Parameter plugin is probably the best choice, but if for some reason you can't (or don't want to) use a plugin, you can still achieve dynamic parameters in a Declarative Pipeline.
Here is a sample Jenkinsfile:
def list_wrap() {
sh(script: 'echo choice1 choice2 choice3 choice4 | sed -e "s/ /\\n/g"', , returnStdout: true)
}
pipeline {
agent any
stages {
stage ('Gather Parameters') {
steps {
timeout(time: 30, unit: 'SECONDS') {
script {
properties([
parameters([
choice(
description: 'List of arguments',
name: 'service_name',
choices: 'DEFAULT\n' + list_wrap()
),
booleanParam(
defaultValue: false,
description: 'Whether we should apply changes',
name: 'apply'
)
])
])
}
}
}
}
stage ('Run command') {
when { expression { params.apply == true } }
steps {
sh """
echo choice: ${params.service_name} ;
"""
}
}
}
}
This embeds a script {} in a stage, which calls a function, which runs a shell script on the agent/node of the Declarative Pipeline, and uses the script's output to set the choices for the parameters. The parameters are then available in the next stages.
The gotcha is that you must first run the job with no build parameters in order for Jenkins to populate the parameters, so they're always going to be one run out of date. That's why the Active Choices Parameter plugin is probably a better idea.
You could also combine this with an input command to cause the pipeline to prompt the user for a parameter:
script {
def INPUT_PARAMS = input message: 'Please Provide Parameters', ok: 'Next',
parameters: [
choice(name: 'ENVIRONMENT', choices: ['dev','qa'].join('\n'), description: 'Please select the Environment'),
choice(name: 'IMAGE_TAG', choices: getDockerImages(), description: 'Available Docker Images')]
env.ENVIRONMENT = INPUT_PARAMS.ENVIRONMENT
env.IMAGE_TAG = INPUT_PARAMS.IMAGE_TAG
}
Credit goes to Alex Lashford (https://medium.com/disney-streaming/jenkins-pipeline-with-dynamic-user-input-9f340fb8d9e2) for this method.
You can use CHOICE parameter of Jenkins in which user can select a repository.
pipeline {
agent any
parameters
{
choice(name: "REPOS", choices: ['REPO1', 'REPO2', 'REPO3'])
}
stages {
stage ('stage 1') {
steps {
// the repository selected by user will be printed
println("$params.REPOS")
}
}
}
}
You can also use the plugin Active Choices Parameter if you want to do multiple select : https://plugins.jenkins.io/uno-choice/#documentation
You can visit pipeline syntax and configure in below way to generate code snippet and you can put it in the jenkins file:
Copy the snippet code and paste it in jenkinsfile at the start.