use groovy to add an additional parameter to a jenkins job - jenkins

We've got a set of groovy scripts that our users invoke in their jenkinsfile that sets some common job properties. However, we haven't been able to figure out how to preserve their existing parameters when we do this update.
snippet of our groovy code:
def newParamsList = []
def newbool = booleanParam(defaultValue: false, description: "deploy", name: "deploy_flag")
newParamsList.add(newbool)
def newParams = parameters(newParamsList)
properties([ //job property declaration
jobProperties,
disableConcurrentBuilds(),
newParams,
addSchedule,
])
However, this overwrites the parameter definitions, so if the user had specified a different parameter definition in their jenkins file before invoking our groovy, it's been wiped out.
I can get access to the existing parameters using currentBuild.rawBuild.getAction(ParametersAction), but if I understand correctly, I need the ParameterDefinition not the ParameterValue in order to set the property. I tried currentBuild.rawBuild.getAction(ParametersDefinitionProperty.class) thinking I could use that like ParametersAction, but it returns null.
Is it possible to get the parameter definitions inside the groovy being called from a Jenkinsfile? Or is there a different way that would let us add an additional parameter to the job without wiping out the existing ones currently defined in the jenkinsfile?

So the way we do this, is treat it all like a simple list, then join them together. So jenkinsfile's first get a list from the shared library, before adding their own to the list and then they set the params (not the shared library)
Repos jenkinsfiles do this:
#!groovy
#Library('shared') _
// Call shared libaray for common params
def paramList = jobParams.listParams ([
"var1": "value",
"var2": "value2"
])
// Define repo specific params
def addtionalParams = [
booleanParam(defaultValue: false, name: 'SOMETHING', description: 'description?'),
booleanParam(defaultValue: false, name: 'SOMETHING_ELSE', description: 'description?'),
]
// Set Jenkins job properties, combining both
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters(paramList + addtionalParams)
])
// Do repo stuff
Our shared library looks like this:
List listParams(def body = [:]) {
//return list of parameters
config = BuildConfig.resolve(body)
// Always common params
def paramsList = [
choice(name: 'ENV', choices: ['dev', 'tst'].join('\n'), description: 'Environment'),
string(name: 'ENV_NO', defaultValue: "1", description: 'Environment number'),
]
// Sometimes common params, switch based on jenkinsfile input
def addtionalParams = []
switch (config.var1) {
case 'something':
case 'something2':
addtionalParams = [
choice(name: 'AWS_REGION', choices: ['us-west-2'].join('\n'), description: 'AWS Region to build/deploy'),
]
break
case 'something3':
addtionalParams = [
string(name: 'DEBUG', defaultValue: '*', description: 'Namespaces for debug logging'),
]
break
}
return paramsList + addtionalParams
}

We did the following groovy code to retrieve the parameters definitions and add new parameters to existing ones (we don't have any knowledge about what the user will put as parameters). If you have something more simple, I take it:
boolean isSupported = true
// nParamsis the List of new parameters to add //
Map initParamsMap = this.initializeParamsMap(nParams)
currentBuild.rawBuild.getParent().getProperties().each { k, v ->
if (v instanceof hudson.model.ParametersDefinitionProperty) {
// get each parameter definition
v.parameterDefinitions.each { ParameterDefinition paramDef ->
String param_symbol_name = null
// get the symbol name from the nested DescriptorImpl class
paramDef.class.getDeclaredClasses().each {
if(it.name.contains('DescriptorImpl')){
param_symbol_name = it.getAnnotation(Symbol).value().first()
}
}
// ... processing... //
if( !initParamsMap.containsKey(paramDef.name) ) {
//Valid parameter types are booleanParam, choice, file, text, password, run, or string.
if (param_symbol_name == 'choice') {
String defaultParamVal = paramDef.defaultParameterValue == null ? null : paramDef.defaultParameterValue.value
tempParams.add(
"$param_symbol_name"(name: paramDef.name,
defaultValue: defaultParamVal,
description: paramDef.description,
choices: paramDef.choices)
)
} else if (param_symbol_name == 'run') {
logError {"buildParametersArray does not support yet already existing RunParameterDefinition " +
"in current job parameters list, so the job parameters will not be modified"}
isSupported = false
} else {
tempParams.add(
"$param_symbol_name"(name: paramDef.name,
defaultValue: paramDef.defaultParameterValue.value,
description: paramDef.description)
)
}
}
}
}
}
if( isSupported) {
properties([parameters(tempParams)])
}

I think you can also do something like this:
// Get existing ParameterDefinitions
existing = currentBuild.rawBuild.parent.properties
.findAll { it.value instanceof hudson.model.ParametersDefinitionProperty }
.collectMany { it.value.parameterDefinitions }
// Create new params and merge them with existing ones
jobParams = [
booleanParam(name: 'boolean_param', defaultValue: false)
/* other params */
] + existing
// Create properties
properties([
parameters(jobParams)
])
Note: But you should either run it in a non-sandboxed environment or use with #NonCPS

There is an example how to add additional string parameter NEW_PARAM into job with name test:
job = Jenkins.instance.getJob("test")
ParametersDefinitionProperty params = job.getProperty(ParametersDefinitionProperty.class);
List<ParameterDefinition> newParams = new ArrayList<>();
newParams.addAll(params.getParameterDefinitions());
newParams.add(new StringParameterDefinition("NEW_PARAM", "default_value"));
job.removeProperty(params);
job.addProperty(new ParametersDefinitionProperty(newParams));

Related

Error creating jobdsl parameters programatically

I am creating jobs for use with Terraform. There are several environments and the number is growing all the time. Rather than update both the pipeline file and the jobdsl file as the parameters change, I started working from the standpoint of scanning the repo for environment files and updating the pipeline and jobdsl file as needed.
My jobdsl script:
#Library('mylib') _
params = [
"serviceName": "infrastructure-${repo}",
"repoUrl": "${repoUrl}",
"sshCredentials": 'git-readonly',
"environment": "${env.Environment}",
"configParams": getTFConfigs(
repoUrl,
"env/${env.AccountName}/${env.AWSRegion}/${env.Environment}")
]
template = libraryResource('dslTemplates/infra.groovy')
jobDsl scriptText: helpers.renderTemplate(template, params)
shared library method: getTFConfigs
#!/usr/bin/env groovy
#NonCPS
import java.util.zip.ZipEntry
import java.util.zip.ZipInputStream
def call(String repoUrl, String filter=""){
def gitProc = new ProcessBuilder(
"git",
"archive",
"--format=zip",
"--remote=${repoUrl}",
"main").start()
def zipIn = new ZipInputStream(gitProc.inputStream)
def zipMembers = []
while (true) {
def ZipEntry entry = zipIn.getNextEntry()
if (entry == null) break
if ( (entry.getName()).contains(filter) ) {
entryName = entry.getName()
zipMembers.push("${entryName}")
}
}
println zipMembers
return zipMembers
}
dslTemplates/infra.groovy template
pipelineJob("${serviceName}") {
description("Apply TF for ${serviceName} to all environment configurations")
definition {
parameters {
<% configParams.each { %>
booleanParam(name: "<%= "${it}" %>", defaultValue: true, description: "<%= "${it}" %>" )
<% } %>
}
logRotator {
numToKeep(20)
}
cpsScm {
scm {
git{
remote{
url("${repoUrl}")
credentials("${sshCredentials}")
branch('*/main')
}
}
}
scriptPath('infra.groovy')
}
}
}
Template result
...
definition {
parameters {
booleanParam(name: env1.tfvars, defaultValue: true, description: env1.tfvars )
booleanParam(name: env2.tfvars, defaultValue: true, description: env2.tfvars )
}
...
When the seed job runs and executes the code, the parameters should be updated with a checkbox for each environment. However, the jobdsl fails with this:
ERROR: (script, line 6) No signature of method: javaposse.jobdsl.dsl.helpers.BuildParametersContext.booleanParam() is applicable for argument types: (java.util.LinkedHashMap) values: [[name:env1.tfvars, defaultValue:true, ...]]
Possible solutions: booleanParam(java.lang.String), booleanParam(java.lang.String, boolean), booleanParam(java.lang.String, boolean, java.lang.String)
Finished: FAILURE
I have tried to applying "toString()" at various steps and cannot seem to find any solution to this.
I have tried to write the entire jobdsl script to a file and read it back in using "jobDsl targets: filename" and got the same result!
Banging my head! as it were!
Thanks
It looks like you used Pipeline Syntax for the parameters in DSL script. If you want to define a parameter in a DSL script do not use name, defaultValue and description. (See Job DSL Plugin)
booleanParam('BOOL_PARAM', true, 'This is a boolean param')

Jenkinsfile one type of parameter instead of two type, convert choice into string or string into choice

I'm using two parameters one is choice (ID) and other one string (NID) but values are same. Requirement is to use only parameter either choice or string. Is it possible to convert choice parameter into string or string to choice parameter?
so that i can use one parameter and one deploy function.
def deploy1(env) {
step([$class: 'UCDeployPublisher',
siteName: siteName,
deploy: [
$class: 'com.urbancode.jenkins.plugins.ucdeploy.DeployHelper$DeployBlock',
deployApp: appName,
deployEnv: 'DEV',
deployVersions: "${compName}:${version}",
deployProc: simpleDeploy,
deployOnlyChanged: false,
deployReqProps: "ID=${params.ID}" ===> string paramater
]])
def deploy2(env) {
step([$class: 'UCDeployPublisher',
siteName: siteName,
deploy: [
$class: 'com.urbancode.jenkins.plugins.ucdeploy.DeployHelper$DeployBlock',
deployApp: appName,
deployEnv: 'DEV',
deployVersions: "${compName}:${version}",
deployProc: simpleDeploy,
deployOnlyChanged: false,
deployReqProps: "ID=${params.NID}" ===> Needs choice paramater
]])
parameters {
choice(
name: 'ID',
choices: [ '8922', '9292', '3220' ]
)
string(
name: 'NID',
defaultvalue: '8922,9292,3220'
)
stage (DEV') {
steps {
script {
if (params.ENVIRONMENT == "dev"){
deploy1('devl') ===> this will call my deploy function
}
}
}
}
Yes you can convert the string parameter to an array by just using split:
Below is an example :
// Define list which would contain all servers in an array
def ID= []
pipeline {
agent none
parameters
{
// Adding below as example string which is passed from paramters . this can be changed based on your need
// Example: Pass NID list as , separated string in your project. This can be changed
string(name: 'NID', defaultValue:'8922,9292,3220', description: 'Enter , separated NID values in your project e.g. 8922,9292,3220')
}
stages {
stage('DEV') {
agent any
steps {
script
{
// Update ID list
ID= params.NID.split(",")
// You can loop thorugh the ID list
for (myid in ID)
{
println ("ID is : ${myid}")
}
}
}
}
}
}

fetch source values from jenkins extended choice parameter

I have added an extended choice paramter. Now the source values are lin1, lin2, lin3 as listed in screenshot
now when I run,
If I select lin1 then I get param3 = lin1,
If I select lin1 and lin2 then I get param2 - lin1,lin2 ( delimiter is comma )
The question here is, inside jenkins pipeline how can get what all source values were set when the param was created. In short, without selecting any of the checkboxes, want to get the list of the possible values probably in a list
Eg:
list1 = some_method(param3)
// expected output >> list1 = [lin,lin2,lin3]
Let me know if this description is not clear.
The user who runs this does not have configure access ( we dont want to give configure access to anonynmous user ) Hence the job/config.xml idea will not work here
As requested you can also get the values dynamically:
import hudson.model.*
import org.jenkinsci.plugins.workflow.job.*
import com.cwctravel.hudson.plugins.extended_choice_parameter.ExtendedChoiceParameterDefinition
def getJob(name) {
def hi = Hudson.instance
return hi.getItemByFullName(name, Job)
}
def getParam(WorkflowJob job, String paramName) {
def prop = job.getProperty(ParametersDefinitionProperty.class)
for (param in prop.getParameterDefinitions()) {
if (param.name == paramName) {
return param
}
}
return null
}
pipeline {
agent any
parameters {
choice(name: 'FOO', choices: ['1','2','3','4'])
}
stages {
stage('test') {
steps {
script {
def job = getJob(JOB_NAME)
def param = getParam(job, "FOO")
if (param instanceof ChoiceParameterDefinition) {
// for the standard choice parameter
print param.getChoices()
} else if (param instanceof ExtendedChoiceParameterDefinition) {
// for the extended choice parameter plugin
print param.getValue()
}
}
}
}
}
}
As you can see it requires a lot of scripting, so just must either disable the Groovy sandbox or approve most of the calls on the script approval page.
I couldn't find any variable or method to get the parameter list. I guess it's somehow possible through a undocumented method on the param or currentBuild maps.
A possible solution to your problem could be defining the map outside of the pipeline and then just use that variables like this:
def param3Choices = ['lin1', 'lin2', 'lin3']
pipeline {
parameters {
choice(name: 'PARAM3', choices: param3Choices, description: '')
}
stage('Debug') {
steps {
echo param.PARAM3
print param3Choices
}
}
}

Is it possible to set Jenkins job parameters dynamically from a pipeline step?

I have the following (simplified) Jenkins pipeline code.
jobParams.groovy
List get(Object paramVars = {}) {
def params = []
params += [
choice(
choices: ['branch', 'tag'],
name: 'RELEASE_TYPE'
),
string(
defaultValue: '',
name: 'VERSION'
),
]
return params
}
pipeline.groovy
def call() {
properties([
parameters(
jobParams.get()
)
])
pipeline {
agent { label 'deploy-slave' }
stages {
stage('Prepare') {
steps {
script {
// Do some logic here and set a job parameter?
}
}
}
}
}
}
This works fine. When the pipeline starts the job parameters are set and available for the next time the job runs.
However, is it also possible to set job parameters dynamically after some logic in a pipeline step?
It turned out to be pretty easy!
I created a jobProperties.groovy file in my shared pipeline library, which composes the parameter list and calls the properties() function.
def call() {
params = [
string(
defaultValue: '',
description: 'Version to deploy',
name: 'VERSION'
),
]
if (env.HANDLER == 'ansible') {
params += [
string(
defaultValue: '',
description: 'DEPLOY_ARGS | Ad hoc "ansible-playbook" args. Example to limit hosts to' +
' deploy to "-l somehost"',
name: 'DEPLOY_ARGS'
),
]
} else if (env.HANDLER == 'capistrano') {
params += [
string(
defaultValue: '',
description: 'DEPLOY_ARGS | Ad hoc "cap" args. Example to limit hosts to' +
' deploy to "-z somehost"',
name: 'DEPLOY_ARGS'
),
]
}
properties([
parameters(
params
)
])
}
pipeline.groovy
def call() {
pipeline {
agent { label 'deploy-slave' }
stages {
stage('Prepare') {
steps {
script {
jobProperties()
}
}
}
}
}
}
I think that if you don't have a shared pipeline library, the code of jobParams.groovy can be also put directly in the script {} wrapper of the pipeline.
It is, but there are some complications as params is an immutable map.
We use a shared library function we wrote when we want to change our params during job execution.
This will probably require admin for script approvals.
The first function is for setting a new string param, or updating an existing one with a new value.
The second and third jobs are just interfaces for adding a new option to a choice param for either your current job, or a different job.
The fourth is the main grunt for this choice adding logic. (not called directly)
The organizationFolder is based on us using the Github branch source plugin.
/**
* Change param value during build
*
* #param paramName new or existing param name
* #param paramValue param value
* #return nothing
*/
def setParam(String paramName, String paramValue) {
List<ParameterValue> newParams = new ArrayList<>();
newParams.add(new StringParameterValue(paramName, paramValue))
try {
$build().addOrReplaceAction($build().getAction(ParametersAction.class).createUpdated(newParams))
} catch (err) {
$build().addOrReplaceAction(new ParametersAction(newParams))
}
}
/**
* Add a new option to choice parameter for the current job
*
* #param paramName parameter name
* #param optionValue option value
* #return nothing
*/
def addChoice(String paramName, String optionValue) {
addChoice($build().getParent(), paramName, optionValue)
}
/**
* Add a new option to choice parameter to the given job
*
* #param paramName parameter name
* #param optionValue option value
* #return nothing
*/
def addChoice(String jobName, String paramName, String optionValue) {
List jobNames = jobName.tokenize("/")
Job job = ((OrganizationFolder)Jenkins.getInstance().getItem(jobNames[0])).getItem(jobNames[1]).getItem(jobNames[2])
addChoice(job, paramName, optionValue)
}
/**
* Add a new option to choice parameter to the given job
* Will be added as the first (default) choice
* #param job job object
* #param paramName parameter name
* #param optionValue option value
* #return
*/
def addChoice(Job job, String paramName, String optionValue) {
ParametersDefinitionProperty paramsJobProperty = job.getProperty(ParametersDefinitionProperty.class);
ChoiceParameterDefinition oldChoiceParam = (ChoiceParameterDefinition)paramsJobProperty.getParameterDefinition(paramName);
List<ParameterDefinition> oldJobParams = paramsJobProperty.getParameterDefinitions();
List<ParameterDefinition> newJobParams = new ArrayList<>();
for (ParameterDefinition p: oldJobParams) {
if (!p.getName().equals(paramName)) {
newJobParams.add(0,p);
}
}
List<String> choices = new ArrayList(oldChoiceParam.getChoices());
choices.add(0,optionValue);
ChoiceParameterDefinition newChoiceParam = new ChoiceParameterDefinition(paramName, choices, oldChoiceParam.getDefaultParameterValue().getValue(), oldChoiceParam.getDescription());
newJobParams.add(newChoiceParam);
ParametersDefinitionProperty newParamsJobProperty = new ParametersDefinitionProperty(newJobParams);
job.removeProperty(paramsJobProperty);
job.addProperty(newParamsJobProperty);
}
As mentioned the params map is immutable, however, as described here Jenkins also creates an environment variable for each build param
So the option I've used to override a build param is to use the environment var rather than the value in the params map eg:
if (environment == "PRD") {
env.vpc_id = 'vpc-0fc6d952bbbf0000'
}
// Now run your script or command that refers to the environment var
sh './script.sh'
I Guess you are talking about deciding downstream parameters dynamically for a job. It surely can be done the way its depicted in the code below.
#Library("shared-library") _
Map buildDetails = [:]
downStreamParams = [["\$class: 'StringParameterValue', name: 'TARGET_WORKSPACE', value: 'prod'"]]
pipeline {
agent {
label 'builds'
}
stages {
stage('Get Details'){
steps {
script {
buildDetails = [
"releaseType":"minor",
"workspace":"prod",
"featureType":"ENHANCEMENT",
"PARAMS=Jenkins_Controller_Image":true,
"PARAMS=APPLY":true,
"PARAMS=PLUGIN_CLEANUP":true,
"PARAMS=RESTART":true,
]
buildDetails.each{ key, value ->
println("$key:$value")
if(key.contains("PARAMS=")){
"[\$class: 'BooleanParameterValue', name: \"${key.split('=')[1]}\", value: true]"
}
}
}
}
}
stage("Build"){
steps{
script{
job = "build( job: jobFullName,parameters: ${downStreamParams},propagate: true,wait: true)"
evaluate(job)
}
}
}
}
}

Extract params from Jenkinsfile

I have a Jenkinsfile that takes a bunch of params ( 50 aprox.), and other 50 for input processing:
pipeline {
agent { label 'ansible24' }
parameters {
string(name: 'NAME', defaultValue: 'Nightly Valid', description: ' instance name')
// ... x50
}
script {
def filename = "configuration.yml"
def yaml = readYaml file: filename
yaml.global.name = "${params.NAME}".toString()
// ... x50
}
Tomorrow, I will also have a validation for each field.
How could I extract this logic in separated files?
I already saw this: How do you load a groovy file and execute it
but it doesn't help a lot for the case of params and my case is not scripted.
Any idea ?

Resources