wrong variables format when filling yaml with Jenkinsfile - jenkins

I have a yml file that I need to fill with Jenkins.
global:
name: 'my_name'
code: 'my_code'
So, I define Jenkins params:
string(name: 'NAME', defaultValue: 'Nightly Valid', description: 'Nightly Valid Name')
string(name: 'CODE', defaultValue: 'NIGHTLY', description: '')
And further in my Jenkinsfile, I have:
script {
def filename = "configuration.yml"
def yaml = readYaml file: filename
// General data
yaml.global.name = "${params.NAME}"
yaml.global.code = "${params.CODE}"
// ...
sh "rm $filename"
writeYaml file: filename, data: yaml
When I do that, I get:
global:
name: '''my_name'''
code: '''my_code'''
How can I do to just have:
global:
name: 'my_name'
code: 'my_code'

"${params.NAME}" is GStringImpl, try to convert it to string directly: "${params.NAME}".toString()

Related

Jenkins dynamic paramters based on groovy method

I am trying to do the below things as part of Jenkins pipeline dsl.
I have a yaml file where i store all my static values.
I created a pipeline job which should show 2 parameters.
a)region : northamerica/europe
b)environment : this should come based on the region selected.
I am defining the 2 functions outside of the pipeline so that i can use them during the parameters section.
Syntax:
#!/usr/bin/env groovy
def yaml_file = "JenkinsFiles/environments.yaml"
def getRegions() {
def var_regions = []
yaml_settings.environments.each { key, value -> var_regions.add(key) }
return var_regions
}
def getEnvironments(String region) {
def var_envs = []
yaml_settings.environments."${region}".non_prod.each { key, value -> var_envs.add("\"" + key + "\"") }
return var_envs
}
environment {
yaml_settings = {}
}
pipeline {
agent
{
node
{
label 'docker'
}
}
stages {
stage('Prepare') {
steps{
script{
yaml_settings = readYaml file: "${yaml_file}"
list_regions = getRegions()
properties([
parameters([
choice(choices: list_regions , description: 'Please select region to deploy', name: 'REGION'),
[$class: 'CascadeChoiceParameter', choiceType: 'PT_SINGLE_SELECT', description: 'Please select environment to deploy', filterLength: 1, filterable: false, name: 'ACP_ENVIRONMENTS', randomName: 'choice-parameter-deploy-env', referencedParameters: 'REGION', script: [$class: 'GroovyScript', fallbackScript: [classpath: [], sandbox: false, script: ''], script: [classpath: [], sandbox: true, script: """
envs = getEnvironments($REGION)
return $envs
"""]]]])])}}}}}
Issue:
The getEnvironments method is not returning the value into the variable and its not getting effected in the parameter. But $region value is coming into though.I can do if else based on the reference parameter and get the value but i dont want to use if else coz i will get many values down the line.
HELP APPRECIATED!!
As with many other questions, the issue here is that Jenkins needs to know the parameters before it executes your pipeline. Once the pipeline is running, the parameters have already been defined, and any change to the parameters won't impact this build.
You may want to take a look at the ActiveChoice plugin to address this.

Is there a way to use a pipeline env var within the extendedChoice parameter?

I am using the extendedChoice plugin for my Jenkins pipeline. It fetches the s3 objects from the bucket and provides the list of values using a short Groovy script. The issue is that I need to parametrize the s3 bucket by using the corresponding variable defined within the pipeline's environment section. How can I do this?
So I tried a lot of different snippets to get the env vars though with no result.
import jenkins.model.*
// This will print out the requested var from the global Jenkins config.
def envVars = Jenkins.instance.getGlobalNodeProperties()[0].getEnvVars()
return envVars['S3_BUCKET']
// This will print out values from the env vars of the node itself where the Jenkins is running.
def env = System.getenv('S3_BUCKET')
return env
// This is what I have now
def domainsList = "aws s3api list-objects-v2 --bucket someRandomBucket --output text --delimiter /".execute() | 'cut -d / -f 1'.execute() | 'sed 1d'.execute()
domainsList.waitFor()
def output = domainsList.in.text
return output.split('COMMONPREFIXES')
// This is the Jenkinsfile
pipeline {
agent any
environment {
DOMAIN_NAME = "${params.DOMAIN_NAME}"
MODEL_VERSION = "${params.MODEL_VERSION}"
S3_BUCKET = "someRandomBucket"
}
parameters {
extendedChoice(
bindings: '',
defaultValue: '',
description: '',
descriptionPropertyValue: '',
groovyClasspath: '',
groovyScript: '''
def domainsList = "aws s3api list-objects-v2 --bucket someRandomBucket --output text --delimiter /".execute() | 'cut -d / -f 1'.execute() | 'sed 1d'.execute()
domainsList.waitFor()
def output = domainsList.in.text
return output.split('COMMONPREFIXES')
''',
multiSelectDelimiter: ',',
name: 'DOMAIN_NAME',
quoteValue: false,
saveJSONParameterToFile: false,
type: 'PT_SINGLE_SELECT',
visibleItemCount: 10)
choice(
choices: ['a', 'b'],
description: 'Select a model version for processing',
name: 'MODEL_VERSION')
}
stages {
stage('Clean workdir') {
steps {
cleanWs()
}
}
stage('build') {
steps {
sh "echo $S3_BUCKET"
sh "echo $DOMAIN_NAME"
sh "echo $MODEL_VERSION"
}
}
}
}
As I mentioned above I need to substitue the someRandomBucket hardcode with the S3_BUCKET env var value in the groovy script within the extendedChoice parameter
RESOLVED - Environment variables can be injected particarly for the parameter via the Jenkins job UI

Load jenkins parameters from external groovy file

I have a big Jenkinsfile that I would like to re-use for other projects, but I have different parameters per project, so i've tried having one file per project containing only those parameters like this :
Jenkinsfile
node {
checkout scm
def options = []
def optionsBuilder = load pwd() + '/global-scripts/optionsBuilder.groovy'
options.addAll(optionsBuilder.buildOptions(env.JOB_BASE_NAME))
properties { options }
}
global-scripts/optionsBuilder.groovy
def buildOptions(jobName) {
echo "load " + pwd() + "/project-scripts/" + jobName + ".groovy"
def jobOptionsBuilder = load pwd() + "/project-scripts/" + jobName + ".groovy"
return jobOptionsBuilder.buildOptions()
}
return this
project-scripts/job.groovy
def buildOptions() {
def options = [buildDiscarder(logRotator(numToKeepStr: '5')),
parameters([string(name: 'releaseVersion', defaultValue: env.releaseVersion, description: 'Version that needs to be released'),
string(name: 'nextVersion', defaultValue: env.nextVersion, description: 'Next snapshot version' ),
string(name: 'branch', defaultValue: env.branch, description: 'Branch that needs to be released'),
booleanParam(name: 'sendRocketChatNotification', defaultValue: true, description: 'Send notification to Rocket_Chat'),
booleanParam(name: 'sendEmail', defaultValue: true, description: 'Send an email with changelog'),
booleanParam(name: 'dryRun', defaultValue: false, description: 'No git push and no mvn deploy')])]
return options
}
return this
But it seems i can't find the right syntax .. Jenkins throws me this error :
java.lang.ClassCastException: org.jenkinsci.plugins.workflow.multibranch.JobPropertyStep.properties expects java.util.List<hudson.model.JobProperty> but received class org.jenkinsci.plugins.workflow.cps.CpsClosure2
at org.jenkinsci.plugins.structs.describable.DescribableModel.coerce(DescribableModel.java:394)
at org.jenkinsci.plugins.structs.describable.DescribableModel.buildArguments(DescribableModel.java:318)
I have the impression that in your Jenkinsfile you should just write
properties(options)

Extract params from Jenkinsfile

I have a Jenkinsfile that takes a bunch of params ( 50 aprox.), and other 50 for input processing:
pipeline {
agent { label 'ansible24' }
parameters {
string(name: 'NAME', defaultValue: 'Nightly Valid', description: ' instance name')
// ... x50
}
script {
def filename = "configuration.yml"
def yaml = readYaml file: filename
yaml.global.name = "${params.NAME}".toString()
// ... x50
}
Tomorrow, I will also have a validation for each field.
How could I extract this logic in separated files?
I already saw this: How do you load a groovy file and execute it
but it doesn't help a lot for the case of params and my case is not scripted.
Any idea ?

Jenkins pipeline - How to give choice parameters dynamically

pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.
Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.
This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}
I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}
this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}
The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}
choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])

Resources