Jenkins Job DSL Parameter from list - jenkins

I'm trying to create a DSL job that will create jobs from a list of maps iterating throughout them like the following:
def git_branch = "origin/awesomebranch"
def credential_id = "awesomerepocreds"
def jobs = [
[
title: "AwesomePipeline",
description: "This pipeline is awesome.",
directory: "awesomepath",
repo: "ssh://git#bitbucket.XXX.XX.XX:XXXX/repo/repo.git"
]
]
jobs.each { i ->
pipelineJob(i.title) {
description("${i.description}\n\n__branch__: ${git_branch}")
parameters {
stringParam('branch', defaultValue='origin/develop', description='Branch to build')
}
definition {
cpsScm {
scm {
git {
branch('$branch')
remote {
url(i.repo)
credentials(credential_id)
}
}
scriptPath("jenkins/${i.directory}/Jenkinsfile")
}
}
}
}
}
For jobs without parameters this works great but I don't know how to pass a list into the map of a job that will be used by the parameters block, something like
....
def jobs = [
[
title: "AwesomePipeline",
description: "This pipeline is awesome.",
directory: "awesomepath",
repo: "ssh://git#bitbucket.XXX.XX.XX:XXXX/repo/repo.git",
params: [
stringParam('branch', defaultValue='origin/develop', description='Branch to build'),
stringParam('sawesomeparam', defaultValue='awesomevalue', description='awesomething')
]
]
]
...
That might be used someway as some sort of each but not sure how to formulate this properly.
....
jobs.each { i ->
pipelineJob(i.title) {
description("${i.description}\n\n__branch__: ${git_branch}")
parameters {
i.params.each { p ->
p
}
}
....
Thanks in advance

This is a relatively old question, and I came across this a few days ago. This is how I ended up solving this.
First we need a class that has a static context method, that (I think) gets called by Groovy. This context is then later reused to actually add the parameters. I suppose you should use the additionalClasspath feature of job-dsl plugin to separate the Params class into a separate file so you can re-use it.
import javaposse.jobdsl.dsl.helpers.BuildParametersContext
class Params {
protected static Closure context(#DelegatesTo(BuildParametersContext) Closure params) {
params.resolveStrategy = Closure.DELEGATE_FIRST
return params
}
static Closure extend(Closure params) {
return context(params)
}
}
Then you can setup your job using a params attribute that contains a list of closures.
def jobs = [
[
name: "myjob",
params: [
{
stringParam('ADDITIONAL_PARAM', 'default', 'description')
}
]
]
]
Lastly, you can call the parameters property using the static extend of above class.
jobs.each { j ->
pipelineJob(j.name) {
// here go all the default parameters
parameters {
string {
name('SOMEPARAM')
defaultValue('')
description('')
trim(true)
}
}
j.params.each { p ->
parameters Params.extend(p)
}
// additional pipelineJob properties...
}
}
The reason I made the params attribute of the map into a list, is because this way you can have additional methods that return Closure that contain additional params. For example:
class DefaultParams {
static Closure someParam(String valueDefault) {
return {
string {
name('SOME_EXTRA_PARAM')
defaultValue(valueDefault)
description("It's a description bonanza")
trim(true)
}
}
}
}
def myjobs = [
[
name: "myjob",
params: [
defaultParams.someParam('default')
{
stringParam('ADDITIONAL_PARAM', 'default', 'description')
}
]
]
]
You should be able to use all the different syntaxes that is showing on the job-dsl api page.

simplest method is:
jobs.each { i ->
pipelineJob(i.title) {
description("${i.description}\n\n__branch__: ${git_branch}")
parameters {
i.params.each { p ->
p.delegate = owner.delegate
p()
}
}
where owner.delegate is delegate of parameters
or better which not creates empty parameters:
jobs.each { i ->
pipelineJob(i.title) {
description("${i.description}\n\n__branch__: ${git_branch}")
i.params.each { p ->
parameters {
p.delegate = getDelegate()
p()
}
}

Related

Groovy modify map in list

I am trying to modify each map in a list in my build. First stage validates the config and I want to generate the image name for each dockerfile and attach it to the map. This value is used in a later stage. Issue is the value .image_name is null still in later stages.
call(build = [:]) {
def docker_files = build.docker_files
stage("Validate") {
steps {
script {
docker_files.each {
// do validation stuff
it.image_name = build_util.get_image_name(it)
}
}
}
}
stage("Build") {
steps {
script {
docker_files.each {
println "${it.image_name}" // would print 'null'
build_util.build(it)
}
}
}
}
}
The App Jenkinsfile looks like so
App([
docker_files: [
[file: "Dockerfile", name: "blah"],
[file: "nginx/Dockerfile", name: "nginx"]
]
])
Edit: I have since attempted the following as well to no avail
docker_files.eachWithIndex { it, idx ->
it.image_name = build_util.get_image_name(it)
docker_files[idx] = it
}
I'm assuming this has something to do with scoping however I have modified other values that were defined immediately inside call. Those modifications carried to later stages so I'm not sure why I am seeing this issue here.

How to pass and invoke a method utility to Jenkins template?

I have this template:
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent any
....
stages {
stage('My stages') {
steps {
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
$v
}
}
}
}
}
}
post { ... }
}
}
Then I use the template in a pipeline:
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': sh "do something...",
'Second stage': myCustomCommand("foo","bar")
]
}
In the stagesParams I pass the instances of my command (sh and myCustomCommand) and they land in the template as $v. How can I then execute them? Some sort of InvokeMethod($v)?
At the moment I am getting this error:
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
The problem of using node is that it doesn't work in situations like parallel:
parallelStages = [:]
v.each { k2, v2 ->
parallelStages["$k2"] = {
// node {
stage("$k2") {
notifySlackStartStage()
$v2
checkLog()
}
// }
}
}
If you want to execute sh step provided with a map, you need to store map values as closures, e.g.
#Library('pipeline-library') _
pipelineTemplateBasic {
stagesParams = [
'First stage': {
sh "do something..."
}
'Second stage': {
myCustomCommand("foo","bar")
}
]
}
Then in the script part of your pipeline stage you will need to execute the closure, but also set the delegate and delegation strategy to the workflow script, e.g.
script {
pipelineParams.stagesParams.each { k, v ->
stage("$k") {
v.resolveStrategy = Closure.DELEGATE_FIRST
v.delegate = this
v.call()
}
}
}

Jenkins Pipeline Conditional Environmental Variables

I have a set of static environmental variables in the environmental directive section of a declarative pipeline. These values are available to every stage in the pipeline.
I want the values to change based on an arbitrary condition.
Is there a way to do this?
pipeline {
agent any
environment {
if ${params.condition} {
var1 = '123'
var2 = abc
} else {
var1 = '456'
var2 = def
}
}
stages {
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
stag('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
Looking for the same thing I found a nice answer in other question:
Basically is to use the ternary conditional operator
pipeline {
agent any
environment {
var1 = "${params.condition == true ? "123" : "456"}"
var2 = "${params.condition == true ? abc : def}"
}
}
Note: keep in mind that in the way you wrote your question (and I did my answer) the numbers are Strings and the letters are variables.
I would suggest you to create a stage "Environment" and declare your variable according to the condition you want, something like below:-
pipeline {
agent any
environment {
// Declare variables which will remain same throughout the build
}
stages {
stage('Environment') {
agent { node { label 'master' } }
steps {
script {
//Write condition for the variables which need to change
if ${params.condition} {
env.var1 = '123'
env.var2 = abc
} else {
env.var1 = '456'
env.var2 = def
}
sh "printenv"
}
}
}
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
stage('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
}
Suppose we want to use optional params for downstream job if it is called from upsteam job, and default params if downsteam job is called by itself.
But we don't want to have "holder" params with default value in downstream for some reason.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
you can get another level of flexibility, using maps:
stage("set_env_vars") {
steps {
script {
def MY_MAP1 = [A: "123", B: "456", C: "789"]
def MY_MAP2 = [A: "abc", B: "def", C: "ghi"]
env.var1 = MY_MAP1."${env.switching_var}"
env.var2 = MY_MAP2."${env.switching_var}"
}
}
}
This way, more choices are possible.

How can I use foreach with conditionalSteps in Jenkins Job DSL

I'm trying to use the conditionalSteps add in with Jenkins Job DSL to conditionally trigger a build step. I want this step to trigger if any file in a given set exists. I am able to get this work by explcitly calling out multiple fileExists and an or. However I would like to dynamically create this using a foreach.
Here's what I have been playing with on http://job-dsl.herokuapp.com/
def files = ["file1", "file2", "file3"]
job('SomeJob') {
steps {
conditionalSteps {
condition {
/* This works fine:
or {
fileExists("file1.jenkinsTrigger", BaseDir.WORKSPACE)
}{
fileExists("file2.jenkinsTrigger", BaseDir.WORKSPACE)
}{
fileExists("file3.jenkinsTrigger", BaseDir.WORKSPACE)
}
*/
//But I want to create the Or clause from the array above
or {
files.each {
fileExists("${it}.jenkinsTrigger", BaseDir.WORKSPACE)
}
}
}
runner('Unstable')
steps {
gradle 'test'
}
}
}
}
The above gets
javaposse.jobdsl.dsl.DslScriptException: (script, line 17) No condition specified
and I have tried all manner of combinations to get this work without avail... any tips would be much appreciated
The or DSL method expects an array of closures. So you need to convert the collection of file names to an array of closure.
Example:
def files = ["file1", "file2", "file3"]
job('example') {
steps {
conditionalSteps {
condition {
or(
(Closure[]) files.collect { fileName ->
return {
fileExists("${fileName}.jenkinsTrigger", BaseDir.WORKSPACE)
}
}
)
}
runner('Unstable')
steps {
gradle 'test'
}
}
}
}

Jenkins DSL - Parse Yaml for complex processing

I'm using Jenkins Job DSL to construct pipelines for multiple SOA style services. All these service pipelines are identical.
job('wibble') {
publishers {
downstreamParameterized {
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', "myproject-2" )
predefinedProp('PROJECT_REPO', "myprojecttwo#gitrepo.com" )
}
}
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', "myproject-1" )
predefinedProp('PROJECT_REPO', "myprojectone#gitrepo.com" )
}
}
}
}
}
Given I'm adding in new projects everyday, I have to keep manipulating the DSL. I've decided that i'd rather have all the config in a yaml file outside of the DSL. I know I can use groovy to create arrays, do loops etc, but I'm not having much luck.
I'm trying to do something like this...
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
List projects = new Yaml().load(("conf/projects.yml" as File).text)
job('wibble') {
publishers {
downstreamParameterized {
projects.each {
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', it.name )
predefinedProp('PROJECT_REPO', it.repo )
}
}
}
}
}
}
conf/projects.yml
---
- name: myproject-1
repo: myprojectone#gitrepo.com
- name: myproject-2
repo: myprojecttwo#gitrepo.com
Does anyone have any experience with this sort of thing?
This is how I'm using snakeyaml with jobDSL to separate configuration from "application" code.
config.yml
services:
- some-service-1
- some-service-2
target_envs:
- stage
- prod
folder_path: "promotion-jobs"
seed_job.groovy
#!/usr/bin/groovy
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
def workDir = SEED_JOB.getWorkspace()
print("Loading config from ${workDir}/config.yml")
def config = new Yaml().load(("${workDir}/config.yml" as File).text)
for (service in config.services) {
for (stage in config.target_envs) {
folder("${config.folder_path}/to-${stage}") {
displayName("Deploy to ${stage} jobs")
description("Deploy ECS services to ${stage}")
}
if (stage == "stage") {
stage_trigger = """
pipelineTriggers([cron["1 1 * * 1"])
"""
} else {
stage_trigger = ""
}
pipelineJob("${config.folder_path}/to-${stage}/${service}") {
definition {
cps {
sandbox()
script("""
node {
properties([
${stage_trigger}
parameters([
choice(
choices: ['dev,stage'],
description: 'The source environment to promote',
name: 'sourceEnv'
),
string(
defaultValue: '',
description: 'Specify a specific Docker image tag to deploy. This will override sourceEnv and should be left blank',
name: 'sourceTag',
trim: true
)
])
])
properties([
disableConcurrentBuilds(),
])
stage('init') {
dockerPromote(
app="${service}",
destinationEnv="${stage}"
)
}
}
""")
}
}
}
}
}

Resources