Groovy modify map in list - jenkins

I am trying to modify each map in a list in my build. First stage validates the config and I want to generate the image name for each dockerfile and attach it to the map. This value is used in a later stage. Issue is the value .image_name is null still in later stages.
call(build = [:]) {
def docker_files = build.docker_files
stage("Validate") {
steps {
script {
docker_files.each {
// do validation stuff
it.image_name = build_util.get_image_name(it)
}
}
}
}
stage("Build") {
steps {
script {
docker_files.each {
println "${it.image_name}" // would print 'null'
build_util.build(it)
}
}
}
}
}
The App Jenkinsfile looks like so
App([
docker_files: [
[file: "Dockerfile", name: "blah"],
[file: "nginx/Dockerfile", name: "nginx"]
]
])
Edit: I have since attempted the following as well to no avail
docker_files.eachWithIndex { it, idx ->
it.image_name = build_util.get_image_name(it)
docker_files[idx] = it
}
I'm assuming this has something to do with scoping however I have modified other values that were defined immediately inside call. Those modifications carried to later stages so I'm not sure why I am seeing this issue here.

Related

How to send a referencedParameter to a readFileFromWorkspace through activeChoiceReactiveParam

I'm trying to send a referencedParameter ('product') to a Groovy script (services.groovy) that is triggered by the command readFileFromWorkspace that is wrapped with an activeChoiceReactiveParam.
Expected result: Get a dropdown list with the files content.
Actual result: The job fails when processing the DSL script
ERROR: (services.groovy, line 5) No such property: product for class: dsl.jobs.argocd.services
I tried to define the products referenced parameter as an environment variable (And updated in the services.groovy script), it didn't work.
I tried to re-create the services.groovy file in the directory /tmp/ but I had issues finding the files.
products.groovy:
package dsl.jobs.argocd
return ['a','b','c']
services.groovy:
package dsl.jobs.argocd
return_value = []
if (product.equals("a")){
return_value = ['e']
}
if (product.equals("b")){
return_value = ['f']
}
if (product.equals("c")){
return_value = ['g']
}
return return_value;
Pipeline:
pipelineJob("test") {
description("test")
keepDependencies(false)
parameters {
activeChoiceParam('product') {
description('What product would you like to update?')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script(readFileFromWorkspace('dsl/jobs/argocd/products.groovy'))
fallbackScript('return ["ERROR"]')
}
}
activeChoiceReactiveParam('service') {
description('Which services would you like to update?')
filterable()
choiceType('CHECKBOX')
groovyScript {
script(readFileFromWorkspace('dsl/jobs/argocd/services.groovy'))
fallbackScript('return ["ERROR"]')
}
referencedParameter("product")
}
}
}
Am I approaching this wrong? Is there a different way to use the same parameter in a few Groovy files?
Well, seems the code above is perfect, the only problem was the location of the services.groovy script.
I took the file out of the DSL directory (Since I don't want it to be parsed as a DSL file), referenced it to the correct location, and it works perfect.
Updated pipeline:
pipelineJob("test") {
description("test")
keepDependencies(false)
parameters {
activeChoiceParam('product') {
description('What product would you like to update?')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script(readFileFromWorkspace('dsl/jobs/argocd/products.groovy'))
fallbackScript('return ["ERROR"]')
}
}
activeChoiceReactiveParam('service') {
description('Which services would you like to update?')
filterable()
choiceType('CHECKBOX')
groovyScript {
script(readFileFromWorkspace('services.groovy'))
fallbackScript('return ["ERROR"]')
}
referencedParameter("product")
}
}
}

Getting the same output from parallel stages in jenkins scripted pipelines

I'm trying to create parallel stages in jenkins pipeline for say with this example
node {
stage('CI') {
script {
doDynamicParallelSteps()
}
}
}
def doDynamicParallelSteps(){
tests = [:]
for (f in ["Branch_1", "Branch_2", "Branch_3"]) {
tests["${f}"] = {
node {
stage("${f}") {
echo "${f}"
}
}
}
}
parallel tests
}
I'm expecting to see "Branch_1", "Branch_2", "Branch_3" and instead I'm getting "Branch_3", "Branch_3", "Branch_3"
I don't understand why. Can you please help ?
Short answer: On the classic view, the stage names are displaying the last value of the variable ${f}. Also, all the echo are echoing the same value. You need to change the loop.
Long Answer: Jenkins does not allow to have multiple stages with the same name so this could never happen successfully :)
On your example, you can see it fine on Blue Ocean:
Also, on console output, the names are right too.
On Jenkins classic view, the stage names have the last value of the variable ${f}. The last value is being printed on the classic view for the stage name, and all the echo are the same.
Solution: Change your loop. This worked fine for me.
node {
stage('CI') {
script {
doDynamicParallelSteps()
}
}
}
def void doDynamicParallelSteps(){
def branches = [:]
for (int i = 0; i < 3 ; i++) {
int index=i, branch = i+1
branches["branch_${branch}"] = {
stage ("Branch_${branch}"){
node {
sh "echo branch_${branch}"
}
}
}
}
parallel branches
}
This has to do with closures and iteration, but in the end this might fix it:
for (f in ["Branch_1", "Branch_2", "Branch_3"]) {
def definitive_name = f
tests[definitive_name] = {

Can I have a reusable "post" block for my jenkins pipelines?

I have many jenkins pipelines for several different platforms but my "post{}" block for all those pipelines is pretty samey. And its quite large at this point because I include success,unstable,failure and aborted in it.
Is there a way to parameterize a reusable post{} block I can import in all my pipelines? I'd like to be able to import it and pass it params as well (because while its almost the same it varies very slightly for different pipelines).
Example post block that is currently copy and pasted inside all my pipeline{}s
post {
success{
script {
// I'd like to be able to pass in values for param1 and param2
someGroovyScript {
param1 = 'blah1'
param2 = 'blah2'
}
// maybe id want a conditional here that does something with a passed in param
if (param3 == 'blah3') {
echo 'doing something'
}
}
}
unstable{
... you get the idea
}
aborted{
... you get the idea
}
failure{
... you get the idea
}
}
The following does not work:
// in mypipeline.groovy
...
post {
script {
myPost{}
}
}
// in vars/myPost.groovy
def call(body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
return always {
echo 'test'
}
}
Invalid condition "myPost" - valid conditions are [always, changed, fixed, regression, aborted, success, unstable, failure, notBuilt, cleanup]
Can i override post{} somehow or something?
Shared libraries is one approach for this, you were pretty close.
#Library('my-shared-library')_
pipeline {
...
post {
always {
script {
myPost()
}
}
}
}
Answer based on https://stackoverflow.com/a/48563538/1783362
Shared Libraries link: https://jenkins.io/doc/book/pipeline/shared-libraries/

How can I use foreach with conditionalSteps in Jenkins Job DSL

I'm trying to use the conditionalSteps add in with Jenkins Job DSL to conditionally trigger a build step. I want this step to trigger if any file in a given set exists. I am able to get this work by explcitly calling out multiple fileExists and an or. However I would like to dynamically create this using a foreach.
Here's what I have been playing with on http://job-dsl.herokuapp.com/
def files = ["file1", "file2", "file3"]
job('SomeJob') {
steps {
conditionalSteps {
condition {
/* This works fine:
or {
fileExists("file1.jenkinsTrigger", BaseDir.WORKSPACE)
}{
fileExists("file2.jenkinsTrigger", BaseDir.WORKSPACE)
}{
fileExists("file3.jenkinsTrigger", BaseDir.WORKSPACE)
}
*/
//But I want to create the Or clause from the array above
or {
files.each {
fileExists("${it}.jenkinsTrigger", BaseDir.WORKSPACE)
}
}
}
runner('Unstable')
steps {
gradle 'test'
}
}
}
}
The above gets
javaposse.jobdsl.dsl.DslScriptException: (script, line 17) No condition specified
and I have tried all manner of combinations to get this work without avail... any tips would be much appreciated
The or DSL method expects an array of closures. So you need to convert the collection of file names to an array of closure.
Example:
def files = ["file1", "file2", "file3"]
job('example') {
steps {
conditionalSteps {
condition {
or(
(Closure[]) files.collect { fileName ->
return {
fileExists("${fileName}.jenkinsTrigger", BaseDir.WORKSPACE)
}
}
)
}
runner('Unstable')
steps {
gradle 'test'
}
}
}
}

Is Jenkins parallel stages pipeline conditional flow, possible?

So is it possible to run the Jenkins pipeline parallel conditionally? So i have a full build that runs all the stages in parallel, however lets say I only wanted to run 2/5 of the stages... is this possible, does anyone have an idea of what this syntax would look? Here is my groovy script:
def call() {
def my_automation = load("my-lib/groovies/my_automation.groovy")
parallel thing1: {
stage('thing1'){
my_automation.my_func("thing1")
}},
thing2: {
stage('thing2'){
my_automation.my_func("thing2")
}}, thing3: {
stage('thing3'){
my_automation.my_func("thing3")
}}, thing4: {
stage('thing4'){
my_automation.my_func("thing4")
}}, thing5: {
stage('thing5'){
my_automation.my_func("thing5")
}}, thing6: {
stage('thing6'){
my_automation.my_func("thing6")
}}, thing7: {
stage('thing7') {
my_automation.my_func("thing7")
}
}
}return this;
But looking for this kind of thing:
for(all things defined, run them at once in parallel)
{
etc...
}
Is this possible?
What you need to do is conditionally define what you want to pass to parallel, rather than processing it conditionally. You would use the for loop to build a map of the things you want to execute in a map, then pass that to the parallel step:
def branches = [:]
for (int i = 0; i < 4 ; i++) {
int index = i
branches["thing${index}"] = {
stage("thing${index}") {
node ('label_example'){
my_automation.my_func("thing${index}")
}
}
}
}
parallel branches
Use logic inside there to skip adding something to the map if you want. This isn't syntax checked or tested, but you should get the idea.
I ended up doing something like this, where i would pass in the image name and match it and this would decide on which one to run against (took long enough to respond, but figured this might help someone in the future):
String call(String image_name) {
def map = ["CentOS-7":"rpm", "CentOS-6":"rpm", "RHEL-7":"rpm","RHEL-6":"rpm","Ubuntu-14.04":"deb"]
for (element in map) {
echo "${element.key} ${element.value}"
if ("${element.key}" == image_name){
return "${element.value}"}
}
return "notfound"
}
return this;

Resources