Cannot get build parameters in Jenkins Pipeline Job - jenkins

I'm trying to get some build parameters in a Jenkins pipeline job. In this context the parameters are defined in the checkbox "this project is parameterized" and passed at build time.
In the job I call two branches:
parallel firstBranch: {
build job: 'Run Blah', parameters: [string(name: 'BLAH', value: '$app.blah.blah')]
}, secondBranch: {
build job: 'Run BlahBlah', parameters: [string(name: 'BLAH', value: '$app.blah.blah')]
}
I've tried accessing the build parameter: app.blah.blah in these various ways:
${app.blah.blah}
$app.blah.blah
"${app.blah.blah}"
app.blah.blah
currentBuild.buildVariableResolver.resolve("app.blah.blah")
System.getenv("app.blah.blah")
I always get some exception that I can somewhat understand, but I'm starting to get very annoyed. It should not be this hard to get a build parameter in the script for God's sake. What am I doing wrong?

This is working for me:
println blah
So I guess it should be enough for you to do it like this:
parallel firstBranch: {
build job: 'Run Blah', parameters: [string(name: 'BLAH', value: blah)]
}, secondBranch: {
build job: 'Run BlahBlah', parameters: [string(name: 'BLAH', value: blah)]
}

Well, looks like you can't have dots in your build parameter names! Confuses Groovy into thinking you're accessing a class. Sucks that I won't be able to keep my parameters consistent across Ant scripts and Jenkins jobs, but it's not a huge deal right now. If anyone knows how to access dotted variables, please feel free to add input!
Correct syntax for build parameter variable: alphanumeric_underline_only
Correct syntax for accessing: println(alphanumeric_underline_only)

If your pipeline has parameter with the dot inside it name, you can try to get value of this parameter by using getProperty(), for example (tested):
//println params.MyApp.Api.Server // will not work
//println "${MyApp.Api.Server}" // will not work
println getProperty('MyApp.Api.Server') //works perfectly

Related

Jenkins run another job on quiet mode

I have a Jenkinsfile that calls another job. It looks something like this: (Other than that there's onlt the pipeline wrapper with agent any.
stage('Call Job1') {
steps{
build job: 'Job1', parameters: [
[$class: 'StringParameterValue', name: 'gitBranch', value: "${gitBranch}"],
[$class: 'StringParameterValue', name: 'callingJob', value: "${JOB_NAME}"]
]
}
}
But for some reason, it runs the job on really long quiet mode, even though it's not defined anywhere:
(pending—In the quiet period. Expires in 9 hr 40 min)
When I go to said job and put in the same parameters manually, it works right away.
Am I doing something wrong? I couldn't find anything online
Thanks ahead!
SOLVED: called the job like this:
build job: 'Job1', quietPeriod: 0,parameters
though I feel like there should be a better solution.

How to access stringParam in groovy Jenkins pipeline

I have a job which takes stringParam and i am trying to access that stringParam from workflow, however i am getting No such property exception.
parameters {
stringParam("COMMIT_SHA", "", "[Required] Short SHA of the commit used to build the image")
}
print "The commit sha val is: $env.COMMIT_SHA"
I have tried different options like ${params.COMMIT_SHA}, ${COMMIT_SHA}
Solution
Modify your parameters block as follows
parameters {
string(name: 'COMMIT_SHA', defaultValue: '', description: '[Required] Short SHA of the commit used to build the image')
}
You should then be able to reference it using ${params.COMMIT_SHA} or ${COMMIT_SHA} or ${env.COMMIT_SHA}
Your syntax is for the Jobs DSL plugin but you stated you are using a pipeline. If you are, in fact, using a Freestyle Job to create other jobs instead of a Pipeline please provide the entire script and I will update my answer. However, your post appears to want to set the parameter for a pipeline ( not specify the metadata for the creation of a separate job via the Jobs DSL plugin )
The following Job DSL script, when executed, will create a Job called "example" with COMMIT_SHA as a parameter. It won't be added as a parameter to the job that contains the Job DSL script
job('example') {
parameters {
stringParam('myParameterName', 'my default stringParam value', 'my description')
}
}
In other words, there is nothing to print in the job that contains the stringParam code because that code configures a different job to contain a string parameter, it doesn't add that parameter to the current job

Pass parameters from KEY=VALUE properties file to downstream Jenkins job

In my declarative pipeline, I do have the below line to call the downstream job.
build job: 'my_downstream_job'
I have a file in KEY=VALUE format, how to pass in the parameters from this properties file? Downstream job receive this parameter as KEY. Using Jenkins GUI, I use "Parameters from properties file" and put this filename in there and it works, like to know how to do the same with pipeline.
You might need some extra logic to process this file and construct the appropriate list. See https://jenkins.io/doc/pipeline/steps/pipeline-build-step/ for documentation but typically it will look like:
build(job: "my_downstream_job",
parameters: [
new StringParameterValue('MY_NAME', myname_var),
],
propagate: true
)
So you might need to parse your file and create a list if new StringParameterValue's for each line.
It doesn't appear that there is a one-liner way to do this, but I got this to work:
params = readProperties file: "params.properties"
build job: 'some_jenkins_job', parameters: [
string(name: 'PROP_STR', value: params.PROP_STR),
booleanParam(name: 'PROP_BOOL', value: params.PROP_BOOL),
]
And maybe this is for the best, so it's clear exactly which params are being passed in, just by reading the Jenkinsfile.

Jenkins declarative pipeline dynamic choice parameter doesn't get updated after first build

I'm trying to convert old Jenkins jobs to declarative pipeline code.
When trying to use the choice parameter in the script I implement a function which should return updated values, if the values are not the most recent ones - the job will fail.
The problem is that after the first build which looks ok, the values stay static, they don't get updated afterwards which as I said above - fails my job.
It's like the function that i wrote runs only one time at the first build and doesn't run ever again.
I've tried writing the code in a way that the output will be sent to a file and be read from it - thus maybe the function will get updated by getting the text from a file - that didn't work.
I've tried looking at the Jenkins documentation / a lot of other threads and didn't find a thing.
My code looks like this:
def GetNames() {
def workspace = "..."
def proc = "${workspace}/script.sh list".execute()
return proc.text
}
${workspace} - Is just my workspace, doesn't matter.
script.sh - A script that 100% works and tested
return proc.text - Does return the values, I've tested it in my Jenkins website/script section and the values do return properly and updated.
My parameters section:
parameters {
choice(name: 'Names', choices: GetNames(), description: 'The names')
}
First build I get 5 names, which is good because those are the updated values, seconds build I know there are 10 values but I still get the 5 from before, and every build after that I will still get the same 5 names - they do not get updated at all, the function does not get triggered again.
It seems like this is a very long running issue which still didn't get patched, the only thread that had this mentioned was this one:
Jenkins dynamic declarative pipeline parameters but the solution is in a scripted and not declarative way.
Well, i've finally figured it out, the solution is combining declarative and scripted ways,
(using active parameter plugin).
node {
properties([
parameters([
[$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
description: 'The names',
filterLength: 1,
filterable: true,
name: 'Name',
randomName: 'choice-parameter-5631314439613978',
script: [
$class: 'GroovyScript',
script: [
classpath: [],
sandbox: false,
script: '''
some code.....
return something'''
]
]
],
])
])
}
pipeline {
agent any
.
.
This way the script part of the active parameter initiates every time you load the page and the values get returned updated every time.

Pipeline jobs - pass parameters upstream?

TL;DR: Obviously in a Jenkins pipeline job you can easily pass parameters downstream. What I want to know is if you can pass them upstream.
Use case:
We have three jobs; job_one, job_two, and job_three. These are frequently run separately as only one stage is needed, but in increasingly more-frequent cases we'd like to be able to run all three back to back.
The first and second rely on parameters you can define ahead of time, but the third needs a parameter that is generated from the second job (a file name whose structure is unknown until job_two runs).
I have built umbrella, which calls something like the following for each job. In this case, PARAM1 is populated because umbrella runs as "Build with parameters".
build job: 'job_one', parameters: [[$class: 'StringParameterValue', name: 'PARAM1', value: "$PARAM1"]]
All fine and dandy, I can then use PARAM1 in job_one just fine.
The Problem:
For job_three I need the parameter filename. This is generated within job_two, and therefore from what I can tell is inaccessible because job_three has no idea what job_two is doing.
In an ideal world I would just have job_two pass the filename to the umbrella job, which would feed it back into job_three. Therefore, how can I pass the generated filename back up to the umbrella job?
I'm picturing a final script something like this;
node('on-demand-t2small'){
stage ('Build 1') {
build job: 'job_one', parameters: [[$class: 'StringParameterValue', name: 'PARAM1', value: "$PARMA1"]]
}
stage ('Build 2') {
build job: 'job_two', parameters: [[$class: 'StringParameterValue', name: 'PARAM2', value: "$PARMA2"]]
//somehow get the filename parameter out of job_two here so that I can move it to job three...
}
stage ('Build 3') {
build job: 'job_three', parameters: [[$class: 'StringParameterValue', name: 'filename', value: "$filename"]]
} }
Additional Notes:
I recognize that the first question will be "why not have job_two trigger job_three? I can't set the system up this way for two reasons;
job_two needs to be able to run without triggering job_three, and three can't always require two's input to run.
I debated having the umbrella kick off two and then have a clause in two that would trigger three ONLY IF it had been started by the umbrella, but as far as I can tell this will limit feedback in the umbrella job; you won't know if two failed because two failed, or because three (as a part of two) failed. If I'm wrong about this assumption please let me know.
I had thought about setting the parameter as an environment variable but I believe that's node-specific and I can't guarantee both jobs will run on the same node so that seemed to not be the solution.
Umbrella is a pipeline job written in groovy, the other three may be pipeline or freestyle jobs, if that matters.
I would appreciate detailed answers where possible, I'm still new to Groovy, Jenkins, and coding in general.
It should be as simple as that:
stage ('Build 3') {
res = build job: 'job_three', parameters: [[$class: 'StringParameterValue', name: 'filename', value: "$filename"]]
echo "$res.buildVariables.filename"
}
Assuming that in job_three you do
env.filename = "col new file name"

Resources