How to access stringParam in groovy Jenkins pipeline - jenkins

I have a job which takes stringParam and i am trying to access that stringParam from workflow, however i am getting No such property exception.
parameters {
stringParam("COMMIT_SHA", "", "[Required] Short SHA of the commit used to build the image")
}
print "The commit sha val is: $env.COMMIT_SHA"
I have tried different options like ${params.COMMIT_SHA}, ${COMMIT_SHA}

Solution
Modify your parameters block as follows
parameters {
string(name: 'COMMIT_SHA', defaultValue: '', description: '[Required] Short SHA of the commit used to build the image')
}
You should then be able to reference it using ${params.COMMIT_SHA} or ${COMMIT_SHA} or ${env.COMMIT_SHA}
Your syntax is for the Jobs DSL plugin but you stated you are using a pipeline. If you are, in fact, using a Freestyle Job to create other jobs instead of a Pipeline please provide the entire script and I will update my answer. However, your post appears to want to set the parameter for a pipeline ( not specify the metadata for the creation of a separate job via the Jobs DSL plugin )
The following Job DSL script, when executed, will create a Job called "example" with COMMIT_SHA as a parameter. It won't be added as a parameter to the job that contains the Job DSL script
job('example') {
parameters {
stringParam('myParameterName', 'my default stringParam value', 'my description')
}
}
In other words, there is nothing to print in the job that contains the stringParam code because that code configures a different job to contain a string parameter, it doesn't add that parameter to the current job

Related

Pass parameters from KEY=VALUE properties file to downstream Jenkins job

In my declarative pipeline, I do have the below line to call the downstream job.
build job: 'my_downstream_job'
I have a file in KEY=VALUE format, how to pass in the parameters from this properties file? Downstream job receive this parameter as KEY. Using Jenkins GUI, I use "Parameters from properties file" and put this filename in there and it works, like to know how to do the same with pipeline.
You might need some extra logic to process this file and construct the appropriate list. See https://jenkins.io/doc/pipeline/steps/pipeline-build-step/ for documentation but typically it will look like:
build(job: "my_downstream_job",
parameters: [
new StringParameterValue('MY_NAME', myname_var),
],
propagate: true
)
So you might need to parse your file and create a list if new StringParameterValue's for each line.
It doesn't appear that there is a one-liner way to do this, but I got this to work:
params = readProperties file: "params.properties"
build job: 'some_jenkins_job', parameters: [
string(name: 'PROP_STR', value: params.PROP_STR),
booleanParam(name: 'PROP_BOOL', value: params.PROP_BOOL),
]
And maybe this is for the best, so it's clear exactly which params are being passed in, just by reading the Jenkinsfile.

Can manually triggered jobs of pipeline take user-input for paramters?

I have two jobs (Job1 & Job2). Both are parameterized with the same parameters but the parameter values differ and are designed using the Active-Choice-Parameter (uno) plugin.
I wish to run both jobs in pipeline however, below is the exact requirement.
When the pipeline is executed Job1 executes and prompts user to enter parameters (UI). The user enters / selects the values and triggers it to build.
Once the build on Job1 completes the user is prompted for (approval) to proceed to the next Job2. The user approves by clicking "OK/Proceed" button; & thereby Job2 of the pipeline gets triggered.
Note: I have achieved this using "input" feature of Groovy Script.
The parameter values of Job1 should be passed and should showup in Job2; however the user should be able to see and modify the passed values for any parameter in Job2 (UI).
Note: I'm able to pass the parameter values using "Parameterized Trigger Plugin" on "Post-Build-Actions" of Job1
Problem statement:
Running the pipeline does not show users parameter screen (UI) for either Job1 or Job2 so that the user could enter / select and change the parameters for either Job1 or Job2 during the pipeline run.
Note:
I'm able to overcome the Problem Statement by using Build Pipeline Plugin:
But the reason i do not wish to consider this solutions is
I don't know how can I inject the groovy pipeline script input element which prompts for approval between jobs.
I have read that using the pipeline plugin has advantages over using Build Pipeline Plugin
Below is Groovy script (Pipeline script)
agent any //agent specifies where the pipeline will execute.
stages {
stage ("build PROD") { //an arbitrary stage name
steps {
build 'job1' //this is where we specify which job to invoke.
}
}
stage ("build DR") { //an arbitrary stage name
input{
message "Press Ok to continue"
submitter "user1,user2"
parameters {
string(name:'username', defaultValue: 'user', description: 'Username of the user pressing Ok')
}
}
steps {
echo "User: ${username} said Ok."
build 'job2' //this is where we specify which job to invoke.
}
}
}
}
Any solution would be of great help. Thanks.
Is there a reason you are keeping the jobs separate? What I would do is re-evaluate your job flow and see if it makes more sense to have the jobs merge into one pipeline.
You could simply use the parameter https://jenkins.io/doc/book/pipeline/syntax/#parameters
Then you have the default user interface, which is simpler then you custom groovy code.

jenkins : update parameters via groovy script

I have a text file on server, e.g. /var/lib/jenkins/.../myChoices.txt
FirstChoice,SecondChoice
As the files will updated from time to time, I want the script update the parameters every time will I click "build with parameters"
But my code only works when I build the job, i.e. is not updating in real time.
def getMyChoices() {
List<String> choices = Arrays.asList(readFileFromWorkspace('/var/lib/jenkins/.../myChoices.txt').split(','))
return choices
}
job(jobName) {
description("Deploy something based on choice.")
parameters {
...
...
choiceParam('EB_ACTIVE_ENV_NAME', getMyChoices(), '')
}
}
I do not want to use the hudson plugin too due to some vulnerability reason.
Groovy scripts would be executed only when the job is run. Hence, until the job is run, the parameters would not be refreshed
The only solution available is to this job at regular intervals with an additional flag to refresh the parameters alone and then exit.
This way whenever you click on Build on Parameters options, you will have the latest parameters that exists in the file.
It is required to regenerate the job in order to refresh the parameters.
What I would do is create a job that generates the jobs with jobdsl step when I get a change on the repository where myChoices.txt is versioned
here is an exemple of use of jobDsl
jobDsl removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
targets: targetFile,
unstableOnDeprecation: true,
additionalParameters: [
pipelineJobs: arrFiles,
props: [
basePath: destination,
gitRemoteUrl: config.gitRemoteUrl,
gitConfigJenkinsBranch: config.gitConfigJenkinsBranch,
localPath: config.localPath ?: ''
]
]
I use it with a shared library I created that allows me to abstract jobDSL and only write pipelineDSL https://github.com/SAP/jenkins-pipelayer/ but there are restriction to this lib, because I parse the pipelineDSL, getMyChoices() would not be evaluated in the current version of the lib

Pass large amount of parameters between jobs

I have two Jenkins jobs that tun on separate computers. On computer 1 I read properties file and use it for environment variables. But i need the same file on PC 2 and it only exist on the first one. When the first Jenkins job finishes it starts the second one and it can pass parameters file via job but I have to receive with creation of separate parameter with Parameterized Trigger Plugin for each parameter, and I have a lot and don`t want to do so. Is there simple solution for this issue?
Forget Jenkins 1 and the plugins Parameterized Trigger Plugin. Using Jenkins 2, here's an example of your need:
node ("pc1") {
stage "step1"
stash name: "app", includes: "properties_dir/*"
}
node ("pc2") {
stage "step2"
dir("dir_to_unstash") {
unstash "app"
}
}

Cannot get build parameters in Jenkins Pipeline Job

I'm trying to get some build parameters in a Jenkins pipeline job. In this context the parameters are defined in the checkbox "this project is parameterized" and passed at build time.
In the job I call two branches:
parallel firstBranch: {
build job: 'Run Blah', parameters: [string(name: 'BLAH', value: '$app.blah.blah')]
}, secondBranch: {
build job: 'Run BlahBlah', parameters: [string(name: 'BLAH', value: '$app.blah.blah')]
}
I've tried accessing the build parameter: app.blah.blah in these various ways:
${app.blah.blah}
$app.blah.blah
"${app.blah.blah}"
app.blah.blah
currentBuild.buildVariableResolver.resolve("app.blah.blah")
System.getenv("app.blah.blah")
I always get some exception that I can somewhat understand, but I'm starting to get very annoyed. It should not be this hard to get a build parameter in the script for God's sake. What am I doing wrong?
This is working for me:
println blah
So I guess it should be enough for you to do it like this:
parallel firstBranch: {
build job: 'Run Blah', parameters: [string(name: 'BLAH', value: blah)]
}, secondBranch: {
build job: 'Run BlahBlah', parameters: [string(name: 'BLAH', value: blah)]
}
Well, looks like you can't have dots in your build parameter names! Confuses Groovy into thinking you're accessing a class. Sucks that I won't be able to keep my parameters consistent across Ant scripts and Jenkins jobs, but it's not a huge deal right now. If anyone knows how to access dotted variables, please feel free to add input!
Correct syntax for build parameter variable: alphanumeric_underline_only
Correct syntax for accessing: println(alphanumeric_underline_only)
If your pipeline has parameter with the dot inside it name, you can try to get value of this parameter by using getProperty(), for example (tested):
//println params.MyApp.Api.Server // will not work
//println "${MyApp.Api.Server}" // will not work
println getProperty('MyApp.Api.Server') //works perfectly

Resources