Parameters are not passing properly to downstream job - jenkins

I am trying to pass parameter values from upstream job to downstream job and only one parameter is passing exact value but not the other.
def newJob = build job: 'downstreamJob', propagate: false
parameters: [
[$class: 'ChoiceParameterValue', name: 'PARAM1', value: "${params.PARAM1}"],
[$class: 'CascadeChoiceParameterValue', name: 'PARAM2', value: "${params.PARAM2}"]
]
My upstream job has the same parameter name as downstream job which is PARAM1 and PARAM2. But my downstream job is using different parameters such as Active Choice Parameter for PARAM1 and Active Choice Reference Parameter for PARAM2. When I run the pipeline, PARAM1 is passing properly but for PARAM2 it is passing the fallback script. Is there any error in my definition of downstream job or is there any other way to define it?
I also tried with string for the parameter class, but no luck

I tried to create pipeline-syntax for my job and observed there is an issue in Jenkins fetching the CascadeChoiceParameter during the downstream job. So I modified my PARAM2 to Active choice and it's working now.

Related

post build action with parameters

I have 2 parameterized pipeline A and B.
project A execute project B as a post build action.
Im using "Predefine parameters" to pass parameters to project B, but seems project B using default values and not the provided one.
the pass parameter is project A parameter.
Jenkins can get weird around parameters. If you are using a declarative pipeline, then define the parameters within the code instead of using the option on the Jenkins page:
build([
job : 'B',
wait : false,
parameters: [
string(name: 'process_id', value: "${id}")
]
])
And in pipeline B:
parameters {
string(defaultValue: null, description: 'parameter', name: 'process_id')
}
If using freestyle jobs, the way you have defined the parameter is correct. If Jenkins is not using the correct parameter and instead is using some cached value, then try these steps:
Clone your downstream job
Rename you downstream job to something else
Rename the cloned downstream job to the correct name that should be used
Run the downstream job once
If the problem is that Jenkins caches the parameters used, this should fix it.

Jenkins - Setting parameters for a parametirized build

I have two Jenkins jobs - job A (located on server A) triggers job B (located on server B). These servers are internal to my company - we have it set up now that server A can trigger the job on server B. The job on server B requires parameters for it to build.
There a multiple build job stages that get triggered in job B when a parameter is set to a particular value (when job type is set to install). Since all these stages used the same parameters, I am trying to do it in such a way that I only need to set the parameters once instead of just copying and pasting the code from the pipeline for each build job.
I have tried looking online for a way to do this but there doesn't seem to much. All I have found is the following:
params.myParam = "some value"
and the following: How to access parameters in a Parameterized Build?
From the above linked question, would I have to do the following in my JenkinsfilePreCommit (a groovy file):
properties([
parameters([
booleanParam(
defaultValue: false,
description: 'isFoo should be false',
name: 'isFoo'
),
where booleanParam can be replaced with whatever type of param it is, defaultValue is the value I want to set it to, description being optional, and name is the name of the parameter in the Jenkins build? i.e job_type

Passing extended choice parameter value from one job to another remote job in Jenkins pipeline

I am working on a scripted Jenkins pipeline, and I am using the triggerRemoteJob plugin to trigger a remote job on another Jenkins instance.
The remote job has an extended choice parameter.
The syntax for passing parameters to the triggerRemoteJob plugin seems to differ from the build Job plugin.
What is the correct syntax to pass the value of an extended choice parameter while using the triggerRemoteJob plugin?
EDIT
Posted an answer below.
If there is a way to solve the issue in Jenkins pipeline, please post it as an answer.
As far as I know, there is no special class for those parameters. I've always used the String one and it works as long as you introduced a valid option:
string(name: 'PARAM', value: "option"),
--- EDIT ----
I do it with this syntax:
build(job: 'my_job', parameters: [
string(name: 'PARAMETER', value: 'value'),
])
Referring to this issue thread in github:
https://github.com/jenkinsci/coordinator-plugin/issues/46
It seems that because the Extended choice parameter does not support an interface in jenkins, calling the triggerRemoteJob with an ext. choice parameter is not supported.

Jenkins: having problems passing environment variable for use in another job (maybe a bug)

I seem to have found a bug with trying to pass environment variables from one Jenkins job to another.
I have a Jenkins job which contains a Powershell build step. My question is not about the Powershell script, as that does exactly what I want (it goes to Artifactory, finds a list of all the builds and then gets the build number of the latest one). The script ends up with the Artifactory build number as a text string '$LATEST_BUILD_NO_SLASH' (for clarity, this is not the Jenkins build number). This is eventually stored as an environment variable called 'LATEST_BUILD_NUM_VAL'
This is definitely creating an environment variable with my value stored in it, as it can be seen in the 'Environment Variables' list.
This environment variable is passed in the standard way in the parameterized build step.
My issue is that when I use this environment variable in a downstream build having passed it using 'LATEST_BUILD_NUM = ${LATEST_BUILD_NUM_VAL}', I get '${LATEST_BUILD_NUM_VAL}' as the value passed to the downstream job:
But, if I pass a Jenkins created environment variable i.e.'LATEST_BUILD_NUM = ${JOB_BASE_NAME}' I get the correct variable in the downstream job:
I have spent all day banging my head around this and don't really know where to go from here. I seem to be creating the environment variable correctly, as it is in the environment variables list and it works if I use a standard environment variable. I have declared 'LATEST_BUILD_NUM' as a parameter in my downstream build.
Is there any other way of achieving what I am trying to do?
I have checked in the 'Jenkins Issues' log for issues with parameterised builds and I can't find anything similar to my issue.
In case it is of any relevance, the Jenkins Environment Injector plugin is v2.1.6 and the Parameterized Trigger plugin is v2.35.2.
This is easy to achieve in Jenkins Pipeline:
Your second job (JobB) is called from your first job (JobA) as a downstream job. Thus somewhere, (probably the end of your JobA pipeline) you will have:
build job: 'CloudbeeFolder1/Path/To/JobB', propagate: false, wait: false, parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: "${env.SOME_VALUE}"]]
Then in JobB on the "other side" you have:
environment {
PARAM_FROM_PIPELINE = "${params.MY_PARAM}"
}
This gets the value of your parameter into an environment variable in JobB.
in first job in post build action--> Trigger Parameterized build on other projects select this
In that project build --> give name of downward jobname
select Add parameter in that add Predefined parameter give parameter in key value format e.g Temp=${BUILD_ID}
In second job select project is parameterized in select any option e.g string parameter and put name as Temp and used this parameter in shell or anywhere as $Temp ...

ensure jenkins pipeline using same node for download stream job

Case:
I have 3 machine (A,B,C) for slave (sharing the same node label e.g 'build')
I have a pipeline which may trigger different downstream job. And i need to make sure that all the job and downstream job using same node (for sharing some file etc.). How i can do that?
a) I pass the node label to downstream but i am not sure if the downstream will take the same node.(parent job using slave "A" and i pass the node label 'build' to downstream job but maybe in downstream job it take slave 'B')
b) is that some way to get the runtime slave when the pipeline is executing, when i pass this slave name to downstream?
or is that any better way to do that?
I advice you to try NodeLable Parameter Plugin.
Once installed, check 'This project is parametrized' option and select 'node' from 'Add Parameter' drop down.
It will populate all nodes as drop down while building job with parameters.
It also have some other options which may help you.
Most important question to me would be: Why do they need to run on the very same node?
Anyway. One way to achieve this would be to retrieve the name of the node in the node block in the first pipeline, like (CAUTION: was not able to verify code written below):
// Code for upstream job
#NonCPS
def getNodeName(def context) {
context.toComputer().name
}
def nodeName = 'undefined'
node('build') {
nodeName = steps.getContext(FilePath)
}
build job: 'downstream', parameters: [string(name: 'nodeName', value: nodeName)]
In the downtstream you use that string parameter as input to your node block - of course you should make sure that the downstream actually is parameterized in the first place having a string parameter named nodeName:
node(nodeName) {
// do some stuff
}
Even having static agents, workspaces are eventually cleaned up, so don't rely on existence of files in the workspace on your builds.
Just archive whatever you need in the upstream job (using the archive step) and then use Copy Artifact Plugin in downstream jobs to get what you need there. You'll probably need to parameterize downstream jobs to pass them the reference to the upstream artifact(s) you need (there are plenty of selectors available in the Copy Artifact plugin that you can play with to achieve what you want.
If you are triggering child jobs manually from pipeline, then you can use syntax as this to pass the specific node label
build job: 'test_job', parameters: [[$class: 'LabelParameterValue', name: 'node', label: 'tester1']]
build job: 'test_job', parameters: [[$class: 'LabelParameterValue', name: 'node', label: 'tester2']]
current label of node you should be able to get this way ${env.NODE_NAME}"
found at How to trigger a jenkins build on specific node using pipeline plugin?
ref. to docs- https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
But yes, if you want to manipulate with some files from this job in other jobs, then you will need to use eg. mentioned copy artifacts plugin, because workspaces of the jobs are independent and each job will have different content.

Resources