I want to pass $CHANGES from my upstream project to downstream project.
I looked at How to pass ${CHANGES} to downstream job? which did not work for me. The All Changes Plugin does not put the changes in environment variable so I can't access them in the downstream job (or maybe I don't know the correct env. variable it uses)
The method to get changes from Parent Job URL and parse the XML also does not work, because it would be hard to correlate the parent job number which triggered this downstream build.
Is there something else that I can try?
The Parameterized Trigger plugin allows you to pass variables to downstream job.
Click "Add Parameters" dropdown under "Trigger parameterized build"
Select "Predefined Parameter"
Type CHANGES=${CHANGES}
The left side of = is the variable that will be injected into the child job.
The right side of = is the value from the current build.
Provided that you have ${CHANGES} as an environment variable in the current build, it will pass the same into the child build. You can change the left-side variable name to avoid any conflicts.
Note: Since version 2.23 of the plugin, the left side variable has to exist as a parameter in the child job. You need to define an empty "Text" parameter called CHANGES (or whatever your left side name is) in the child job configuration.
Related
I have a parameterized Jenkins job that has 1 parameter that accepts webooks to kick off builds.
I use the parameter in the branch specifier and it works except for one use case.
The parameter is called a, the branch specifier in the job configuration is defined as refs/${a} the default value for a is set to heads/master
this all works, but as soon as I kick off the job manually (passing in the default parameter value), WebHooks no longer kick off the job after I kick off the job manually.
Any ideas?
I seem to have found a bug with trying to pass environment variables from one Jenkins job to another.
I have a Jenkins job which contains a Powershell build step. My question is not about the Powershell script, as that does exactly what I want (it goes to Artifactory, finds a list of all the builds and then gets the build number of the latest one). The script ends up with the Artifactory build number as a text string '$LATEST_BUILD_NO_SLASH' (for clarity, this is not the Jenkins build number). This is eventually stored as an environment variable called 'LATEST_BUILD_NUM_VAL'
This is definitely creating an environment variable with my value stored in it, as it can be seen in the 'Environment Variables' list.
This environment variable is passed in the standard way in the parameterized build step.
My issue is that when I use this environment variable in a downstream build having passed it using 'LATEST_BUILD_NUM = ${LATEST_BUILD_NUM_VAL}', I get '${LATEST_BUILD_NUM_VAL}' as the value passed to the downstream job:
But, if I pass a Jenkins created environment variable i.e.'LATEST_BUILD_NUM = ${JOB_BASE_NAME}' I get the correct variable in the downstream job:
I have spent all day banging my head around this and don't really know where to go from here. I seem to be creating the environment variable correctly, as it is in the environment variables list and it works if I use a standard environment variable. I have declared 'LATEST_BUILD_NUM' as a parameter in my downstream build.
Is there any other way of achieving what I am trying to do?
I have checked in the 'Jenkins Issues' log for issues with parameterised builds and I can't find anything similar to my issue.
In case it is of any relevance, the Jenkins Environment Injector plugin is v2.1.6 and the Parameterized Trigger plugin is v2.35.2.
This is easy to achieve in Jenkins Pipeline:
Your second job (JobB) is called from your first job (JobA) as a downstream job. Thus somewhere, (probably the end of your JobA pipeline) you will have:
build job: 'CloudbeeFolder1/Path/To/JobB', propagate: false, wait: false, parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: "${env.SOME_VALUE}"]]
Then in JobB on the "other side" you have:
environment {
PARAM_FROM_PIPELINE = "${params.MY_PARAM}"
}
This gets the value of your parameter into an environment variable in JobB.
in first job in post build action--> Trigger Parameterized build on other projects select this
In that project build --> give name of downward jobname
select Add parameter in that add Predefined parameter give parameter in key value format e.g Temp=${BUILD_ID}
In second job select project is parameterized in select any option e.g string parameter and put name as Temp and used this parameter in shell or anywhere as $Temp ...
I have Upstream Job(MultiJob) which takes a String Parameter called freshORrerun, to take string value as "fresh" or "rerun" string value, which i need to pass on to downstream(standalone build) jobs to check the value is "fresh" or "rerun". Based on which, in child jobs's i will trigger complete tests run (pybot) or rerun (rebot) of failed tests.
here i have attached the screenshots how i have configured. When i print the passed string in child job it is empty.
Overall Job configuration.
Multi Job phase config and child Jobs
I have many no.of robot tests running them takes a lot of time. i need a way to run only failures of previous run, so that it gives me quick picture of how many got fixed. Could Some one please help me with this.
Click the 'Add parameters' button, select 'predefined parameters' and add: freshORrerun=${freshORrerun} to the list.
You can do it using one plugin called parameterized job trigger in which you will get options to pass parent job parameters to child job.
Note:- For this, you have to create parameters in child job also. These parameters will be overwritted.
plugin link
There is a job parametrized with Active Choices Parameters using Active Choices Plugin
I want to trigger this job from the upstream job.
The upstream job should use the default parameters of the downstream job.
The parameter UtilityPath depends on UtilityVersion to evaluate itself and to form the list of choices.
How can I
Get the list of choices returned by the groovy script of UtilityVersion from the upstream job?
Supply my choice for UtilityVersion to the parameter UtilityPath, so it could generate it's own list of choices for me (again, on the upstream job).
Trigger the job with my choices for parameters UtilityVersion and UtilityPath?
Whatever your downstream job's parameter has (in the groovy script/ code section), if you can put that in a SCRIPTLER script (see Jenkins Scriptler plugin) then you can call that scriptler script and pass the same parameters (that you were passing in the downstream job) in your upstream job's BUILD section (either execute shell or Run Groovy script) as you mentioned, you don't want to add the same downstream parameters in your upstream job due to complexities). NOTE: See conditional run plugin on how to call Scriptler script (in Build section) if you don't want to call the Scriptler script if you are dealing with TFS vs ProjectC vs someAutomationD or when parameterX is set to true (your call there).
It's pretty much same what CSchulz mentioned but Scriptler script is better as you change the code/script in one place (Scriptler Script section - left hand side section on Jenkins home page) and then use/reuse that script anywhere (i.e. either in parameters which support Groovy Scriptler script --or-- in the build section) without requiring to read a downstream job's parameter values (some hacky way before even the downstream is called, time changes everything sometimes) --OR doing something crazy with Jenkins API to make it more complex.
As I have tried, you cannot trigger upstream/downstream jobs with "Active Choice Plugin". Active choice and Reactive parameters get fired only if you trigger the job manually. For instance, if you tried to trigger the build from a bitbucket, active choice parameter get the value but reactive value will be shown as empty.
But you can achieve this in different ways.
If you are triggering the first job manually (by yourself), set the downstram job parameters as string so you can read those values directly.
Second option is to use environmental variable. Active choice is more over a conditional choice parameter. you can write groovy script to set parameters as environmental variable.This can be achieved with EnvInject Plugin. Write your conditional script in groovy and parameters are available in each and every build steps.
use environment variables to pass parameters to downstream job
I am triggering a job (child job) in 'Server B' from a job (parent job) in 'Server A' through a python script. I have 2-3 parent jobs. So I want to know which parent job is triggered the child job. How can I know which parent job triggered child job?
Can I pass the parent job name to child job name ?
Or
Can I get the parent name directly from child job ? (Environment variable / using python scripts)
Every build has an environment variable JOB_NAME. You can pass this as a string parameter to your child job.
Following description is provided in the /env-vars.html:
JOB_NAME
Name of the project of this build, such as "foo" or "foo/bar". (To strip off folder paths from a Bourne shell script, try: ${JOB_NAME##*/})
Passing the job name as an environment variable as mentioned by OltzU, may be the way to go, but that depends on how you are triggering the child job. If you are directly triggering the child job from the parent job using a post-build step, you can use something like the Parameterized Remote Build plugin to pass the Job name along. If you are using a script in the parent job to fire off the child job, you can string the Job name on as a parameter.
If you can't pass the triggering job as a parameter, you can programmatically get the build trigger(s) using Groovy. Groovy is really the only language that fully integrates with the Jenkins api, so if you want to use another language (python), you are stuck with using the rest api or jenkins cli, which are both limited in what they can give you (e.g. neither can give you the triggering job to my knowledge).
If you want to use groovy to get the trigger job, you will need the Groovy Plugin, which you will run as build step in your child job. Here's a snippet of code to get the chain of upstream jobs that triggered your build. You may need to modify the code depending on the type of trigger that is used.
def getUpstreamProjectTriggers(causes) {
def upstreamCauses = []
for (cause in causes) {
if (cause.class.toString().contains("UpstreamCause")) {
upstreamCauses.add(cause.getUpstreamProject())
}
}
return upstreamCauses
}
getUpstreamProjectTriggers(build.getCauses())
From here, if you want to use the triggering job in, say, a python script, you would need to use groovy to set the triggering job in an environment variable. This SO thread gives more info on that, or you can skip to my answer in that thread to see how I do it.
In the child Jenkinsfile this Groovy code will get the name of the triggering job:
String getTriggeringProjectName() {
if (currentBuild.upstreamBuilds) {
return currentBuild.upstreamBuilds[0].projectName
} else {
return ""
}
}
currentBuild.upstreamBuilds is a list of RunWrapper objects
You could use an additional parameter for your child job.