I have a multijob A, it has some downstream jobs (subA1, subA2, ...). Each downstream job has its own parameters.
When I run job A, all parameters of job A are passed to downstream jobs but their own parameters of downstream jobs (subA1, subA2, ..) does not appear.
How can I make subA1, subA2, ... get their own params ?
Related
I am currently running 2 Jenkins jobs where one Jenkins Job (Job A) calling another Jenkins Job (Job B) by using "when builds promote". Once it is approved manually, then Job B job will get triggered. After this step, I need the scenario like I would like to get status from Job B in Job A.
If my Job B fails then Job A should fail or vice versa. Any help!
You are missing the fundamentals of how Jenkins works. If an upstream job (job 1) is successful and the downstream job (job 2) fails, Jenkins is doing its job correctly by showing the actual statuses of the jobs (i.e., job 1 is green and job 2 is red).
If the upstream job fails then the downstream should never kick off in the first place, so you won't have to worry about that. This is how Jenkins was designed to work.
I have a freestyle job in Jenkins and would normally pass parameters to another freestyle job using the Predefined parameters option.
example:
PROJECT=Myproject
PATH=/depot/workspace/
Previously I could access the above values through the KEY in the downstream job through the environment by using ${PROJECT} OR ${PATH}.
My problem now is that I have a pipeline job that needs to access the above values but when using ${PROJECT} OR ${PATH} it does not work.
So, in general how I want it to work is have the freestyle job run first and pass the parameters to the downstream pipeline job.
You might need to use "${params.PROJECT}" in your pipeline to access the parameters.
I am using a parametrized job to trigger pipeline jobs job1,job2 and job3. My intention is that by default job1,job2,job3 should run on node "A" and whenever I use parametrized job, and select node "B" manually then all downstream jobs i,e job1,job2 and job3 should run on the node "B".
I used nodelabel parameter plugin but only parent job will run accordingly as selected parameter but downstream jobs are not triggered on the selected parameter in the parent job.
Make a job parameter for job1, job2 and job3. Use this parameter for nodelabel. Make the default for this parameter nodeB. When you start the job with the parametrized job set the parameter nodeA as seen below:
build job: 'job1', parameters: [[$class: 'StringParameterValue', name:'nodeA']]
I have a question for downstream job in jenkins.
I want to use directly the value of a predefined parameters from upstream job in the git parameter of a downstream job.
How is this done?
Yes , just add the parameters to the downstream job. And it will do the trick
I have created a wrapper job in Jenkins which will get triggered every hour if there are any new commits in my GIT repository. This wrapper job in turn calls 6 other downstream jobs. So the structure of my wrapper job (W) is like this:
W -> A -> B -> C -> D -> E -> F
I am using Jenkins Parameterized Trigger Plugin to stitch one job to the other so that my upstream jobs fail if the downstream job fails. Upon completion of the last downstream job (F), Wrapper job (W) is copying the artifacts from all the downstream jobs in its current workspace.
Now when one of my downstream job (lets say E) fails, I get failure notifications from the failed downstream job (E) as well as from all the other upstream jobs (D, C, B, A and W). So I get 6 mails in total and it creates some noise.
If I activate the email notification on only the Wrapper job (W), then I get a single failure notification mentioning that Job A has failed. Then I will check Job A's logs only to find out that it was Job B that failed and continue the log checks until I reach Job E.
How can I customize the notification to send a single mail identifying the specific downstream job (in this case E) that caused the failure?
OR
Is there a better way to trigger the downstream jobs, wait for all the downstream jobs to get completed and copy the artifacts from all the downstream jobs to the trigger job?
Wrote a Groovy script in Groovy Postbuild to iterate through all the subprojects of wrapper job and mark the wrapper job as failure if any of the subprojects have failed.
Also changed the exit criteria in "Trigger/call builds on other projects" to never mark job as failure/unstable. Instead the call of setting the job as failure is handled in the groovy script itself based on the status of the downstream subprojects.