I would like to build a Pipeline in Jenkins where result based on multiple upstream jobs should trigger a single downstream job.
For ex:
Job 1 --> Job 2 --> Job 5 --> Job 6
Job 3 -->
Job 4 -->
Job 1: when a new piece of code is committed to Git production it should trigger Jobs 2,3 and 4 (this part I was able to run run using Build Other jobs option in Post Build Action; although any suggestion to improve this is also greatly appreciated).
Job 2,3 and 4 are regressions to be run on different test machines.
The part I am not able to fig out is only when Job 2,3 and 4 are successful it should trigger the downstream Job 5 and eventually 5 can trigger 6.
I am currently using the Build Pipeline plugin, but it was successful for one (downstream) to many (upstream )jobs and not vice versa.
Any help/suggestion/guidance is greatly appreciated. Thanks in advance!
Cheers!!
you can do this by using the 'Build after other projects are built' on Job 5 configuration 'Build Triggers' section. There you add Job2, Job3 and Job4 as dependencies and set up the option 'Trigger only if build is stable' (As the image below). This should do the job, and will wait the three jobs to finish.
But this as you said does not accomplish the goal of executing Job5 when Job2,3 and 4 are successful (Job5 gets executed even though one of them failed). I think the best solution for your case is to use a new job and create it as a 'Pipeline' job (let's call it PipelineJob). This way you do not have to edit every single Job with its own configuration and dependencies and you can tweak your pipeline behaviour easier. Also thanks to error propagation it will fail if some phase fails too! This should work as intended:
pipeline {
agent any
stages{
stage('Phase 1') {
steps {
build 'Job1'
}
}
stage('Phase 2') {
parallel {
stage('Job 2') {
steps {
build 'Job2'
}
}
stage('Job 3') {
steps {
build 'Job3'
}
}
stage('Job 4') {
steps {
build 'Job4'
}
}
}
}
stage('Phase 3') {
steps {
build 'Job5'
}
}
stage('Phase 4') {
steps {
build 'Job6'
}
}
}
}
In addition to #CMassa's answer (yes this works - thanks for the answer), I found Join Plugin by Jenkins and it also works great for this scenario.
Related
How can I start a job if all 3 other jobs are successful?
Here is the scenario:
job1 - build module 1
job2 - build module 2
job3 - build module 3
job4 - main program
Run job4 (main program) if all modules build are successful (Job 1,2 and 3) so the main program can checkout all 3 modules from 3 different github repo and build the main program.
How can it be chained so its automatic?
I want all 3 jobs separate. Then Job 4 just need to check if all 3 jobs were successful and if so, it'll re-build all 3 modules inside job 4 or just pull code for all 3 modules and then proceed with other stuff. The only question is how can I check all 3 job status within jenkinsfile?
One thing you can do it if you don't do much stuff in job 1,2,3 then add those job in single jenkinsfile or groovy file with respective stages for each job and if all those jobs successful means in this case stages then schedule the job 4.
pipeline {
stages {
stage('Job 1 checkout ') {
}
stage('Job 1 Build ') {
}
stage('Job 2 Checkout ') {
}
stage('Job 2 Build ') {
}
}
}
and so on
I have two pipeline jobs (one for Build and another for Deployment) in jenkins and deployment job builds on one choice parameter which needs values(tags) from docker hub.
What we want to do is, To just update the choice field of deploy job from the build job post actions with out building it. Is that possible anyway?
I don't think that's possible at the moment. You need to run a build to update the parameters.
There are some tickets for this on the Jenkins Jira: JENKINS-52939 JENKINS-50365. The latter is marked as "Fixed but unreleased", but it doesn't appear to be actually fixed yet.
One comment mentions a workaround. You can put this in your pipeline to update the parameters and then abort the build:
stages {
stage("parameterizing") {
steps {
script {
if ("${params.Invoke_Parameters}" == "Yes") {
currentBuild.result = 'ABORTED'
error('DRY RUN COMPLETED. JOB PARAMETERIZED.')
}
}
}
}
}
We are using Jenkins and the Pipeline plugin for CI/CD. We have two pipelines that we need to run parallel, and there is a downstream pipeline which should trigger ONLY if the two upstream pipelines both finish and are successful.
P1--
| -- P3
P2--
Basically P3 should run only when P1 and P2 are finished and successful and not depend on just one of them.
Is there a way to achieve this? We are using 2.5 version of the plugin.
Since stages only run if previous stages run successfully, and since you can execute other pipelines via build, and since there is a magical instruction called parallel, I think this might do it:
pipeline {
agent { label 'docker' }
stages {
stage("build_p1_and_p2_in_parallel") {
steps {
parallel p1: {
build 'p1'
}, p2: {
build 'p2'
}
}
}
stage("build_p3_if_p1_and_p2_succeeded") {
steps {
build 'p3'
}
}
}
}
Use the "Snippet Generator" embedded in your jenkins instance to figure out what the argument to build should be. If it's another pipeline at the same level as the top level Jenkinsfile, you could just reference it by job name. Caveat: I've used parallel, but never build within parallel, but it seems like it should work.
You can try and wrap the pipelines jobs with MultiJob plugin that can implement the logic that you require as 2 jobs inside a phase.
could you suggest best jenkins plugin to manage multiple level and complex dependency build?
similar to diamond
many build starts in parallel
downstream job will have to wait for two or more upstream job to finish before it triggers. e.g C job should wait for both A and B to complete and with build success
Edit:
Seems like Pipeline plugin is the one that will be officially supported and developed by CloudBees.
Original Answer:
IMHO The easiest to start with is: Build Flow Plugin
From the plugin Wiki:
parallel (
// job 1, 2 and 3 will be scheduled in parallel.
{ build("Job1") },
{ build("Job2") },
{ build("Job3") }
)
if (params["PARAM1"] == "BOO"){
println "BUILDING OPTIONAL JOB4"
// job4 will be triggered after jobs 1, 2 and 3 complete and if condition is met
build("Job4")
}
Additional plugins to check would be:
Pipeline Plugin
Tikal's Multijob
Is there a possible way / plugin that can run a post build script when a Jenkins job is aborted.
I do see that the post build plugin provides an action to execute a set of scripts, but these can be run only on 2 options either a successful job or a failed job.
This question is positively answered here.
The Post Build Task plugin is run even if a job is aborted.
Use it to search the log text for "Build was aborted" and you can specify a shell script to run.
Works like a charm. :-)
For a declarative Jenkins pipeline, you can achieve it as follows:
pipeline {
agent any
options {
timeout(time: 2, unit: 'MINUTES') // abort on exceeding the timeout
}
stages {
stage('Echo 1') {
steps {
echo 'Hello World'
}
}
stage('Sleep'){
steps {
sh 'sleep 180'
}
}
stage('Wake up'){
steps {
echo "Woken up"
}
}
}
// this post part is executed if job is aborted
post {
aborted {
script{
echo "Damn it. I was aborted!"
}
}
}
}
As far as I know, if a build is aborted, there's no way to execute any build steps (or post build steps) in it any more - which makes sense, that's what I would expect of "abort".
What you could do is create another job that monitors the first one's status, and triggers if it was aborted (e.g. see the BuildResultTrigger plugin).
Another solution might be to create a "wrapper" job, which calls the first one as a build step - this way you can execute additional steps after its completion, like checking its status, even if it was aborted.
If you use a scripted pipeline, you can always use a try/catch/finally set, where the build gets executed in the try block and the postbuild steps are in the finally block. This way, even if the build fails, post build steps are executed.
try {
build here
}catch(FlowInterruptedException interruptEx){
catch exception here
} finally {
postBuild(parameters)
}