Jenkins Parallel Trigger and Wait - jenkins

I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.

Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")

Try the Locks and Latches plugin.

This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.

I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.

Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin

Related

How to run jenkins job only once but with two conditions

Thanks for looking into my concern.
I have 3 jenkins jobs. JOb A, B & C.
Job A starts at 10PM at night.
JOB B is a down stream of Job A and runs only if job A is success.
Job C is a downstream job of job B
Now I want job C to be triggered after successful completion of job B or at at a scheduled time. Problem is if I schedule job C as down stream as well as with a schedule. It runs twice.
But, it should run only once.
Please help me to achieve this.
Did you try "Conditional BuildStep" plug-in? You can execute a downstream job (or a script) based on "Build cause"
You can add more than 1 "single" conditions for each build cause.
Now you'll need to decide when to run a job, as a timer or as a downstream
You can use jenkins pipeline plugin. You can create a pipeline job with stages. A pipeline will proceed only to next stage if previous stage is successful. Refer documentation for more details on pipeline.
Pipeline comes with a lot of flexibilities in which you can define the flow. You can either use a declarative pipeline or a scripted pipeline. Good number of examples can be found in here

Aggregating test results in Jenkins with Parametrized Jobs

I understand this post is similar to:
Aggregating results of downstream is no test in Jenkins
and also to:
Aggregating results of downstream parameterised jobs in Jenkins
Nonetheless, I am not able to figure out for my case, how to make this working. I am currently using Jenkins 1.655.
I have jobs A, B, C - A being the upstream job. What I want to do is to have A call B and B call C. All needs to block and wait for completion of the next. If one fails, all fails. B and C generate unit test reports. So I want to aggregate these reports in A and then Publish that result in A. So, here's the current setup of the jobs:
Job A:
Build Steps
Execute shell: echo $(date) > aggregate
Trigger Parametrized Buid Job: Job B
Post Build Steps
Aggregate downstream test results
Record fingerprints of files to track usage: set Files to fingerprint to aggregate
Publish JUnit test result report (report files from B and C)
Job B:
Build Steps
Copy artifacts from another project: copy from upstream job aggregate file
Run tests to generate unit test reports
Trigger Parametrized Build Job: Job C
It ultimately fails here because aggregate is only archived in the
Post Build Steps of Job A. How can I archive an artifact in the Build Step?
Post Build Steps
Aggregate downstream test results (unit test.xml generated)
Record fingerprints of files to track usage: set Files to fingerprint to aggregate
I won't post Job C here for simplicity but it follows pretty much what B does.
So, summing it up, I want to have interlinked jobs that depend on each other and uses the parametrized plugin and the upstream job must aggregate the test results of all downstream.
Any help appreciated, thanks!
If you have no limitation on where to run your jobs you can always specify it to run on the same workspace\machine - this will solve all your issues.
If for some reason you can't run it on the same workspace, instead of using the copy artifact plugin you can use the link in Jenkins to the WS (guessing you're using Parameterized Trigger Plugin) so it'll be easy to wget the "aggregate" file from A job from the triggered job using the defined: TRIGGERED_BUILD_NUMBER_="Last build number triggered" from A. This will also help you to keep track of the jobs B and C you triggered to get the artifacts from there.
Hope it helps!

Jenkins: parallelize test execution

I started using Jenkins in my project and I am trying to parallelize my test suite (Rspec test cases) written in 4 files in Jenkins
spec/features/
|-- test1.rb
|-- test2.rb
|-- test3.rb
|-- test4.rb
We can run all test cases with below command, it will run all tests written in test1.rb ..test4.rb sequentially which will take around 1 hour.
script spec/features/
If you want to excute test cases from each test file we can run like
script spec/features/test1.rb
Now I want to parallelize these test cases which can reduce the run from 1hr to 15 mins, All these test cases can run in one machine in parallel
I followed below approach in Jenkins
1) Set a new job "Main_Test_job"
2)
Selected "Trigger/Call builds on other projects"
projects to build " Child_test_job"
Build on same node
Predefined Parameters TEST_UNIT=test1.rb
Block until the triggered projects finish their builds ( Not selected this)
Add trigger --->
Selected "Trigger/Call builds on other projects"
projects to build " Child_test_job"
Build on same node
Predefined Parameters TEST_UNIT=test2.rb
Block until the triggered projects finish their builds ( Not selected this)
Add trigger --->
Selected "Trigger/Call builds on other projects"
projects to build " Child_test_job"
Build on same node
Predefined Parameters TEST_UNIT=test3.rb
Block until the triggered projects finish their builds ( Not selected this)
Add trigger --->
Selected "Trigger/Call builds on other projects"
projects to build " Child_test_job"
Build on same node
Predefined Parameters TEST_UNIT=test4.rb
Block until the triggered projects finish their builds ( Not selected this)
3)
Created job "Child_test_job" as which was included in main_test_job like below
Select Build step "Execute Shell" with below command
script spec/$TEST_UNIT
When I start "Main_Test_job", it will automatically start 4 Child_Test_Jobs in same machine, which will reduce my total run time to 15 mins.
But in this case "Main_test_job" has no way to monitor statuses of 4
child_test_jobs, It always succeeds immediately after starting 4
child jobs
"Block until the triggered projects finish their builds" this option
monitors child jobs but if we select this option for all child jobs,
they are running sequentially instead of parallel.
I can't use join plugin as I am not running different jobs instead triggering same job multiple times.
My Ideas:
have separate jobs for each test.rb and use join trigger to monitor
statuses of all jobs
have some shell script as the post-build task of "Main_Test_job"
which will aggregate/monitor statuses/results of each jobs.
I think this must be a common scenario in many organizations and there must be a easy way in Jenkins to achieve this.
Please let me know your approaches/ideas. May be I am missing some thing here.
If your jobs can run in parallel on same machine then Multijob plugin might be of interest to you. It starts the jobs in parallel but waits till all of them finish.
You can also use Build Flow Plugin
You can run any type of jobs by using this plugin.

Jenkins - Build Pipeline - Showing unwanted Job after using Join Plug

I'm trying to set up Jenkins as follows:
Test Job --> (Test Job 1 & Test Job 2 in parallel) --> Test Job 3 --> Test Job 4
I have this working at present using the Join Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Join+Plugin) and Build Pipeline Plugin.
However the display on the Build Pipeline unnecessarily 2 x Test Job 3s and 2 x Test Job 4s after the join, see below:
Set up for each job is as follows:
Test Job:
Test Job 1 & 2:
Test Job 3:
Test Job 4:
I would like to remove the "blue" versions of Test Job 3 and Test Job 4 from my Build Pipeline after the two parallel processes finish.
Anybody able to help me to remove these?
Cheers
Try with Build Flow plugin
It will do both parallel and sequential jobs.
I recommend using the Multijob Plugin alone without the Build Pipeline Plugin.
The Multijob Plugin gives you the functionality of the Join Plugin, and its configuration is more straightforward. I actually prefer how it displays my running build.
You can put a multijob into a build pipeline, but the placement of the jobs within the pipeline is wrong The jobs within the multijob are displayed in vertically in alphabetical order (not build order). On the positive side, everything else seems to work, so this should be easy to fix. I reported this problem as Jenkins bug 22074.
the 'Jenkins - Build Pipeline' plugin support customize css , maybe you could make it unvisable by css
You can use build pipeline plugin together with Multijob plugin. Just use Multijob plugin as a substitute for Join plugin. Basically, Multijob plugin will only be used to make certain jobs to be executed simultaneously.
If you do this way, then build Pipeline view won't get screwed up.
This is how it looks in Pipeline Build view
build-bv-docker-images is a Multijob plugin Job.
build-(activemq|postgres|tomcat|wildfly)-bv_image are simple jobs used for building docker images
deploy-staging is a job, which is triggered after build-bv-docker-images job. Logically speaking it suppose to appear right after stack of build-*-bv-images jobs, but it appears as a part of this stack. Nevertheless, it's waiting until all jobs of this stack are completed. I had to prefix deploy-staging job with + sign in order to make it appear on the top of the stack. It looks awkward, but it's still better than to see deploy-staging job in the bottom of the stack.
This is how build-bv-docker-images multijob is configured

Triggering a Hudson job when all upstream jobs are completed?

I have 3 jobs A,B,C. Job C has upstream Dependency on Job A and Job B. Both Job A and B can run in pararllel. We want that the Job C should only be triggered when Job A and Job B are completed. Is there any existing plug-in that I can use? We are using Hudson 3.0.1
From other posts here I figured out that there is an existing plug-in in Jenkins called Build-Flow Plug-in(https://wiki.jenkins-ci.org/display/JENKINS/Build+Flow+Plugin) that provides this functionality. Is there some plug-in existing in Hudson that provides same functionality ? Or can I reuse this plug-in for Hudson ?
Try Build Pipeline Plugin,this may do what you want.
Try Multijob Plugin
(https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin)
It provides Multiphase configuration.
If you want to run 2 or more jobs in parallel then put all the jobs in same phase.
In your case you need to create 2 phases
Example:
phase 1:
job A
job B
phase 2:
job C
here phase 2 executes only after the completion of phase 1.

Resources