I have a couple of jobs that have to be built in a specific order.
Job A triggers Job B that triggers Job C.
Job D must not start before A-B-C has run it course. However this job (D) is in it's turn triggered by a repository change. As D has run successfully it will trigger A
Any way to block D until A-B and C has run?
/J
You are looking for the "Throttle Concurrent Build" plugin.
https://wiki.jenkins-ci.org/display/JENKINS/Throttle+Concurrent+Builds+Plugin
Related
Thanks for looking into my concern.
I have 3 jenkins jobs. JOb A, B & C.
Job A starts at 10PM at night.
JOB B is a down stream of Job A and runs only if job A is success.
Job C is a downstream job of job B
Now I want job C to be triggered after successful completion of job B or at at a scheduled time. Problem is if I schedule job C as down stream as well as with a schedule. It runs twice.
But, it should run only once.
Please help me to achieve this.
Did you try "Conditional BuildStep" plug-in? You can execute a downstream job (or a script) based on "Build cause"
You can add more than 1 "single" conditions for each build cause.
Now you'll need to decide when to run a job, as a timer or as a downstream
You can use jenkins pipeline plugin. You can create a pipeline job with stages. A pipeline will proceed only to next stage if previous stage is successful. Refer documentation for more details on pipeline.
Pipeline comes with a lot of flexibilities in which you can define the flow. You can either use a declarative pipeline or a scripted pipeline. Good number of examples can be found in here
I have created Job A which looks for upstream jobs (Job B and Job C) success result and triggers a shell script to verify a condition.
Once Job B and Job C executed successfully Job A executes downstream jobs (Job D and Job E).
I have used reverse (to configure upstream jobs) and downstream-ext (to configure downstream jobs) plugins in Job A using JJB.
Issue I am facing here is: After Job B is executed successfully without waiting for Job C result. Job A should wait for both Job B and Job C and then execute based on the result.
Could you please help me how to configure this scenario.
You can try using the Join Plugin, here is the documentation:
https://wiki.jenkins-ci.org/display/JENKINS/Join+Plugin
This'd be easier if you convert your A job to a build flow https://wiki.jenkins.io/display/JENKINS/Build+Flow+Plugin?focusedCommentId=60917290 or even better it's successor Pipeline 2.0 https://jenkins.io/doc/book/pipeline/
(Groovy) Code in A would then be something like:
if (build('scenario-B-Job') && build('scenario-C-Job')) {
build('scenario-E-Job')
build('scenario-D-Job')
}
You can also parallelize (B,C and then D,E) to shorten overall execution times if you have enough slaves around.
Is it possible to setup a job X to build a sequence of jobs A, B and C sequentially (B does not start before A finished successfully C does not start before B....) without specifying in B that is should start after A is done etc?
Precisely I would like to setup a "Master" job which executes the subjobs sequentially without modifying the subjobs.
(OR)
Is there a way to specify which all jobs to run sequentially from a list file or text file??
Can I use some plugin to perform this ?
I guess you should try the plugin Build Pipeline Plugin
I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.
Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")
Try the Locks and Latches plugin.
This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.
I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.
Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin
I have three jobs that I would like to serialize in Jenkins.
They should run as a block after a job that triggers them:
Job1 -> [A,B,C]
Job2 -> [A,B,C]
Right now when Job1 is triggered twice, I get the following behavior:
Order that jobs are run now:
-Job1
-Job2
-A
-B
-(job A or C)
-Order is not guaranteed after this
What I would like to see is:
Order that jobs are run:
-Job1
-Job2
-A (from Job1)
-B (from Job1)
-C (from Job1)
------------
-A (from Job2)
-B (from Job2)
-C (from Job2)
Have a look at the Jenkins Locks and Latches plugin.
So, this is the solution I found:
Using the Jenkins Parameterized Trigger plug-in, an extra Build step is available:
Trigger/call builds on other projects
On Job1 add three build triggers:
Projects to Build: JobA
[x] Block until the triggered projects finish
their builds
Projects to Build: JobB
[x] Block until the triggered projects finish
their builds
Projects to Build: JobC
[x] Block until the triggered projects finish
their builds
Do the same on Job2.
Add a lock that is shared between the two Jobs 1 and 2.
This way when Job 1 and Job 2 are triggered they will wait for each other. And each Job won't be marked as passed until all the Jobs A->B>C run and are successful.
More options are available to control the return value of each build step , whether it should depend on the result of the triggered job or continue regardless of the result.
Fail this build step if the triggered build is worse or equal to
Mark this build as failure if the triggered build is worse or equal to
Mark this build as unstable if the triggered build is worse or equal to
I cannot think of a way to do exactly what you want in Jenkins. Maybe Jenkins is not the tool for you?
Another way to approach this would be: Your question actually presents a solution to your problem (a bunch of jobs and their execution order) and you are asking how to implement it.
Would you be willing to explain what is your goal, what is it you want to achieve? Maybe some Jenkins expert can then tell you how you can do it using Jenkins.