Jenkins diamond dependency with 3 upstream jobs triggers only 2 times - jenkins

We have one jenkins job (A) that triggers 3 other jobs (B1,B2,B3).
These 3 jobs all trigger the same job (C).
When triggering job A, the job C is executed twice (I expected 3 times).
Question: Can someone please explain why C is triggered twice rather than 3 times?
Dependencies overview:
-> B1 ->
A -> B2 -> C
-> B3 ->
The downstream build view shows that 2 of the jobs (for example B2,B3) trigger only one execution of C. Please note that these are not always the same 2 jobs.
Execution overview (Downstream build view of A)
-> B1 -> C (build number 1)
A -> B2 -> C (build number 2)
-> B3 -> C (build number 2) <<< same as for B2
More details about the job configs:
Job A has Post-build Actions/Build other projects: B1, B2, B3
Job C has Build trigger/Build after other projects are built/Projects to watch: B1, B2, B3
Jenkins version: 1.583

It's the way how Jenkins triggers jobs. If concurrent builds are not allowed in C (I suppose they aren't), then the following happens:
A finishes and triggers B1,B2,B3
B1 (for example, could be B2 or B3 as well) finishes and triggers C, build #1 (C#1).
B2 finishes and triggers C. The build is stacked since C#1 is still running.
B3 finishes and triggers C. As long as C#1 build is running, other builds are stacked and if they are triggered the same way (i.e. C is not parameterized build or parameters are the same), the stacked builds are merged into one. Thus, only one build of C (C#2) keeps stacked.
C#1 finishes and the next build in the queue (C#2) is started. As C#2 was merged (from triggers B2,B3), the build queue is now empty.
C finishes C#2.
As you can see, C was only run twice.
There is a workaround, though. Make C parameterized and supply different values (for example job name of the trigger). Or allow concurrent builds of C - but you must ensure it won't access the same shared resource (e.g. by critical section exclusion).

Related

Jenkins. Generate parallel stages depending on passed parameters

I need to run identical steps (manipulation of files) on different servers (agents) that have different files. E.g.:
server A has files A1-A3
server B has files B1-B6
server C has a file C1
Every server has to run the steps with its own filelist independently and in parallel of other servers. Every file has to be processed in 3 steps. Step1 -> then step2 -> then send to network share (step3).
So my ideas were:
Idea1. Create a map:
filelist = [
[serverhostname: serverA, files: [A1, A2, A3]],
[serverhostname: serverB, files: [B1, B2, B3, B4, B5, B6]],
[serverhostname: serverC, files: [C1]]
]
Idea2. Generate parallel stages and pass the map into these stages. I read an article Generating parallel stages in Jenkinsfile according to passed parameters but couldn't combine examples into a working code.
Idea3. In order to limit servers loading (step2 eats CPU, step3 eats network bandwidth) I want to process only 1 file at the moment by every server but not whole fileset.

Jenkins - How to make a child wait for both parents?

I have 5 jobs in Jenkins:
Image example
"E" is executed only after both parents(C & D) have completed their builds.
How do I trigger the child job only after both parents jobs?
Note: I want to make sure that "E" executes only once.
Use a pipeline for it.
Run jobs C and D in parallel , once both complete run job E.
Is there any case where C or D would be ran independently of the other and then E being ran? If not, then you could just add a post-build downstream for D from C and downstream E from D therefore the pipeline would get built C->D->E in that order.
You can have Up & Down stream project setup with your project order. Use General => Advanced => Block build when upstream project is building option to your Project E so that it will wait until Project C & D builds are done. You can have the same for Project C & D also. Hope this helps.

Jenkins plugin - Merge build queue?

I'm looking for the jenkins plugins.
Here is my scenario;
1) Job B's quiet period is set to 10 minutes.
2) Job B will have 10 queued builds.
3) After 10 minutes, job B-1 starts running.
4) After B-1 finished, then B-2 starts running.
5) ...
==> Instead of running a single B-1 build in step 3), I want to gather all the 10 queued build's parameters and run a merged build B-x, and discard all the 10 build queues.
Is it possible??
if I got your question you have a parameter job with 10 jobs in queue , and you want to run only the last one ?
If yes you should use some groovy script to check the queue before you trigger the job or inside the job as build step, and clean all previous jobs that exist in the queue.
here is an example to clean jobs for a specific branch , you can modify it for your needs. let me know if you need any help
Thanks , Mor
import jenkins.model.*
def branchName = build.environment.get("GIT_BRANCH_NAME")
def buildNo = build.environment.get("BUILD_NUMBER")
println "checking if need to clean the queue for" + branchName + " build number : " + buildNo
def q = Jenkins.instance.queue
q.items.each {
println("${it.task.name}:")
}
q.items.findAll { it.task.name.startsWith(branchName) }.each {
q.cancel(it.task)
}
You sound to be describing a matrix project which enables a matrix of different build parameter combinations.
If you had 3 different parameters with three different options, this would give you 9 builds, each in its own workspace. There are options to remove some combinations
This is a good explanation of matrix builds

Pbft consensus with 4 VPs

I have set up a dev network consisting of 4VPs using the pbft consensus.
I am trying to test the behaviour of the VPs when one of them is down.
Step one
While the 4 VPs are running , i have deployed a chain code (chaincode_example02).
Checking localhost:7050/chain -> return 2
Step two
I shutdown one of the VP using (docker stop containerID)
Now when i execute an Invoke transaction and recheck the chain length:
localhost:7050/chain -> it still returns 2.
Step three
I restart the VP (from step 2) , and the i see that the invoke transaction (from step 2) is executed automatically and the chain size is now 3
localhost:7050/chain -> now returns 3.
My understanding is that with 4VP using the pbft consensus, we have tolerance for 1 faulty VP .If that is the case, then the invoke transaction should have been executed in step2.
Can someone please advise if that is the expected result and why?
Thanks in advance

Jenkins - Multijob Phase marks build as failed when one of the step is not executed

I am in need of little help here. appreciate any pointer on this issue.
One of my projects (MultiJob) has few phases (p1, p2, p3). I have set the conditions to proceed to each phase.
Condition from P1 to P2 is only if P1 successful
Condition from P2 to P3 is only if P2 is Failed
Everything would be fine except in a case where P2 succeeds. Whenever P2 succeeds, P3 does not run (this is what I want) but it marks Jenkins job as UNSTABLE.
Am I missing something here? Is this is the right approach to handle the cases I have? Please suggest
Use conditional build step plugin : https://wiki.jenkins-ci.org/display/JENKINS/Conditional+BuildStep+Plugin

Resources