could you suggest best jenkins plugin to manage multiple level and complex dependency build?
similar to diamond
many build starts in parallel
downstream job will have to wait for two or more upstream job to finish before it triggers. e.g C job should wait for both A and B to complete and with build success
Edit:
Seems like Pipeline plugin is the one that will be officially supported and developed by CloudBees.
Original Answer:
IMHO The easiest to start with is: Build Flow Plugin
From the plugin Wiki:
parallel (
// job 1, 2 and 3 will be scheduled in parallel.
{ build("Job1") },
{ build("Job2") },
{ build("Job3") }
)
if (params["PARAM1"] == "BOO"){
println "BUILDING OPTIONAL JOB4"
// job4 will be triggered after jobs 1, 2 and 3 complete and if condition is met
build("Job4")
}
Additional plugins to check would be:
Pipeline Plugin
Tikal's Multijob
Related
I want to use the Jenkins "PRQA" plugin, which seems not to have the option to use it from a pipeline. The plugin would run static code analysis and publish the results.
In my case, it requires some preparations that are already done in a pipelinejob. Because of that, I want to include the job into that pipeline, but on the same executor with the data prepared by the pipeline as some kind of inlined job-step.
I have tried to create a job for the PRQA-Plugin-Step and execute this with the build step from the pipeline. But this tries to start the job on a new executor (and stalls because I have only one executor).
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Prepare'
}
}
stage('SCA') {
steps {
//Run this without using a new executor with the Environment that exists now
build 'PRQA_Job'
}
}
}
}
What is the correct way to run the job on the same executor with the current working directory.
With specified build 'PRQA_Job' it's not possible to run second job on the same executor (1 job = 1 executor), since main job just waiting for a triggered job to be finished. But you can run another job on the same agent with more than 1 executor to reach workspace from main job.
For a test porpose specify agent name in both jobs: agent 'agent_name_here'
If you want to use plugin functionality for a plugin, which has no native pipeline support, you could try using "step: General Build step" feature for Jenkins Pipelines. You can use the Pipeline Syntax wizzard linked in the Job configuration windows to generate the needed Pipeline description.
If the plugin does not show up in the "step: General Build step" part of Jenkins you can use a separate Job. To copy all the needed files/Data into this second Job you will require to use Archive Artifact/Copy Artifact functionality of Jenkins to save files from your Pipeline build.
For more information on how to sue Archive Artifact/Copy Artifact see https://plugins.jenkins.io/copyartifact/ and
https://www.jenkins.io/doc/pipeline/tour/tests-and-artifacts/
I have Jenkins declarative pipelines for a few different repos that trigger a database refresh, and unit tests that depend on the database. These Jenkins jobs are triggered from pull requests in GitHub.
To avoid resource collisions, I need to prevent these jobs from running at the same time -- both within each project, and across projects.
The "Throttle Concurrent Builds" plugin seems to be built for this.
I have installed the plugin and configured a category like so:
And I added the "throttle" option to the Jenkinsfile in one of the repositories whose builds should be throttled:
pipeline {
agent any
options {
throttle(['ci_database_build'])
}
stages {
stage('Build') {
parallel {
stage('Build source') {
steps {
// etc etc...
However, this does not seem to be preventing 2 jobs from executing at once. As evidence, here are 2 jobs (both containing the above Jenkisfile change) executing at the same time:
What am I missing?
The following in the options block should work
options {
throttleJobProperty(
categories: ['ci_database_build'],
throttleEnabled: true,
throttleOption: 'category',
)
}
The full syntax can be seen here: https://github.com/jenkinsci/throttle-concurrent-builds-plugin/pull/68)
I would like to build a Pipeline in Jenkins where result based on multiple upstream jobs should trigger a single downstream job.
For ex:
Job 1 --> Job 2 --> Job 5 --> Job 6
Job 3 -->
Job 4 -->
Job 1: when a new piece of code is committed to Git production it should trigger Jobs 2,3 and 4 (this part I was able to run run using Build Other jobs option in Post Build Action; although any suggestion to improve this is also greatly appreciated).
Job 2,3 and 4 are regressions to be run on different test machines.
The part I am not able to fig out is only when Job 2,3 and 4 are successful it should trigger the downstream Job 5 and eventually 5 can trigger 6.
I am currently using the Build Pipeline plugin, but it was successful for one (downstream) to many (upstream )jobs and not vice versa.
Any help/suggestion/guidance is greatly appreciated. Thanks in advance!
Cheers!!
you can do this by using the 'Build after other projects are built' on Job 5 configuration 'Build Triggers' section. There you add Job2, Job3 and Job4 as dependencies and set up the option 'Trigger only if build is stable' (As the image below). This should do the job, and will wait the three jobs to finish.
But this as you said does not accomplish the goal of executing Job5 when Job2,3 and 4 are successful (Job5 gets executed even though one of them failed). I think the best solution for your case is to use a new job and create it as a 'Pipeline' job (let's call it PipelineJob). This way you do not have to edit every single Job with its own configuration and dependencies and you can tweak your pipeline behaviour easier. Also thanks to error propagation it will fail if some phase fails too! This should work as intended:
pipeline {
agent any
stages{
stage('Phase 1') {
steps {
build 'Job1'
}
}
stage('Phase 2') {
parallel {
stage('Job 2') {
steps {
build 'Job2'
}
}
stage('Job 3') {
steps {
build 'Job3'
}
}
stage('Job 4') {
steps {
build 'Job4'
}
}
}
}
stage('Phase 3') {
steps {
build 'Job5'
}
}
stage('Phase 4') {
steps {
build 'Job6'
}
}
}
}
In addition to #CMassa's answer (yes this works - thanks for the answer), I found Join Plugin by Jenkins and it also works great for this scenario.
We are using Jenkins and the Pipeline plugin for CI/CD. We have two pipelines that we need to run parallel, and there is a downstream pipeline which should trigger ONLY if the two upstream pipelines both finish and are successful.
P1--
| -- P3
P2--
Basically P3 should run only when P1 and P2 are finished and successful and not depend on just one of them.
Is there a way to achieve this? We are using 2.5 version of the plugin.
Since stages only run if previous stages run successfully, and since you can execute other pipelines via build, and since there is a magical instruction called parallel, I think this might do it:
pipeline {
agent { label 'docker' }
stages {
stage("build_p1_and_p2_in_parallel") {
steps {
parallel p1: {
build 'p1'
}, p2: {
build 'p2'
}
}
}
stage("build_p3_if_p1_and_p2_succeeded") {
steps {
build 'p3'
}
}
}
}
Use the "Snippet Generator" embedded in your jenkins instance to figure out what the argument to build should be. If it's another pipeline at the same level as the top level Jenkinsfile, you could just reference it by job name. Caveat: I've used parallel, but never build within parallel, but it seems like it should work.
You can try and wrap the pipelines jobs with MultiJob plugin that can implement the logic that you require as 2 jobs inside a phase.
I have many Jenkins Jobs that I need to run on every Build,
At present time I have 4 slave servers.
I would like the jobs to run in parallel as much as possible, hence I defined the jobs as follow:
Execute concurrent builds if necessary - Disabled
Restrict where this project can be run - Enabled with the following values SalveLinux1HT||SalveLinux2HT||SalveLinux3HT||SalveLinux4HT
To my understanding if Job A and B are triggered at the same time, one should use 1HT and the other should use 2HT and they can run in parallel
however Jenkins build job A on all 4 slaves and only after it's finished he will build job B on all 4 slaves
This is the opposite of my goal
Any ideas?
Thanks in advance
You can use
Build Flow Plugin
You can find both installation and configuration instructions of this plugin at the above mentioned link.
If you want to run any jobs in parallel you can use following scripts:
parallel (
// job A and B will be scheduled in parallel.
{ build("jobA") },
{ build("jobB") }
)
// jobC will be triggered after jobs A and B are completed
build("jobC")