Disabling particular jenkins job for cloning - jenkins

We have some jobs which should'nt be duplicated?
Is it possible to disable cloning(copying) of a particular job in jenkins?

The only thing that comes close in the Matrix Authorization Strategy Plugin is:
Job
...
Create
...
☐
☐
☐

Related

Jenkins jobs to run concurrently with conditions

I have a Jenkins job and does something(by calling a Ansible playbook) on a environment(a user selected parameter). I want to setup the job to run in conditional concurrent mode, like the jobs can run concurrently only for different environments.
I saw there was a same question:
Jenkins - Stop concurrent job with same parameter
posted 6 years ago and I did what the recommendation said, but when I tried with the plugins, I do not see the:
Builder: Set the build result
Result: Aborted
for the first Conditional step (single) in the answer.
So something have changed since then? Are there any new solutions for my case?
Thanks!

Custom metrics for jenkins job

My jenkins job "Build" configuration has
3 Execute shell - build tasks
2 Post build action
Is there any way to get the runtime of each buildtask and postbuild action ?
Not directly, which is why:
I use a Jenkins plugin timestamper to add the timestamps in the logs, giving me the opportunity to extract each task duration.
there are plugins like additional-metrics-plugin or build-metrics which collect job duration (but not necessarily task or post-build task duration): they could be good starting point for making a custom plugin which would collect what you need.

Is it possible to pause a stage until a time is reached in a Jenkins declarative pipeline?

I have a requirement to only allow deployments to production in a specific time range, e.g.: between 04:00 and 06:00.
My Jenkins pipeline is triggered by a Git webhook and starts a flow consisting of the usual stages (Build, Test, Deploy Dev, Deploy Pre, Deploy Pro).
When we reach the Deploy Pro stage I would need to check whether the current time is in the allows range to go on. Otherwise I need to pause the pipeline (freeing the Jenkins agent) until that time range is reached.
Is is possible to implement this? How?
To do this, you can use sleep method, which can get different units: NANOSECONDS, MICROSECONDS, MILLISECONDS, SECONDS, MINUTES, HOURS, DAYS (by default it's SECONDS).
E.g. sleep(time:3,unit:"SECONDS") (or just sleep 3 - it's the same).
So, check whether the current time is in the allows range, and if it's not, then specify time for sleep method as value of 04:00 - <current time> (you need to use some arithmetics here to get right value in seconds or minutes, but it shall be not hard).
But if all the other stages are quite fast, I recommend you to use Poll SCM option (with specifying time range) for all pipeline, because it's much more easier and the result will be the same - job will be finished only after 04:00.
Your Jenkins pipeline is triggered by a Git webhook and starts a flow consisting of the usual stages (Build, Test, Deploy Dev, Deploy Pre, Deploy Pro).
Option 1:
skip the deployment using 'when' condition and if time is not between
Option 2: separate the pipelines for CI and CD And CD will trigger only during that or keep in pause stage using shell cmd

Jenkins - Build Pipeline - Showing unwanted Job after using Join Plug

I'm trying to set up Jenkins as follows:
Test Job --> (Test Job 1 & Test Job 2 in parallel) --> Test Job 3 --> Test Job 4
I have this working at present using the Join Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Join+Plugin) and Build Pipeline Plugin.
However the display on the Build Pipeline unnecessarily 2 x Test Job 3s and 2 x Test Job 4s after the join, see below:
Set up for each job is as follows:
Test Job:
Test Job 1 & 2:
Test Job 3:
Test Job 4:
I would like to remove the "blue" versions of Test Job 3 and Test Job 4 from my Build Pipeline after the two parallel processes finish.
Anybody able to help me to remove these?
Cheers
Try with Build Flow plugin
It will do both parallel and sequential jobs.
I recommend using the Multijob Plugin alone without the Build Pipeline Plugin.
The Multijob Plugin gives you the functionality of the Join Plugin, and its configuration is more straightforward. I actually prefer how it displays my running build.
You can put a multijob into a build pipeline, but the placement of the jobs within the pipeline is wrong The jobs within the multijob are displayed in vertically in alphabetical order (not build order). On the positive side, everything else seems to work, so this should be easy to fix. I reported this problem as Jenkins bug 22074.
the 'Jenkins - Build Pipeline' plugin support customize css , maybe you could make it unvisable by css
You can use build pipeline plugin together with Multijob plugin. Just use Multijob plugin as a substitute for Join plugin. Basically, Multijob plugin will only be used to make certain jobs to be executed simultaneously.
If you do this way, then build Pipeline view won't get screwed up.
This is how it looks in Pipeline Build view
build-bv-docker-images is a Multijob plugin Job.
build-(activemq|postgres|tomcat|wildfly)-bv_image are simple jobs used for building docker images
deploy-staging is a job, which is triggered after build-bv-docker-images job. Logically speaking it suppose to appear right after stack of build-*-bv-images jobs, but it appears as a part of this stack. Nevertheless, it's waiting until all jobs of this stack are completed. I had to prefix deploy-staging job with + sign in order to make it appear on the top of the stack. It looks awkward, but it's still better than to see deploy-staging job in the bottom of the stack.
This is how build-bv-docker-images multijob is configured

How do I make a Jenkins job start after multiple simultaneous upstream jobs succeed?

In order to get the fastest feedback possible, we occasionally want Jenkins jobs to run in Parallel. Jenkins has the ability to start multiple downstream jobs (or 'fork' the pipeline) when a job finishes. However, Jenkins doesn't seem to have any way of making a downstream job only start of all branches of that fork succeed (or 'joining' the fork back together).
Jenkins has a "Build after other projects are built" button, but I interpret that as "start this job when any upstream job finishes" (not "start this job when all upstream jobs succeed").
Here is a visualization of what I'm talking about. Does anyone know if a plugin exists to do what I'm after?
Edit:
When I originally posted this question in 2012, Jason's answer (the Join and Promoted Build plugins) was the best, and the solution I went with.
However, dnozay's answer (The Build Flow plugin) was made popular a year or so after this question, which is a much better answer. For what it's worth, if people ask me this question today, I now recommend that instead.
Pipeline plugin
You can use the Pipeline Plugin (formerly workflow-plugin).
It comes with many examples, and you can follow this tutorial.
e.g.
// build
stage 'build'
...
// deploy
stage 'deploy'
...
// run tests in parallel
stage 'test'
parallel 'functional': {
...
}, 'performance': {
...
}
// promote artifacts
stage 'promote'
...
Build flow plugin
You can also use the Build Flow Plugin. It is simply awesome - but it is deprecated (development frozen).
Setting up the jobs
Create jobs for:
build
deploy
performance tests
functional tests
promotion
Setting up the upstream
in the upstream (here build) create a unique artifact, e.g.:
echo ${BUILD_TAG} > build.tag
archive the build.tag artifact.
record fingerprints to track file usage; if any job copies the same build.tag file and records fingerprints, you will be able to track the parent.
Configure to get promoted when promotion job is successful.
Setting up the downstream jobs
I use 2 parameters PARENT_JOB_NAME and PARENT_BUILD_NUMBER
Copy the artifacts from upstream build job using the Copy Artifact Plugin
Project name = ${PARENT_JOB_NAME}
Which build = ${PARENT_BUILD_NUMBER}
Artifacts to copy = build.tag
Record fingerprints; that's crucial.
Setting up the downstream promotion job
Do the same as the above, to establish upstream-downstream relationship.
It does not need any build step. You can perform additional post-build actions like "hey QA, it's your turn".
Create a build flow job
// start with the build
parent = build("build")
parent_job_name = parent.environment["JOB_NAME"]
parent_build_number = parent.environment["BUILD_NUMBER"]
// then deploy
build("deploy")
// then your qualifying tests
parallel (
{ build("functional tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) },
{ build("performance tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) }
)
// if nothing failed till now...
build("promotion",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
// knock yourself out...
build("more expensive QA tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
good luck.
There are two solutions that I have used for this scenario in the past:
Use the Join Plugin on your "deploy" job and specify "promote" as the targeted job. You would have to specify "Functional Tests" and "Performance Tests" as the joined jobs and start them via in some fashion, post build. The Parameterized Trigger Plugin is good for this.
Use the Promoted Builds Plugin on your "deploy" job, specify a promotion that works when downstream jobs are completed and specify Functional and Performance test jobs. As part of the promotion action, trigger the "promote" job. You still have to start the two test jobs from "deploy"
There is a CRITICAL aspect to both of these solutions: fingerprints must be correctly used. Here is what I found:
The "build" job must ORIGINATE a new fingerprinted file. In other words, it has to fingerprint some file that Jenkins thinks was originated by the initial job. Double check the "See Fingerprints" link of the job to verify this.
All downstream linked jobs (in this case, "deploy", "Functional Tests" and "Performance tests") need to obtain and fingerprint this same file. The Copy Artifacts plugin is great for this sort of thing.
Keep in mind that some plugins allow you change the order of fingerprinting and downstream job starting; in this case, the fingerprinting MUST occur before a downstream job fingerprints the same file to ensure the ORIGIN of the fingerprint is properly set.
The Multijob plugin works beautifully for that scenario. It also comes in handy if you want a single "parent" job to kick off multiple "child" jobs but still be able to execute each of the children manually, by themselves. This works by creating "phases", to which you add 1 to n jobs. The build only continues when the entire phase is done, so if a phase as multiple jobs they all must complete before the rest are executed. Naturally, it is configurable whether the build continues if there is a failure within the phase.
Jenkins recently announced first class support for workflow.
I believe the Workflow Plugin is now called the Pipeline Plugin and is the (current) preferred solution to the original question, inspired by the Build Flow Plugin. There is also a Getting Started Tutorial in GitHub.
Answers by jason & dnozay are good enough. But in case someone is looking for easy way just use JobFanIn plugin.
This diamond dependency build pipeline could be configured with
the DepBuilder plugin. DepBuilder is using its own domain
specific language, that would in this case look like:
_BUILD {
// define the maximum duration of the build (4 hours)
maxDuration: 04:00
}
// define the build order of the existing Jenkins jobs
Build -> Deploy
Deploy -> "Functional Tests" -> Promote
Deploy -> "Performance Tests" -> Promote
After building the project, the build visualization will be shown on the project dashboard page:
If any of the upstream jobs didn't succeed, the build will be automatically aborted. Abort behavior could be tweaked on a per job basis, for more info see the DepBuilder documentation.

Resources