Jenkins: how to block a job to make it unrunnable - jenkins

This is not just another question about concurrent job execution in Jenkins. The problem I have is that there are several jobs that run independently from one another. When they finish it should be possible to run a manual job. The condition though is that all those automated jobs should be in successful state. Otherwise it should not be possible to run this manual job. It should also not be possible to run or even schedule run of this manual job if those other jobs are running.
I searched for the answer everywhere and checked every possible plugin that serves synchronization. But I did not figure it out how to solve the above problem.

IMHO the delivery pipeline plugin (see https://wiki.jenkins-ci.org/display/JENKINS/Delivery+Pipeline+Plugin for the download and http://www.infoq.com/articles/orch-pipelines-jenkins for a thorough description) could do what you want.
You can run a lot of jobs (in parallel or not), and when (and only when) they succeed another job (or more). You even can add manual steps (needing a button click when your pipeline may continue).
Everything is configurable - and quite stable at this moment.
No-one should be able to manually (or otherwise) start a job that is in "waiting state" for other jobs to finish.

Regarding this question:
Otherwise it should not be possible to run this manual job. It should also not be possible to run or even schedule run of this manual job if those other jobs are running.
You can use the Throttle Concurrent Builds Plugin and create a category which will include your automated jobs and the manual jobs.
If one automated job is running, it will be impossible to launch the manual jobs.
Regarding your first question, did you have a look to the Join plugin?
Cheers

https://wiki.jenkins-ci.org/display/JENKINS/Promoted+Builds+Plugin can also be option. Setup promotions in that way that manual approval is needed and build will not fail only if automated jobs are done.

Related

Jenkins job dsl causes a branch scan to happen for multibranchPipelineJob jobs on every run even if there are no changes

I think this is related to the "id" field maybe, previously we weren't setting that and had issues where the multibranch job was reindexed all branches as new. That's fixed now but there is another issue.
Every time the job dsl runs it causes a branch scan job to kick off for all of our multibranchPipelineJobs.
Why does this happen ? Is there a way to prevent this ? For a few jobs it's not a big deal but we have almost 200 multibranchPipelineJobs. So this huge branch scan queue builds up every time the seed job is run. Also, according to Cloudbees there is no way to increase the number of scan jobs Jenkins processes at a time. So it always takes forever to burn down.
This is stupid, am I doing something wrong ? This happens even if there are no changes but frankly I don't think it should happen even if there are. I notice if I modify the config of a Jenkins job and save it, it usually just kicks off a branch scan job so maybe this is Jenkins behavior?
It seems like the ugliest way to handle this, but can you have the job dsl kill scan jobs in the queue for jobs it just configured and not affect other scan jobs that aren't related to the seed job run?

How to lock builds on circleci

The idea is when 2-3 concurrent commits are pushed to a branch, it shouldn't start all build jobs for a given step in circleCI. It should wait until the 1st job is finished and then only run the next one in the queue.
I have tried using the below links but no luck. Please help.
* https://circleci.com/orbs/registry/orb/gastfreund/dynamo-lock?version=1.0.1
* https://circleci.com/orbs/registry/orb/freighthub/lock
You can take a look at https://circleci.com/developer/orbs/orb/eddiewebb/queue#usage-examples as it seems quite robust, is still maintained and has pretty good documentation. It also lets you define a custom job and queue the entire workflow or just a specific job(s).

How to run a job concurrently in Jenkins

I am using throttle concurrent build to run job in parallel. But I am not able to run the job in parallel. Only single build is triggered.
In Job configuration : selected Throttle Concurrent Builds and specified Maximum Total (ex:4)and/or Maximum Per Node(Ex:2)
selected “Execute concurrent build if possible” option also
I have one Master(2 Executors) and one Agent(2 Executors) in Jenkins.
Kindly help me to resolve this problem.
From the Throttle Concurrent Builds Plugin
It should be noted that Jenkins, by default, never executes the same
Job in parallel, so you do not need to actually throttle anything if
you go with the default. However, there is the option Execute
concurrent builds if necessary, which allows for running the same Job
multiple time in parallel, and of course if you use the categories
below, you will also be able to restrict multiple Jobs.)
So you need to check the box, which I think might be in the advance settings

Using a lock in a Jenkins Workflow Job

I want to use a lock in a workflow job in order to prevent jobs from running at the same time on the same node.
I want to use the functionality of the lock and latches plugin to control the parallel execution of jobs: When a Job A starts building on a specific node, Job B should wait until A is done, and then B can run.
How can I achieve that ? or is there another solution (in case locks are not supported in workflow jobs) ?
Thank you.
What exactly are you trying to prevent? The easiest way would be to set each node as having only 1 executor... If you do this, then the node will only ever run one job at a time. Note that some fly-weight tasks may run but generally these are non-significant and involve polling the remote SCM repository and such.
If you just mean within the same workflow, you can use various mix of the parallel step to split parallel sections and then combine the results.

jenkins job on two slaves?

We need to be able to run a Jenkins job that consumes two slaves. (Or, two jobs, if we can guarantee that they run at the same time, and it's possible for at least one to know what the other is.) The situation is that we have a heavy weight application that we need to run tests against. The tests run on one machine, the application runs on another. It's not practical to have them on the same host.
Right now, we have a Jenkins job that uses a script to kick a dedicated application server up, install the correct version, the correct data, and then run the tests against it. That means that I can't use the dedicated application server to run other tasks, when there aren't the heavy weight testing going on. It also pretty much limits us to one loop. Being able to assign the app server dynamically would allow more of them.
There's clearly no way to do this in the core jenkins, but I'm hoping there's some plugin or hackery to make this possible. The current test build is a maven 2 job, but that's configurable, if we have to wrap it in something else. It's kicked off by the successful completion of another job, which could be changed to start two, or whatever else is required.
I just learned from that the simultaneous allocation of multiple slaves can be done nicely in a pipeline job, by nesting node clauses:
node('label1') {
node('label2') {
// your code here
[...]
}
}
See this question where Mateusz suggested that solution for a similar problem.
Let me see if I understood the problem.
You want to have dynamically choose a slave and start the App Server on it.
When the App server is running on a Slave you do not want it to run any other job.
But when the App server is not running, you want to use that Slave as any other Slave for other jobs.
One way out will be to label the Slaves. And use "Restrict where this project can be run" to make the App Server and the Test Suite run on the machines with Slave label.
Then in the slave nodes, put " # of executors" to 1. This will make sure that at anytime only one Job will run.
Next step will be to create a job to start the App Server and then kick off the Test job once the App Server start job is successful..
If your test job needs to know the server details of the machine where your App server is running then it becomes interesting.

Resources