Jenkins Build Queue Limit - jenkins

I've noticed that there seems to be a build queue limit of one in Jenkins. When I trigger a lot of builds it seems to only place a max of one build in the build queue. Is there a way to remove this limit so there can be more then one build in the build queue?

This is intended behaviour:
Normally, your jobs will depend on some input (from SCM, or from some upstream jobs)
If your slave capacity is too low to catch up with each and every build, then you'd normally want to test/build/... only the very latest "item".
This is the default behaviour. Without that, there'd be a risk that the build queue grows indefinitely.
On top of that, Jenkins does not track the properties of normal build requests -- they all look the same, and Jenkins can not (for example) separate different SCM states that existed at different triggering times.
This is however exactly the point that gives you a workaround: parameterize your jobs, and then use for example the Trigger parameterized build on other projects post-build action to trigger those. Then Jenkins will queue each build request individually -- and inside your job, you can use the parameter to find out what exactly has to be done.
Jenkins will squash queued parameterized builds that have identical parameter values (thanks to user "atline" for checking).

Related

Jenkins - Job Y is configure to run after Job x, but is not triggered

I have a Jenkins server on K8s (using Rancher).
Job Y is configured to run after job X (image is attached).
Job X runs successfully, but at its end, job Y is not triggered.
Both jobs are on the same Jenkins master.
This problem has never happened to me before, on our Jenkins on prem servers.
I have checked everything I could think of, including creating test jobs that basically do nothing, just in order to test if the triggering is working for other jobs on this master, it does not.
Checking the logs - job x is in the log (image is also attached), job Y is missing from the log completely.
I will mention that on another master on the same cluster, triggering is working. Any ideas?
check the trigger condition for job Y.
Below is the one way you can trigger jobs in Jenkins
Chain jobs together
Chain Jenkins job together using the “Build other projects” option in the “Post Build Actions” of a job. The downside to this is that it introduces a dependency between jobs. For example, I want to be able to run the database backup job without doing a deployment every time.
Parameterized Trigger Plugin
When you have the Parameterized Trigger Plugin installed:
Create a wrapper job for your sequential jobs
For each sequential job
Select Build->Add build step->Trigger/call builds on other projects
Enter the sequential job name
Check the ‘Block until the triggered projects finish their builds’ checkbox (this only appears when you have the Parameterized Trigger Plugin installed)
Now when you run the wrapper job, all the triggered jobs will be run in order and sequentially. You of course also have the option of using the parameters functionality of the plugin too.
Throttle Concurrent Builds Plugin
The docs for the Throttle Concurrent Builds Plugin seem to be a little lacking, but it works well. With it installed, a “Throttle Concurrent Builds” section shows up in the Jenkins config section (Jenkins -> Manage Jenkins -> Configure System). There, you create a “Multi-Project Throttle Category”
Next, for each job that needs to be run sequentially, you
Check the ‘Throttle Concurrent Builds’ check box
Select the ‘Throttle this project as part of one or more categories’ radio button
Check the checkbox of the “Multi-Project Throttle Category” you created above
Save
Finally, you create a wrapper job that triggers all your to-be-run-sequentially jobs and when you run it, you should see your jobs running sequentially and in order.
Although a fairly complex option, this approach also allows you to run some jobs sequentially and others in parallel. For example, your wrapper job can run jobs A and B sequentially and jobs C and D sequentially, but trigger A and C in parallel. You can also combine this plugin with the Parameterized Trigger Plugin for maximum flexibility.
Join Plugin
Although I haven’t used the Join Plugin myself, it seems worth mentioning. Its docs state that
This plugin allows a job to be run after all the immediate downstream jobs have completed. In this way, the execution can branch out and perform many steps in parallel, and then run a final aggregation step just once after all the parallel work is finished.
That’s it. Several options for running Jenkins jobs in parallel. Choose the one that best suits your needs…

Jenkins - Puge build history logs - Configure a job to keep just the last 100 builds history and delete the rest

I have a Jenkins Job that is running periodically (Every minute).
As so, I end up with thousands of logs that doesn't really matters to me and overload the space disk.
Is there a way to configure that job, in a way it will keep just the last 100 builds and delete the old ones?
I know this is possible manually, But I am looking for a way that I don't have to do it myself every time, I want the job, or another one to do it automatically.
You do not need to do that manually, you just configure the job to retain the # of builds to retain during creating the job. This again can be done when you create the job itself through a rest api, all you have to do is to set the appropriate values in the job's config.xml. You configure the job once and never have to worry, Jenkins automatically takes care of the clean-up.
Note
Once you configure the job, the next run will be over the threshold and trigger deleting the excess job logs.
Also, since LTS 2.204.6: Add globally configured build discarders that delete old builds not marked as "keep forever" even if there is no, or a less aggressive, per-project build discarder configured, executed periodically and after a build finishes

How to execute only the most recent queued job in Jenkins?

I've got a commit build project in Jenkins which schedules an acceptance build project on completion. Since commits come in faster than the acceptance build job finishes, after a short time there are now six queued acceptance build jobs. I would like the acceptance build project to work like the "Poll SCM" functionality - On completion, start the most recently queued job, skipping the rest.
I can't use the "Build after other projects are built" without more hacks since I need to pass information from the commit build job to the acceptance build job.
#l0b0,
Jenkins behavior is to coalesce builds so that the queue only contains the currently running build and one enqueued job. The depth only increases if the newly enqueued jobs takes parameters that differ from what's already on the queue.
So I'm gathering that your downstream (acceptance) job takes some sort of parameters, but you need to supply more details of how it's working.
If you're using parameterized trigger plugin then you should check out this existing SO thread
More generally speaking, you should look into your parameters. It sounds like you are passing too much information from upstream to downstream jobs, resulting in the the Jenkins queue treating them as distinct parameters when then is not necessarily the case.
Are you passing in the run number of the last successful upstream job as a parameter? If so, then yeah you've got problems. What you should do instead is use the Promoted Build Plugin on the upstream job to mark the last successful build, and then have the downstream job simply jump to the most recent promoted build.
Hope that helps.

Jenkins Mutualize SCM polling

Using Jenkins, I am running 2 builds (Linux+Windows) and one Doxygen job
At the moment, I am using 3 separate SCM polling triggers pointing to the same source code
How can I use a single trigger for all three jobs provided that I still want to get separate statuses
For the record; the underlying SCM is Git
Off the top of my head, some solutions which might do what you are looking for:
Instead of setting an SCM trigger, use a post-receive hook in your repository, which can send a signal for Jenkins that there are new changes (see: https://wiki.jenkins-ci.org/display/JENKINS/Git+Plugin#GitPlugin-Pushnotificationfromrepository). This way Jenkins doesn't have to constantly poll the repository (multiple times for the different jobs), and the trigger would be faster, since there is no waiting for the next polling, but when there is a push, a build will be started.
Use an extra job, that would do nothing else, just have the SCM polling trigger, and start all three original jobs (without waiting for any of them to finish).
If the configuration is similar for all three jobs, you could consider creating a single project with a matrix configuration. Roughly what it does, is that you could have a variable for the build type, with values like linux, windows, doxygen. When the job is triggered, it would start multiple builds with all the possible values - of course you would have to set up the job in a way that the current parameter changes the build process according to what needs to be done. Actually I haven't had to use a matrix configuration yet, so my example may be not the best, but you can probably find lots of examples on the Jenkins wiki, if you think this is a good direction.

Jenkins - Running instances of single build concurrently

I'd like to be able to run several builds of the same Jenkins job simultaneously.
Example:
Build [*jenkins_job_1*]: calls an ant script with parameter 'A'
Build [*jenkins_job_1*]: calls an ant script with parameter 'B'
repeat as necessary
each instance of the job runs simultaneously, rather than through a queue.
The reason I'd like to do this is to avoid having to create several jobs that are nearly identical, all of which would need to be maintained.
Is there a way to do this, or maybe another solution (ie — dynamically create a job from a base job and remove it after it's finished)?
Jenkins has a check box: "Execute concurrent builds if necessary"
If you check this, then it'll start multiple builds for a job.
This works with the "This build is parameterized" checkbox.
You would still trigger the builds, passing your A or B as parameters. You can use another job to trigger them or you could do it manually via a script.
You can select Build a Multi-configuration project (Matrix build) when you create the job. Then, under the job's configuration, you can define the Configuration Matrix which lets you specify one or more parameters (axes) for different builds. Regarding running simultaneously, you should be able to run as many simultaneous builds as you have executors (with the appropriate label).
Unfortunately, the Jenkins wiki lacks documentation about this setup. There are a couple previous SO questions, here and here, that might provide a little guidance. There was a "recent" blog post about setting up a multi-configuration job to perform builds on various platforms.
A newer (and better) solution is the Jenkins Job DSL Plugin.
We've been using it with great success. Our job configurations are now disposable... we can set up a huge stack of complicated jobs from some groovy files and a couple template jobs. It's great.
I'm liking it a lot more than the matrix builds, which were complicated and harder to understand.
Nothing stopping you doing this using the Jenkins pipeline DSL.
We have the same pipeline running in parallel in order to model combined loads for an application that exposes web services, provides a database to several external applications, receives data via several work queues and has a GUI front end. The business gives us non-functional requirements (NFRs) which our application must meet that guarantees its responsiveness even at busy times.
The different instances of the pipeline are run with different parameters. The first instance might be WS_Load, the second GUI_Load and the third Daily_Update_Load, modelling a large data queue that needs processing within a certain time-frame. More can be added depending on which combination of loads we're wanting to test.
Other answers have talked about the checkboxes for concurrent builds, but I wanted to mention another issue: resource contention.
If your pipeline uses temporary files or stashes files between pipeline stages, the instances can end up pulling the rug from under each others' feet. For example you can end up overwriting a file in one concurrent instance while another instance expects to find the pre-overwritten version of the same stash. We use the following code to ensure stashes and temporary filenames are unique per concurrent instance:
def concurrentStash(stashName, String includes) {
/* make a stash unique to this pipeline and build
that can be unstashed using concurrentUnstash() */
echo "Safe stashing $includes in ${concurrentSafeName(stashName)}..."
stash name: concurrentSafeName(stashName), includes: includes
}
def concurrentSafeName(name) {
/* make a name or name component unique to this pipeline and build
* guards against contention caused by two or more builds from the same
* Jenkinsfile trying to:
* - read/write/delete the same file
* - stash/unstash under the same name
*/
"${name}-${BUILD_NUMBER}-${JOB_NAME}"
}
def concurrentUnstash(stashName) {
echo "Safe unstashing ${concurrentSafeName(stashName)}..."
unstash name: concurrentSafeName(stashName)
}
We can then use concurrentStash stashName and concurrentUnstash stashName and the concurrent instances will have no conflict.
If, say, the two pipelines both need to store stats, we can do something like this for filenames:
def statsDir = concurrentSafeName('stats')
and then the instances will each use a unique filename to store their output.
You can create a build and configure it with parameters. Click the This build is parameterized checkbox and add your desired param(s) in the Configuration of the build. You can then fire off simultaneous builds using different parameters.
Side note: The "Bulk Builder" in Jenkins might push it into a queue, but there's also a This bulk build is parameterized checkbox.
I was having a pretty large build queue and I performed below steps to run jobs in
parallel in jenkins to reduce number of jobs waiting in queue
For each job you need to navigate to configure and select the checkbox stating
"Execute concurrent builds if necessary"
Navigate to Manage -> Configure System -> look for "# of executors" and set the no
of parallel executors you want (in my case it was set to 0 and I updated it to 2)

Resources