I have a job that takes a while to complete and is quite resource heavy. I only ever want to run one instance of the job at a time, and only if there are new SCM changes.
I set it up with an SCM trigger and left off the checkbox "Execute concurrent builds if necessary".
The problem I have is that it queues up the next build on the first SCM change it detects.
What I want is that whenever the SCM trigger sees a change, that it replaces the job it queued up with a new job, with the newer SCM revision. I don't want it running a new instance with the next change, and I don't want a ton of jobs queued up.
Basically, I want the job to run in a loop, and only pause if there are no SCM changes.
How do I do that?
You might be looking for Quiet period option which is under Advanced Project Options section.
You can set the value (in seconds) for Quiet period as per your requirement. This will help you reduce the frequency of builds.
Related
I have a Jenkins Job that is running periodically (Every minute).
As so, I end up with thousands of logs that doesn't really matters to me and overload the space disk.
Is there a way to configure that job, in a way it will keep just the last 100 builds and delete the old ones?
I know this is possible manually, But I am looking for a way that I don't have to do it myself every time, I want the job, or another one to do it automatically.
You do not need to do that manually, you just configure the job to retain the # of builds to retain during creating the job. This again can be done when you create the job itself through a rest api, all you have to do is to set the appropriate values in the job's config.xml. You configure the job once and never have to worry, Jenkins automatically takes care of the clean-up.
Note
Once you configure the job, the next run will be over the threshold and trigger deleting the excess job logs.
Also, since LTS 2.204.6: Add globally configured build discarders that delete old builds not marked as "keep forever" even if there is no, or a less aggressive, per-project build discarder configured, executed periodically and after a build finishes
Is there a possible to prevent the full jenkins job from running, that is periodically scheduled; if there is no SCM changes since the last build.
For example,
There is a daily night build, to create a build. The completion of this job triggers(upstream project) the automation testing job for that build.
I would like to be able to have two things
Stop the 1st build job, if there is no SCM change since the last job
2nd upstream automation test job runs only if it has not run for 1 week. (after it has been triggered by the 1st Job)
Thanks in advance
The first part is simple, instead of "Build periodically" set to "Poll SCM" with the same schedule. It's exactly what it does: periodically checking for changes and only running the job if there were some.
The second part (triggering another job with a time constraint) is more complicated. One option is "Throttle builds" (to 1 per week) in addition to your usual build triggering scheme. Another one is "Trigger builds remotely (e.g., from scripts)" option and checking whether required conditions are met in some sort of script or service.
For the first
Did you try the poll the SCM every night? If no changes, the Jenkins job wont start.
0 23 * * *
Will run every night at 11 p.m
for the second
use the following plugin : https://wiki.jenkins.io/display/JENKINS/Run+Condition+Plugin
In both it takes cron expression as input, so what's the actual difference between these two.
Poll SCM periodically polls the SCM to check whether changes were made (i.e. new commits) and builds the project if new commits where pushed since the last build, whereas build periodically builds the project periodically even if nothing has changed.
I've noticed that there seems to be a build queue limit of one in Jenkins. When I trigger a lot of builds it seems to only place a max of one build in the build queue. Is there a way to remove this limit so there can be more then one build in the build queue?
This is intended behaviour:
Normally, your jobs will depend on some input (from SCM, or from some upstream jobs)
If your slave capacity is too low to catch up with each and every build, then you'd normally want to test/build/... only the very latest "item".
This is the default behaviour. Without that, there'd be a risk that the build queue grows indefinitely.
On top of that, Jenkins does not track the properties of normal build requests -- they all look the same, and Jenkins can not (for example) separate different SCM states that existed at different triggering times.
This is however exactly the point that gives you a workaround: parameterize your jobs, and then use for example the Trigger parameterized build on other projects post-build action to trigger those. Then Jenkins will queue each build request individually -- and inside your job, you can use the parameter to find out what exactly has to be done.
Jenkins will squash queued parameterized builds that have identical parameter values (thanks to user "atline" for checking).
Using Jenkins, I am running 2 builds (Linux+Windows) and one Doxygen job
At the moment, I am using 3 separate SCM polling triggers pointing to the same source code
How can I use a single trigger for all three jobs provided that I still want to get separate statuses
For the record; the underlying SCM is Git
Off the top of my head, some solutions which might do what you are looking for:
Instead of setting an SCM trigger, use a post-receive hook in your repository, which can send a signal for Jenkins that there are new changes (see: https://wiki.jenkins-ci.org/display/JENKINS/Git+Plugin#GitPlugin-Pushnotificationfromrepository). This way Jenkins doesn't have to constantly poll the repository (multiple times for the different jobs), and the trigger would be faster, since there is no waiting for the next polling, but when there is a push, a build will be started.
Use an extra job, that would do nothing else, just have the SCM polling trigger, and start all three original jobs (without waiting for any of them to finish).
If the configuration is similar for all three jobs, you could consider creating a single project with a matrix configuration. Roughly what it does, is that you could have a variable for the build type, with values like linux, windows, doxygen. When the job is triggered, it would start multiple builds with all the possible values - of course you would have to set up the job in a way that the current parameter changes the build process according to what needs to be done. Actually I haven't had to use a matrix configuration yet, so my example may be not the best, but you can probably find lots of examples on the Jenkins wiki, if you think this is a good direction.