Jenkins: block scm polling when upstream projects are building - jenkins

I was looking for the answer a few days, but haven't found an easy solution.
All the projects are related to each other by upstream and downstream triggers. In all projects set the same interval poll SCM. After the launch of the first project, the second project will start, then the third and so on. And when in the middle of the chain SCM timer triggers on any project, it begins to build of this project. There is an option to block the project until the upstream projects are building. But!!! The problem is that the CMS trigger still adds a job to the queue. I need it isn't add downstream jobs to the queue and do not do anything, if upstream projects are building. In other words, I need upstream job freezing SCM triggers in all downstream projects.
I know about pipeline, but I'm looking for something simpler. Thanks.

Related

How to avoid scheduling/starting multiple runs of a Jenkins job at the same time

We are moving over our build system from Hudson to Jenkins and also to declarative pipelines in SCM. Alas, it looks like there are some hiccups. In Hudson, when a job was scheduled and waiting in the queue, no new jobs were scheduled for that project, which makes all the sense. In Jenkins, however, I observe there are e.g. 5 instances of a job started at the same time, triggered by various upstream or SCM change events. They have all even kind of started, one of them is actually running on the build node and the rest are waiting in "Waiting for next available executor on (build node)". When the build node becomes available, they all dutifully start running in turn and all dutifully run through, most of them without no purpose at all as there are no more changes, and this all takes a huge amount of time.
The declarative pipeline script in SCM starts with the node declaration:
pipeline {
agent {
label 'BuildWin6'
}
...
I guess the actual problem is that Jenkins starts to run these jobs even though the specified build node is busy. Maybe it thinks I might have changed the Jenkinsfile in the SCM and specified another build node to run the thing on? Anyway, how to avoid this? This is probably something obvious as googling does not reveal any similar complaints.
For the record, answering myself. It looks like the best solution is to define another trigger job which is triggered itself by SCM changes. It should do nothing else, only checks out the needed svn repos (with depthOption: 'empty' for space and speed). The job needs to be bound to run on the same agent than the main job.
The main job is triggered only by the first job, not by SCM changes. Now if the main job is building for an hour, and there are 10 svn commits during that time, Jenkins will schedule 10 trigger job builds to run. They are all waiting in the queue as the agent is busy. When the agent becomes available, they all run quickly through and trigger the main job. The main job is triggered only once, for that one must ensure its grace/quiet period is larger than the trigger job run time.

Jenkins builds are being triggered without a trigger

I inherited a project set up to use Jenkins. I set the trigger to poll SCM every 5 minutes and disabled the pre-existing build remotely trigger. So now the only trigger set for Jenkins is to poll SCM every 5 minutes. However jenkins is building the project everytime the project repo is updated and NOT on polling. The polling log specifically says that it won't build (due to the author of the commit being in the excluded list) but the project builds anyway, unrelated to the polling. There are no other triggers set.
I didn't find any scripts anywhere that could explain this behaviour.

Jenkins build summary link to post-build build summary

I have a job that is triggered as a post-build action for dozens of other jobs. It essentially organizes and process the artifacts of those upstream jobs (using Copy Artifact Plugin), and publishes the reformatted logs, and originals, as artifacts of its own.
I want the build summary pages of an upstream job to have a link to that downstream job. From what I gather, this is not an intended use case. Conventional wisdom seems to be that if we want a link to a downstream job, we should run it as a sub-project within the Build step of the upstream job. But if I do that, I don't have the artifacts to pass to the downstream job. Catch 22.
Or is there something (even something really hacky and nasty) I can do to make this work. People want to get the processed artifacts directly from the build page.
One way (and I think the only way) to do this would be to call the Jenkins api from the downstream job to put a link to itself in the upstream job's description. But this seemed like more work than it was worth. So we just didn't do anything, and we're all fine.

Jenkins P4 plugin triggers, trigger by branch

I am using the "P4 Plugin" - https://wiki.jenkins-ci.org/display/JENKINS/P4+Plugin in Jenkins v2.32.1.
I would like to trigger specific jobs in Jenkins, depending on what path is changed in SCM.
For example if something changes in
//depot/branchA
Build job A. If something changes in
//depot/branchB
Build job B.
As far as I can tell the plugin is only set up to trigger every job in Jenkins that has "Perforce triggered build." (building both A and B jobs) Am I missing something? I am currently using SCM polling and trying to move to a more efficient system.
This understanding is based on reading the "Triggering" section of https://github.com/jenkinsci/p4-plugin/blob/master/SETUP.md
You may be defining too broad a workspace, and causing Jenkins to trigger on every submit. The client workspace associated with job A should only map //depot/branchA/..., and the workspace for job B should only map //depot/branchB/....
Jenkins polls for changes, and if it sees any, triggers any build that has a matching path. So if both jobs had a workspace that mapped //depot/... then submits to branchA or branchB would trigger both jobs.

jenkins downstream jobs not triggering

I have a large multi-module maven project, which is handled by a number of jobs in jenkins.
I have noticed that not all downstream jobs are trggered when an upstream job finishes successfully. The upstream and downstream jobs are calcualated automatically by Jenkins. Sometimes a subset of the jobs are triggered, and sometimes none. This puzzles me, any good explanations out there ?
In the category better late than never:
There was a bug in the maven job code which would ignore triggers if upstream dependencies were still building :
see https://issues.jenkins-ci.org/browse/JENKINS-21903
this was fixed end september 2014, so any relatively recent version should no longer be affected by this.

Resources