triggers directive not executed in Jenkins within pipeline script - jenkins

I'm trying to build a pipeline script which basically works. Now I tried to move the pollSCM trigger from the configuration, where it is unchecked now, into the pipeline script. See screenshot:
It works with the checkbox, so new code triggers the build, but I want to get it working via pipeline script. Unfortunately, nothing is being triggered, although I checked in some new code.
The syntax seems to be correct, see https://www.jenkins.io/doc/book/pipeline/syntax/#triggers
Any idea why the job is not triggered from the pipeline script?

You need to trigger the Pipeline once manually for those configurations to be picked up. Once you execute the Pipeline after adding the triggers you should see the check mark automatically getting added.

Related

Azure devops pipeline triggers

I am trying to build a pipeline that gets triggered by other pipeline and should not be able to be queued by itself. I am unable to find a way to do the same. Any help would be greatly appreciated.
Updated:
Structure i am looking for is PipelineA triggers PipelineB and waits for PipelineB's completion. If i add a trigger saying start when completed it wont trigger PipelineB since A is technically not complete.
Thanks
Assuming you are using Azure DevOps, you can add a pipeline trigger to run your pipeline upon the successful completion of the triggering pipeline.
To prevent triggering two runs of a pipeline, you must remove its own CI trigger or pipeline trigger.
We do not have this build-in feature at present. You need to customize yourself.
Triggering a pipeline can be done via the API and trough PowerShell. You can write your own script file and use the PowerShell tasks.
Then you could use Rest API to query to build result for you have triggered above.
Finally use a task Conditions.
Inside the Control Options of each task, and in the Additional options
for a job in a release pipeline, you can specify the conditions under
which the task or job will run.
Unless the query result of you trigger build PipelineB is completed/succeed. Then you could continue to run left tasks in Pipeline A

Is there any way to link two different pipelines using jenkinsfile?

Developer team needs a pipeline should only be allowed to start only if another related pipeline has completed and they need the pipeline view on same page
It is something like this , a set of stages complete and the next stages would start when project manager starts the pipeline manually .
Simply they need to visualize both pipelines in a single page like below picture.
(https://puppet.com/sites/default/files/2016-09/puppet_continuous_diagram.gif)
You can use the step build job: '<first_pipeline>', parameters: [...] in say, stage 1 of your second pipeline as a upstream job. Then define the steps of your second pipeline from stage 2 onwards. This will make sure that the first pipeline is always built when you trigger the second and also works with delivery pipeline view for single page visualization.
Or, if you just want to check if the first pipeline is completed without actually triggering it, use the api <jenkins_url>/job/<first_pipeline>/lastBuild/api/json in stage 1 of your second pipeline with a while loop to wait until the status is “completed”.
I haven't heard of a way to link them, but you could write the steps in the same pipeline.
You can work with input steps to ask for the project manager to confirm. Note that this halts an entire build processor until confirmed (or you set a timeout).
You might also want to have a look at conditional steps.

Updating declarative pipeline job's cron triggers doesn't update triggers

I have a declarative pipeline job defined as a pipeline script (not pipeline from SCM). It has a cron trigger:
triggers {
cron('H */4 * * 1-5')
}
I've run this a few times on-demand and cron triggered, and everything is fine so far. Now if I change the cron trigger, jenkins does not pick up the change, the old trigger is still in effect until I force a job run.
How do I get Jenkins to use the changed triggers without running the job manually? I think the question can be extended to any declarative job definition changes really, how do I get jenkins to update job settings without being forced to run the job.
This is related to how Jenkins pipeline works. Triggering as well as other job configurations are only loaded into Jenkins itself only after the job is executed once. It is simply the egg and the chicken question.
Since a pipeline job should be in the context of the place that stored it (Github, for example), you should consider triggering it from there then use some internal logic to decide if to run it or not.
The complexity of this solution should be relative to the number of times you update your trigger.
In short, currently, at time of writing, you can't.
I work around this for parameterised pipelines by adding a "noop" option and them making sure my pipeline does nothing when this option is selected. That way the job runs but has no side effects.
If your pipeline is not parameterised we are currently, as I said, out of luck.

Remove unnecessary jobs from build pipeline on Jenkins

I'm a bit new to Jenkins and I'm having an display issue with Build Pipeline plugin. I'm executing some jobs in parallel using JobFanIn Plugin, i.e. the next job in the pipeline will only be executed when all the previous jobs are concluded. However, the Build Plugin believes that all jobs will trigger a new instance of the last job. The execution goes right, but the display isn't.
Bellow is possible to observe what is happening in practice. The GENERATE TEST REPORT job will only be triggered when all TEST jobs are finished and this occurs ok. But since the Build Pipeline plugin expects 3 instances of this job to happen, only one happens and the others appeat as pending forever.Any ideas?
First image displaying what is happening
Second image displaying what is supposed to happen

Jenkins Pipeline: how to trigger a build as a downstream that requires an input

I have a Jenkins pipeline job defined. The Jenkinsfile of this job has an input defined using a dropdown. So when the build is triggered the user is requested to select an input.
For manually triggering this is working great!
However, if we want the same job to be triggered by another job as a downstream. How can we do this? providing the input parameters.
Thanks
As #StephenKing pointed out to get setup we need, we have to move away from using input step and make the build parameterized.
The answer I was looking for is here

Resources