My Jenkinsfile is built from several stages and can be triggered by the scheduler or via push to Gitlab. I would like to use the trigger source in order to skip several stages for a trigger. How can I identify which event triggered the job?
If you mean Push/Merge events, you can check env.gitlabActionType variable.
Related
Example
Multiple SCM
GIT1
GIT2
If i do PollScm, the job will get triggered if changes are detected on any SCM.
I would like to only poll GIT2, and trigger if changes are on GIT2, only
I am trying to build a pipeline that gets triggered by other pipeline and should not be able to be queued by itself. I am unable to find a way to do the same. Any help would be greatly appreciated.
Updated:
Structure i am looking for is PipelineA triggers PipelineB and waits for PipelineB's completion. If i add a trigger saying start when completed it wont trigger PipelineB since A is technically not complete.
Thanks
Assuming you are using Azure DevOps, you can add a pipeline trigger to run your pipeline upon the successful completion of the triggering pipeline.
To prevent triggering two runs of a pipeline, you must remove its own CI trigger or pipeline trigger.
We do not have this build-in feature at present. You need to customize yourself.
Triggering a pipeline can be done via the API and trough PowerShell. You can write your own script file and use the PowerShell tasks.
Then you could use Rest API to query to build result for you have triggered above.
Finally use a task Conditions.
Inside the Control Options of each task, and in the Additional options
for a job in a release pipeline, you can specify the conditions under
which the task or job will run.
Unless the query result of you trigger build PipelineB is completed/succeed. Then you could continue to run left tasks in Pipeline A
I have a declarative pipeline job defined as a pipeline script (not pipeline from SCM). It has a cron trigger:
triggers {
cron('H */4 * * 1-5')
}
I've run this a few times on-demand and cron triggered, and everything is fine so far. Now if I change the cron trigger, jenkins does not pick up the change, the old trigger is still in effect until I force a job run.
How do I get Jenkins to use the changed triggers without running the job manually? I think the question can be extended to any declarative job definition changes really, how do I get jenkins to update job settings without being forced to run the job.
This is related to how Jenkins pipeline works. Triggering as well as other job configurations are only loaded into Jenkins itself only after the job is executed once. It is simply the egg and the chicken question.
Since a pipeline job should be in the context of the place that stored it (Github, for example), you should consider triggering it from there then use some internal logic to decide if to run it or not.
The complexity of this solution should be relative to the number of times you update your trigger.
In short, currently, at time of writing, you can't.
I work around this for parameterised pipelines by adding a "noop" option and them making sure my pipeline does nothing when this option is selected. That way the job runs but has no side effects.
If your pipeline is not parameterised we are currently, as I said, out of luck.
I have a trouble about running initial pipeline from Job Dsl. Is there way to perform/run initial pipeline automatically after it is created. I am creating dynamic pipeline for each feature branch. So In each time when the feature pipeline created, I want to run it automatically after pipeline and its job creation completed.I don't want to perform pipeline manually from Jenkins. Because it is too tricky to perform each created feature pipeline.
You can use queue to schedule a build of a job. See https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-DSL-Commands#queue
pipelineJob('example') {
// ...
}
queue('example')
I want to create a Jenkins job ( stage 1) that will gather all parameters needed throughout a build pipeline as i do not want to hard-code each stage individually as they are likely to change regularly.
In the pipeline at stage 3 there will be 5 simultaneous jobs being run with each containing different parameters which will have been got from stage 1.
Is there a way i can gather the parameters i need in stage 1 using a cron job which will be available for subsequent stages ?
I think what is throwing everyone off answering your question is the "cron" part. What has "cron" got to do with any of this?
If we ignore that, there is an answer here that deals with a similar situation:
How to build a pipeline of jobs in Jenkins?
Using the Parameterized Trigger Plugin, you can collect all your parameters in the first job, and then just pass it from one job to another as environment variables, using this plugin.