Bitbucket webhook trigger after pipeline completes successfully - bitbucket

I'd like to trigger a webhook after a pipeline completes successfully, I looked in the trigger list and didn't find any, is there a workaround to trigger webhook manually via pipeliens?

You can use the Build status updated trigger to trigger a webhook based on a pipelines build. However, this will also be triggered by the INPROGRESS or FAILED state, and there is no possibility to specify a pipeline.
If you only want to trigger a webhook from when a specific pipeline completes succesfully, you can do this manually by adding the necessary commands to your bitbucket-pipelines.yaml file.
If you have multiple parallel steps in your pipeline, you should add the webhook trigger in a separate serial step, so it only runs when all parallel steps have completed.

Related

How to update gitlab Merge Request build status (pipeline status) when using Jenkins and the pipeline was previously canceled/deleted?

We set up Gitlab with Jenkins integration, by using Jenkins Gitlab plugin and trigerring Jenkins webhooks (regular Pipeline type job) on gitlab Merge Request events (configured in Gitlab->Repo->Integrations) and we are successfully displaying the job build status on the Merge Request page (by using updateGitlabCommitStatus in the pipeline) - it is displayed as a status of some pipeline, which as I understand, is created and associated with the last commit in the source branch.
At some point, I canceled this pipeline from the MR page and after that, closed and reopened the MR, thus re-triggering the build.
Unfortunately, after cancelling the pipeline, the latest build job statuses were not reflected nor in the MR, nor in the pipeline itself. In the pipeline page it wouldn't even display the newest jobs running in jenkins.
I tried deleting this specific pipeline (via curl - we are using gitlab 12.3, which doesn't allow deleting pipelines via GUI) and creating a new Merge Request (same branch, same commit), hoping that a new pipeline would be created in this case, but nothing. It seems that I have no means to display build status again for this specific commit.
Any suggestions how to overcome this?
Thanks in advance!
I have a similar case and the only way to do this is to re-run the pipeline from GitLab... You have to enter to the integrations and there you should look for all request sent to the Jenkins. Once you locate the correct one, you click on resend and it should give you the correct status.
For my observations, the update GitLab status command only work when it is invoked from a webhook.

How to trigger a Jenkins Job from outside and receive the status of build

I need to trigger a Job from another Scheduler and want to receive the status of the triggered Job after it finishes.
Sure it would be possible to create a status file or such, but it would be convenient to trigger the Job from a script or per httpRequest and wait for some kind of exit or return code.
Is that possible?
For this, you have to use Build Authorization Token Root Jenkins plugin
It will allow you to run Jenkins build remotely.
Check this image
Please check my article : https://medium.com/appgambit/trigger-jenkins-job-from-slack-5b07b6131e25
But yes you can skip slack integration and just use API for triggering particular job.
Or if you want to run your job after another job is complete or success you have to use "Build after other projects are built"
Build after other projects are built

How to prioritize the JOB NOTIFICATION plugin to run first in post build actions

I have a setup in my job config. I just want to execute some downstream jobs after job notification plugin invoked from POST Build actions. But downstream jobs always trigger first and followed by job notification plugin is invoked.
Is there any way to run first job notification plugin and followed by downstream job execution?
You can use the Flexible Publish from Jenkins. It maintains the order of execution of post-build actions and also allows you to use a publisher more than once in a build.
https://plugins.jenkins.io/flexible-publish
https://wiki.jenkins.io/display/JENKINS/Flexible+Publish+Plugin

how to trigger Jenkins job from ant script running on same Jenkins server

I'm running an ant job that runs several thing on master node and need to trigger several jobs on slave servers based on the options i select from the main job parameters
is there a way to call another job from within ant script without using jenkins-cli.jar as external command
You can trigger Jenkins jobs by doing an HTTP request:
Go to your Job Configuration
Build triggers > Check 'Trigger Builds Remotely' and think of an access token, e.g. SOME_SECURE_TOKEN.
In your ant script: execute a POST request to JENKINS_URL/job/JOB_NAME/build?token=SOME_SECURE_TOKEN
Note that if you have authentication in place, you need to set up a user who is authorized to start the other jobs. In that case read this more detailed explanation: https://www.nczonline.net/blog/2015/10/triggering-jenkins-builds-by-url/
Another solution would be to use the Parameterized Trigger Plugin to trigger builds from a Jenkins step. You mention that the jobs that need to be triggered may depend on the job parameters. In that case you can combine the Conditional Buildstep plugin with the Parameterized Trigger plugin.

How to trigger Jenkins downstream job from script but not manually through Jenskins?

From Jenkins plugin (ie. Delivery Pipeline, Parameterized Trigger), we could setup a pipeline which contains multiple jobs in sequence, for instance: Build -> Unit Test -> Deploy To DEV.
Now, for each pipeline, we want to stop after "Unit Test" because we need to wait for someone to approve the deployment before it can go on (We use JIRA as the tracking system for this).
Say when, someone approves the deployment ticket in JIRA, we already setup a post action to fire and trigger a job in Jenkins, say "Deploy To Dev" in this case. However, this job will run independently outside the pipeline.
Is there a way, we can trigger the down stream job from script within an instance of a pipeline so it can carry over all the parameters from upstream and shown as part of the pipeline?
Thx

Resources