Is there a way to check the error log and, in particular, the number of consecutive failures and then integrate the post build task in the Jenkins job?
I want to integrate the Pagerduty plugin after my job failure under some specific conditions.
Related
From Jenkins plugin (ie. Delivery Pipeline, Parameterized Trigger), we could setup a pipeline which contains multiple jobs in sequence, for instance: Build -> Unit Test -> Deploy To DEV.
Now, for each pipeline, we want to stop after "Unit Test" because we need to wait for someone to approve the deployment before it can go on (We use JIRA as the tracking system for this).
Say when, someone approves the deployment ticket in JIRA, we already setup a post action to fire and trigger a job in Jenkins, say "Deploy To Dev" in this case. However, this job will run independently outside the pipeline.
Is there a way, we can trigger the down stream job from script within an instance of a pipeline so it can carry over all the parameters from upstream and shown as part of the pipeline?
Thx
I'm using webhooks on Bitbucket to trigger builds on Jenkins when push event occurs, for this purpose I'm using Bitbucket plugin.
My Jenkins pipeline consist of multiple cross depending tasks e.g.:
Main pipeline (triggered task)
1) build docker images
2) run tests
3) do something
The build is triggered when expected but tasks are failing because they rely on specific branch that I need to provide. Unfortunately I don't know how to access the webhook's payload that have all the information I need.
The alternative would be using Poll CMS option in Jenkins but I prefer to build on demand and not periodically.
From:
https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
they say:
Since 1.1.5 Bitbucket automatically injects the payload received by Bitbucket into the build. You can catch the payload to process it accordingly through the environmental variable $BITBUCKET_PAYLOAD.
Regards
I've just started working with the Workflow plugin.
The set-up I have currently consists of a Workflow script that uses the build step to basically define a pipeline made up of multiple downstream jobs.
This is working well but their there isn't really any link between the output of Workflow build and the output from all the downstream builds, is their a way to either,
Link from the Workflow project build output to all the corresponding downstream builds.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
I'm hoping with either of these options it will be possible to see the output from the whole pipeline via the Workflow job output.
IMO, the intention of Workflow is to replace pipelines of various Jenkins jobs with just a single job. This may be why Workflow doesn't make any significant effort to link to downstream jobs. I've been converting my "pipelines" to monolithic Workflow jobs, and really appreciating the fact that all the actions are more tightly grouped together.
Link from the Workflow project build output to all the corresponding downstream builds.
PR 218 under review as of this writing.
Capture the console output of the downstream jobs and include it in the output of the Workflow job.
JENKINS-26124
We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read
I've got a Jenkins job that is intended to do the following:
Build a project and deploy it to a test server
Run tests
If the tests fail, roll back the server to the previous version
If the tests succeed, update the version in our source control system
Because we have a single test server, we need to ensure that Jenkins is only running a single version of this job at a time. Unfortunately, we can't seem to find a way to run a job on failure and keep the upstream job from executing while the downstream job is running.
Is there an easy way to do this? Is there a better way?
The Jenkins Post Build Task allows you to run tasks in a job after failure. Rolling the server back sounds more like a task than a job, so that might suit.
Otherwise, there are a couple of plugins that allow for more complex pipelining features. The Pipeline Plugin seems to be the most popular at the moment.
In job configuration, under Advanced Project Options (just before the SCM part), click the Advanced... button. You can now chose to Block build when upstream/downstream is executing
As for running conditional steps on failure:
- Use Post Build Tasks as Paul suggested, or
- Configure logic using Conditional Build steps