Running a Post Build script when a Jenkins job is aborted - jenkins

Is there a possible way / plugin that can run a post build script when a Jenkins job is aborted.
I do see that the post build plugin provides an action to execute a set of scripts, but these can be run only on 2 options either a successful job or a failed job.

This question is positively answered here.
The Post Build Task plugin is run even if a job is aborted.
Use it to search the log text for "Build was aborted" and you can specify a shell script to run.
Works like a charm. :-)

For a declarative Jenkins pipeline, you can achieve it as follows:
pipeline {
agent any
options {
timeout(time: 2, unit: 'MINUTES') // abort on exceeding the timeout
}
stages {
stage('Echo 1') {
steps {
echo 'Hello World'
}
}
stage('Sleep'){
steps {
sh 'sleep 180'
}
}
stage('Wake up'){
steps {
echo "Woken up"
}
}
}
// this post part is executed if job is aborted
post {
aborted {
script{
echo "Damn it. I was aborted!"
}
}
}
}

As far as I know, if a build is aborted, there's no way to execute any build steps (or post build steps) in it any more - which makes sense, that's what I would expect of "abort".
What you could do is create another job that monitors the first one's status, and triggers if it was aborted (e.g. see the BuildResultTrigger plugin).
Another solution might be to create a "wrapper" job, which calls the first one as a build step - this way you can execute additional steps after its completion, like checking its status, even if it was aborted.

If you use a scripted pipeline, you can always use a try/catch/finally set, where the build gets executed in the try block and the postbuild steps are in the finally block. This way, even if the build fails, post build steps are executed.
try {
build here
}catch(FlowInterruptedException interruptEx){
catch exception here
} finally {
postBuild(parameters)
}

Related

How to create a post-build script for all Jenkins jobs

Is there a way to create a post build script for all Jenkins jobs? Some script that is shared across jobs? I would like to avoid manually creating a post-build script for each job if possible.
AFAIK there is no job that will always run after any other job. You can emulate that creating a new job and then either configure a post build trigger on all your jobs to run the new one, or configure a build trigger in the new job to run after all the jobs you specify.
However, if all your jobs are pipelines and you have a shared library you can create a step that is actually a pipeline with a built-in post, for example consider a step called postPipeline.groovy:
def call(Closure body) {
pipeline {
agent any
stages {
stage('Run pipeline') {
steps {
script {
body()
}
}
}
}
post {
always {
<< routine post actions go here >>
}
}
}
}
By changing all the pipelines to use this step you ensure they all run the post script:
postPipeline {
stage('Pipeline stage') {
<< code >>
}
.
.
.
}
Still, in any case you get yourself involved in manual labor.

How to avoid that Jenkins reuses a workspace?

I have several parallel stages that share a node, and clean up their workspace after they are done. The issue I have is that when the stage fails, I want the workspace NOT cleaned up, so I can inspect it.
What happens instead is:
failing stage fails, leaves the workspace as I want it
second stage reuses the workspace, succeeds
second stage cleans up the workspace
How can I avoid this?
Jenkins has a post-stage for this. Depending on the result of your pipeline a different branch of code is executed. So lets say your pipeline is successful then your cleanup script of clean up plugin is called. If you pipeline fails you can archive your results or simply skip the cleanup of the workspace.
Check the official jenkin documentation for more information (search for 'post'): https://jenkins.io/doc/book/pipeline/syntax/
pipeline {
agent any
stages {
stage('PostExample') {
steps {
// do something here
}
}
}
post { //Is called after your stage
failure {
//pipeline failed - do not clear workspace
}
success {
//pipeline is successful - clear workspace
}
}
}
On the other hand if you want to keep your results you could think about archiving them so they are independent from your workspace because you can access them anytime from the jenkins gui (
you just have to use finally(this will execute irrespective of stage output) method while you are executing jenkins files: Refer to How to perform actions for failed builds in Jenkinsfile

How to execute script before the job starts?

Is it possible to execute my shell script before the job starts? We are using the jenkins pipeline, but it is already late when Jenkins is processing this script - we are dealing with unknown problem with keychain and git, but we are using global libraries as well, that need to be downloaded from git before the pipeline script is executed.
Therefore we need to delete the items which are causing the problems from keychain BEFORE it downloads the global library for the job. Is there anything like this available in Jenkins?
I recommend using a pipeline, you can control which stage is executed.
see below example:
pipeline {
agent any
stages {
stage('before job starts') {
steps {
sh 'your_scripts.sh'
}
}
stage('the job') {
steps {
sh 'run_job.sh'
}
}
}
post {
always {
echo 'I will always run!'
}
}
}

Manual trigger of delivery in Jenkins 2.0 for a successful build

I have a pipeline with build and test stages, and on success archiving the result:
node {
try {
stage('build') {
sh 'make'
}
stage('test') {
sh 'make test'
}
stage('archive') {
archiveArtifacts artifacts: "build/"
}
} finally {
junit "tests/*.xml"
}
I'd like to trigger a 'deploy' stage manually from the build page. For instance on the build page, there is a link to download the artifact. I'd like to have a custom button to trigger a 'deploy' action. This action would need to access to the archived artifact and call a web service.
I have tried with the 'input' step in my pipeline, but it's not satisfactory as the build doesn't appear as successful and is still active until I decide to process or abort. Since I'm using concurrentBuild=false, it's not a practical solution.

Run next job in sequence even if the previous job fails in a Jenkins Build flow

I have a build flow which builds 4 jobs in sequence
eg;
build(Job 1)
build(Job 2)
build(Job 3)
build(Job 4)
I want to run Job 4 even if any of the previous job fails also . How can I do that in the build flow ?
you can set propagate to false, that will ensure your workflow will continue if particular job fails:
build job: '<job_name>', propagate: false
For me, propagate: false didn't worked, So I used ignore(FAILURE) instead in my BuildFlow, to make sure that all the jobs in flow executes, even if there are failures. (Ref)
ignore(FAILURE) {
build("JobToCall", Param1: "param1Val", Param2: "param2Val")
}
You can use Jenkins Workflow Plugin as follows:
try {
build 'A'
} catch(e) {
echo 'Build for job A failed'
}
try {
build 'B'
} catch(e) {
echo 'Build for job B failed'
}
You can extend this idiom to any number of jobs or any combination of success/failure flow you want (for example, adding build steps inside catches if you want to build some job in case another failed).

Resources