How to trigger a job when one job is finished - jenkins

I used this:
post {
always {
sh "echo Jenkins Job is Done"
junit 'target/surefire-reports/*.xml'
echo 'Run darkWeb Test pipeline!'
build job: 'DarkWeb'
}
and it works. The problem is that the original job continuing running while the second job (DarkWeb) is running too.
I want the 'DarkWeb' job to run only after the original job is completely finished

You can try the following syntax:
build job: 'Darkweb', wait: false
The wait: false token would allow the first job to finish without waiting for the completion of second job.
Let me know if it works!

Related

Jenkins build job execute jobs in parallel how to create the programmatically without creating pipeline in jenkins GUI

i have 2 functions which I like to execute them in parallel, each job will run on slave or where there are enough resources. the problem is that I dont want to create pipelines in Jenkins GUI and then from that pipeline to execute my function. i like to be able to create the pipeline on the fly in code.
this is example what i have now :
//downstream job
build job: "my_job_pipeline_1",
parameters: [string(name: 'PROJECT_NAME', value: "${PROJECT_NAME}"),
propagate: false,
wait: true
//downstream job
build job: "my_job_pipeline_2",
parameters: [string(name: 'PROJECT_NAME', value: "${PROJECT_NAME}"),
propagate: false,
wait: true
This called from my main pipeline but for this to work I have to create 2 pipelines in Jenkins GUI
my_job_pipeline_2 and my_job_pipeline_1
can i create those pipelines programmatically ?
The Jenkins CLI allows you to create jobs from the terminal. Follow the documentation to set it up. Under Manage Jenkins > Jenkins CLI you can find the available commands (including "create job")

Jenkins delivery pipeline view -how to add manual trigger(play button)?

I am trying to connect/achieve below things:
Pipeline having job with multiple stages and tasks
Configure above pipeline with Delivery view plugin
Last task of last stage is to deploy to production [caution: want this task with manual trigger]
Jenkins version: 2.222.x
Things I tried
node {
stage 'Build'
task 'Compile'
echo 'Compiling'
sleep 1
task 'Unit test'
sleep 1
stage 'Test'
task 'Component tests'
echo 'Running component tests'
sleep 1
task 'Integration tests'
echo 'Running component tests'
sleep 1
stage 'Deploy'
task 'Deploy to UAT'
echo 'Deploy to UAT environment'
sleep 1
task 'Deploy to production'
echo 'Deploy to production, but wanted with manual trigger'
sleep 1
}
Below is the desired configuration which I am looking.
desired configuration, delivery pipeline plugin wiki
I could able to achieve that manual trigger by creating multiple free style jobs with upstream and downstream configuration and for the manual step I can set post build job with manual trigger.
But that is something which I want in pipeline because there we have task (inside stage also we can do separate vertical tasks)feature.
Please help me and suggest how to achieve this.

Run next job in sequence even if the previous job fails in a Jenkins Build flow

I have a build flow which builds 4 jobs in sequence
eg;
build(Job 1)
build(Job 2)
build(Job 3)
build(Job 4)
I want to run Job 4 even if any of the previous job fails also . How can I do that in the build flow ?
you can set propagate to false, that will ensure your workflow will continue if particular job fails:
build job: '<job_name>', propagate: false
For me, propagate: false didn't worked, So I used ignore(FAILURE) instead in my BuildFlow, to make sure that all the jobs in flow executes, even if there are failures. (Ref)
ignore(FAILURE) {
build("JobToCall", Param1: "param1Val", Param2: "param2Val")
}
You can use Jenkins Workflow Plugin as follows:
try {
build 'A'
} catch(e) {
echo 'Build for job A failed'
}
try {
build 'B'
} catch(e) {
echo 'Build for job B failed'
}
You can extend this idiom to any number of jobs or any combination of success/failure flow you want (for example, adding build steps inside catches if you want to build some job in case another failed).

Jenkins workflow check if job is running or schedulled

is it possible to check if some job is running or scheduled from some workflow script?
Although it may seems enough to manage concurrency using stages:
stage name: 'stageName', concurrency: 1
and run builds in those stages like this:
build job: 'test-job', wait: false
it may happen that someone started the job test-job manually and I just want to handle that situation in my workflow script. For example by skipping the build or waiting till the build is finished.
This works in flows that are not using Groovy Sandbox:
for (Project job : Hudson.getInstance().getProjects()) {
if (job.isBuilding() || job.isInQueue()) {
echo "${job.getName()}"
}
}

Running a Post Build script when a Jenkins job is aborted

Is there a possible way / plugin that can run a post build script when a Jenkins job is aborted.
I do see that the post build plugin provides an action to execute a set of scripts, but these can be run only on 2 options either a successful job or a failed job.
This question is positively answered here.
The Post Build Task plugin is run even if a job is aborted.
Use it to search the log text for "Build was aborted" and you can specify a shell script to run.
Works like a charm. :-)
For a declarative Jenkins pipeline, you can achieve it as follows:
pipeline {
agent any
options {
timeout(time: 2, unit: 'MINUTES') // abort on exceeding the timeout
}
stages {
stage('Echo 1') {
steps {
echo 'Hello World'
}
}
stage('Sleep'){
steps {
sh 'sleep 180'
}
}
stage('Wake up'){
steps {
echo "Woken up"
}
}
}
// this post part is executed if job is aborted
post {
aborted {
script{
echo "Damn it. I was aborted!"
}
}
}
}
As far as I know, if a build is aborted, there's no way to execute any build steps (or post build steps) in it any more - which makes sense, that's what I would expect of "abort".
What you could do is create another job that monitors the first one's status, and triggers if it was aborted (e.g. see the BuildResultTrigger plugin).
Another solution might be to create a "wrapper" job, which calls the first one as a build step - this way you can execute additional steps after its completion, like checking its status, even if it was aborted.
If you use a scripted pipeline, you can always use a try/catch/finally set, where the build gets executed in the try block and the postbuild steps are in the finally block. This way, even if the build fails, post build steps are executed.
try {
build here
}catch(FlowInterruptedException interruptEx){
catch exception here
} finally {
postBuild(parameters)
}

Resources