Jenkins continue pipeline on failed stage - jenkins

I have a jenkins setup with a bunch of pipelines.
I wrote a new pipeline which can start all pipelines at once.
I would like to build other stages, even if one of them fails.
The script currently looks like this
stage 'CentOS6'
build 'centos6.testing'
stage 'CentOS7'
build 'centos7.testing'
stage 'Debian7'
build 'debian7-x64.testing'
stage 'Debian8'
build 'debian8-x64.testing'
The build scripts itself contain the node they should run on.
How can the script continue with the following stages even if one of them fails.
Cheers

If they should be run in a sequence you can do something like this:
def buildResult= 'success'
try{
build 'centos6.testing'
}catch(e){
buildResult = 'failure'
}
currentBuild.result = buildResult
If they should be run in parallell you just run them:
https://www.cloudbees.com/blog/parallelism-and-distributed-builds-jenkins

If you use the parallel step, this should work as you expect by default, as the failFast option, which aborts the job if any of the parallel branches fail, defaults to false.
For example:
parallel(
centos6: { build 'centos6.testing' },
centos7: { build 'centos7.testing' },
debian7: { build 'debian7-x64.testing' },
debian8: { build 'debian8-x64.testing' }
)

What worked for me:
'Task' : {
build( job : "DemoJob-2", wait: false )
build( job : "DemoJob-3", wait: false )
}

Related

Jenkins Pipeline - how to execute sequential sub-jobs and propagate the error while keeping the sequence

I am trying to have a pipeline that executes multiple sequential jobs.
The problem is that if I have the "propagate false" flag, the jobs are executed but the pipeline build always returns 'Success' regardless the sub-jobs status.
If I want the pipeline reflects the 'Fail' status when a sub-job fails, and remove the propagate flag, the sequence is broken at that failure point, and no more jobs are executed.
Can you help me getting the best way to achieve this?
I hope I was clear. Thank you very much.
pipeline{
stages{
stage('Tests'){
steps{
parallel(
'TestSet':{
build wait: true, job: 'Test A'
build wait: true, job: 'Test B'
build wait: true, job: 'Test C'
}
)
}
}
}
}
When you run the build step it actually returns an RunWrapper object (see Java Docs).
The RunWrapper has a getResult() function which enables you to get the result of the build that was executed, along side many other proproteins of the executed build, like the build number.
You can then run your jobs with the propagate false option, save the results, examine them after all builds are finished and then run your required logic.
For example:
pipeline{
stages{
stage('Tests'){
steps{
script {
parallel(
'TestSet': {
// Collect all results of build execution into a map
def results = [:]
results['Test A'] = build(wait: true, job: 'Test A').getResult()
results['Test B'] = build(wait: true, job: 'Test B').getResult()
results['Test C'] = build(wait: true, job: 'Test C').getResult()
// Analyze the result
failedJobs = results.findAll(it.value != 'SUCCESS') // you can also use Result.SUCCESS instead of the string
if (failedJobs){
error "The following jobs have failed: ${failedJobs.collect{it.key}.join(',')}"
}
}
)
}
}
}
}
}

Jenkins / How to deploy with one click

I am working on a project with git and jenkins (pipeline).
I want to build the project at every commit but only deploy it when the chief wants.
So I would like to have like, two pipelines, one that run at every commit and only build / test and one that I can run by clicking on a button labelled "click me to deploy" that do the job.
Do I must create 2 jenkins jobs or is there a plugin or a way to do this with 1 job.
I have searched but I found nothing about this.
You can achieve with 1job using Input Step (Pipeline). As part of your pipeline, after the build and test execution, add input step (Wait for Interactive input) and then add the deployment related stages.
So for each check-in, Jenkins build will trigger. But it will complete only build and test stages after that it will wait for chief approval to proceed for deployment.
reference: https://jenkins.io/doc/pipeline/steps/pipeline-input-step
This is an example on how to build a pipeline that builds, waits for input, and deploys when input is yes. If input timeout is exceeded then the job will exit. If one does not need the timeout then it can be ommited and the pipeline will wait indefinately without consuming an executor (note the agent annotation in the top pipeline and in each stage)
pipeline {
agent none
stages {
stage('Build') {
agent { label 'master' }
steps {
sh 'build something'
}
}
stage('Production deploy confirmation') {
options {
timeout(time: 60, unit: 'SECONDS')
}
input {
message "Deploy to production?"
ok "Yes"
}
steps {
echo 'Confirmed production deploy'
}
}
stage('Deploy Production') {
stage('internal') {
agent { label 'master' }
steps {
sh 'deploy something'
}
}
}
}
}
Try a parametrized Job with a Boolean parameter and two separate stages for Build and Deploy:
pipeline{
parameters {
booleanParam(name: 'deploy_param', defaultValue: false, description: 'Check if want to deploy')
}
stages{
stage("Build"){
steps{
// build steps
}
}
stage("Deploy"){
when {
environment name: 'deploy_param', value: 'true'
}
steps{
// deploy steps
}
}
}
}
In this way yo can have a CI build with "Deploy" stage turned off, as the deploy_param is set to false by default. And also a manual build ("when the chief wants") with "Deploy" stage turned on by manually setting the deploy_param to true.

Jenkin Pipeline does not execute the next stage

I have a jenkinfile like this
node any {
def global variables.
stage {
//build job 1
build job 'job1'
}
stage{
//build job 2
build job 'job2'
}
What's happening when i run this is , job 1 gets successfully built on jenkins, but anything written after the first 'build' statement doesn't get executed. I have tried moving the stages around the first always work but second doesn't as the control never reaches to second stage.
What am i doing wrong here?
Did you tried naming your stages ? As shown here : https://jenkins.io/doc/book/pipeline/#scripted-pipeline-fundamentals
node any {
def global variables.
stage('Build 1') {
//build job 1
build job: 'job1'
}
stage('Build 2') {
//build job 2
build job: 'job2'
}
}

Jenkins pipeline script to trigger other pipeline jobs

I want to create a parent pipeline job with stages that call trigger other jobs, which are also pipeline jobs.
Can I achieve this?
Here is a skeleton of what I want:
Parent job's script:
pipeline {
parallel{
stage("A") {
build 'name of job 1 which is a pipeline job again and has a parallel block with stages in it'
}
stage("B") {
build 'name of job 2 which is a pipeline job again and has a parallel block with stages in it'
}
stage("C") {
build 'name of job 3 which is a pipeline job again and has a parallel block with stages in it'
}
}
}
Does it work this way? Is there any way to achieve this
Sure does,
This is what we are using, we promote between environments by kicking off the same job from the current execution and don't wait for the result.
build(job: "org/${jobName}/${BRANCH_NAME}",
parameters: [
new StringParameterValue('ENV', env),
new StringParameterValue('ENV_NO', env_no),
new StringParameterValue('ARTIFACT_NAME', params.ARTIFACT_NAME)
],
propagate: false,
wait: false,
)
Refer to the reference for all options
https://jenkins.io/doc/pipeline/steps/pipeline-build-step/

Jenkins continuous delivery pipeline skip stage based on input

A simplified pipeline will look something like:
1. build
2. unit test
3. deploy to dev
4. integration tests
5. deploy to prod
For step #5 I've setup a Jenkins pipeline input command. We won't be deploying to prod on every commit so if we abort all those jobs it will have a big list of grey builds. Is it possible to have a skip option so the build can still be shown as green blue?
There is a better solution I just found. You can access the result of the input like by using the return value. The user has to check the checkbox, to run the optional stage. Otherwise the steps of the stage are skipped. If you skipp the whole stage, the stage will disappear and that "cleans" the stage view history.
stage('do optional stuff?') {
userInput = input(
id: 'userInput', message: "Some important question?", parameters: [
booleanParam(defaultValue: false, description: 'really?', name: 'myValue')
])
}
stage('optional: do magic') {
if (userInput) {
echo "do magic"
} else {
// do what ever you want when skipping this build
currentBuild.result = "UNSTABLE"
}
}
How about:
stage('Deploy') {
when { branch 'master' }
steps {
sh '...'
}
}
}
the stage will be skipped for commits on other branches and will be green.
Can't you do something like this, it will be blue/green whatever you choose from input, and you can then run the deployment depending on it too?
def deployToProduction = true
try{
input 'Deploy to Production'
}catch(e){
deployToProduction = false
}
if(deployToProduction){
println "Deploying to production"
}
Instead of using pipeline as a code Jenkins2 feature, you can setup Jobs with downstream/upstream configuration.
Build -> Unit test -> Deploy to Dev -> Integration tests -> Promote to Prod -> Deploy to Prod
At present it gives more control to choose which version of pipeline you wish to Prod.
For greater visibility you can configure Delivery Pipeline using Delivery-Pipeline-Plugin.

Resources