Jenkins pipeline if else not working - jenkins

I am creating a sample jenkins pipeline, here is the code.
pipeline {
agent any
stages {
stage('test') {
steps {
sh 'echo hello'
}
}
stage('test1') {
steps {
sh 'echo $TEST'
}
}
stage('test3') {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
}
this pipeline fails with following error logs
Started by user admin
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 15: Not a valid stage section definition: "if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}". Some extra configuration is required. # line 15, column 9.
stage('test3') {
^
WorkflowScript: 15: Nothing to execute within stage "test3" # line 15, column 9.
stage('test3') {
^
But when i execute the following example from this url, it executes successfully and print the else part.
node {
stage('Example') {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
The only difference i can see is that in the working example there is no stages but in my case it has.
What is wrong here, can anyone please suggest?

your first try is using declarative pipelines, and the second working one is using scripted pipelines. you need to enclose steps in a steps declaration, and you can't use if as a top-level step in declarative, so you need to wrap it in a script step. here's a working declarative version:
pipeline {
agent any
stages {
stage('test') {
steps {
sh 'echo hello'
}
}
stage('test1') {
steps {
sh 'echo $TEST'
}
}
stage('test3') {
steps {
script {
if (env.BRANCH_NAME == 'master') {
echo 'I only execute on the master branch'
} else {
echo 'I execute elsewhere'
}
}
}
}
}
}
you can simplify this and potentially avoid the if statement (as long as you don't need the else) by using "when". See "when directive" at https://jenkins.io/doc/book/pipeline/syntax/. you can also validate jenkinsfiles using the jenkins rest api. it's super sweet. have fun with declarative pipelines in jenkins!

It requires a bit of rearranging, but when does a good job to replace conditionals above. Here's the example from above written using the declarative syntax. Note that test3 stage is now two different stages. One that runs on the master branch and one that runs on anything else.
stage ('Test 3: Master') {
when { branch 'master' }
steps {
echo 'I only execute on the master branch.'
}
}
stage ('Test 3: Dev') {
when { not { branch 'master' } }
steps {
echo 'I execute on non-master branches.'
}
}

If you wanted to create a condition to execute only a stage based on expression you can use keyword when
stage ('test3'){
when { expression { return env.BRANCH_NAME == 'master'} }
steps {
echo 'I only execute on the master branch.'
}
}
}
With the expression key word you can add any condition.
e.g. if stage is dependent on generated file in workspace.
stage ('File Dependent stage'){
when { expression { return fileExists ('myfile') } }
steps {
echo "file exists"
}
}
}

if ( params.build_deploy == '1' ) {
println "build_deploy 是 ${params.build_deploy}"
jobB = build job: 'k8s-core-user_deploy', propagate: false, wait: true, parameters: [
string(name:'environment', value: "${params.environment}"),
string(name:'branch_name', value: "${params.branch_name}"),
string(name:'service_name', value: "${params.service_name}"),
]
println jobB.getResult()
}

Related

Declarative pipeline to check the build step = Failure then trigger next build step, but not fail the job.

I am trying to fail a build step in Jenkinsfile with failed results = failure. Once the step is failed it triggers my rollback job. Tried many different things, but had no luck. Any help would be greatly appreciated.
pipeline {
agent any
stages {
stage('Git Checkout') {
steps {
script {
git 'somegit-repo'
sh'''
mvn package
'''
echo currentBuild.result
catchError {
build 'rollback'
}
}
}
}
}
One way is to use a shell script and with exit 1 statement
e.g.
sh "exit 1"
Or you can use error step
error('Failing build because...')
See https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#error-error-signal
Use a try catch block
node {
stage("Run scripts") {
try {
<some command/script>
} catch (error) {
<rollback command/script>
}
}
}
Thank you so much. This seems to work!
stages {
stage("some test") {
steps{
script {
git 'mygitrepo.git'
try {
sh''' mvn test '''
} catch (error) {
script {
def job = build job: 'rollback-job'
}
}
}
}
}
If you check the cleaning and notifications page
You can do a post step and get rid of all the try/catch stuff and get a cleaner Jenkinsfile
pipeline {
agent any
stages {
stage('No-op') {
steps {
sh 'ls'
}
}
}
post {
always {
echo 'One way or another, I have finished'
deleteDir() /* clean up our workspace */
}
success {
echo 'I succeeeded!'
}
unstable {
echo 'I am unstable :/'
}
failure {
echo 'I failed :('
}
changed {
echo 'Things were different before...'
}
}
}

Declarative pipeline when condition in post

As far as declarative pipelines go in Jenkins, I'm having trouble with the when keyword.
I keep getting the error No such DSL method 'when' found among steps. I'm sort of new to Jenkins 2 declarative pipelines and don't think I am mixing up scripted pipelines with declarative ones.
The goal of this pipeline is to run mvn deploy after a successful Sonar run and send out mail notifications of a failure or success. I only want the artifacts to be deployed when on master or a release branch.
The part I'm having difficulties with is in the post section. The Notifications stage is working great. Note that I got this to work without the when clause, but really need it or an equivalent.
pipeline {
agent any
tools {
maven 'M3'
jdk 'JDK8'
}
stages {
stage('Notifications') {
steps {
sh 'mkdir tmpPom'
sh 'mv pom.xml tmpPom/pom.xml'
checkout([$class: 'GitSCM', branches: [[name: 'origin/master']], doGenerateSubmoduleConfigurations: false, submoduleCfg: [], userRemoteConfigs: [[url: 'https://repository.git']]])
sh 'mvn clean test'
sh 'rm pom.xml'
sh 'mv tmpPom/pom.xml ../pom.xml'
}
}
}
post {
success {
script {
currentBuild.result = 'SUCCESS'
}
when {
branch 'master|release/*'
}
steps {
sh 'mvn deploy'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result,
)
}
failure {
script {
currentBuild.result = 'FAILURE'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result
)
}
}
}
In the documentation of declarative pipelines, it's mentioned that you can't use when in the post block. when is allowed only inside a stage directive.
So what you can do is test the conditions using an if in a script:
post {
success {
script {
if (env.BRANCH_NAME == 'master')
currentBuild.result = 'SUCCESS'
}
}
// failure block
}
Using a GitHub Repository and the Pipeline plugin I have something along these lines:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
make
'''
}
}
}
post {
always {
sh '''
make clean
'''
}
success {
script {
if (env.BRANCH_NAME == 'master') {
emailext (
to: 'engineers#green-planet.com',
subject: "${env.JOB_NAME} #${env.BUILD_NUMBER} master is fine",
body: "The master build is happy.\n\nConsole: ${env.BUILD_URL}.\n\n",
attachLog: true,
)
} else if (env.BRANCH_NAME.startsWith('PR')) {
// also send email to tell people their PR status
} else {
// this is some other branch
}
}
}
}
}
And that way, notifications can be sent based on the type of branch being built. See the pipeline model definition and also the global variable reference available on your server at http://your-jenkins-ip:8080/pipeline-syntax/globals#env for details.
Ran into the same issue with post. Worked around it by annotating the variable with #groovy.transform.Field. This was based on info I found in the Jenkins docs for defining global variables.
e.g.
#!groovy
pipeline {
agent none
stages {
stage("Validate") {
parallel {
stage("Ubuntu") {
agent {
label "TEST_MACHINE"
}
steps {{
sh "run tests command"
recordFailures('Ubuntu', 'test-results.xml')
junit 'test-results.xml'
}
}
}
}
}
post {
unsuccessful {
notify()
}
}
}
// Make testFailures global so it can be accessed from a 'post' step
#groovy.transform.Field
def testFailures = [:]
def recordFailures(key, resultsFile) {
def failures = ... parse test-results.xml script for failures ...
if (failures) {
testFailures[key] = failures
}
}
def notify() {
if (testFailures) {
... do something here ...
}
}

Chained multiple pipeline based on 'post' jenkins block

I'm beginner to Jenkins. I have code pipeline structure like this
Repo1 -> Repo2 -> Repo3 -> Deploy
I already created such hierarchy via GUI but I want to create it via pipeline as code.I want to create chain of pipelines where I clone different repos and perform tests on it and then continue to another repo based on current pipeline post result.
This is my jenkinsfile - (psuedo code like as it gives me error to build)
pipeline {
agent any
stages {
stage('Build Repo1') {
steps {
sh 'echo "repo1 build!"'
}
}
stage('Test Repo1') {
steps {
sh 'echo "repo success!"'
}
}
}
post {
success {
pipeline {
agent any
stages {
stage('Build Repo2') {
steps {
sh 'echo "build repo2!"'
}
}
stage('Test Repo2') {
steps {
sh 'echo "test repo2!"'
}
}
}
post {
success {
# continue to generate pipeline for repo3
echo 'This will always run'
}
failure {
echo 'This will run only if failed'
}
}
}
}
failure {
echo 'This will run only if failed'
}
unstable {
echo 'This will run only if the run was marked as unstable'
}
changed {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now successful'
}
}
}
Please help!

How to lock multiple stages of declarative Jenkins pipeline?

I want to run multiple stages inside a lock within a declarative Jenkins pipeline:
pipeline {
agent any
stages {
lock(resource: 'myResource') {
stage('Stage 1') {
steps {
echo "my first step"
}
}
stage('Stage 2') {
steps {
echo "my second step"
}
}
}
}
}
I get the following error:
Started by user anonymous
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 10: Expected a stage # line 10, column 9.
lock(resource: 'myResource') {
^
WorkflowScript: 10: Stage does not have a name # line 10, column 9.
lock(resource: 'myResource') {
^
WorkflowScript: 10: Nothing to execute within stage "null" # line 10, column 9.
lock(resource: 'myResource') {
^
3 errors
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:116)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:430)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:393)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:257)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:405)
Finished: FAILURE
What's the problem here? The documentation explicitly states:
lock can be also used to wrap multiple stages into a single
concurrency unit
It should be noted that you can lock all stages in a pipeline by using the lock option:
pipeline {
agent any
options {
lock resource: 'shared_resource_lock'
}
stages {
stage('will_already_be_locked') {
steps {
echo "I am locked before I enter the stage!"
}
}
stage('will_also_be_locked') {
steps {
echo "I am still locked!"
}
}
}
}
This has since been addressed.
You can lock multiples stages by grouping them in a parent stage, like this :
stage('Parent') {
options {
lock('something')
}
stages {
stage('one') {
...
}
stage('two') {
...
}
}
}
(Don't forget you need the Lockable Resources Plugin)
The problem is that, despite the fact that declarative pipelines were technically available in beta in September, 2016, the blog post you reference (from October) is documenting scripted pipelines, not declarative (it doesn't say as much, so I feel your pain). Lockable resources hasn't been baked in as a declarative pipeline step in a way that would enable the feature you're looking for yet.
You can do:
pipeline {
agent { label 'docker' }
stages {
stage('one') {
steps {
lock('something') {
echo 'stage one'
}
}
}
}
}
But you can't do:
pipeline {
agent { label 'docker' }
stages {
lock('something') {
stage('one') {
steps {
echo 'stage one'
}
}
stage('two') {
steps {
echo 'stage two'
}
}
}
}
}
And you can't do:
pipeline {
agent { label 'docker' }
stages {
stage('one') {
lock('something') {
steps {
echo 'stage one'
}
}
}
}
}
You could use a scripted pipeline for this use case.
If the resource is only used by this pipeline you could also disable concurrent builds:
pipeline {
agent any
options {
disableConcurrentBuilds()
}
stages {
stage('will_already_be_locked') {
steps {
echo "I am locked before I enter the stage!"
}
}
stage('will_also_be_locked') {
steps {
echo "I am still locked!"
}
}
}
}
Altho the options{} block offers this functionality it is not posible to use it in some use cases.
Lets say that you have to name your lock() with a specific name depending on a branch or an environment. You have a pipeline which you dont want to be block by disableConcurrentBuilds() and lock resources depending on a discriminator. You can not name your lock() inside the options{} block by using a environment variable or any other variable from the pipeline because the block is evaluated outside the agent.
The best solution in my opinion is the following:
pipeline {
agent { label 'docker' }
stages {
stage('Wrapper') {
steps {
script {
lock(env.BRANCH_NAME) {
stage('Stage 1') {
sh('echo "stage1"')
}
stage('Stage 2') {
sh('echo "stage2"')
}
}
}
}
}
}
}
Keep in mind that the script {} block takes a block of Scripted Pipeline and executes that in the Declarative Pipeline so no steps{} are allowed inside.
I run multiple build and test containers on the same build nodes. The test containers must lock up the node name as db username for the tests.
lock(resource: "${env.NODE_NAME}" as String, variable: 'DBUSER')
Locks in options are computed at load time, but NODE_NAME is unknown that early. In order to lock multiple stages for visual effect, we can create stages inside script block, i.e. 'run test' stage in the snippet. The stage visualization is just as good as other stage blocks.
pipeline {
agent any
stages {
stage('refresh') {
steps {
echo "freshing on $NODE_NAME"
lock(resource: "${env.NODE_NAME}" as String, variable: 'DBUSER') {
sh '''
printenv | sort
'''
script {
stage('run test')
sh '''
printenv | sort
'''
}
}
}
}
}
}

How do I assure that a Jenkins pipeline stage is always executed, even if a previous one failed?

I am looking for a Jenkinsfile example of having a step that is always executed, even if a previous step failed.
I want to assure that I archive some builds results in case of failure and I need to be able to have an always-running step at the end.
How can I achieve this?
We switched to using Jenkinsfile Declarative Pipelines, which lets us do things like this:
pipeline {
agent any
stages {
stage('Test') {
steps {
sh './gradlew check'
}
}
}
post {
always {
junit 'build/reports/**/*.xml'
}
}
}
References:
Tests and Artifacts
Jenkins Pipeline Syntax
try {
sh "false"
} finally {
stage 'finalize'
echo "I will always run!"
}
Another possibility is to use a parallel section in combination with a lock. For example:
pipeline {
stages {
parallel {
stage('Stage 1') {
steps {
lock('MY_LOCK') {
echo 'do stuff 1'
}
}
}
stage('Stage 2') {
steps {
lock('MY_LOCK') {
echo 'do stuff 2'
}
}
}
}
}
}
Parallel stages in a parallel section only abort other stages in the same parallel section if the fail fast option for the parallel section is set. See the docs.

Resources