Jenkins publishers postBuildScripts doesn't work - jenkins

I have a groovy script to setup a scheduled job in Jenkins.
I want to execute some shell scripts on failed build.
If I had the scripts manually after the job creation after the job is updated by groovy script, they run.
But the groovy script does not add it:
job('TestingAnalysis') {
triggers {
cron('H 8 28 * *')
}
steps {
shell('some jiberish to create error')
}
publishers {
postBuildScripts {
steps {
shell('echo "fff"')
shell('echo "FFDFDF"')
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}
retryBuild {
rerunIfUnstable()
retryLimit(3)
fixedDelay(600)
}
}
}
Every thing works fine except:
postBuildScripts {
steps {
shell('echo "fff"')
shell('echo "FFDFDF"')
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}
This is my result:
I tried postBuildSteps and also got error.
I tried also with error:
postBuildScripts {
steps {
sh' echo "ggg" '
}
onlyIfBuildSucceeds(false)
onlyIfBuildFails(true)
}

Take a look at JENKINS-66189 seems like there is an issue with version 3.0 of the PostBuildScript in which the old syntax (that you are using) is no longer supported. In order to use the new version it in a Job Dsl script you will need to use Dynamic DSL syntax.
Use the following link in your own Jenkins instance to see the correct usage:
YOUR_JENKINS_URL/plugin/job-dsl/api-viewer/index.html#path/freeStyleJob-publishers-postBuildScript.
it will help you build the correct command. In your case it will be:
job('TestingAnalysis') {
triggers {
cron('H 8 28 * *')
}
steps {
shell('some jiberish to create error')
}
publishers {
postBuildScript {
buildSteps {
postBuildStep {
stopOnFailure(false) // Mandatory setting
results(['FAILURE']) // Replaces onlyIfBuildFails(true)
buildSteps {
shell {
command('echo "fff"')
}
shell {
command('echo "FFDFDF"')
}
}
}
}
markBuildUnstable(false) // Mandatory setting
}
}
}
Notice that instead of using functions like onlyIfBuildSucceeds and onlyIfBuildFails you now just pass a list of relevant build results to the results function. (SUCCESS,UNSTABLE,FAILURE,NOT_BUILT,ABORTED)

Related

Conditional post section in Jenkins pipeline

Say I have a simple Jenkins pipeline file as below:
pipeline {
agent any
stages {
stage('Test') {
steps {
sh ...
}
}
stage('Build') {
steps {
sh ...
}
}
stage('Publish') {
when {
buildingTag()
}
steps {
sh ...
send_slack_message("Built tag")
}
}
}
post {
failure {
send_slack_message("Error building tag")
}
}
}
Since there's a lot non-tag builds everyday, I don't want to send any slack message about non-tag builds. But for the tag builds, I want to send either a success message or a failure message, despite of which stage it failed.
So for the above example, I want:
When it's a tag build, and stage 'Test' failed, I shall see a "Error building tag" message. (This is a yes in the example)
When it's a tag build, and all stages succeeded, I shall see a "Built tag" message. (This is also a yes in the example)
When it's not a tag build, no slack message will ever been sent. (This is not the case in the example, for example, when the 'Test' stage fails, there's will be a "Error building tag" message)
As far as I know, there's no such thing as "conditional post section" in Jenkins pipeline syntax, which could really help me out here. So my question is, is there any other way I can do this?
post {
failure {
script {
if (isTagBuild) {
send_slack_message("Error building tag")
}
}
}
}
where isTagBuild is whatever way you have to differentiate between a tag or no tag build.
You could also apply the same logic, and move send_slack_message("Built tag") down to a success post stage.
In the postbuild step you can also use script step inside and use if. And inside this if step you can add emailext plugin.
Well, for those who just want some copy-pastable code, here's what I ended-up with based on #eez0's answer.
pipeline {
agent any
environment {
BUILDING_TAG = 'no'
}
stages {
stage('Setup') {
when {
buildingTag()
}
steps {
script {
BUILDING_TAG = 'yes'
}
}
}
stage('Test') {
steps {
sh ...
}
}
stage('Build') {
steps {
sh ...
}
}
stage('Publish') {
when {
buildingTag()
}
steps {
sh ...
}
}
}
post {
failure {
script {
if (BUILDING_TAG == 'yes') {
slackSend(color: '#dc3545', message: "Error publishing")
}
}
}
success {
script {
if (BUILDING_TAG == 'yes') {
slackSend(color: '#28a745', message: "Published")
}
}
}
}
}
As you can see, I'm really relying on Jenkins built-in buidingTag() function to help me sort things out, by using an env-var as a "bridge". I'm really not good at Jenkins pipeline, so please leave comments if you have any suggestions.

Jenkins dsl configure block makes duplicate tabs

I'm trying to create a job DSL which creates a multibranch-pipeline job,
The job is being created successfully but there are some missing configurations in the multi-pipeline job so I tried to use the "configure" block.
The configure block indeed was created but it created a duplicate "tag" of jenkins.branch.BranchSource I guess I am missing something' I tried tons of manipulations but nothing worked for me.
Any advice?
This is my groovy DSL:
multibranchPipelineJob('TestDocker_pipeline_DSL') {
branchSources {
git {
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
//includes("(V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
configure {
it / sources / data / "jenkins.branch.BranchSource" << "jenkins.plugins.git.GitSCMSource" {
id("8fd33e1d-07b6-4cc4-8f1c-a18d955b4b6e")
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
traits{
"jenkins.scm.impl.trait.RegexSCMHeadFilterTrait"{
regex("V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
}
}
}
}
factory {
workflowBranchProjectFactory {
scriptPath('main/Docker/DockerJenkinsfileSlave.groovy')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(3)
}
}
}
And this is the job XML being created:
Well After a lot of struggling I think that my problem was that I didn't define some of the TAGS as plugins in the groovy DSL and removing the "git" section also helped.
So the Final groovy that finally worked was this one:
branchSources {
configure {
it / sources / data / "jenkins.branch.BranchSource" << source (class: "jenkins.plugins.git.GitSCMSource", plugin:"git#3.9.2") {
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
includes('*')
excludes('')
ignoreOnPushNotifications(false)
traits{
"jenkins.scm.impl.trait.RegexSCMHeadFilterTrait"{
regex("(V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
}
}
}
}
Which resulted this beutifull XML job:

Jenkins. Use a shared library on the options phase

So I have created a shared library in jenkins with a listener that gets triggered each time the pipelines reads a FlowNode so I can run groovy code before and after each stage, step, etc...
I'm able to call the shared library in a step phase like this:
pipeline {
agent any
stages {
stage('prepare') {
steps{
prepareStepsWrapper()
}
}
stage('step1') {
steps {
echo 'step1'
}
}
stage('step2') {
steps {
echo 'step2'
}
}
stage('step3') {
steps {
echo 'step3'
// fail on purpose
sh 'notfoundexecutablelol'
}
}
stage('step4') {
steps {
echo 'step4'
}
}
}
post{
always{
println env.getEnvironment()
}
}
}
And works pretty great!
With this approach the 'prepare' stage needs to be filtered out so I've switched to the options directive:
pipeline {
agent any
options {
prepareStepsWrapper()
}
stages {
stage('step1') {
steps {
echo 'step1'
}
}
...
}
}
But the pipeline fails with
WorkflowScript: 4: Invalid option type "prepareStepsWrapper"
tl;dr; How can I load a shared library within the options directive?
What does the option-stage do?
The options directive allows configuring Pipeline-specific options
from within the Pipeline itself.
You can't call the shared-library in the options-stage. This stage should not be used for execute any logic, rather it sets configurations for the pipeline. All availables options and the documentation can be found here.
You could try to create a stage that simply calls your prepareStepsWrapper() and use locks to avoid that other stages are executed before this stage.

How to catch manual UI cancel of job in Jenkinsfile

I've tried to find documentation about how in a Jenkinsfile pipeline catching the error that occurs when a user cancels a job in jenkins web UI.
I haven't got the postor try/catch/finally approaches to work, they only work when something fails within the build.
This causes resources not to be free'd up when someone cancels a job.
What I have today, is a script within a declarative pipeline, like so:
pipeline {
stage("test") {
steps {
parallell (
unit: {
node("main-builder") {
script {
try { sh "<build stuff>" } catch (ex) { report } finally { cleanup }
}
}
}
)
}
}
}
So, everything within catch(ex) and finally blocks is ignored when a job is manually cancelled from the UI.
Non-declarative approach:
When you abort pipeline script build, exception of type org.jenkinsci.plugins.workflow.steps.FlowInterruptedException is thrown. Release resources in catch block and re-throw the exception.
import org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
def releaseResources() {
echo "Releasing resources"
sleep 10
}
node {
try {
echo "Doing steps..."
} catch (FlowInterruptedException interruptEx) {
releaseResources()
throw interruptEx
}
}
Declarative approach (UPDATED 11/2019):
According to Jenkins Declarative Pipeline docs, under post section:
cleanup
Run the steps in this post condition after every other post condition has been evaluated, regardless of the Pipeline or stage’s status.
So that should be good place to free resources, no matter whether the pipeline was aborted or not.
def releaseResources() {
echo "Releasing resources"
sleep 10
}
pipeline {
agent none
stages {
stage("test") {
steps {
parallel (
unit: {
node("main-builder") {
script {
echo "Doing steps..."
sleep 20
}
}
}
)
}
post {
cleanup {
releaseResources()
}
}
}
}
}
You can add a post trigger "cleanup" to the stage:
post {
cleanup {
script { ... }
sh "remove lock"
}
}

Job DSL to create "Pipeline" type job

I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.

Resources