Jenkins file send email on failure - jenkins

I setup multi branch project on jenkins .
this is my JenkinsFile:
properties([[$class: 'BuildDiscarderProperty', strategy: [$class: 'LogRotator', artifactDaysToKeepStr: '14', artifactNumToKeepStr: '10', daysToKeepStr: '14', numToKeepStr: '10']]])
node {
checkout scm
def lib = load 'cicd/shared-library.groovy'
stage('build project') {
lib.compileProject()
}
stage('Unit test') {
lib.executeUnitTest()
}
stage('Archive log files') {
def files = ["failure_services.txt", "unit_test.log"]
lib.archiveFile(files, "unit_test_result.tar.xz")
}
stage('send email') {
def subject = "Test Result"
def content = 'ًLog file attached'
def toList = ["aaa#gmail.com", "bbb#gmail.com"]
def ccList = ["xxx#gmail.com", "zzz#gmail.com"]
def attachmentFiles = ["unit_test_result.tar.xz"]
lib.sendMail(toList, ccList, subject, content, attachmentFiles)
}
cleanWs()
}
sometimes Unit test stage result a error , so in this case next steps not executed .
I want send email stage executed under any circumstances .
How can config that on JenkinsFile ?

In scripted pipeline (shared library) you can define send email steps in function. Wrap your pipeline steps in try-catch-finally block and call function send_email() in finally part.
For declarative pipeline, you can wrap your pipeline steps in catchError block and send email outside it.
Example:
properties([[$class: 'BuildDiscarderProperty', strategy: [$class: 'LogRotator', artifactDaysToKeepStr: '14', artifactNumToKeepStr: '10', daysToKeepStr: '14', numToKeepStr: '10']]])
node {
catchError {
checkout scm
def lib = load 'cicd/shared-library.groovy'
stage('build project') {
lib.compileProject()
}
stage('Unit test') {
lib.executeUnitTest()
}
stage('Archive log files') {
def files = ["failure_services.txt", "unit_test.log"]
lib.archiveFile(files, "unit_test_result.tar.xz")
}
}
stage('send email') {
def subject = "Test Result"
def content = 'ًLog file attached'
def toList = ["aaa#gmail.com", "bbb#gmail.com"]
def ccList = ["xxx#gmail.com", "zzz#gmail.com"]
def attachmentFiles = ["unit_test_result.tar.xz"]
lib.sendMail(toList, ccList, subject, content, attachmentFiles)
}
cleanWs()
}

Most likely you just need to add a post section in your pipeline.
The post section defines one or more additional steps that are run upon the completion of a Pipeline’s or stage’s run (depending on the location of the post section within the Pipeline). post can support any of the following post-condition blocks: always, changed, fixed, regression, aborted, failure, success, unstable, unsuccessful, and cleanup. These condition blocks allow the execution of steps inside each condition depending on the completion status of the Pipeline or stage. The condition blocks are executed in the order shown below.
Find further information in the docs here

Related

How to run call a sharedlibrary pipeline inside another pipline?

I've written a pipeline as a shared library and I would like to call as one of the stage of master pipeline, but I am getting an error that probably node is not defined. What is the best approach for that? In second case I rewrite sharedTest as a standard pipe line and use "build job" instead of call a shared library, but in that I am repeating a code in some places.
So generally I would like to have:
sharedTest as a independed pipeline but also reusing it in some places, so first one is simple because I can create a separate pipeline where I am importing lib and then calling such lib method. The problem is when I would like to use a shared pipeline as a stage of master piple.
sharedTest.groovy :
def call() {
pipeline{
agent {
label "ansirobotSpy3-devel"
}
parameters {
choice(name: 'TEST', choices: ['bts1', 'bts2'], description: '')
string(name: 'PATH', defaultValue: '/bts1/, description: '')
}
environment {
HTTPS_PROXY = 'http://1.1.1.1'
HTTP_PROXY = 'http://1.1.1.1'
}
stages{
stage('Test stage'){
steps{
script {
sh "ls -lart ./*"
installPyLibs('pytest')
}
}
}
}
}
}
master pipeline:
...
stage("tests"){
agent none
options {
skipDefaultCheckout()
}
when{
beforeAgent true
allOf{
not { expression { currentBuild.result == 'ABORTED' } }
not { expression { SharedTest == 'true' } }
}
}
steps {
script {
stage ("Seek && Destoy") {
sharedTest()
}
stage ("Deploy") {
def deploy = build job: 'Deploy',
parameters: [
string(name: 'BUILD_NUMBER', value: "${env.NEW_BUILD_NR_VAR}")
], wait: true, propagate: false
}
...
From my experience Jenkins doesn't allow using shared libraries locally. I did a workaround registering my shared library this way:
library identifier: 'LIBRARYNAME#BRANCH',
retriever: modernSCM([$class: 'GitSCMSource',
credentialsId: 'CREDENTIALS_FROM_JENKINS',
id: 'GUID',
remote: 'CLONE_LINK_TO_GIT_REPO',
traits: [[$class: 'jenkins.plugins.git.traits.BranchDiscoveryTrait']]])
This also can be achieved using UI and registering the library. More on that here: https://www.jenkins.io/doc/book/pipeline/shared-libraries/
As for the code - I assume your pipeline is inside vars folder in your repository. (details about this here: folder structure) .This way they will be accessible during the pipeline.
Let's assume I have file vars/internalStepTestEcho.groovy. After loading a library this can be accessed using:
internalStepTestEcho()

How to configure dynamic parameters in declarative pipeline (Jenkinsfile)?

Using the declarative pipeline syntax, I want to be able to define parameters based on an array of repos, so that when starting the build, the user can check/uncheck the repos that should not be included when the job runs.
final String[] repos = [
'one',
'two',
'three',
]
pipeline {
parameters {
booleanParam(name: ...) // static param
// now include a booleanParam for each item in the `repos` array
// like this but it's not allowed
script {
repos.each {
booleanParam(name: it, defaultValue: true, description: "Include the ${it} repo in the release?")
}
}
}
// later on, I'll loop through each repo and do stuff only if its value in `params` is `true`
}
Of course, you can't have a script within the parameters block, so this won't work. How can I achieve this?
Using the Active Choices Parameter plugin is probably the best choice, but if for some reason you can't (or don't want to) use a plugin, you can still achieve dynamic parameters in a Declarative Pipeline.
Here is a sample Jenkinsfile:
def list_wrap() {
sh(script: 'echo choice1 choice2 choice3 choice4 | sed -e "s/ /\\n/g"', , returnStdout: true)
}
pipeline {
agent any
stages {
stage ('Gather Parameters') {
steps {
timeout(time: 30, unit: 'SECONDS') {
script {
properties([
parameters([
choice(
description: 'List of arguments',
name: 'service_name',
choices: 'DEFAULT\n' + list_wrap()
),
booleanParam(
defaultValue: false,
description: 'Whether we should apply changes',
name: 'apply'
)
])
])
}
}
}
}
stage ('Run command') {
when { expression { params.apply == true } }
steps {
sh """
echo choice: ${params.service_name} ;
"""
}
}
}
}
This embeds a script {} in a stage, which calls a function, which runs a shell script on the agent/node of the Declarative Pipeline, and uses the script's output to set the choices for the parameters. The parameters are then available in the next stages.
The gotcha is that you must first run the job with no build parameters in order for Jenkins to populate the parameters, so they're always going to be one run out of date. That's why the Active Choices Parameter plugin is probably a better idea.
You could also combine this with an input command to cause the pipeline to prompt the user for a parameter:
script {
def INPUT_PARAMS = input message: 'Please Provide Parameters', ok: 'Next',
parameters: [
choice(name: 'ENVIRONMENT', choices: ['dev','qa'].join('\n'), description: 'Please select the Environment'),
choice(name: 'IMAGE_TAG', choices: getDockerImages(), description: 'Available Docker Images')]
env.ENVIRONMENT = INPUT_PARAMS.ENVIRONMENT
env.IMAGE_TAG = INPUT_PARAMS.IMAGE_TAG
}
Credit goes to Alex Lashford (https://medium.com/disney-streaming/jenkins-pipeline-with-dynamic-user-input-9f340fb8d9e2) for this method.
You can use CHOICE parameter of Jenkins in which user can select a repository.
pipeline {
agent any
parameters
{
choice(name: "REPOS", choices: ['REPO1', 'REPO2', 'REPO3'])
}
stages {
stage ('stage 1') {
steps {
// the repository selected by user will be printed
println("$params.REPOS")
}
}
}
}
You can also use the plugin Active Choices Parameter if you want to do multiple select : https://plugins.jenkins.io/uno-choice/#documentation
You can visit pipeline syntax and configure in below way to generate code snippet and you can put it in the jenkins file:
Copy the snippet code and paste it in jenkinsfile at the start.

Jenkins parameterized input in stage

I know how to request for the user input for the whole pipeline using parameters directive. Now i want to be able to request for the user input inside a specific stage and be able to access that value inside the stage. Here's my pipeline:
pipeline {
agent none
stages {
stage('Stage 1') {
when{
beforeAgent true
expression {
timeout (time: 30, unit: "SECONDS"){
input message: 'Should we continue?', ok: 'Yes',
parameters:[[
$class: 'ChoiceParameterDefinition',
choices: ['Stable release', 'SNAPSHOT'],
description: 'Which Version?',
name: 'version'
]]
}
return true
}
}
agent any
steps {
echo 'Checking dependencies ...'
echo "${params}"
}
}
}
}
In this pipeline I'm able to prompt the user to choose between Stable release and SNAPSHOT inside Stage 1 stage. However I'm not able to access this variable using ${params.version}. Any ideas how to solve this?
I managed to work around the problem and read the input chosen by user as in the following pipeline:
def version //define a global variable for the whole pipeline.
pipeline {
agent none
stages {
stage('Stage 1') {
when{
beforeAgent true
expression {
timeout (time: 30, unit: "SECONDS"){
//Assign the variable here.
version = input message: 'Should we continue?', ok: 'Yes',
parameters:[[
$class: 'ChoiceParameterDefinition',
choices: ['Stable release', 'SNAPSHOT'],
description: 'Which Version?',
name: 'v'
]]
}
return true
}
}
agent any
steps {
// And finally access it.
echo "${version}"
}
}
}
}

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Use Jenkins 'Mailer' inside pipeline workflow

I'd like to leverage the existing Mailer plugin from Jenkins within a Jenkinsfile that defines a pipeline build job. Given the following simple failure script I would expect an email on every build.
stage 'Test'
node {
try {
sh 'exit 1'
} finally {
step([$class: 'Mailer', notifyEveryUnstableBuild: true, recipients: 'me#me.com', sendToIndividuals: true])
}
}
The output from the build is:
Started by user xxxxx
[Pipeline] stage (Test)
Entering stage Test
Proceeding
[Pipeline] node
Running on master in /var/lib/jenkins/jobs/rpk-test/workspace
[Pipeline] {
[Pipeline] sh
[workspace] Running shell script
+ exit 1
[Pipeline] step
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
As you can see, it does record that it performs the pipeline step immediately after the failure, but no emails get generated.
Emails in other free-style jobs that leverage the mailer work fine, its just invoking via pipeline jobs.
This is running with Jenkins 2.2 and mailer 1.17.
Is there a different mechanism by which I should be invoking failed build emails? I don't need all the overhead of the mail step, just need notifications on failures and recoveries.
In Pipeline failed sh doesn't immediately set the currentBuild.result to FAILURE whereas its initial value is null. Hence, build steps that rely on the build status like Mailer might work seemingly incorrect.
You can check it by adding a debug print:
stage 'Test'
node {
try {
sh 'exit 1'
} finally {
println currentBuild.result // this prints null
step([$class: 'Mailer', notifyEveryUnstableBuild: true, recipients: 'me#me.com', sendToIndividuals: true])
}
}
This whole pipeline is wrapped with exception handler provided by Jenkins that's why Jenkins marks the build as failed in the the end.
So if you want to utilize Mailer you need to maintain the build status properly. For instance:
stage 'Test'
node {
try {
sh 'exit 1'
currentBuild.result = 'SUCCESS'
} catch (any) {
currentBuild.result = 'FAILURE'
throw any //rethrow exception to prevent the build from proceeding
} finally {
step([$class: 'Mailer', notifyEveryUnstableBuild: true, recipients: 'me#me.com', sendToIndividuals: true])
}
}
If you don't need to re-throw the exception, you can use catchError. It is a Pipeline built-in which catches any exception within its scope, prints it into console and sets the build status. For instance:
stage 'Test'
node {
catchError {
sh 'exit 1'
}
step([$class: 'Mailer', notifyEveryUnstableBuild: true, recipients: 'me#me.com', sendToIndividuals: true])
}
In addition to izzekil's excellent answer, you may wish to choose email recipients based on the commit authors. You can use email-ext to do this (based on their pipeline examples):
step([$class: 'Mailer',
notifyEveryUnstableBuild: true,
recipients: emailextrecipients([[$class: 'CulpritsRecipientProvider'],
[$class: 'RequesterRecipientProvider']])])
If you're using a recent email-ext (2.50+), you can use that in your pipeline:
emailext(body: '${DEFAULT_CONTENT}', mimeType: 'text/html',
replyTo: '$DEFAULT_REPLYTO', subject: '${DEFAULT_SUBJECT}',
to: emailextrecipients([[$class: 'CulpritsRecipientProvider'],
[$class: 'RequesterRecipientProvider']]))
If you're not using a declarative Jenkinsfile, you will need to put checkout scm so Jenkins can find the committers. See JENKINS-46431.
If you're still on an older version of email-ext, you'll hit JENKINS-25267. You could roll your own HTML email:
def emailNotification() {
def to = emailextrecipients([[$class: 'CulpritsRecipientProvider'],
[$class: 'DevelopersRecipientProvider'],
[$class: 'RequesterRecipientProvider']])
String currentResult = currentBuild.result
String previousResult = currentBuild.getPreviousBuild().result
def causes = currentBuild.rawBuild.getCauses()
// E.g. 'started by user', 'triggered by scm change'
def cause = null
if (!causes.isEmpty()) {
cause = causes[0].getShortDescription()
}
// Ensure we don't keep a list of causes, or we get
// "java.io.NotSerializableException: hudson.model.Cause$UserIdCause"
// see http://stackoverflow.com/a/37897833/509706
causes = null
String subject = "$env.JOB_NAME $env.BUILD_NUMBER: $currentResult"
String body = """
<p>Build $env.BUILD_NUMBER ran on $env.NODE_NAME and terminated with $currentResult.
</p>
<p>Build trigger: $cause</p>
<p>See: $env.BUILD_URL</p>
"""
String log = currentBuild.rawBuild.getLog(40).join('\n')
if (currentBuild != 'SUCCESS') {
body = body + """
<h2>Last lines of output</h2>
<pre>$log</pre>
"""
}
if (to != null && !to.isEmpty()) {
// Email on any failures, and on first success.
if (currentResult != 'SUCCESS' || currentResult != previousResult) {
mail to: to, subject: subject, body: body, mimeType: "text/html"
}
echo 'Sent email notification'
}
}
I think a better way to send mail notifications in jenkins pipelines is to use the post section of a pipeline as described in the jenkins docs instead of using try catch:
pipeline {
agent any
stages {
stage('whatever') {
steps {
...
}
}
}
post {
always {
step([$class: 'Mailer',
notifyEveryUnstableBuild: true,
recipients: "example#example.com",
sendToIndividuals: true])
}
}
}
}
}

Resources