How to run post actions if setting up the shared library fails? - jenkins

I'm using a shared library via #Library('shared-lib') _. The pipeline script implements post actions. E.g.
post {
always {
script {
// Do stuff
}
}
}
When there's an error with the shared lib then Jenkins just fails the entire build and the post action block isn't executed, as it seems (tested with wrong repository URL and non-existing branch). In case GitHub is down, I want Jenkins to run post actions to notify the issuer of the build that it failed. Is there a way to do this, without having the issuer making API calls of some kind for verification?
Thanks for any suggestion!

One way you can control the loading of the shared library is by Loading libraries dynamically, when you do so you can wrap the loading phase with a try catch block and handle the failure.
However, when using this technique the error will be handled outside the pipeline execution so in order to avoid duplicating the error handler function (that sends notifications) you can define the error handling in a separate methods (or in a shared library) and call it for the catch block and from the post block.
Something like:
try {
library "shared-lib"
}
catch(Exception ex){
// handle the exception
handleError(ex.getMessage())
}
pipeline {
agent any
stages {
stage('Hello') {
steps {
...
}
}
}
post {
always {
script {
handleError(message)
}
}
}
}
def handleError(message){
emailext ...
}
You can also try to load the library inside a pipeline step, thus utilizing the post directive on failure, but this can cause issues with the context of the loaded library and therefore it is not recommended.
You can also of course handle separately each failure type and avoid the need of an external function.
Last thing, shared library failures are usually not handled because if the job failed to load the library for the SCM then it will probably fail to load the pipeline itself form the SCM, so assuming you host them both on the same SCM platform, this scenario is relatively rare.

Related

jenkins job dsl plugin issue where none of the internal jobs have access to the external jobs

I am using jenkins jobDsl as follows:
#!groovy
node('master') {
stage('Prepare') {
deleteDir()
checkout scm
}
stage('Provision Jobs') {
jobDsl(targets: ['jenkins/commons.groovy', 'folderA/jenkins/jobA.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
sandbox: false)
}
}
Where I want to use from the jobA.groovy a function that is defined on commons.groovy.
Currently, the jobA.groovy doesn't have access to the function defined on commons.groovy, how can I allow this behavior?
Attached:
jobA.groovy:
test_job("param1", "param2")
common.groovy:
def test_job(String team, String submodule) {
pipelineJob("${team}/${submodule}/test_job") {
displayName("Test Job")
description("This is a Continuous Integration job for testing")
properties {
githubProjectUrl("githubUrl")
}
definition {
cpsScm {
scm {
git {
remote {
url('githubUrl')
credentials('credentials')
refspec('+refs/pull/*:refs/remotes/origin/pr/*')
}
branch('${sha1}')
scriptPath("scriptPath")
}
}
}
}
}
}
The idea would be to be able to call this method test_job("param1", "param2") from jobA.groovy with no issues and I am currently getting:
ERROR: (jobA.groovy, line 9) No signature of method: test_job() is applicable for argument types: (java.lang.String, java.lang.String)
JobDSL creates the jobs. Then at runtime you want your job to call your function. The function must be imported through a shared library.
Create a shared lib
here is a sample: https://github.com/sap-archive/jenkins-pipelayer
the most important piece there is that you need to create a vars/ folder that will define the functions you can call from your pipelines. host the lib on its own repo or orphan branch
Import a shared lib
To import a lib library in Jenkins. From Manage page, go to Configure System Under section Global Pipeline Libraries, add a new library with name of your choice, ie name-of-your-lib, default version master, modern scm git https://urlofyoursharedlib.git
Run a first time the jobDSL job, then go to the In Process Script Approval page and approve everything.
Use a shared lib
To import a library inside your job you must include at the top of the file, after the shebang the statement #Library('name-of-your-lib')_
There is also a similar statement that exists, "library 'name-of-your-lib'". this one is useful to "debug & fix" a shared library because when you hit that replay button you'll see the shared library files used in the pipeline
Finally if all you are trying is to create job templates, I would recommend to try to get what this shared library I shared is doing, it helps with creating declarative templates and solves issues and limitations you will encounter with jobdsl & shared pipeline

How to write email configuration in jenkins pipeline to send different set of information for build success and build failure?

I have pipeline which will trigger from SCM. I want to capture most of the information about what went wrong if build fails and needed information if build succeed. All the captured info i will be using it in mail body(As detail as possible). I want to know how to capture those info and Do i need to use try catch or is there any other way?
Could anyone help me with the solution please?
I did something similar in the scripted pipeline.
Certainly, you need to use try/catch/finally as the sending email step needs to run no matter build pass or fail; you need to catch the exception so that you can take useful information out of it and then finally run the send email step.
At a high level, it will look like this
try {
// put your stages/logic here
} catch (ex) {
// get exception details ex.message etc
throw ex
} finally {
// put the email step
}

Can we use a single jenkins file for multibranch piepeline in jenkins using shared libraries?

I am trying to write a jenkinsfile which will take the data from shared libraries in jenkins for multibranch pipeline, something like below:-
#Library('Template')_
if (env.BRANCH_NAME == 'master') {
jenkins1(PROJECTNAME: 'test', GITURL: 'http://test/test.git')
} else {
jenkins2(PROJECTNAME: 'test1', GITURL: 'http:////test/test.git')
}
so the pipeline take the shared library depending upon the if condition, if the branch is master if statement data should work or else should be build.
Yes that’s possible. Actually we’re using a multibranch project to test our changes to our shared library that way.
You have to use the library step to load the library instead of the #Library annotation, like:
if (condition) {
library(‘someLib#${env.BRANCH_NAME}’)
} else {
library(‘someOtherLib’)
}
See https://jenkins.io/doc/pipeline/steps/workflow-cps-global-lib/#library-load-a-shared-library-on-the-fly for all details.
By the way: In case you’re planning to do Pull Requests the following Post might be useful to you as well: https://stackoverflow.com/a/51915362/4279361

Jenkins DSL script - Test Failure - Found multiple extensions which provide method lastCompleted

Trying to create multijobs in Jenkins with DSL scripting.
There are multiple jobs in a phase and I want to create a consolidated report for the multijob from downstream jobs.
I am using copy artifact to copy the results of downstream jobs to the multijob's target dir. Using selector - lastCompleted()
However I am getting this an error saying multiple extensions providing the method and tests are failing. lastCompleted() is apparently present in copyArtifact and multijob plugins where in this case I require both.
Here is my script:
multiJob('dailyMultiJob') {
concurrentBuild(true)
logRotator(-1, 10, -1, 10)
triggers {
cron('H H(0-4) * * 0-6')
}
steps {
phase('Smoke Tests'){
phaseJob('JobA')
phaseJob('JobB')
phaseJob('JobC')
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobA')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobB')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobC')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
}
publishers {
allure {
results {
resultsConfig {
path('target/allure-results')
}
}
}
archiveArtifacts {
pattern('target/reports/**/*.*')
pattern('target/allure-results/**/*.*')
allowEmpty(true)
}
}
}
Getting this below error after running gradle tests
Caused by: javaposse.jobdsl.dsl.DslException: Found multiple extensions which provide method lastCompleted with arguments []: [[hudson.plugins.copyartifact.LastCompletedBuildSelector, com.tikal.jenkins.plugins.multijob.MultiJobBuildSelector]]
I am not sure if there is a way to indicate use specific artifact's method.
Been stuck on this for quite some time. Any helps are highly appreciated. Thank you in advance!
I had come across the same issue few months back.
There are two possible solutions to this issue.
1 - Keep only one plugin that will avoid the conflict. (Not recommended as it might break other jobs)
2- Use configure block to modify the xml file which will avoid this conflict & you can keep multiple plugins that support the same extensions. (Recommended solution)
Thanks,
Late update:
What I had to do is to switch to scripted pipeline jobs instead.
Configure blocks are not really allowed on all the methods you want to use and they are limited by design. I believe some plugins also don't allow it for security reasons.
Better do use Pipelines.

Abort, rather than error, stage within a Jenkins declarative pipeline

Problem
Our source is a single large repository that contains multiple projects. We need to be able to avoid building all projects within the repository if a commit happens within specific areas. We are managing our build process using pipelines.
Research
The git plugin provides the ability to ignore commits from certain user, paths, and message content. However, as we are using the pipeline, we believe we are experiencing the issue described by JENKINS-36195. In one of the most recent comments, Jesse suggests examining the changeset and returning early if the changes look boring. He mentions that a return statement does not work inside a library, closure, etc), but he doesn't mention how a job could be aborted.
Potential Approaches
We have considered using the error step, but this would result in the job being marked as having a failure and would need to be investigated.
While a job result could be marked as NOT_BUILT, the job is not aborted but continues to process all stages.
Question
How would you abort a job during an early step without marking it as a failure and processing all stages of the pipeline (and potentially additional pipelines)?
Can you try using try catch finally block in the pipeline.
try{
}
catch {
}
finally{
}
I suppose you can also use post build action in the pipeline.
pipeline {
agent any
stages {
stage('Test') {
steps {
sh 'make check'
}
}
}
post {
always {
junit '**/target/*.xml'
}
failure {
mail to: team#example.com, subject: 'The Pipeline failed :('
}
}
}
The documentation is below https://jenkins.io/doc/book/pipeline/syntax/#post
you can also try using the below one outside of build step with your conditions as specified by Slawomir Demichowicz in the ticket.
if (stash.isJenkinsLastAuthor() && !params.SKIP_CHANGELOG_CHECK) {
echo "No more new changes, stopping pipeline"
currentBuild.result = "SUCCESS"
return
}
I am not sure this could help you.

Resources