How to define post build actions for Jenkins multi pipeline project?
There is a separate option available when you have a simple project but not for multipipeline.
To add post build steps to a Multibranch Pipeline, you need to code these steps into the finally block, an example is below:
node {
try {
stage("Checkout") {
// checkout scm
}
stage("Build & test") {
// build & Unit test
}
} catch (e) {
// fail the build if an exception is thrown
currentBuild.result = "FAILED"
throw e
} finally {
// Post build steps here
/* Success or failure, always run post build steps */
// send email
// publish test results etc etc
}
}
For most of the post-build steps you would want there are online examples of them on how to write in pipeline format. If you have any specific one please list it here
When you write a pipeline, you describe the whole flow yourself, which gives you great flexibility to do whatever you want, including running post-build steps.
You can see an example of using post-build steps in a pipeline I wrote:
https://github.com/geek-kb/Android_Pipeline/blob/master/Jenkinsfile
Example from that code:
run_in_stage('Post steps', {
sh """
# Add libCore.so files to symbols.zip
find ${cwd}/Product-CoreSDK/obj/local -name libCore.so | zip -r ${cwd}/Product/build/outputs/symbols.zip -#
# Remove unaligned apk's
rm -f ${cwd}/Product/build/outputs/apk/*-unaligned.apk
"""
})
Related
I'm using Cucumber reports plugin in my declarative pipeline like that:
cucumber '**/cucumber.json'
I'm able to check if some tests fail through link on the sidebar, but do I need to do something to mark the stage containing cucumber.json check as failed if some cucumber reports are failed? Because the problem is the build and stage are both green and successful despite there are some failed cucumber reports.
Jenkins version is 2.176.3
Cucumber reports version is 4.10.0
Cucumber command you are using just generates the report regardless the test result.
So yes, you have to make your pipeline fail somehow as the problem you are facing is that your test command is not returning making your pipeline fail.
The way to go is to make that the command that runs the tests returns non-zero exit code (exit 1) if something went wrong on your tests. That would make your pipeline stage to go red.
In case you run your tests using Maven this would be automatically managed on 'mvn test' (or whatever).
Otherwise, if you cannot do that, you will have to manage to make something like for example an sh script
that returns the exit code (0 pass / 1 fail) or a groovy function inside 'script' tag that sets the pipeline currentBuild.result value:
def checkTestResult() {
// Check some file to see if tests went fine or not
return 'SUCCESS' // or 'FAILURE'
}
...
stage {
script {
currentBuild.result = checkTestResult()
if (currentBuild.result == 'FAILURE') {
sh "exit 1" // Force pipeline exit with build result failed
}
}
}
...
I recommend you to use cucumber command on a 'always' post build action of your declarative pipeline
as it is a step that you will likely execute every time at the end of the pipeline either if it passes or fails. See the following example:
pipeline {
stages {
stage('Get code') {
// Whatever
}
stage('Run tests') {
steps {
sh "mvn test" // run_tests.sh or groovy code
}
}
}
post {
always {
cucumber '**/cucumber.json'
}
}
}
It is possible to set BuildStatus : 'FAILURE' to mark build as failed if a report marked as failed.
cucumber fileIncludePattern: '**/cucumber.json', buildStatus: 'FAILURE'
My jenkinsfile does not compile anymore when trying to add a POST action. This last one should be displayed to the jenkins console output at the end of build.
Part I is about my jenkinsfile code for which builds are done well.
Part II is the patch added to part I for which any builds fail.
I want to integrate part I and part II to get the expected output described hereafter but integration fails whatever how insertion is made.
I have tried a lot of thing and i'm stucked now, so any help will be appreciate.
// Part I : my base code
node {
def mvnHome
stage('Preparation') {
git 'https://github.com/jglick/simple-maven-project-with- tests.git'
// Get the Maven tool.
// ** NOTE: This 'M3' Maven tool must be configured
// ** in the global configuration.
mvnHome = tool 'M3'
}
stage('Build') {
// Run the maven build
if (isUnix()) {
sh "'${mvnHome}/bin/mvn' -Dmaven.test.failure.ignore clean package"
} else {
bat(/"${mvnHome}\bin\mvn" -Dmaven.test.failure.ignore clean package/)
}
}
stage('Results') {
junit '**/target/surefire-reports/TEST-*.xml'
archiveArtifacts 'target/*.jar'
}
}
// Part II : code to add to the previous code
post {
always {
echo 'I have finished and deleting workspace'
// deleteDir()
}
success {
echo 'Job succeeeded!
}
unstable {
echo 'I am unstable :/'
}
failure {
echo 'I failed :('
}
changed {
echo 'Things were different before...'
}
}
output expected in the console output : 'Job succeeeded! or I am unstable :/ or 'I failed :(' ... depending on the jenkins build status and always clean the workspace before each new build
Actual result is the error message from the console output :
java.lang.NoSuchMethodError: No such DSL method 'post' found among steps [archive, bat, build, catchError, checkout, deleteDir, dir ......
You are mixing up scripted and declarative pipeline syntax. post is part of declarative, but you use the scripted variant (no pipeline, but node steps).
You have to use try/catch.
See the documentation.
I'm trying to automate the creation of a Jenkins Pipeline build from within a pipeline.
I have a pipeline which creates a Bitbucket repository and commits some code to it, including a Jenkinsfile.
I need to add another step to this pipeline to then create the Pipeline build for it, which would run the steps in the Jenkinsfile.
I think the Jobs DSL should be able to handle this but the documentation I've found for it has been very sparse, and I'm still not entirely sure if it's possible or how to do it.
Any help would be appreciated. The generated Pipeline build I would imagine just needs to have a link to the repository and be told to run the Jenkinsfile there?
Yes, Job DSL is what you need for your use case.
See this and this to help you get started.
EDIT
pipeline {
agent {
label 'slave'
}
stages{
stage('stage'){
steps {
// some other steps
jobDsl scriptText: '''pipelineJob(\'new-job\') {
def repo = \'https://xxxxx#bitbucket.org/xxxx/dummyrepo.git\'
triggers {
scm(\'H/5 * * * *\')
}
definition {
cpsScm {
scm {
git {
remote {
url(repo)
credentials('bitbucket-jenkins-access')
}
branches(\'master\')
scriptPath(\'Jenkinsfile\')
extensions { }
}
}
}
}
}'''
}
}
}
}
Documentation - https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob-scm-git
By using this python library jenins-job-builder you can easily create your expected pipeline or free-style job from another pipeline or from any other remote location.
Example:
steps-1
python3 -m venv .venv
source .venv/bin/activate
pip install --user jenkins-job-builder
steps-2
Once you have done above, Create 2 file, one with name config.ini and the other one is job.yml. Please note - there are no strict rules about the file name. It can be up to you.
The config.ini file contain can looks like
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
password = jenkins-password
query_plugins_info = False
url = http://jenkins-url.net
user = jenkins-username
If you are creating a pipeline job , then your job.yml file can look like
- job:
name: pipeline01
display-name: 'pipeline01'
description: 'Do not edit this job through the web!'
project-type: pipeline
dsl: |
node(){
stage('hello') {
sh 'echo "Hellow World!"'
}
}
steps-3
after all the above. Invoke below command
jenkins-jobs --conf config.ini update job.yml
Note- jenkins-jobs command can only be available if you have followed steps-1
I have a project which has multiple build pipelines to allow for different types of builds against it (no, I don't have the ability to make one build out of it; that is outside my control).
Each of these pipelines is represented by a Jenkinsfile in the project repo, and each one must use the same build agent label (they need to share other pieces of configuration as well, but it's the build agent label which is the current problem). I'm trying to put the label into some sort of a configuration file in the project repo, so that all the Jenkinsfiles can read it.
I expected this to be simple, as you don't need this config data until you have already checked out a copy of the sources to read the Jenkinsfile. As far as I can tell, it is impossible.
It seems to me that a Jenkinsfile cannot read files from SCM until the project has done its SCM step. However, that's too late: the argument to agent{label} is read before any stages get run.
Here's a minimal case:
final def config
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
checkout scm // we don't need all the submodules here
echo "Reading configuration JSON"
script { config = readJSON file: 'buildjobs/buildjob-config.json' }
echo "Read configuration JSON"
}
}
stage('Build and Deploy') {
agent {
label config.agent_label
}
steps {
echo 'Got into Stage 2'
}
}
}
}
When I run this, I get:
java.lang.NullPointerException: Cannot get property 'agent_label' on null object I don't get either of the echoes from the 'Configure' stage.
If I change the label for the 'Build and Deploy' stage to 'master', the build succeeds and prints out all three echo statements.
Is there any way to read a file from the Git workspace before the agent labels need to be set?
Please see https://stackoverflow.com/a/52807254/7983309. I think you are running into this issue. label is unable to resolve config.agent_label to its updated value. Whatever is set in the first line is being sent to your second stage.
EDIT1:
env.agentName = ''
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
script {
env.agentName = 'slave'
echo env.agentName
}
}
}
stage('Finish') {
steps {
node (agentName as String) { println env.agentName }
script {
echo agentName
}
}
}
}
}
Source - In a declarative jenkins pipeline - can I set the agent label dynamically?
I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/