How to build pipeline job inside the POST section in Jenkins pipeline - jenkins

I have a Jenkins pipeline which, among multiple steps should have a final step that should be executed regardless of the status of previous steps. For that to happen, I've tried using post section which looks like this:
pipeline {
agent {
label 'master'
}
stages {
stage('Stage 1') {
steps {
build job: 'stage 1 job', parameters: [
...
]
}
}
stage('Stage 2') {
steps {
build job: 'stage 2 job', parameters: [
...
]
}
}
}
post {
always {
build job: "cleanup", parameters: [
...
]
}
}
}
However, I'm getting following error when trying to execute something like this:
No such DSL method '$' found among steps
Question: Is it even possible to use build job inside post action? If not, what would be good alternative to achieve that "cleanup" job is always executed at the end (regardless of the status of stages above)

Yes, it is possible to use build a job inside post action. Here is the pipeline script:
pipeline {
agent any
stages {
stage('1') {
steps {
script {
echo "Hello"
}
}
}
}
post {
always {
build job: 'schedule-job', parameters: [string(name: 'PLATFORM', value: 'Windows')]
}
}
}
In the above example, I have schedule-job which accepts parameters PLATFORM and it will Always run, regardless of build status
Here is the output:

Related

Is it possible to get build number even if build is unstable but not failed?

When building a job in a scripted pipeline, I would like to keep the external build number even if that build is unstable but not failed.
pipeline {
agent any
stages {
stage('Job1') {
steps {
script {
Job1 = build job: 'Job1'
}
}
}
stage('Job2') {
steps {
build job: 'Job2',
parameters: [
string(
name: 'Job1_ID'
value: "${Job1.number}"
)
]
}
}
}
}
I have tried with a catchError() around the job1 build, but still have that problem if the build is unstable.
I have also tried with propagate:false parameter, but I can never see the actual status of the build visually, plus, I don't want the second build to be triggered if the first is failed.
Is there any solution for that ?
What you can do is set propagate: false and then conditionally execute your second Job. Please see the pipeline below.
pipeline {
agent any
stages {
stage('Job1') {
steps {
script {
Job1 = build job: 'Job1', propagate: false
}
}
}
stage('Job2') {
when { expression { return Job1.resultIsBetterOrEqualTo("SUCCESS")}}
steps {
build job: 'Job2',
parameters: [
string(name: 'Job1_ID',value: "${Job1.number}")
]
}
}
}
}

Re-run a pipeline using script jenkins

I have a pipeline with some information detailed behind
pipeline {
parameters {
booleanParam(name: 'RERUN', defaultValue: false, description: 'Run Failed Tests')
}
stage('Run tests ') {
steps {
runTest()
}
}
post {
always {
reRun()
}
}
}
def reRun() {
if ("SUCCESS".equals(currentBuild.result)) {
echo "LAST BUILD WAS SUCCESS"
} else if ("UNSTABLE".equals(currentBuild.result)) {
echo "LAST BUILD WAS UNSTABLE"
}
}
but I want that after the stage "Run tests" execute, if some tests fail I want to re-run the pipeline with parameters RERUN true instead of false. How can I replay via script instead of using plugins ?
I wasn't able to find how to re-run using parameters on my search, if someone could help me I will be grateful.
First of you can use the post step to determine if the job was unstable:
post{
unstable{
echo "..."
}
}
Then you could just trigger the same job with the new parameter like this:
build job: 'your-project-name', parameters: [[$class: 'BooleanParameterValue', name: 'RERUN', value: Boolean.valueOf("true")]]

Jenkins Pipeline passing parameter as shell script argument

I have a parameterized Jenkins Pipeline with a default value and I'm trying to pass that param as a script argument but it doesn't seem to pass anything. Here is the script :
pipeline {
agent any
stages {
stage('Building') {
steps {
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
}
}
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}
It seems to work for the build step but not for the script. I returns an error saying I have no argument.
I tried to get it with ${configuration}, ${params.configuration} and $configuration.
What is the right way to access a param and pass it correctly to a script ?
Thanks.
Actually, you are using the build step, to pass a parameter to the Jenkins job 'myProject'.
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
If you want to declare a Parameter in this job you need to declare your parameter in a "parameters" block.
pipeline {
agent any
parameters {
string(defaultValue: '', description: '', name: 'configuration')
}
stages {
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}

Jenkins DSL Pipeline: delete a job from its pipeline

I have a Jenkins pipeline job that (among other things) creates another pipelineJob (to cleanup everything afterwards) using Job DSL plugin.
pipeline {
agent { label 'Deployment' }
stages {
stage('Clean working directory and Checkout') {
steps {
deleteDir()
checkout scm
}
}
// Complex logic omitted
stage('Generate cleanup job') {
steps {
build job: 'cleanup-job-template',
parameters: [
string(name: 'REGION', value: "${REGION}"),
string(name: 'DEPLOYMENT_TYPE', value: "${DEPLOYMENT_TYPE}")
]
}
}
}
}
The thing is that I need this newly generated job to be built only once and then, if the build was successful, the job should be deleted.
pipeline {
stages {
stage('Cleanup afterwards') {
// cleanup logic
}
}
post {
success {
// delete this job?
}
}
}
I thought, that this can be done using Pipeline Post Action, but, unfortunately, I couldn't find any out-of-the-box solution for this.
Is it possible to achieve this at all?
You can achieve this using the post Groovy and then you will need to write some groovy code in order to delete the job:
#!/usr/bin/env groovy
import hudson.model.*
pipeline {
agent none
stages {
stage('Cleanup afterwards') {
// cleanup logic
steps {
node('worker') {
sh 'ls -la'
}
}
}
}
post {
success {
script {
jobsToDelete = ["<JOB_TO_DELETE"]
deleteJob(Hudson.instance.items, jobsToDelete)
}
}
}
}
def deleteJob(items, jobsToDelete) {
items.each { item ->
if (item.class.canonicalName != 'com.cloudbees.hudson.plugins.folder.Folder') {
if (jobsToDelete.contains(item.fullName)) {
manager.listener.logger.println(item.fullName)
item.delete()
}
}
}
}
Tested both cases and work on Jenkins 2.89.4
You should do that all in one job instead of creating and deleting jobs. Use multiple stages for that, e.g. deploy test system, run tests / wait for tests to be finished, undeploy. No need for extra jobs. Example posted here: Can a Jenkins pipeline have an optional input step?

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Resources