Rerun current Jenkins pipeline - jenkins

I have Jenkins job A that triggers Jenkins job B.
I want to do some stuff (deploy to env1), and at last I want Jenkins job B to rerun itself when some conditions is OK and deploy to env2 and rerun again and deploy to env3. My problem is that after I approve to continue, nothing happens. I want the job to automatic trigger itself after approval.
I also want the parameters to be the same as the first run, but I want to update som values in my environment.
This is what Ive tried:
stage('Continue deploy?') {
when {
branch 'develop' }
}
steps {
input message: "Continue deploy to env2?"
script {
if (currentBuild.result) {
if (env.ENVIRONMENT == 'env1') {
env.ENVIRONMENT = 'env2'
} else input message: "Continue deploy to env3?"
env.ENVIRONMENT = '3'
currentBuild.restart()
}
}
}
}

Do you want to trigger the JOb B 3 times one after another with manual approval?
Following psudo is for is Job A
def envList = ['env1', 'env2', 'env']
for(int i=0; i < envList.size(); i++) {
def b = build job: "JobB", wait: true, parameters: [[$class: 'StringParameterValue', name: 'envName', value: envList[i]]]
if (b.result == 'SUCCESS') {
input message: "Continue deploy to next environment?"
}
}

Related

Re-run a pipeline using script jenkins

I have a pipeline with some information detailed behind
pipeline {
parameters {
booleanParam(name: 'RERUN', defaultValue: false, description: 'Run Failed Tests')
}
stage('Run tests ') {
steps {
runTest()
}
}
post {
always {
reRun()
}
}
}
def reRun() {
if ("SUCCESS".equals(currentBuild.result)) {
echo "LAST BUILD WAS SUCCESS"
} else if ("UNSTABLE".equals(currentBuild.result)) {
echo "LAST BUILD WAS UNSTABLE"
}
}
but I want that after the stage "Run tests" execute, if some tests fail I want to re-run the pipeline with parameters RERUN true instead of false. How can I replay via script instead of using plugins ?
I wasn't able to find how to re-run using parameters on my search, if someone could help me I will be grateful.
First of you can use the post step to determine if the job was unstable:
post{
unstable{
echo "..."
}
}
Then you could just trigger the same job with the new parameter like this:
build job: 'your-project-name', parameters: [[$class: 'BooleanParameterValue', name: 'RERUN', value: Boolean.valueOf("true")]]

Jenkins Pipeline - build same job multiple times in parallel

I am trying to create a pipeline in Jenkins which triggers same job multiple times in different node(agents).
I have "Create_Invoice" job Jenkins, configured : (Execute Concurrent builds if necessary)
If I click on Build 10 times it will run 10 times in different (available) agents/nodes.
Instead of me clicking 10 times, I want to create a parallel pipeline.
I created something like below - it triggers the job but only once.
What Am I missing or is it even possible to trigger same test more than once at the same time from pipeline?
Thank you in advance
node {
def notifyBuild = { String buildStatus ->
// build status of null means successful
buildStatus = buildStatus ?: 'SUCCESSFUL'
// Default values
def tasks = [:]
try {
tasks["Test-1"] = {
stage ("Test-1") {
b = build(job: "Create_Invoice", propagate: false).result
}
}
tasks["Test-2"] = {
stage ("Test-2") {
b = build(job: "Create_Invoice", propagate: false).result
}
}
parallel tasks
} catch (e) {
// If there was an exception thrown, the build failed
currentBuild.result = "FAILED"
throw e
}
finally {
notifyBuild(currentBuild.result)
}
}
}
I had the same problem and solved it by passing different parameters to the same job. You should add parameters to your build steps, although you obviously don't need them. For example, I added a string parameter.
tasks["Test-1"] = {
stage ("Test-1") {
b = build(job: "Create_Invoice", parameters: [string(name: "PARAM", value: "1")], propagate: false).result
}
}
tasks["Test-2"] = {
stage ("Test-2") {
b = build(job: "Create_Invoice", parameters: [string(name: "PARAM", value: "2")], propagate: false).result
}
}
As long as the same parameters or no parameters are passed to the same job, the job is only tirggered once.
See also this Jenkins issue, it describes the same problem:
https://issues.jenkins.io/browse/JENKINS-55748
I think you have to switch to Declarative pipeline instead of Scripted pipeline.
Declarative pipeline has parallel stages support which is your goal:
https://www.jenkins.io/blog/2017/09/25/declarative-1/
This example will grab the available agent from the Jenkins and iterate and run the pipeline in all the active agents.
with this approach, you no need to invoke this job from an upstream job many time to build on a different agent. This Job itself will manage everything and run all the stages define in all the online node.
jenkins.model.Jenkins.instance.computers.each { c ->
if(c.node.toComputer().online) {
node(c.node.labelString) {
stage('steps-one') {
echo "Hello from Steps One"
}
stage('stage-two') {
echo "Hello from Steps Two"
}
}
} else {
println "SKIP ${c.node.labelString} Because the status is : ${c.node.toComputer().online} "
}
}

killing/stopping Jenkins job with a very long list

I'm new to Jenkins so I hope my terms are correct:
I have a Jenkins job that triggers another job. This second job tests a very long list of items (maybe 2000) it gets from the trigger.
Because it's such a long list, I pass it to the second job in groups of 20.
Unfortunately, this list turned out to take an extremely long time, and I can't stop it.
No matter what I tried, stop/kill only stop the current group of 20, and proceeds to the group.
Waiting for it to finish, or doing this manually for each group is not an option.
I guess the entire list was already passed to the second job, and it's loading the next group whenever the current one ends.
What I tried:
Clicking the "stop" button next to the build on the trigger and the second job
Using purge build queue add on
Using the following script in script console:
def jobname = "Trigger Job"
def buildnum = 123
def job = Jenkins.instance.getItemByFullName(jobname)
for (build in job.builds) {
if (buildnum == build.getNumber().toInteger()){
if (build.isBuilding()){
build.doStop();
build.doKill();
}
}
}
Using the following script in script console:
String job = 'Job name';
List<Integer> build_list = [];
def result = jenkins.model.Jenkins.instance.getItem(job).getBuilds().findAll{
it.isBuilding() == true && (!build_list || build_list.contains(it.id.toInteger()))}.each{it.doStop()}.collect{it.id};
println new groovy.json.JsonBuilder(result).toPrettyString(); ```
This is my groovy part of the code that splits it into groups of 20. Maybe I should put the parallel part outside the sub list loop?
Is there a better way to divide into sub lists for future use?
stages {
stage('Execute tests') {
steps {
script {
// Limit number of items to run
def shortList = IDs.take(IDs.size()) // For testing purpose, can be removed if not needed
println(Arrays.toString(shortList))
// devide the list of items into small, equal,sub-lists
def colList = shortList.toList().collate(20)
for (subList in colList) {
testStepsForParallel = subList.collectEntries {
["Testing on ${it}": {
catchError(buildResult: 'FAILURE', stageResult: 'FAILURE') {
stage(it) {
def buildWrapper = build job: "Job name",
parameters: [
string(name: 'param1', value: it.trim()),
string(name: 'param2', value: "")
],
propagate: false
remoteBuildResult = buildWrapper.result
println("Remote build results: ${remoteBuildResult}")
if (remoteBuildResult == "FAILURE") {
currentBuild.result = "FAILURE"
}
catchError(stageResult: 'UNSTABLE') {
copyArtifacts projectName: "Job name", selector: specific("${buildWrapper.number}")
}
}
}
}]
}
parallel testStepsForParallel
}
}
}
}
}
Thanks for your help!
Don't know what else to do to stop this run.

How to run trigger another job outside pipeline

I have three jobs which are in pipeline. Whenever anyone fails due to an internal account lock these have to trigger post-build action.In POst build action i mentioned Trigger when build is failed. I wrote a robot test to unlock the account and I wrote a shell script to call this test.
I am calling this template in both jobs in post-build action and building it on the same node.But what i found is this post build action is kept in pending state and jenkins is triggering downstream project. How to make Jenkins to run post build action when the current job fails?
How to achieve that?
You can play with the seed job's propagate property.
Simple example:
Map jobResults = [:]
pipeline {
agent any
stages {
stage('Build seedjob 1') {
steps {
script {
String seedJobName = 'testjob1'
def seedJob = build job: seedJobName, propagate: false
jobResults[seedJobName] = seedJob.result
echo "Result of ${seedJobName}: ${seedJob.result}"
}
}
}
stage('Build seedjob 2') {
steps {
script {
String seedJobName = 'testjob2'
def seedJob = build job: seedJobName, propagate: false
jobResults[seedJobName] = seedJob.result
echo "Result of ${seedJobName}: ${seedJob.result}"
}
}
}
}
post {
success {
script {
if(jobResults['testjob1'] == 'FAILURE') {
echo "Running another job"
build job: 'another-job1', propagate: true
}
if(jobResults['testjob2'] == 'FAILURE') {
echo "Running another job"
build job: 'another-job2', propagate: true
}
}
}
}
}

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Resources