Stop parallel Jenkins jobs from being superseded/skipped? - jenkins

I am running a Jenkins job where I call another Jenkins job to build azure environments.
I create a 2d array [:] and store 3 jobs inside.
When I call the keyword 'parallel' on the array, the 3 jobs should run in parallel. This has worked for all my past Jenkins files, but when I run it here, it only runs one or two of the three jobs.
node(label: 'master')
{
def branches = [:]
stage ('Parallel Builds')
{
for (int i = 0; i < 3; i++)
{
branches["branch${i}"] = prepare(i)
}
echo "branches: ${branches}"
parallel branches
}
}
def prepare(def num)
{
return {
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snap-win10-19.6.9-boh-cfc-qs'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-'+num]
]
}
}
Jenkins console - skipping job when running in parallel
I am expecting all parallel jobs to run together but it keeps skipping one or two.
EDIT: I have also implemented a retry(3) for the failed branches but the jobs that fail just hang infinitely... below is the retry code as well as a picture to the jenkins console..
Jenkins console - jobs that fail hang
node(label: 'master')
{
def branches = [:]
branches.failFast = false
stage ('Parallel Builds')
{
for (int i = 0; i < 3; i++)
{
branches["branch${i}"] = prepare(i)
}
echo "branches: ${branches}"
}
try
{
parallel branches
}
catch(Exception e)
{
e.getCauses().each
{
echo "${it.getShortDescription()}"
}
}
}
def prepare(def num)
{
return {
try
{
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snapPOSQSWithEDC1.2'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-0-PARALLEL-'+num],
[$class: 'StringParameterValue', name:'ResourceGroupName', value: 'rg-aloha-pos-automation']
]
}
catch(error)
{
echo "First build failed, let's retry if accepted"
retry(3)
{
input "***Retry the job***"
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snapPOSQSWithEDC1.2'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-PARALLEL-'+num],
[$class: 'StringParameterValue', name:'ResourceGroupName', value: 'rg-aloha-pos-automation']
]
}
}
}
}

Try adding propagate: false to the build command.
build job: 'Azure/Environment-General/Environment - Create', propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snap-win10-19.6.9-boh-cfc-qs'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-'+num]

Found a very stupid solution..... The job I was calling had the parameter checked off 'Do not allow concurrent builds'..... I unchecked and parallel works fine.
Will keep this up just in case anyone makes this mistake as well Lol

Related

can we run Jenkins file in pipeline?

I've Pipeline Generic Webhook from Bitbucket, this is a job to trigger another job.
currentBuild.displayName = "Generic-Job-#" + env.BUILD_NUMBER
pipeline {
agent any
triggers {
GenericTrigger(
genericVariables: [
[key: 'actorName', value: '$.actor.display_name'],
[key: 'TAG', value: '$.push.changes[0].new.name'],
[key: 'REPONAME', value: '$.repository.name'],
[key: 'GIT_URL', value: '$.repository.links.html.href'],
],
token: '11296ae8d97b2134550f',
causeString: ' Triggered on $actorName version $TAG',
printContributedVariables: true,
printPostContent: true
)
}
stages {
stage('Build Job DEVELOPMENT') {
when {
expression { return params.TARGET_ENV == 'DEVELOPMENT' }
}
steps {
build job: 'DEVELOPMENT',
parameters: [
[$class: 'StringParameterValue', name: 'FROM_BUILD', value: "${BUILD_NUMBER}"],
[$class: 'StringParameterValue', name: 'TAG', value: "${TAG}"],
[$class: 'StringParameterValue', name: 'GITURL', value: "${GIT_URL}"],
[$class: 'StringParameterValue', name: 'REPONAME', value: "${REPONAME}"],
[$class: 'StringParameterValue', name: 'REGISTRY_URL', value: "${REGISTRY_URL}"],
]
}
}
}
}
Another Pipeline
pipeline {
agent any
stages {
stage('Cleaning') {
steps {
cleanWs()
}
}
def jenkinsFile
stage('Loading Jenkins file') {
jenkinsFile = fileLoader.fromGit('Jenkinsfile', "${GIT_URL}", "${TAG}", null, '')
}
jenkinsFile.start()
}
}
can i run Jenkinsfile in Pipeline ? Because every project I make has a different Jenkinsfile, it can't be the same, but when I run this it doesn't execute the Jenkinsfile
it works for me :D
Sample Pipeline
stage 'Load a file from GitHub'
def jenkinsFile = fileLoader.fromGit('<path-jenkinsfile>', "<path-git>", "<branch>", '<credentials>', '')
stage 'Run method from the loaded file'
jenkinsFile
pipeline {
agent any
stages {
stage('Print Hello World Ke #1') {
steps {
echo "Hello Juan"
}
}
}
}
before run the pipeline, you must install plugin "Pipeline Remote Loader Plugin Version"

Jenkins declarative pipeline: if-else statement inside parameters directive

I'm trying to display a choice parameter if I have options to choose from, or else display an input text, something like this (which does not work):
pipeline {
agent any
parameters {
if (someOptions) {
choice(name: 'FIELD_NAME', choices: "$someOptions", description: 'Field description')
} else {
string(name: 'FIELD_NAME', defaultValue: '', description: 'Field description')
}
}
environment {
// environment params
}
stages {
// stages
}
}
Is there a way of doing this?
To expand on #Matt Schuchard's comment, here's what this might look like:
def my_param = []
if (someOptions) {
my_param = [$class: 'ChoiceParameter',
name: 'FIELD_NAME',
choiceType: 'PT_SINGLE_SELECT',
description: 'Choose the desired option',
script:
[$class: 'GroovyScript',
fallbackScript:
[classpath: [], sandbox: false, script: 'return ""'],
script:
[classpath: [], sandbox: false, script: "return $someOptions"]
]
]
} else {
my_param = [$class: 'StringParameterDefinition',
name: 'FIELD_NAME',
defaultValue: false,
description: '']
}
properties([
parameters([my_param,
// other parameters
Don't forget to approve Groovy scripts in script approval console.

Pipeline DSL: Change build name on the fly

i would like to run some builds in parallel based on one core job (downstream job) , in the following code there is no option to change the build name on the fly, for example instead of numbers (default) , i want to get ABCDE_${number} , is that possible?
i tried to use some plugin of Version Number plug in, but i cannot set build number ....
stage ("Run Tests") {
steps {
dir("UIAutomationV2.0") {
script {
tasks = [:]
products_set = [].toSet()
features_list.each {
def featureName = it.key
tasks[featureName] = {
withEnv(["FEATURE_NAME=${featureName}"]) {
def valArr = it.value.split(",")
def productName = valArr[0]
def productPath = valArr[1]
def runnerFeaturePath = productPath.replace("UIAutomationV2.0/", '')
metaData["tests"][it.key]['phases']['Run Tests']["startTime"] = getEpochTime();
println "Run test startTime : " + metaData["tests"][it.key]['phases']['Run Tests']["startTime"]
println "Calling build for feature '${featureName}' in job '${productName}' under path ='${productPath}' "
pJob = build job: "v2_Core_Task_Runner_Slack", propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'FeatureName', value: "${featureName}"],
[$class: 'StringParameterValue', name: 'FeaturePath', value: "${runnerFeaturePath}"],
[$class: 'StringParameterValue', name: 'TagName', value: "${params.TagName}"],
[$class: 'StringParameterValue', name: 'Environment', value: "${params.Environment}"],
[$class: 'BooleanParameterValue', name: 'CreateTenant', value: params.CreateTenant],
[$class: 'StringParameterValue', name: 'TagNameCondition', value: "${params.TagNameCondition}"],
[$class: 'StringParameterValue', name: 'TenantTemplate', value: "${params.TenantTemplate}"],
[$class: 'StringParameterValue', name: 'ClientLabel', value: "AGENTS_LABEL_${JOB_NAME}_${BUILD_NUMBER}"]
]
metaData["tests"][it.key]['phases']['Run Tests']["endTime"] = getEpochTime();
metaData["tests"][it.key]['consoleUrl'] = pJob.getAbsoluteUrl();
metaData["tests"][it.key]['result'] = pJob.getResult();
//pJob.nextBuildNumber = pJob.nextBuildNumber() + 1
println "Job result = " + pJob.getResult() + ", Url: " + pJob.getAbsoluteUrl()
// def nextBldNo = VersionNumber(versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMdd"}-feature-${featureName}-${BUILDS_TODAY}')
// nextBldNo = '${nextBldNo}' + pJob.nextBuildNumber
// println "next build :: " + '${nextBldNo}'
// println "next build num >> " + VersionNumber(versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMdd"}-feature-${featureName}-${BUILDS_TODAY}')
println "Copy artificats to 'allreports/${productName}/${featureName}'"
copyArtifacts(
projectName: 'v2_Core_Task_Runner_Slack',
filter: '**/report.json',
fingerprintArtifacts: true,
target: "allreports/${productName}/${featureName}",
flatten: true,
selector: specific(pJob.getId()),
optional: true
)
println "Run test endtime : " + metaData["tests"][it.key]['phases']['Run Tests']["endTime"]
parallel tasks
metaData["endTime"] = getEpochTime()
metDataStr = new JsonBuilder(metaData).toPrettyString()
//killPhaseCondition("NEVER")
}
}
}
}
def test = [:]
pipeline {
stages {
stage ('create map') {
steps {
script {
for (int i = 0; i < (yourCounter as Integer); i++) {
def name = i
test.put(name, testFunc(name))
}
}
}
}
stage ('run parallel') {
steps {
script {
parallel test
}
}
}
}
}
def testFunc(name) {
return {
stage ("${name}") {
node ("${name}") {
script {
echo name
}
}
}
}
}
in this way you can pass any value to testFunc() and control its params

Execute only selected jobs using Jenkins pipe line

I have this jenkins pipe line which has multiple stages. Inside these stages, there are multiple jobs being executed.
When I build the job I'd like to have a set of check boxes and the pipe line should build only what I've checked inside the pipeline stages. Is there any plugins or methods I can use to achieve this?
Sample pipeline code.
As per below example, there are jobs called job_A1, job_B1, job_C1, job_D1, job_A2, job_B2, job_C2 and job_D2. If I click Build with parameters, it should prompt me check boxes and I should be able to check any job I want so that the pipe line will build only the ones I checked.
Thanks in Advance.
pipeline {
agent {label 'server01'}
stages {
stage('Build 01') {
steps {
parallel (
"BUILD A1" : {
build job: 'job_A1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B1" : {
build job: 'job_B1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C1" : {
build job: 'job_C1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D1" : {
build job: 'job_D1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
stage('Build 02') {
steps {
parallel (
"BUILD A2" : {
build job: 'job_A2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B2" : {
build job: 'job_B2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C2" : {
build job: 'job_C2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D2" : {
build job: 'job_D2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
}
}
Thanks #mbn217 for your answer, but ExtendedChoice parameter didn't help much in my scenario.
Anyway, I could do it using boolean parameters and calling it inside the pipeline using the script tag.
Example pipeline script
stage ('BUILD A') {
steps {
script {
if (params.get('boolA',true)) {
build job: '_build_A', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "A is not selected to build"
}
}
}
}
stage ('BUILD B') {
steps {
script {
if (params.get('boolB',true)) {
build job: '_build_B', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "B is not selected to build"
}
}
}
}
You can use ExtendedChoiceParameter to accomplish what you want. Basically you will nee to parametrize job Names too using this jenkins plugin.
You can use a list of checkboxes as shown in the screen shot

Aggregate downstream test results

I have a jenkins workflow configuration that is running several test jobs in parallel. What I would like to do, is aggregate all test results and display them on the 'workflow' job page.
Here is my current configuration:
node('git && linux') {
// dome some stuff here
}
stage "Running unit tests for $REVISION"
build job: 'unit-tests', parameters: [[$class: 'StringParameterValue', name: 'REVISION', value: REVISION], [$class: 'StringParameterValue', name: 'REFSPEC', value: REFSPEC]]
stage "Running integration tests $REVISION"
def jobs = [:]
jobs['integration-tests-1']={build job: 'integration-tests-job-1', parameters: [[$class: 'StringParameterValue', name: 'REVISION', value: REVISION], [$class: 'StringParameterValue', name: 'REFSPEC', value: REFSPEC]]}
jobs['integration-tests-2']={build job: 'integration-tests-job-2', parameters: [[$class: 'StringParameterValue', name: 'REVISION', value: REVISION], [$class: 'StringParameterValue', name: 'REFSPEC', value: REFSPEC]]}
jobs['integration-tests-3']={build job: 'integration-tests-job-3', parameters: [[$class: 'StringParameterValue', name: 'REVISION', value: REVISION], [$class: 'StringParameterValue', name: 'REFSPEC', value: REFSPEC]]}
parallel jobs
// Here I would like to add something like
// aggregateDownstream('unit-tests', 'integration-tests-1', 'integration-tests-2', 'integration-tests-3'
Do you know if it would be possible to achieve this?

Resources