i would like to run some builds in parallel based on one core job (downstream job) , in the following code there is no option to change the build name on the fly, for example instead of numbers (default) , i want to get ABCDE_${number} , is that possible?
i tried to use some plugin of Version Number plug in, but i cannot set build number ....
stage ("Run Tests") {
steps {
dir("UIAutomationV2.0") {
script {
tasks = [:]
products_set = [].toSet()
features_list.each {
def featureName = it.key
tasks[featureName] = {
withEnv(["FEATURE_NAME=${featureName}"]) {
def valArr = it.value.split(",")
def productName = valArr[0]
def productPath = valArr[1]
def runnerFeaturePath = productPath.replace("UIAutomationV2.0/", '')
metaData["tests"][it.key]['phases']['Run Tests']["startTime"] = getEpochTime();
println "Run test startTime : " + metaData["tests"][it.key]['phases']['Run Tests']["startTime"]
println "Calling build for feature '${featureName}' in job '${productName}' under path ='${productPath}' "
pJob = build job: "v2_Core_Task_Runner_Slack", propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'FeatureName', value: "${featureName}"],
[$class: 'StringParameterValue', name: 'FeaturePath', value: "${runnerFeaturePath}"],
[$class: 'StringParameterValue', name: 'TagName', value: "${params.TagName}"],
[$class: 'StringParameterValue', name: 'Environment', value: "${params.Environment}"],
[$class: 'BooleanParameterValue', name: 'CreateTenant', value: params.CreateTenant],
[$class: 'StringParameterValue', name: 'TagNameCondition', value: "${params.TagNameCondition}"],
[$class: 'StringParameterValue', name: 'TenantTemplate', value: "${params.TenantTemplate}"],
[$class: 'StringParameterValue', name: 'ClientLabel', value: "AGENTS_LABEL_${JOB_NAME}_${BUILD_NUMBER}"]
]
metaData["tests"][it.key]['phases']['Run Tests']["endTime"] = getEpochTime();
metaData["tests"][it.key]['consoleUrl'] = pJob.getAbsoluteUrl();
metaData["tests"][it.key]['result'] = pJob.getResult();
//pJob.nextBuildNumber = pJob.nextBuildNumber() + 1
println "Job result = " + pJob.getResult() + ", Url: " + pJob.getAbsoluteUrl()
// def nextBldNo = VersionNumber(versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMdd"}-feature-${featureName}-${BUILDS_TODAY}')
// nextBldNo = '${nextBldNo}' + pJob.nextBuildNumber
// println "next build :: " + '${nextBldNo}'
// println "next build num >> " + VersionNumber(versionNumberString: '${BUILD_DATE_FORMATTED, "yyyyMMdd"}-feature-${featureName}-${BUILDS_TODAY}')
println "Copy artificats to 'allreports/${productName}/${featureName}'"
copyArtifacts(
projectName: 'v2_Core_Task_Runner_Slack',
filter: '**/report.json',
fingerprintArtifacts: true,
target: "allreports/${productName}/${featureName}",
flatten: true,
selector: specific(pJob.getId()),
optional: true
)
println "Run test endtime : " + metaData["tests"][it.key]['phases']['Run Tests']["endTime"]
parallel tasks
metaData["endTime"] = getEpochTime()
metDataStr = new JsonBuilder(metaData).toPrettyString()
//killPhaseCondition("NEVER")
}
}
}
}
def test = [:]
pipeline {
stages {
stage ('create map') {
steps {
script {
for (int i = 0; i < (yourCounter as Integer); i++) {
def name = i
test.put(name, testFunc(name))
}
}
}
}
stage ('run parallel') {
steps {
script {
parallel test
}
}
}
}
}
def testFunc(name) {
return {
stage ("${name}") {
node ("${name}") {
script {
echo name
}
}
}
}
}
in this way you can pass any value to testFunc() and control its params
Related
I am running a Jenkins job where I call another Jenkins job to build azure environments.
I create a 2d array [:] and store 3 jobs inside.
When I call the keyword 'parallel' on the array, the 3 jobs should run in parallel. This has worked for all my past Jenkins files, but when I run it here, it only runs one or two of the three jobs.
node(label: 'master')
{
def branches = [:]
stage ('Parallel Builds')
{
for (int i = 0; i < 3; i++)
{
branches["branch${i}"] = prepare(i)
}
echo "branches: ${branches}"
parallel branches
}
}
def prepare(def num)
{
return {
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snap-win10-19.6.9-boh-cfc-qs'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-'+num]
]
}
}
Jenkins console - skipping job when running in parallel
I am expecting all parallel jobs to run together but it keeps skipping one or two.
EDIT: I have also implemented a retry(3) for the failed branches but the jobs that fail just hang infinitely... below is the retry code as well as a picture to the jenkins console..
Jenkins console - jobs that fail hang
node(label: 'master')
{
def branches = [:]
branches.failFast = false
stage ('Parallel Builds')
{
for (int i = 0; i < 3; i++)
{
branches["branch${i}"] = prepare(i)
}
echo "branches: ${branches}"
}
try
{
parallel branches
}
catch(Exception e)
{
e.getCauses().each
{
echo "${it.getShortDescription()}"
}
}
}
def prepare(def num)
{
return {
try
{
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snapPOSQSWithEDC1.2'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-0-PARALLEL-'+num],
[$class: 'StringParameterValue', name:'ResourceGroupName', value: 'rg-aloha-pos-automation']
]
}
catch(error)
{
echo "First build failed, let's retry if accepted"
retry(3)
{
input "***Retry the job***"
build job: 'Azure/Environment-General/Environment - Create', parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snapPOSQSWithEDC1.2'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-PARALLEL-'+num],
[$class: 'StringParameterValue', name:'ResourceGroupName', value: 'rg-aloha-pos-automation']
]
}
}
}
}
Try adding propagate: false to the build command.
build job: 'Azure/Environment-General/Environment - Create', propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'BOHSnapshotName', value: 'snap-win10-19.6.9-boh-cfc-qs'],
[$class: 'StringParameterValue', name:'Terminal1SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'Terminal2SnapshotName', value: 'none'],
[$class: 'StringParameterValue', name:'EnvironmentPrefix', value: 'jl250638-'+num]
Found a very stupid solution..... The job I was calling had the parameter checked off 'Do not allow concurrent builds'..... I unchecked and parallel works fine.
Will keep this up just in case anyone makes this mistake as well Lol
UPDATE
I have a simple pipeline where I want to receive in parameters multiple choices from a file.
In my file I have
#Test1,Accounts
#Test2,Services
#Test3,Accesses
and I want to have all of "#Test1", "#Test2" and "#Test3" in checkboxes as parameters so I would run only the tests selected.
But I'm not understanding what I'm doing wrong.
Pipeline
def code = """tests = getChoices()
return tests
def getChoices() {
def filecontent = readFile "/var/jenkins_home/test.txt"
def stringList = []
for (line in filecontent.readLines()) {stringList.add(line.split(",")[0].toString())}
List modifiedList = stringList.collect{'"' + it + '"'}
return modifiedList
}""".stripIndent()
properties([
parameters([
[$class : 'CascadeChoiceParameter',
choiceType : 'PT_CHECKBOX',
description : 'Select a choice',
filterLength : 1,
filterable : false,
name : 'choice1',
referencedParameters: 'role',
script : [$class : 'GroovyScript',
fallbackScript: [
classpath: [],
sandbox : true,
script : 'return ["ERROR"]'
],
script : [
classpath: [],
sandbox : true,
script : code
]
]
]
])
])
pipeline {
agent {
docker { image 'node:latest' }
}
stages {
stage('Tags') {
steps {
getChoices()
}
}
}
}
def getChoices() {
def filecontent = readFile "/var/jenkins_home/test.txt"
def stringList = []
for (line in filecontent.readLines()) {
stringList.add(line.split(',')[0].toString())
}
List modifiedList = stringList.collect { '"' + it + '"' }
echo "$modifiedList"
return modifiedList
}
With this approach I know I can use multi-select checkboxes because I tried to substitute
def code = """ tests = ["Test1", "Test2", "Test3"]
return tests""".stripIndent()
and I get the output that I wanted.
But when I run my pipeline I get build SUCCESS but always get fallbackScript in my Build parameters checkbox. Can anyone help me out understand what is causing fallbackScript to run always? Thanks :)
If you want to auto-populate build parameters you have to return a list of parameters from your function. When you execute the pipeline the build with parameters will be populated. Note in this was only from the second execution of the pipeline the new parameters will be available. Refer following.
pipeline {
agent any
parameters{
choice(name: 'TESTES', choices: tests() , description: 'example')
}
stages {
stage('Hello') {
steps {
echo 'Hello World'
}
}
}
}
def tests() {
return ["Test01", "Test2", "Test4"]
}
If you want to get user input each time you execute a build you should move your choice parameter into a stage. Please refer to the following.
pipeline {
agent any
stages {
stage('Get Parameters') {
steps {
script{
def choice = input message: 'Please select', ok: 'Next',
parameters: [choice(name: 'PRODUCT', choices: tests(), description: 'Please select the test')]
echo '$choice'
}
}
}
}
}
def tests() {
return ["Test01", "Test2", "Test4"]
}
Update 02
Following is how to read from a file and dynamically create the choice list.
pipeline {
agent any
stages {
stage('Get Parameters') {
steps {
script{
sh'''
echo "#Test1,Accounts" >> test.txt
echo "#Test2,Services" >> test.txt
'''
def choice = input message: 'Please select', ok: 'Next',
parameters: [choice(name: 'PRODUCT', choices: getChoices(), description: 'Please select the test')]
}
}
}
}
}
def getChoices() {
def filecontent = readFile "test.txt"
def choices = []
for(line in filecontent.readLines()) {
echo "$line"
choices.add(line.split(',')[0].split('#')[1])
}
return choices
}
I've Pipeline Generic Webhook from Bitbucket, this is a job to trigger another job.
currentBuild.displayName = "Generic-Job-#" + env.BUILD_NUMBER
pipeline {
agent any
triggers {
GenericTrigger(
genericVariables: [
[key: 'actorName', value: '$.actor.display_name'],
[key: 'TAG', value: '$.push.changes[0].new.name'],
[key: 'REPONAME', value: '$.repository.name'],
[key: 'GIT_URL', value: '$.repository.links.html.href'],
],
token: '11296ae8d97b2134550f',
causeString: ' Triggered on $actorName version $TAG',
printContributedVariables: true,
printPostContent: true
)
}
stages {
stage('Build Job DEVELOPMENT') {
when {
expression { return params.TARGET_ENV == 'DEVELOPMENT' }
}
steps {
build job: 'DEVELOPMENT',
parameters: [
[$class: 'StringParameterValue', name: 'FROM_BUILD', value: "${BUILD_NUMBER}"],
[$class: 'StringParameterValue', name: 'TAG', value: "${TAG}"],
[$class: 'StringParameterValue', name: 'GITURL', value: "${GIT_URL}"],
[$class: 'StringParameterValue', name: 'REPONAME', value: "${REPONAME}"],
[$class: 'StringParameterValue', name: 'REGISTRY_URL', value: "${REGISTRY_URL}"],
]
}
}
}
}
Another Pipeline
pipeline {
agent any
stages {
stage('Cleaning') {
steps {
cleanWs()
}
}
def jenkinsFile
stage('Loading Jenkins file') {
jenkinsFile = fileLoader.fromGit('Jenkinsfile', "${GIT_URL}", "${TAG}", null, '')
}
jenkinsFile.start()
}
}
can i run Jenkinsfile in Pipeline ? Because every project I make has a different Jenkinsfile, it can't be the same, but when I run this it doesn't execute the Jenkinsfile
it works for me :D
Sample Pipeline
stage 'Load a file from GitHub'
def jenkinsFile = fileLoader.fromGit('<path-jenkinsfile>', "<path-git>", "<branch>", '<credentials>', '')
stage 'Run method from the loaded file'
jenkinsFile
pipeline {
agent any
stages {
stage('Print Hello World Ke #1') {
steps {
echo "Hello Juan"
}
}
}
}
before run the pipeline, you must install plugin "Pipeline Remote Loader Plugin Version"
I have a pipeline script that needs to trigger a "TEST" job.
The main parameter (string) is SETUP_DESCRIPTION which I phrase from a json file I'm creating.
Each server can have different amount of outputs depends on server resources (some have 2 setups and some 3).
Code looks like this:
#!/usr/bin/env groovy
import hudson.model.Result
import hudson.model.Run
import groovy.json.JsonSlurperClassic
import jenkins.model.CauseOfInterruption.UserInterruption
import org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
def projectProperties = [
buildDiscarder(
logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '14', numToKeepStr: '')
),
parameters([
string(defaultValue: '', description: '', name: 'SERVER_NAME'),
string(defaultValue: 'Ubuntu_17.10_x86_64_kvm', description: '', name: 'KVM_TEMPLATE'),
string(defaultValue: 'test#test.com'', description: 'mailing list', name: 'SW_MAIL'),
choice(choices: ['no', 'eth', 'ib'], description: '', name: 'SIMX_SERVER'),
choice(choices: ['cib', 'cx3pro', 'cx4', 'cx4lx', 'cx5', 'cx6'], description: '', name: 'SIMX_BOARD'),
choice(choices: ['os_install', 'provision', 'add_jks_slave', 'add_to_noga', 'tests'], description: '', name: 'RUN_STAGE')
]),
[$class: 'RebuildSettings', autoRebuild: false, rebuildDisabled: false],
[$class: 'ThrottleJobProperty',
categories: [],
limitOneJobWithMatchingParams: true,
maxConcurrentPerNode: 5,
maxConcurrentTotal: 5,
paramsToUseForLimit: '',
throttleEnabled: true,
throttleOption: 'project'
],
]
properties(projectProperties)
def build_sanity (SETUP_DESCRIPTION) {
IMAGE = "linux/upstream_devel-x86_64"
CLOUD_IP = "dev-l-vrt-storage"
TAGS = "test_new_setup"
if ("$SETUP_DESCRIPTION" != "b2b x86-64 cib cloud_test") {
DATA_BASE = "b2b_eth_drivertest_mini_reg_db.json"
LINK_LAYER = "eth"
}
else {
DATA_BASE = "b2b_ib_drivertest_mini_reg_db.json"
LINK_LAYER = "ib"
}
build job: 'SANITY_TESTS/new_cloud_setup_GENERAL_SANITY_CHECK2', propagate: false
parameters:
[string(name: 'SETUP_DESCRIPTION', value: "${SETUP_DESCRIPTION}"),
string(name: 'DATA_BASE', value: "${DATA_BASE}"),
string(name: 'LINK_LAYER', value: "${LINK_LAYER}"),
string(name: 'IMAGE', value: "${IMAGE}"),
string(name: 'CLOUD_IP', value: "${CLOUD_IP}"),
string(name: 'TAGS', value: "${TAGS}")]
}
try {
ansiColor('xterm') {
timestamps {
node('cloud-slave1'){
stage('Test Setups') {
if (params.RUN_STAGE == 'os_install' || params.RUN_STAGE == 'provision' || params.RUN_STAGE == 'add_jks_slave' || params.RUN_STAGE == 'add_to_noga' || params.RUN_STAGE == 'tests') {
def stepsForParrallel = [:]
def NOGA_DESCRIPTION_LIST = sh (
script: "curl -X GET 'https://noga.mellanox.com/app/server/php/rest_api/?api_cmd=get_resources&pattern=${params.SERVER_NAME}&resource_type=Setup&group_name=Yaron&sub_group=Cloud'",
returnStdout: true
).trim()
#NonCPS
def JSON = new groovy.json.JsonSlurperClassic().parseText(NOGA_DESCRIPTION_LIST)
def DESCRIPTION_LIST = JSON.data.each{
SETUP_NAME = "${it.NAME}"
SETUP_DESCRIPTION = "${it.DESCRIPTION}"
println "${it.DESCRIPTION}" // PRINT ALL DECRIPTIONS INSIDE DATA
stepsForParrallel["${it.NAME}"] = {
build_sanity(SETUP_DESCRIPTION)
}
}
parallel stepsForParrallel
}
}
}
}
}
}catch (exc) {
def recipient = "${SW_MAIL}"
def subject = "${env.JOB_NAME} (${env.BUILD_NUMBER}) Failed"
def body = """
It appears that build ${env.BUILD_NUMBER} is failing, please check HW or network stability:
${env.BUILD_URL}
"""
mail subject: subject,
to: recipient,
replyTo: recipient,
from: 'cloud-host-provision#mellanox.com',
body: body
throw exc
1) When I run it like code above build_sanity function called once and execute (instead of 3 times as expected).
2) When I take build_sanity function content and run it inside the ech loop in the tests stage it runs 3 times as expected but not choosing different parameters as expected.
so i managed to figure it out.
1) i have println some parameters and saw my function did not received the variables ok
so i have changed the build job: part and that fixed that issue.
2) i also had issue in the stage part. i put the "run parallel" inside the each loop which cause it to ran several times. so i drop it one } down and that fixed the loop issue
here is the function code + stage if anyone encounter such issue in the future
def build_sanity (SETUP_DESCRIPTION) {
if ("$SETUP_DESCRIPTION" != "b2b x86-64 cib cloud_test") {
DATA_BASE = "b2b_eth_drivertest_mini_reg_db.json"
LINK_LAYER = "eth"
}
else {
DATA_BASE = "b2b_ib_drivertest_mini_reg_db.json"
LINK_LAYER = "ib"
}
IMAGE = "linux/upstream_devel-x86_64"
CLOUD_IP = "dev-l-vrt-storage"
TAGS = "test_new_setup"
build job: 'SANITY_TESTS/new_cloud_setup_GENERAL_SANITY_CHECK',
parameters: [
string(name: 'SETUP_DESCRIPTION', value: "${SETUP_DESCRIPTION}"),
string(name: 'DATA_BASE', value: "${DATA_BASE}"),
string(name: 'LINK_LAYER', value: "${LINK_LAYER}"),
string(name: 'IMAGE', value: "${IMAGE}"),
string(name: 'TAGS', value: "${TAGS}"),
]
}
stage('Test Setups') {
if (params.RUN_STAGE == 'os_install' || params.RUN_STAGE == 'provision' || params.RUN_STAGE == 'add_jks_slave' || params.RUN_STAGE == 'add_to_noga' || params.RUN_STAGE == 'tests') {
def stepsForParrallel = [:]
def NOGA_DESCRIPTION_LIST = sh (
script: "curl -X GET 'https://noga.mellanox.com/app/server/php/rest_api/?api_cmd=get_resources&pattern=${params.SERVER_NAME}&resource_type=Setup&group_name=Yaron&sub_group=Cloud'",
returnStdout: true
).trim()
#NonCPS
def JSON = new groovy.json.JsonSlurperClassic().parseText(NOGA_DESCRIPTION_LIST)
def DESCRIPTION_LIST = JSON.data.each{
def SETUP_NAME = "${it.NAME}"
stepsForParrallel["${it.NAME}"] = {
build_sanity("${it.DESCRIPTION}")
}
}
parallel stepsForParrallel
}
}
I'm trying to use version number plugin to format a version number for our
packages.
From some reason the placement for the version variable doesn't work
and when I echo the following I only get the build number, for instance: "...54"
def Version_Major = '1'
def Version_Minor = '0'
def Version_Patch = '0'
pipeline {
environment {
VERSION = VersionNumber([
versionNumberString: '${Version_Major}.${Version_Minor}.${Version_Patch}.${BUILD_NUMBER}',
worstResultForIncrement: 'SUCCESS'
]);
}
stage ('Restore packages'){
steps {
script{
echo "${VERSION}"
}
}
}
}
Edit: It does look like an issue with the plugin usage since this works:
properties([
parameters([
string(name: 'Version_Major', defaultValue: '1', description: 'Version Major'),
string(name: 'Version_Minor', defaultValue: '0', description: 'Version Minor'),
string(name: 'Version_Patch', defaultValue: '0', description: 'Version Patch')
])
])
pipeline {
agent any
environment {
VERSION = "${params.Version_Major}.${params.Version_Minor}.${params.Version_Patch}.${BUILD_NUMBER}"
}
stages{
stage ('Test'){
steps {
echo "${VERSION}"
}
}
}
}
You must define the variables inside the pipeline.
Try this:
pipeline {
environment {
Version_Major = '1'
Version_Minor = '0'
Version_Patch = '0'
VERSION = VersionNumber([
versionNumberString: '${Version_Major}.${Version_Minor}.${Version_Patch}.${BUILD_NUMBER}',
worstResultForIncrement: 'SUCCESS'
]);
}
stage ('Restore packages'){
steps {
script{
echo "${VERSION}"
}
}
}
}
If you need to use a parameter instead that's also possible via:
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
usage:
"Hello ${params.PERSON}"