stage ("install_new_image") {
when { expression {IMAGE_FILE_PATH_VAR} }
steps {
script {
def IMAGE_FILE_VAR = IMAGE_FILE_PATH_VAR.split('/').last()
def INSTALL_IMAGE_ARGS = "./job_files/image_upload_job.py --testbed-file ${TESTBED_FILE} --image_file ${IMAGE_FILE_VAR} --mail-to ${MAIL_TO_VAR}"
echo "args for install_new_image= ${INSTALL_IMAGE_ARGS}"
}
build job: "/team_eng_ent_routing/Helper_Projects/PYATS_JOB_EXECUTOR", parameters: [
string(name: "pyats_job_args", value: INSTALL_IMAGE_ARGS),
string(name: "branch_name", value: BRANCH_NAME_VAR),
string(name: "platform_name", value: PLATFORM_NAME_VAR)
]
}
}
The above stage fails because of the below error,, whats the correct way to initialise and use variables?
hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: INSTALL_IMAGE_ARGS for class: WorkflowScript
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
You can either declare variables globally, before the pipeline, and then initialize them at any point in the pipeline, these will be accessible from anywhere in the pipeline, or you can declare/initialize variables within a script block, these variables are accessible only within the scope of the script block.
In your case, to easily resolve this you can move everything into the script block.
stage ("install_new_image") {
when { expression {IMAGE_FILE_PATH_VAR} }
steps {
script {
def IMAGE_FILE_VAR = IMAGE_FILE_PATH_VAR.split('/').last()
def INSTALL_IMAGE_ARGS = "./job_files/image_upload_job.py --testbed-file ${TESTBED_FILE} --image_file ${IMAGE_FILE_VAR} --mail-to ${MAIL_TO_VAR}"
echo "args for install_new_image= ${INSTALL_IMAGE_ARGS}"
build job: "/team_eng_ent_routing/Helper_Projects/PYATS_JOB_EXECUTOR", parameters: [
string(name: "pyats_job_args", value: INSTALL_IMAGE_ARGS),
string(name: "branch_name", value: BRANCH_NAME_VAR),
string(name: "platform_name", value: PLATFORM_NAME_VAR)
]
}
}
}
Related
def BUILD_USER = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
pipeline {
agent {label "master"}
parameters {
string(name: 'BUILD', defaultValue: '123')
booleanParam(name: 'Deploy', defaultValue: 'true')
booleanParam(name: 'Upgrade_Config', defaultValue: 'true')
booleanParam(name: 'SchemaComparison', defaultValue: 'false')
booleanParam(name: 'Publish_Server', defaultValue: 'false')
}
stages {
stage ('Start Deployment') {
agent {label "master"}
steps{
script{
sh '''
rm -rf /params/parameters
cd /params
echo $BUILD_USER
python3 buildParameters.py --Build=$BUILD --Publish_VM=$Publish_Server --userName=BUILD_USER --Upgrade_Config=$Upgrade_Config
'''
file = readFile('/params/parameters.txt')
}
}
}
stage ('UpgradeConfigurations') {
when {
expression { params.Deploy == true }
}
agent {label "master"}
environment {
file = "${file}"
}
steps{
script{
println("${file}")
build(job: 'UpgradeConfigurations', parameters: [ file(name: 'parameters', file: "${file}"), string(name: 'build_uniqe_id' , value: "${BUILD_USER}") , booleanParam(name: 'Deploy' , value: "${Deploy}") , booleanParam(name: 'SchemaComparison' , value: "${SchemaComparison}")], propagate: false, wait: false )
}
}
}
}
}
buildParameters.py file generate some additional parameters in parameters.txt file on master vm and I am trying to pass it to the upstream job UpgradeConfigurations
Upstream job UpgradeConfigurations is getting started but file parameters are not getting passed as parameters to it.
I have tried using base64file as well but no luck.
Referred Build Plugin doc:
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/
I have the following stage in groovy script of a jenkins job":
stage('Remove servers') {
when {
expression { params.DO_REMOVE == true }
}
steps {
script {
parallel RemoveSource: {
sh """set -x
export KUBECONFIG=${source_config}
kubectl get ns ${source_namespace} || exists="False"
"""
echo "${exists}"
if ("${exists}" != "False") {
build job: 'RemoveFCC',
parameters: [string(name: 'Branch', value: Branch),
booleanParam(name: 'build_ansible', value: false),
string(name: 'pipeline', value: 'yes')]
} else {
echo "Server does not exist. skipped fcc run"
}
},
RemoveTarget: {
sh """set -x
export KUBECONFIG=${target_config}
kubectl get ns ${target_namespace} || exists="False"
"""
echo "${exists}"
if ("${exists}" != "False") {
build job: 'RemoveFCC',
parameters: [string(name: 'Branch', value: Branch),
booleanParam(name: 'build_ansible', value: false),
string(name: 'pipeline', value: 'yes')]
} else {
echo "Server does not exist. skipped fcc run"
}
}
}
}
}
Even though echo "${exists}" prints False the if condition is still getting executed. I am not sure what am I missing here. Tried things like adding when instead of if.
I have this jenkins pipe line which has multiple stages. Inside these stages, there are multiple jobs being executed.
When I build the job I'd like to have a set of check boxes and the pipe line should build only what I've checked inside the pipeline stages. Is there any plugins or methods I can use to achieve this?
Sample pipeline code.
As per below example, there are jobs called job_A1, job_B1, job_C1, job_D1, job_A2, job_B2, job_C2 and job_D2. If I click Build with parameters, it should prompt me check boxes and I should be able to check any job I want so that the pipe line will build only the ones I checked.
Thanks in Advance.
pipeline {
agent {label 'server01'}
stages {
stage('Build 01') {
steps {
parallel (
"BUILD A1" : {
build job: 'job_A1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B1" : {
build job: 'job_B1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C1" : {
build job: 'job_C1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D1" : {
build job: 'job_D1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
stage('Build 02') {
steps {
parallel (
"BUILD A2" : {
build job: 'job_A2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B2" : {
build job: 'job_B2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C2" : {
build job: 'job_C2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D2" : {
build job: 'job_D2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
}
}
Thanks #mbn217 for your answer, but ExtendedChoice parameter didn't help much in my scenario.
Anyway, I could do it using boolean parameters and calling it inside the pipeline using the script tag.
Example pipeline script
stage ('BUILD A') {
steps {
script {
if (params.get('boolA',true)) {
build job: '_build_A', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "A is not selected to build"
}
}
}
}
stage ('BUILD B') {
steps {
script {
if (params.get('boolB',true)) {
build job: '_build_B', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "B is not selected to build"
}
}
}
}
You can use ExtendedChoiceParameter to accomplish what you want. Basically you will nee to parametrize job Names too using this jenkins plugin.
You can use a list of checkboxes as shown in the screen shot
I have a pipeline script that needs to trigger a "TEST" job.
The main parameter (string) is SETUP_DESCRIPTION which I phrase from a json file I'm creating.
Each server can have different amount of outputs depends on server resources (some have 2 setups and some 3).
Code looks like this:
#!/usr/bin/env groovy
import hudson.model.Result
import hudson.model.Run
import groovy.json.JsonSlurperClassic
import jenkins.model.CauseOfInterruption.UserInterruption
import org.jenkinsci.plugins.workflow.steps.FlowInterruptedException
def projectProperties = [
buildDiscarder(
logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '14', numToKeepStr: '')
),
parameters([
string(defaultValue: '', description: '', name: 'SERVER_NAME'),
string(defaultValue: 'Ubuntu_17.10_x86_64_kvm', description: '', name: 'KVM_TEMPLATE'),
string(defaultValue: 'test#test.com'', description: 'mailing list', name: 'SW_MAIL'),
choice(choices: ['no', 'eth', 'ib'], description: '', name: 'SIMX_SERVER'),
choice(choices: ['cib', 'cx3pro', 'cx4', 'cx4lx', 'cx5', 'cx6'], description: '', name: 'SIMX_BOARD'),
choice(choices: ['os_install', 'provision', 'add_jks_slave', 'add_to_noga', 'tests'], description: '', name: 'RUN_STAGE')
]),
[$class: 'RebuildSettings', autoRebuild: false, rebuildDisabled: false],
[$class: 'ThrottleJobProperty',
categories: [],
limitOneJobWithMatchingParams: true,
maxConcurrentPerNode: 5,
maxConcurrentTotal: 5,
paramsToUseForLimit: '',
throttleEnabled: true,
throttleOption: 'project'
],
]
properties(projectProperties)
def build_sanity (SETUP_DESCRIPTION) {
IMAGE = "linux/upstream_devel-x86_64"
CLOUD_IP = "dev-l-vrt-storage"
TAGS = "test_new_setup"
if ("$SETUP_DESCRIPTION" != "b2b x86-64 cib cloud_test") {
DATA_BASE = "b2b_eth_drivertest_mini_reg_db.json"
LINK_LAYER = "eth"
}
else {
DATA_BASE = "b2b_ib_drivertest_mini_reg_db.json"
LINK_LAYER = "ib"
}
build job: 'SANITY_TESTS/new_cloud_setup_GENERAL_SANITY_CHECK2', propagate: false
parameters:
[string(name: 'SETUP_DESCRIPTION', value: "${SETUP_DESCRIPTION}"),
string(name: 'DATA_BASE', value: "${DATA_BASE}"),
string(name: 'LINK_LAYER', value: "${LINK_LAYER}"),
string(name: 'IMAGE', value: "${IMAGE}"),
string(name: 'CLOUD_IP', value: "${CLOUD_IP}"),
string(name: 'TAGS', value: "${TAGS}")]
}
try {
ansiColor('xterm') {
timestamps {
node('cloud-slave1'){
stage('Test Setups') {
if (params.RUN_STAGE == 'os_install' || params.RUN_STAGE == 'provision' || params.RUN_STAGE == 'add_jks_slave' || params.RUN_STAGE == 'add_to_noga' || params.RUN_STAGE == 'tests') {
def stepsForParrallel = [:]
def NOGA_DESCRIPTION_LIST = sh (
script: "curl -X GET 'https://noga.mellanox.com/app/server/php/rest_api/?api_cmd=get_resources&pattern=${params.SERVER_NAME}&resource_type=Setup&group_name=Yaron&sub_group=Cloud'",
returnStdout: true
).trim()
#NonCPS
def JSON = new groovy.json.JsonSlurperClassic().parseText(NOGA_DESCRIPTION_LIST)
def DESCRIPTION_LIST = JSON.data.each{
SETUP_NAME = "${it.NAME}"
SETUP_DESCRIPTION = "${it.DESCRIPTION}"
println "${it.DESCRIPTION}" // PRINT ALL DECRIPTIONS INSIDE DATA
stepsForParrallel["${it.NAME}"] = {
build_sanity(SETUP_DESCRIPTION)
}
}
parallel stepsForParrallel
}
}
}
}
}
}catch (exc) {
def recipient = "${SW_MAIL}"
def subject = "${env.JOB_NAME} (${env.BUILD_NUMBER}) Failed"
def body = """
It appears that build ${env.BUILD_NUMBER} is failing, please check HW or network stability:
${env.BUILD_URL}
"""
mail subject: subject,
to: recipient,
replyTo: recipient,
from: 'cloud-host-provision#mellanox.com',
body: body
throw exc
1) When I run it like code above build_sanity function called once and execute (instead of 3 times as expected).
2) When I take build_sanity function content and run it inside the ech loop in the tests stage it runs 3 times as expected but not choosing different parameters as expected.
so i managed to figure it out.
1) i have println some parameters and saw my function did not received the variables ok
so i have changed the build job: part and that fixed that issue.
2) i also had issue in the stage part. i put the "run parallel" inside the each loop which cause it to ran several times. so i drop it one } down and that fixed the loop issue
here is the function code + stage if anyone encounter such issue in the future
def build_sanity (SETUP_DESCRIPTION) {
if ("$SETUP_DESCRIPTION" != "b2b x86-64 cib cloud_test") {
DATA_BASE = "b2b_eth_drivertest_mini_reg_db.json"
LINK_LAYER = "eth"
}
else {
DATA_BASE = "b2b_ib_drivertest_mini_reg_db.json"
LINK_LAYER = "ib"
}
IMAGE = "linux/upstream_devel-x86_64"
CLOUD_IP = "dev-l-vrt-storage"
TAGS = "test_new_setup"
build job: 'SANITY_TESTS/new_cloud_setup_GENERAL_SANITY_CHECK',
parameters: [
string(name: 'SETUP_DESCRIPTION', value: "${SETUP_DESCRIPTION}"),
string(name: 'DATA_BASE', value: "${DATA_BASE}"),
string(name: 'LINK_LAYER', value: "${LINK_LAYER}"),
string(name: 'IMAGE', value: "${IMAGE}"),
string(name: 'TAGS', value: "${TAGS}"),
]
}
stage('Test Setups') {
if (params.RUN_STAGE == 'os_install' || params.RUN_STAGE == 'provision' || params.RUN_STAGE == 'add_jks_slave' || params.RUN_STAGE == 'add_to_noga' || params.RUN_STAGE == 'tests') {
def stepsForParrallel = [:]
def NOGA_DESCRIPTION_LIST = sh (
script: "curl -X GET 'https://noga.mellanox.com/app/server/php/rest_api/?api_cmd=get_resources&pattern=${params.SERVER_NAME}&resource_type=Setup&group_name=Yaron&sub_group=Cloud'",
returnStdout: true
).trim()
#NonCPS
def JSON = new groovy.json.JsonSlurperClassic().parseText(NOGA_DESCRIPTION_LIST)
def DESCRIPTION_LIST = JSON.data.each{
def SETUP_NAME = "${it.NAME}"
stepsForParrallel["${it.NAME}"] = {
build_sanity("${it.DESCRIPTION}")
}
}
parallel stepsForParrallel
}
}
We have a jenkins jobs that run autotests with parameters:
HOST;
EXPEIMENT;
TAKE_NEW_SCREENSHOT;
XML_NAME.
All of this parameters have default values,
see screenshot before running parametrizing job:
I need to run several jobs simultaneously with only 2 parameters: HOST and EXPERIMENT.
I created next pipeline-script:
def tasks = [:]
parameters {
string(name: 'HOST', defaultValue: 'www', description: 'host: www, dev3, etc',)
string(name: 'EXPERIMENT', defaultValue: 'withoutExperiment',)
}
tasks['Actions MyBox'] = {
build job: 'MyDocs_Actions_And_Manage_Buttons_MyBox_Tests', parameters: [
string(name: 'HOST', value: 'www'),
string(name: 'EXPERIMENT', value: 'withoutExperiment'),
booleanParam(name: 'TAKE_NEW_SCREENSHOT', value: false),
string(name: 'XML_NAME', value: 'my_docs_actions_buttons_mybox_tests')
]
}
tasks['DashBoard General'] = {
build job: 'DashBoard_General_Tests', parameters: [
string(name: 'HOST', value: 'www'),
string(name: 'EXPERIMENT', value: 'withoutExperiment'),
booleanParam(name: 'TAKE_NEW_SCREENSHOT', value: false),
string(name: 'XML_NAME', value: 'my_docs_dash_board_general_tests')
]
}
tasks['Actions InBox'] = {
build job: 'MyDocs_Actions_Buttons_InBox_Tests', parameters: [
string(name: 'HOST', value: 'www'),
string(name: 'EXPERIMENT', value: 'withoutExperiment'),
booleanParam(name: 'TAKE_NEW_SCREENSHOT', value: false),
string(name: 'XML_NAME', value: 'my_docs_actions_buttons_inbox_tests')
]
}
parallel tasks
and specified parameters in "General" pipeline configuration:
But when I run this pipeline item with parameter value != default value, for example specify HOST = dev12,
anyway all jobs running simultaneously with default parameter values and build shows null specified parameter,
Help me please define a problem.
You're passing hardcoded values to your tasks. For example, you defined
tasks['Actions MyBox'] = {
build job: 'MyDocs_Actions_And_Manage_Buttons_MyBox_Tests', parameters: [
string(name: 'HOST', value: 'www'),
string(name: 'EXPERIMENT', value: 'withoutExperiment'),
booleanParam(name: 'TAKE_NEW_SCREENSHOT', value: false),
string(name: 'XML_NAME', value: 'my_docs_actions_buttons_mybox_tests')
]
}
In this case all parameters are hardcoded and each time when pipeline is executed the value of HOST will be www. And that's why you have null in the HOST parameter description in build execution info (because you're not specifying it in build job command).
So, you need to use something like string(name:'HOST', value: "${params.HOST}")