Missing Property Exception on trigger build - jenkins

I am getting the following error: No. Such property: $env for class: WorkflowSscript
Here is my implementation:
node('test node') {
stage ('apply terraform') {
// this stage is passing successfully
env_meta = getEnvMeta("test", "${env.ENV_NAME}")
}
stage ('Run Env Tets'){
build job: 'infra_tets', parameters: [
string (name: 'UI_TESTS', value: 'all'),
string (name: 'env', value: String.valueOf($env.ENV_NAME)),
]
}
}

Please try:
node('test node'){
stage ('apply terraform') {
// this stage is passing successfully
env.env_meta = getEnvMeta("test", "${env.ENV_NAME}")
}
stage ('Run Env Tets'){
build job: 'infra_tets', parameters: [
string (name: 'UI_TESTS', value: 'all'),
string (name: 'env', value: env.env_meta),
]
}
}
notice that if you use quotes you need to access env var by "${env.ENV_NAME}"

Related

Jenkinsfile one type of parameter instead of two type, convert choice into string or string into choice

I'm using two parameters one is choice (ID) and other one string (NID) but values are same. Requirement is to use only parameter either choice or string. Is it possible to convert choice parameter into string or string to choice parameter?
so that i can use one parameter and one deploy function.
def deploy1(env) {
step([$class: 'UCDeployPublisher',
siteName: siteName,
deploy: [
$class: 'com.urbancode.jenkins.plugins.ucdeploy.DeployHelper$DeployBlock',
deployApp: appName,
deployEnv: 'DEV',
deployVersions: "${compName}:${version}",
deployProc: simpleDeploy,
deployOnlyChanged: false,
deployReqProps: "ID=${params.ID}" ===> string paramater
]])
def deploy2(env) {
step([$class: 'UCDeployPublisher',
siteName: siteName,
deploy: [
$class: 'com.urbancode.jenkins.plugins.ucdeploy.DeployHelper$DeployBlock',
deployApp: appName,
deployEnv: 'DEV',
deployVersions: "${compName}:${version}",
deployProc: simpleDeploy,
deployOnlyChanged: false,
deployReqProps: "ID=${params.NID}" ===> Needs choice paramater
]])
parameters {
choice(
name: 'ID',
choices: [ '8922', '9292', '3220' ]
)
string(
name: 'NID',
defaultvalue: '8922,9292,3220'
)
stage (DEV') {
steps {
script {
if (params.ENVIRONMENT == "dev"){
deploy1('devl') ===> this will call my deploy function
}
}
}
}
Yes you can convert the string parameter to an array by just using split:
Below is an example :
// Define list which would contain all servers in an array
def ID= []
pipeline {
agent none
parameters
{
// Adding below as example string which is passed from paramters . this can be changed based on your need
// Example: Pass NID list as , separated string in your project. This can be changed
string(name: 'NID', defaultValue:'8922,9292,3220', description: 'Enter , separated NID values in your project e.g. 8922,9292,3220')
}
stages {
stage('DEV') {
agent any
steps {
script
{
// Update ID list
ID= params.NID.split(",")
// You can loop thorugh the ID list
for (myid in ID)
{
println ("ID is : ${myid}")
}
}
}
}
}
}

How to use a matrix section in a declarative pipeiline

I have the following pipeline. I need this pipeline to run on 4 different nodes at the same time. I have read that using a matrix section within the declarative pipeline is key to making this work. How can I go about doing that with the pipeline below?
pipeline
{
stages
{
stage ('Test')
{
steps
{
script
{
def test_proj_choices = ['AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA']
for (choice in test_proj_choices)
{
stage ("${choice}")
{
echo "Running ${choice}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: choice), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
One helpful article can be found here : https://www.jenkins.io/blog/2019/11/22/welcome-to-the-matrix/
The official documentation here: https://www.jenkins.io/doc/book/pipeline/syntax/#declarative-matrix
Accordingly, the syntax should be:
pipeline {
agent none
stages {
stage('Tests') {
matrix {
agent any
axes {
axis {
name 'CHOICE'
values 'AD', 'CD', 'DC', 'DISP_A', 'DISP_PROC', 'EGI', 'FD', 'FLT', 'FMS_C', 'IFF', 'liblO', 'libNGC', 'libSC', 'MISCMP_MP', 'MISCMP_GP', 'NAV_MGR', 'RADALT', 'SYS', 'SYSIO15', 'SYSIO42', 'SYSRED', 'TACAN', 'VOR_ILS', 'VPA', 'WAAS', 'WCA'
}
}
stages {
stage("Test") {
steps {
echo "Running ${CHOICE}"
build job: "UH60Job", parameters: [string(name: "TEST_PROJECT", value: CHOICE), string(name: "SCADE_SUITE_TEST_ACTION", value: "all"), string(name: "VIEW_ROOT", value: "myview")]
}
}
}
}
}
}
}
Note that your inner stage cannot be named dynamically, you'd get a syntax error trying to expand "${CHOICE}".

Jenkins Pipeline passing parameter as shell script argument

I have a parameterized Jenkins Pipeline with a default value and I'm trying to pass that param as a script argument but it doesn't seem to pass anything. Here is the script :
pipeline {
agent any
stages {
stage('Building') {
steps {
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
}
}
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}
It seems to work for the build step but not for the script. I returns an error saying I have no argument.
I tried to get it with ${configuration}, ${params.configuration} and $configuration.
What is the right way to access a param and pass it correctly to a script ?
Thanks.
Actually, you are using the build step, to pass a parameter to the Jenkins job 'myProject'.
build job: 'myProject', parameters: [string(name: 'configuration', value: '${configuration}')]
If you want to declare a Parameter in this job you need to declare your parameter in a "parameters" block.
pipeline {
agent any
parameters {
string(defaultValue: '', description: '', name: 'configuration')
}
stages {
stage('Doing stuff') {
steps {
sh "~/scripts/myScript ${configuration}"
}
}
}
}

How to trigger a Jenkins Job on multiple nodes from a Pipeline (only one job executing)

I have a Jenkins Job, configured as a Scripted Jenkins Pipeline, which:
Checks out the code from GitHub
merges in developer changes
builds a debug image
it is then supposed to split into 3 separate parallel processes - one of which builds the release version of the code and unit tests it.
The other 2 processes are supposed to be identical, with the debug image being flashed onto a target and various tests running.
The targets are identified in Jenkins as slave_1 and slave_2 and are both allocated the label 131_ci_targets
I am using 'parallel' to trigger the release build, and the multiple instances of the test job. I will post a (slightly redacted) copy of my Scripted pipeline below for full reference, but for the question I have tried all 3 of the following options.
Using a single build call with LabelParamaterValue and allNodesMatchingLabel set to true. In this the TEST_TARGETS is the label 131_ci_targets
parallel_steps = [:]
parallel_steps["release"] = { // Release build and test steps
}
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'LabelParameterValue',
name: 'RUN_NODE', label: "${TEST_TARGETS}",
allNodesMatchingLabel: true,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
stage('Parallel'){
parallel parallel_steps
} // Parallel
Using a single build call with NodeParamaterValue and a list of all nodes. In this TEST_TARGETS is again the label, while test_nodes is a list of 2 strings: [slave_1, slave_2]
parallel_steps = [:]
parallel_steps["release"] = { // Release build and test steps
}
test_nodes = hostNames("${TEST_TARGETS}")
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: test_nodes,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
stage('Parallel'){
parallel parallel_steps
} // Parallel
3: Using multiple stages, each with a single build call with NodeParamaterValue and a list containing only 1 slave id.
test_nodes is the list of strings : [slave_1, slave_2], while the first call passes slave_1 and the second slave_2.
for ( tn in test_nodes ) {
parallel_steps["${tn}"] = {
stage("${tn}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [tn],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
} // ${tn}
}
All of the above will trigger only a single run of the 'Trial_Test_Pipe' on slave_2 assuming that both slave_1 and slave_2 are defined, online and have available executors.
The Trial_Test_Pipe job is another Jenkins Pipeline job, and has the checkbox "Do not allow concurrent builds" unchecked.
Any thoughts on:
Why the job will only trigger one of the runs, not both?
What the correct solution may be?
For reference now: here is my full(ish) scripted Jenkins job:
import hudson.model.*
import hudson.EnvVars
import groovy.json.JsonSlurperClassic
import groovy.json.JsonBuilder
import groovy.json.JsonOutput
import java.net.URL
def BUILD_SLAVE=""
// clean the workspace before starting the build process
def clean_before_build() {
bat label:'',
script: '''cd %GITHUB_REPO_PATH%
git status
git clean -x -d -f
'''
}
// Routine to build the firmware
// Can build Debug or Release depending on the environment variables
def build_the_firmware() {
return
def batch_script = """
REM *** Build script here
echo "... Build script here ..."
"""
bat label:'',
script: batch_script
}
// Copy the hex files out of the Build folder and into the Jenkins workspace
def copy_hex_files_to_workspace() {
return
def batch_script = """
REM *** Copy HEX file to workspace:
echo "... Copy HEX file to workspace ..."
"""
bat label:'',
script: batch_script
}
// Updated from stackOverflow answer: https://stackoverflow.com/a/54145233/1589770
#NonCPS
def hostNames(label) {
nodes = []
jenkins.model.Jenkins.instance.computers.each { c ->
if ( c.isOnline() ){
labels = c.node.labelString
labels.split(' ').each { l ->
if (l == label) {
nodes.add(c.node.selfLabel.name)
}
}
}
}
return nodes
}
try {
node('Build_Slave') {
BUILD_SLAVE = "${env.NODE_NAME}"
echo "build_slave=${BUILD_SLAVE}"
stage('Checkout Repo') {
// Set a desription on the build history to make for easy identification
currentBuild.setDescription("Pull Request: ${PULL_REQUEST_NUMBER} \n${TARGET_BRANCH}")
echo "... checking out dev code from our repo ..."
} // Checkout Repo
stage ('Merge PR') {
// Merge the base branch into the target for test
echo "... Merge the base branch into the target for test ..."
} // Merge PR
stage('Build Debug') {
withEnv(['LIB_MODE=Debug', 'IMG_MODE=Debug', 'OUT_FOLDER=Debug']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Debug
stage('Post Build') {
if (currentBuild.resultIsWorseOrEqualTo("UNSTABLE")) {
echo "... Send a mail to the Admins and the Devs ..."
}
} // Post Merge
} // node
parallel_steps = [:]
parallel_steps["release"] = {
node("${BUILD_SLAVE}") {
stage('Build Release') {
withEnv(['LIB_MODE=Release', 'IMG_MODE=Release', 'OUT_FOLDER=build\\Release']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Release
stage('Unit Tests') {
echo "... do Unit Tests here ..."
}
}
} // release
test_nodes = hostNames("${TEST_TARGETS}")
if (true) {
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'LabelParameterValue',
name: 'RUN_NODE', label: "${TEST_TARGETS}",
allNodesMatchingLabel: true,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
} else if ( false ) {
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: test_nodes,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
} else {
for ( tn in test_nodes ) {
parallel_steps["${tn}"] = {
stage("${tn}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [tn],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
} // ${tn}
}
}
stage('Parallel'){
parallel parallel_steps
} // Parallel
} // try
catch (Exception ex) {
if ( manager.logContains(".*Merge conflict in .*") ) {
manager.addWarningBadge("Pull Request ${PULL_REQUEST_NUMBER} Experienced Git Merge Conflicts.")
manager.createSummary("warning.gif").appendText("<h2>Experienced Git Merge Conflicts!</h2>", false, false, false, "red")
}
echo "... Send a mail to the Admins and the Devs ..."
throw ex
}
So ... I have a solution for this ... as in, I understand what to do, and why one of the above solutions wasn't working.
The winner is Option 3 ... the reason it wasn't working is that the code inside the enclosure (the stage part) isn't evaluated until the stage is actually being run. As a result the strings aren't expanded until then and, since tn is fixed at slave_2 by that point, that's the value used on both parallel streams.
In the Jenkins examples here ... [https://jenkins.io/doc/pipeline/examples/#parallel-from-grep] ... the enclosures are returned from a function transformIntoStep and by doing this I was able to force early evaluation of the strings and so get parallel steps running on both slaves.
If you're here looking for answers, I hope this helps. If you are, and it has, please feel free to give me an uptick. Cheers :)
My final scripted jenkinsfile looks something like this:
import hudson.model.*
import hudson.EnvVars
import groovy.json.JsonSlurperClassic
import groovy.json.JsonBuilder
import groovy.json.JsonOutput
import java.net.URL
BUILD_SLAVE=""
parallel_steps = [:]
// clean the workspace before starting the build process
def clean_before_build() {
bat label:'',
script: '''cd %GITHUB_REPO_PATH%
git status
git clean -x -d -f
'''
}
// Routine to build the firmware
// Can build Debug or Release depending on the environment variables
def build_the_firmware() {
def batch_script = """
REM *** Build script here
echo "... Build script here ..."
"""
bat label:'',
script: batch_script
}
// Copy the hex files out of the Build folder and into the Jenkins workspace
def copy_hex_files_to_workspace() {
def batch_script = """
REM *** Copy HEX file to workspace:
echo "... Copy HEX file to workspace ..."
"""
bat label:'',
script: batch_script
}
// Updated from stackOverflow answer: https://stackoverflow.com/a/54145233/1589770
#NonCPS
def hostNames(label) {
nodes = []
jenkins.model.Jenkins.instance.computers.each { c ->
if ( c.isOnline() ){
labels = c.node.labelString
labels.split(' ').each { l ->
if (l == label) {
nodes.add(c.node.selfLabel.name)
}
}
}
}
return nodes
}
def transformTestStep(nodeId) {
return {
stage(nodeId) {
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: TARGET_BRANCH),
string(name: 'FRAMEWORK_VERSION', value: FRAMEWORK_VERSION),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [nodeId],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
}
}
def transformReleaseStep(build_slave) {
return {
node(build_slave) {
stage('Build Release') {
withEnv(['LIB_MODE=Release', 'IMG_MODE=Release', 'OUT_FOLDER=build\\Release']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Release
stage('Unit Tests') {
echo "... do Unit Tests here ..."
}
}
}
}
try {
node('Build_Slave') {
BUILD_SLAVE = "${env.NODE_NAME}"
echo "build_slave=${BUILD_SLAVE}"
parallel_steps["release"] = transformReleaseStep(BUILD_SLAVE)
test_nodes = hostNames("${TEST_TARGETS}")
for ( tn in test_nodes ) {
parallel_steps[tn] = transformTestStep(tn)
}
stage('Checkout Repo') {
// Set a desription on the build history to make for easy identification
currentBuild.setDescription("Pull Request: ${PULL_REQUEST_NUMBER} \n${TARGET_BRANCH}")
echo "... checking out dev code from our repo ..."
} // Checkout Repo
stage ('Merge PR') {
// Merge the base branch into the target for test
echo "... Merge the base branch into the target for test ..."
} // Merge PR
stage('Build Debug') {
withEnv(['LIB_MODE=Debug', 'IMG_MODE=Debug', 'OUT_FOLDER=Debug']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Debug
stage('Post Build') {
if (currentBuild.resultIsWorseOrEqualTo("UNSTABLE")) {
echo "... Send a mail to the Admins and the Devs ..."
}
} // Post Merge
} // node
stage('Parallel'){
parallel parallel_steps
} // Parallel
} // try
catch (Exception ex) {
if ( manager.logContains(".*Merge conflict in .*") ) {
manager.addWarningBadge("Pull Request ${PULL_REQUEST_NUMBER} Experienced Git Merge Conflicts.")
manager.createSummary("warning.gif").appendText("<h2>Experienced Git Merge Conflicts!</h2>", false, false, false, "red")
}
echo "... Send a mail to the Admins and the Devs ..."
throw ex
}

Passing maps into Jenkins pipeline jobs

I have a Jenkins Job DSL seed job that calls out to a couple of pipeline jobs e.g.
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
pipelineJob("job2") {
definition {
cps {
script(readFileFromWorkspace('job2.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
job1.groovy and job2.groovy are standard Jenkinsfile style pipelines.
I want to pass a couple of common maps into these jobs. These contains things that may vary between environments, e.g. target servers, credential names.
Something like:
def SERVERS_MAP = [
'prod': [
'prod-server1',
'prod-server2',
],
'dev': [
'dev-server1',
'dev-server2',
],
]
Can I define a map in my seed job that I can then pass and access as a map in my pipeline jobs?
I've come up with a hacky workaround using the pipeline-utility-steps plugin.
Essentially I pass my data maps around as JSON.
So my seed job might contain:
def SERVERS_MAP = '''
{
"prod": [
"prod-server1",
"prod-server2"
],
"dev": [
"dev-server1",
"dev-server2"
]
}
'''
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
stringParam('SERVERS_MAP', "${SERVERS_MAP}", "")
}
}
}
and my pipeline would contain something like:
def serversMap = readJSON text: SERVERS_MAP
def targetServers = serversMap["${ENV}"]
targetServers.each { server ->
echo server
}
I could also extract these variables into a JSON file and read them from there.
Although it works, it feels wrong somehow.
You can use string parameter pass the Map val, downstream read it as json format.
UPSTREAM PIPELINE
timestamps{
node("sse_lab_CI_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage("-- regression execute --"){
def test_map =
"""
{
"gerrit_patchset_commit": "aad5fce",
"build_cpu_x86_ubuntu": [
"centos_compatible_build_test",
"gdb_compatible_build_test",
"visual_profiler_compatible_build_test"
],
}
"""
build(job: 'tops_regression_down',
parameters: [string(name: 'UPSTREAM_JOB_NAME',
value: "${env.JOB_BASE_NAME}"),
string(name: 'UPSTREAM_BUILD_NUM',
value: "${env.BUILD_NUMBER}"),
string(name: 'MAP_PARAM',
value: "${test_map}"),
],
propagate: true,
wait: true)
}
}
}
DOWNSTREAM PIPELINE
timestamps{
node("sse_lab_inspur_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage('--in precondition--'){
dir('./'){
cleanWs()
println("hello world")
println("${env.MAP_PARAM}")
Map result_json = readJSON(text: "${env.MAP_PARAM}")
println(result_json)
}
}
}
}

Resources