Jenkins Job DSL for pipelineJob missing parameters - jenkins

am using a Jenkin's Job DSL pipelineJob to create a new job, and I need to pass three vparameters to the new job. Here is my DSL code:
pipelineJob("cronjob/${ACTION}_${ENVRIONMENT_NAME}_environment_CRONJOB") {
parameters {
stringParam("ENVRIONMENT", "${ENVRIONMENT_NAME}")
stringParam("INSTANCE", "${INSTANCE_NAME}")
}
triggers {
scm("${SCHEDULE}")
}
definition {
cpsScm {
scm {
git {
remote {
url("<my github URL>")
credentials("my_credential_Id")
}
branch('*/develop')
}
}
scriptPath("myhome/code/single-${ACTION}")
}
}
disabled()
}
Here ACTION, ENVRIONMENT_NAME, and INSTANCE_NAME are the Active Choice Parameters of the DSL job. It creates a new job, and the parameters have the correct values from this job.
The myhome/code/single-${ACTION}:
pipeline {
agent any
}
stages {
stage('Run inflate') {
steps {
script {
if (env.ENVIRONMENT != "Select a environment" && env.INSTANCE == "Select a instance") {
echo "Now for env.... ${env.ENVIRONMENT}"
ansiblePlaybook become: true,
colorized: true,
credentialsId: 'my_credential_ID',
extras: "-e environment_name=${env.ENVIRONMENT}",
installation: 'ansible',
inventory: 'ansible/hosts',
playbook: "ansible/scripts/single-action.yml"
}
}
}
}
}
}
When I run the created job, it does not take the values assigned to the parameters, and part of the output is:
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Run inflate)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
Now for env.... null
[Pipeline] echo
The env.ENVIRONMENT is null, instead of test_1 as shown in the job. Because the parameters do not have values, the Ansible playbook failed next.
How can I make the job to pick up the parameters values?
Thanks!

Turns out there was a typo in the pipelineJob code. The correct one should be:
pipelineJob("cronjob/${ACTION}_${ENVIRONMENT_NAME}_environment_CRONJOB") {
parameters {
stringParam("ENVIRONMENT", "${ENVIRONMENT_NAME}")
stringParam("INSTANCE", "${INSTANCE_NAME}")
}
triggers {
scm("${SCHEDULE}")
}
definition {
cpsScm {
scm {
git {
remote {
url("<my github URL>")
credentials("my_credential_Id")
}
branch('*/develop')
}
}
scriptPath("myhome/code/single-${ACTION}")
}
}
disabled()
}
and it works!

Related

Jenkins stage doesn't call custom method

I have a Jenkins pipeline that does some code linting through different environments. I have a linting method that I call based on what parameters are passed. However, during my build, the stage that calls the method does nothing and returns nothing. Everything appears to be sane to me. Below is my code, and the stages showing the null results.
Jenkinsfile:
IAMMap = [
"west": [
account: "XXXXXXXX",
],
"east": [
account: "YYYYYYYYY",
],
]
pipeline {
options {
ansiColor('xterm')
}
parameters {
booleanParam(
name: 'WEST',
description: 'Whether to lint code from west account or not. Defaults to "false"',
defaultValue: false
)
booleanParam(
name: 'EAST',
description: 'Whether to lint code from east account or not. Defaults to "false"',
defaultValue: true
)
booleanParam(
name: 'LINT',
description: 'Whether to perform linting. This should always default to "true"',
defaultValue: true
)
}
environment {
CODE_DIR = "/code"
}
stages {
stage('Start Lint') {
steps {
script {
if (params.WEST && params.LINT) {
codeLint("west")
}
if (params.EAST && params.LINT) {
codeLint("east")
}
}
}
}
}
post {
always {
cleanWs disableDeferredWipeout: true, deleteDirs: true
}
}
}
def codeLint(account) {
return {
stage('Code Lint') {
dir(env.CODE_DIR) {
withAWS(IAMMap[account]) {
sh script: "./lint.sh"
}
}
}
}
}
Results:
15:00:20 [Pipeline] { (Start Lint)
15:00:20 [Pipeline] script
15:00:20 [Pipeline] {
15:00:20 [Pipeline] }
15:00:20 [Pipeline] // script
15:00:20 [Pipeline] }
15:00:20 [Pipeline] // stage
15:00:20 [Pipeline] stage
15:00:20 [Pipeline] { (Declarative: Post Actions)
15:00:20 [Pipeline] cleanWs
15:00:20 [WS-CLEANUP] Deleting project workspace...
15:00:20 [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
15:00:20 [WS-CLEANUP] done
As you can see nothing gets executed. I assure you I am checking the required parameters when running Build with Parameters in the console. As far as I know, this is the correct syntax for a declarative pipeline.
Don't return the Stage, just execute it within the codeLint function.
def codeLint(account) {
stage('Code Lint') {
dir(env.CODE_DIR) {
withAWS(IAMMap[account]) {
sh script: "./lint.sh"
}
}
}
}
Or once the Stage is returned you can run it. This may need Script approval.
codeLint("west").run()

How to get the value of params inside jenkins pipeline stages

How to access parameters values inside different stages in Jenkins pipeline.
So far I have done:
env_vars = 'Initial value'
pipeline {
agent {label 'master'}
stages {
stage('Inject-Env-Vars') {
steps {
script {
env_vars = params.collect{string(name: it.key, value: it.value)} //Collecting all the params here
}
}
}
stage('add_env_variable') {
steps {
script {
env_vars.add(string(name: 'Change_Reason', value: "sam")) ////Adding an extra param
}
}
}
stage('Parent') {
parallel {
stage('RCP') {
steps {
echo "$env_vars"
echo "${env_vars.Change_Reason}" //// want to print value of Change_Reason
}
}
}
}
}
}
Output:
[Pipeline] { (RCP)
[Pipeline] echo
[#string(name=Change_Reason,value=sam)]
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch RCP
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: No such field found: field org.jenkinsci.plugins.structs.describable.UninstantiatedDescribable Change_Reason
Please let me know if there is a way to get the param value in different stages.
env_vars seems to be an Array and you expect to access it as a Map.

How to pass variable from pipeline to job in Jenkins?

I need to create a unique identifier in a pipeline and then all jobs started from this pipeline should have access to this unique identifier.
I do not want to parameterize those jobs.
I thought that environment variable defined on a pipeline level will be accessible from jobs, but it isn't.
pipeline {
agent any
environment {
TEST_VAR = 'TEST_VAR'
}
stages {
stage('Stage1') {
steps {
build (job: 'job1')
}
}
}
}
You do not really need to parameterize the downstream pipelines but can still pass the variable as a parameter from the upstream and access it in the downstream.
Upstream pipeline
pipeline {
agent any
environment {
TEST_VAR = 'hello_world'
}
stages {
stage('Build-downstream-job') {
steps {
build job: 'downstream-job', parameters: [string(name: 'TEST_VAR', value: env.TEST_VAR)], wait: false
}
}
}
}
Downstream pipeline
pipeline {
agent any
stages {
stage('Get-test-var') {
steps {
println(params.TEST_VAR)
}
}
}
}
Downstream pipeline console output
[Pipeline] stage
[Pipeline] { (Get-test-var)
[Pipeline] echo
hello_world
[Pipeline] }
[Pipeline] // stage
You should try adding a '$' before TEST_VAR:
environment {
TEST_VAR = '$TEST_VAR'
}

Jenkins: For each loop issue

I am using Jenkins to trigger an Ansible playbook. I would like to pass in an array of computer names in order for the Ansible playbook to trigger the playbook and image the machines one by one.
In order to do this I believe I need a foreach loop.
I have little skill in groovy/Jenkins and have ran into an issue.
error = "Expected a symbol # line..." The line that this is referring to is HOSTS.each {item ->
Can someone please assist me? My script is below (I have edited out some private data)
pipeline {
agent any
// every day
triggers {
cron('H 7 * * *')
}
environment {
HOSTS = ['node1','node2']
}
stages {
stage('MachineDB Scheduler') {
steps {
HOSTS.each { item -> // review
HOSTNAME = ${item} // review
ansibuildPlaybookperf(
sshUser: env.USER,
vaultUser: env.USER,
server: "$SERVER",
dir: "$HOMEDIR/$BUILD_TAG",
playbook: "$PLAYBOOK",
extras: "--vault-password-file passmgr.sh",
extraVars: "$VARS_JENKINS"
)
}
}
}
}
}
I don't really know a lot about ansible but maybe this helps.
This pipeline shows a list of PC's. In the declarative pipeline I call a groovy function which is defined after the pipeline. In this function I go through the list and past every PC-name.
def list = [
'PCNAME1',
'PCNAME2',
'PCNAME3'
]
pipeline {
agent any
stages {
stage('Loop through PCs') {
steps {
loopPC(list)
}
}
}
}
def loopPC(list){
list.each {
println "Computer ${it}"
}
}
OUTPUT:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Loop through PCs)
[Pipeline] echo
Computer PCNAME1
[Pipeline] echo
Computer PCNAME2
[Pipeline] echo
Computer PCNAME3
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
You can also use a script block in your declarative pipeline to execute the script immediately. It's less clean but maybe easier in the beginning and to make it work (and closer to your attempt):
def list = [
'PCNAME1',
'PCNAME2',
'PCNAME3'
]
pipeline {
agent any
stages {
stage('Loop through PCs') {
steps {
script {
list.each {
println "Computer ${it}"
}
}
}
}
}
}

Declarative pipeline when condition in post

As far as declarative pipelines go in Jenkins, I'm having trouble with the when keyword.
I keep getting the error No such DSL method 'when' found among steps. I'm sort of new to Jenkins 2 declarative pipelines and don't think I am mixing up scripted pipelines with declarative ones.
The goal of this pipeline is to run mvn deploy after a successful Sonar run and send out mail notifications of a failure or success. I only want the artifacts to be deployed when on master or a release branch.
The part I'm having difficulties with is in the post section. The Notifications stage is working great. Note that I got this to work without the when clause, but really need it or an equivalent.
pipeline {
agent any
tools {
maven 'M3'
jdk 'JDK8'
}
stages {
stage('Notifications') {
steps {
sh 'mkdir tmpPom'
sh 'mv pom.xml tmpPom/pom.xml'
checkout([$class: 'GitSCM', branches: [[name: 'origin/master']], doGenerateSubmoduleConfigurations: false, submoduleCfg: [], userRemoteConfigs: [[url: 'https://repository.git']]])
sh 'mvn clean test'
sh 'rm pom.xml'
sh 'mv tmpPom/pom.xml ../pom.xml'
}
}
}
post {
success {
script {
currentBuild.result = 'SUCCESS'
}
when {
branch 'master|release/*'
}
steps {
sh 'mvn deploy'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result,
)
}
failure {
script {
currentBuild.result = 'FAILURE'
}
sendNotification(recipients,
null,
'https://link.to.sonar',
currentBuild.result
)
}
}
}
In the documentation of declarative pipelines, it's mentioned that you can't use when in the post block. when is allowed only inside a stage directive.
So what you can do is test the conditions using an if in a script:
post {
success {
script {
if (env.BRANCH_NAME == 'master')
currentBuild.result = 'SUCCESS'
}
}
// failure block
}
Using a GitHub Repository and the Pipeline plugin I have something along these lines:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''
make
'''
}
}
}
post {
always {
sh '''
make clean
'''
}
success {
script {
if (env.BRANCH_NAME == 'master') {
emailext (
to: 'engineers#green-planet.com',
subject: "${env.JOB_NAME} #${env.BUILD_NUMBER} master is fine",
body: "The master build is happy.\n\nConsole: ${env.BUILD_URL}.\n\n",
attachLog: true,
)
} else if (env.BRANCH_NAME.startsWith('PR')) {
// also send email to tell people their PR status
} else {
// this is some other branch
}
}
}
}
}
And that way, notifications can be sent based on the type of branch being built. See the pipeline model definition and also the global variable reference available on your server at http://your-jenkins-ip:8080/pipeline-syntax/globals#env for details.
Ran into the same issue with post. Worked around it by annotating the variable with #groovy.transform.Field. This was based on info I found in the Jenkins docs for defining global variables.
e.g.
#!groovy
pipeline {
agent none
stages {
stage("Validate") {
parallel {
stage("Ubuntu") {
agent {
label "TEST_MACHINE"
}
steps {{
sh "run tests command"
recordFailures('Ubuntu', 'test-results.xml')
junit 'test-results.xml'
}
}
}
}
}
post {
unsuccessful {
notify()
}
}
}
// Make testFailures global so it can be accessed from a 'post' step
#groovy.transform.Field
def testFailures = [:]
def recordFailures(key, resultsFile) {
def failures = ... parse test-results.xml script for failures ...
if (failures) {
testFailures[key] = failures
}
}
def notify() {
if (testFailures) {
... do something here ...
}
}

Resources