Jenkins DSL - Parse Yaml for complex processing - jenkins

I'm using Jenkins Job DSL to construct pipelines for multiple SOA style services. All these service pipelines are identical.
job('wibble') {
publishers {
downstreamParameterized {
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', "myproject-2" )
predefinedProp('PROJECT_REPO', "myprojecttwo#gitrepo.com" )
}
}
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', "myproject-1" )
predefinedProp('PROJECT_REPO', "myprojectone#gitrepo.com" )
}
}
}
}
}
Given I'm adding in new projects everyday, I have to keep manipulating the DSL. I've decided that i'd rather have all the config in a yaml file outside of the DSL. I know I can use groovy to create arrays, do loops etc, but I'm not having much luck.
I'm trying to do something like this...
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
List projects = new Yaml().load(("conf/projects.yml" as File).text)
job('wibble') {
publishers {
downstreamParameterized {
projects.each {
trigger("SOA_Pipeline_Builder") {
condition('SUCCESS')
parameters {
predefinedProp('PROJECT_NAME', it.name )
predefinedProp('PROJECT_REPO', it.repo )
}
}
}
}
}
}
conf/projects.yml
---
- name: myproject-1
repo: myprojectone#gitrepo.com
- name: myproject-2
repo: myprojecttwo#gitrepo.com
Does anyone have any experience with this sort of thing?

This is how I'm using snakeyaml with jobDSL to separate configuration from "application" code.
config.yml
services:
- some-service-1
- some-service-2
target_envs:
- stage
- prod
folder_path: "promotion-jobs"
seed_job.groovy
#!/usr/bin/groovy
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
def workDir = SEED_JOB.getWorkspace()
print("Loading config from ${workDir}/config.yml")
def config = new Yaml().load(("${workDir}/config.yml" as File).text)
for (service in config.services) {
for (stage in config.target_envs) {
folder("${config.folder_path}/to-${stage}") {
displayName("Deploy to ${stage} jobs")
description("Deploy ECS services to ${stage}")
}
if (stage == "stage") {
stage_trigger = """
pipelineTriggers([cron["1 1 * * 1"])
"""
} else {
stage_trigger = ""
}
pipelineJob("${config.folder_path}/to-${stage}/${service}") {
definition {
cps {
sandbox()
script("""
node {
properties([
${stage_trigger}
parameters([
choice(
choices: ['dev,stage'],
description: 'The source environment to promote',
name: 'sourceEnv'
),
string(
defaultValue: '',
description: 'Specify a specific Docker image tag to deploy. This will override sourceEnv and should be left blank',
name: 'sourceTag',
trim: true
)
])
])
properties([
disableConcurrentBuilds(),
])
stage('init') {
dockerPromote(
app="${service}",
destinationEnv="${stage}"
)
}
}
""")
}
}
}
}
}

Related

Jenkins pipeline - How to run a stage based on defined parameter

I have a jenkins pipeline that includes few stages- I wish to run the copy_file stage only if deploy parameter == yes. I have tried to use when but it is not working
servers =['100.1.1.1', '100.1.1.2']
deploy = yes
pipeline {
agent { label 'server-1' }
stages {
stage('Connect to git') {
steps {
git branch: 'xxxx', credentialsId: 'yyy', url: 'https://zzzz'
}
}
stage ('Copy file') {
when { deploy == 'yes' }
steps {
dir('folder_a') {
file_copy(servers)
}
}
}
}
}
def file_copy(list) {
list.each { item ->
sh "echo Copy file"
sh "scp 11.txt user#${item}:/data/"
}
}
First, declare an environmental variable.
environment {
DEPLOY = 'YES'
}
Now use it in when condition like this.
when { environment name: 'DEPLOY', value: 'YES' }
There are two types of pipeline codes in Jenkins.
Declarative pipeline
Scripted pipeline (groovy script)
You are coding in declarative manner so you need to follow the declarative syntax.
NOTE: There are other ways for what you are trying to achieve. I mean to say you may use different logic.
Another way is to use parameters:
parameters {
choice choices: ['YES', 'NO'], description: 'Deploy?', name: 'DEPLOY'
}
stages {
stage ('continue if DEPLOY set to YES') {
when {
expression { params.DEPLOY == 'YES' }
}
steps {
...
}
}
}

Jenkins file groovy issues

Hi My jenkins file code is as follows : I am basically trying to make call to a python script and execute it, I have defined some variables in my code : And when i am trying to run it, It gives no such property error in the beginning and I cant find out the reason behind it.
I would really appreciate any suggestions on this .
import groovy.json.*
pipeline {
agent {
label 'test'
}
parameters {
choice(choices: '''\
env1
env2'''
, description: 'Environment to deploy', name: 'vpc-stack')
choice(choices: '''\
node1
node2'''
, description: '(choose )', name: 'stack')
}
stages {
stage('Tooling') {
steps {
script {
//set up terraform
def tfHome = tool name: 'Terraform 0.12.24'
env.PATH = "${tfHome}:${env.PATH}"
env.TFHOME = "${tfHome}"
}
}
}
stage('Build all modules') {
steps {
wrap([$class: 'BuildUser']) {
// build all modules
script {
if (params.refresh) {
echo "Jenkins refresh!"
currentBuild.result = 'ABORTED'
error('Jenkinsfile refresh! Aborting any real runs!')
}
sh(script: """pwd""")
def status_code = sh(script: """PYTHONUNBUFFERED=1 python3 scripts/test/test_script.py /$vpc-stack""", returnStatus: true)
if (status_code == 0) {
currentBuild.result = 'SUCCESS'
}
if (status_code == 1) {
currentBuild.result = 'FAILURE'
}
}
}
}
}
}
post {
always {
echo 'cleaning workspace'
step([$class: 'WsCleanup'])
}
}
}
And this code is giving me the following error :
hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: vpc for class
Any suggestions what can be done to resolve this.
Use another name for the choice variable without the dash sign -, e.g. vpc_stack or vpcstack and replace the variable name in python call.

Jenkins Declarative Pipeline Include File

I am trying to a separate file holding variable for a Jenkins pipeline, this is because it will be used by multiple pipelines. But I can't seem to find the proper way to include it? Or if there's any way to include it?
MapA:
def MapA = [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
return this;
MainScript:
def NodeLabel = 'windows'
def CustomWorkSpace = "C:/Workspace"
// Tried loading it here (Location 1)
load 'MapA'
pipeline {
agent {
node {
// Restrict Project Execution
label NodeLabel
// Use Custom Workspace
customWorkspace CustomWorkSpace
// Tried loading it here (Location 2)
load 'MapA'
}
}
stages {
// Solution
stage('Solution') {
steps {
script {
// Using it here
MapA.each { Solution ->
stage("Stage A") {
...
}
stage("Stage B") {
...
}
// Extract Commit Solution
stage("Stage C") {
...
echo "${Solution.value.Environment}"
echo "${Solution.value.Name}"
echo "${Solution.value.Version}"
}
}
}
}
}
}
}
On Location 1 outside the pipeline and node section: it gave the below error
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException: Required context class hudson.FilePath is missing
Perhaps you forgot to surround the code with a step that provides this, such as: node
On Location 2 inside the node section: it gave the below error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 7: Expected to find ‘someKey "someValue"’ # line 7, column 14.
load 'MapA'
node {
^
You can achieve your scenario in 2 ways:
#1
If you want you can hardcode the variable in the same Jenkins file and make use of it on your pipeline like below Example :
Jenkinsfile content
def MapA = [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
pipeline {
agent any;
stages {
stage('debug') {
steps {
script {
MapA.each { k, v ->
stage(k) {
v.each { k1,v1 ->
// do your actual task by accessing the map value like below
echo "${k} , ${k1} value is : ${v1}"
}
}
}
}
}
}
}
}
#2
If you would like to keep the variable in a separate groovy file in a gitrepo, it will be like below
Git Repo file and folder structure
.
├── Jenkinsfile
├── README.md
└── var.groovy
var.groovy
def mapA() {
return [
ItemA: [
Environment: 'envA',
Name: 'ItemA',
Version: '1.0.0.2',
],
ItemB: [
Environment: 'envB',
Name: 'ItemB',
Version: '2.0.0.1',
]
]
}
def helloWorld(){
println "Hello World!"
}
return this;
Jenkinsfile
pipeline {
agent any
stages {
stage("iterate") {
steps {
sh """
ls -al
"""
script {
def x = load "${env.WORKSPACE}/var.groovy"
x.helloWorld()
x.mapA().each { k, v ->
stage(k) {
v.each { k1,v1 ->
echo "for ${k} value of ${k1} is ${v1}"
}
} //stage
} //each
} //script
} //steps
} // stage
}
}

Jenkins declarative pipeline load parameters in pre condition

in jenkins declarative pipeline is there a way to execute pre condition whereby it should load build parameters from a file. In Jenkins there is an option whereby we can restart individual stage. Therefore, i wish for each stage to load the parameters from groovy file.
Currently is
pipeline {
agent any
stage("Grep the values") {
steps {
load "${WORKSPACE}/file-parameter.groovy"
}
}
stage("Perform Deploynment) {
when {
expression { "${Perform_Deployment}" == "true" }
}
steps {
withCredentials([
usernamePassword(credentialsId: "LoginID", passwordVariable: "LoginPassword", usernameVariable: "LoginUser")
]) {
ansiblePlaybook (
playbook: "${WORKSPACE}/ansible-playbook.yml",
forks: 5,
extraVars: [
loginUser: "${LoginUser}",
loginPassword: "${LoginPassword}"
]
)
}
}
}
}
}
How can i load "${WORKSPACE}/file-parameter.groovy" in teh stage before when condition. My expectation should be somethign as below
pipeline {
agent any
stage("Grep the values") {
steps {
load "${WORKSPACE}/file-parameter.groovy"
}
}
stage("Perform Deploynment) {
load "${WORKSPACE}/file-parameter.groovy"
when {
expression { "${Perform_Deployment}" == "true" }
}
steps {
withCredentials([
usernamePassword(credentialsId: "LoginID", passwordVariable: "LoginPassword", usernameVariable: "LoginUser")
]) {
ansiblePlaybook (
playbook: "${WORKSPACE}/ansible-playbook.yml",
forks: 5,
extraVars: [
loginUser: "${LoginUser}",
loginPassword: "${LoginPassword}"
]
)
}
}
}
}
}
The load step returns whatever the groovy script returned when it was executed, so you need to store it in a variable
file-parameter.groovy could either look like this:
return [
performDeployment: true,
// other variables
]
or like this
performDeployment = true
// other variables and methods
return this
In both cases you could use it in your pipeline like so:
stage("Grep the values") {
steps {
script {
fileParams = load("${WORKSPACE}/file-parameter.groovy")
}
}
}
stage("Perform Deploynment) {
when {
expression { fileParams.performDeployment }
}
I am pretty sure there is no need for the string comparison you are doing and you could use just the boolean value instead.

Passing jenkins variables to Job dsl script in a Pipeline file

Question
How can I pass variables to a job dsl script embedded in a pipeline file. I have a Jenkins pipeline which sets some variables and I want to use these variables in the pipelineJob template in the pipeline. Tried different combinations but cant seem to get it right. For example, following is my pipeline which gets an input from the user i.e the git repo url.
pipeline {
agent any
parameters {
string(name: 'repo_url', defaultValue: '')
}
stages {
stage('Input gathering') {
steps {
script {
env.repo_url = input message: 'Enter github url', parameters: [string(defaultValue: '', description: '', name: 'repo_url', trim: false)]
}
echo "====${env.repo_url}======"
}
}
stage('stage'){
steps {
// some other steps
echo "====${env.repo_url}======"
jobDsl scriptText: '''pipelineJob(\'new-job\') {
triggers {
scm(\'H/5 * * * *\')
}
definition {
cpsScm {
scm {
git {
remote {
url(repo_url)
credentials('bitbucket-jenkins-access')
}
branches(\'master\')
scriptPath(\'Jenkinsfile\')
extensions { }
}
}
}
}
}'''
}
}
}
}

Resources