Yaml file rewrite outputing "type mismatch" - jenkins

I am trying to write to a yaml file, specifically overwrite, in a declarative pipeline in groovy.
transfers:
- name: xyz
cloud: aws
subheading:
impact: Low
reason: ---
artifacts:
- name: name1
type: type1
source:
hash: a1b2C3dd4 ---> VALUE TO OVERWRITE
HASH = sh(returnStdout:true, script: 'git rev-parse HEAD').trim()
data.transfers['artifacts'].source['hash'] = HASH
writeYaml file: filename, data: data, overwrite = true
However there is a type mismatch when doing the second line of the code , and it looks like it's because data.transfers['artifacts'].source['hash'] is of type arraylist and I am trying to set it = to HASH which is of type string. I know that to solve this I can just convert the HASH to arrayList, but I don't understand why data.transfers['artifacts'].source['hash'] = HASH doesn't overwrite the value inside the array? Is there anyway to avoid converting the HASH to array list?

Related

Reading YAML config file and creating environment variables

I was wondering what is the best approach to reading in the yaml config and setting environment variables.
For example, my yaml config looks like this:
amps-ml:
models:
- name: app-sample
type: sagemaker
inference:
image: project_id.dkr.ecr.us-west-2.amazonaws.com/template-model-bert:test_1
data: s3://my_project/sagemaker/huggingface-pytorch-inference-recommender/sentiment-analysis/model/model.tar.gz
endpoint: amp-app-endpoint-test
model_name: sample-model
endpoint_config_name: amp-app-config
model_package_group_name: sample-package-group
endpoint_instance_count: 1,
endpoint_instance_type: ml.m5.large
I essentially want to set environment variables in my Jenkins pipeline for all the variables under inference.
Try
def yaml = readYAML file: "your-file.yaml"
yaml["amps-ml"]["models"][0]["inference"].each {name, value ->
env["$name"] = value
}
You can also iterate the models instead of using explicit index (0)

Dynamic assignment of values as default parameters in Jenkinsfile

Whenever I run this pipeline in Jenkins I have to manually copy-paste some values from a YAML file in a remote Gitlab repository. What I would like to achieve is an auto-fill of the values that .
This is how my Jenkinsfile and the YAML look like:
Jenkinsfile
pipeline {
agent {
docker {
image 'artifactory...'
args "..."
}
}
parameters {
string(name: 'BACKEND_TAG_1', defaultValue: '', description: 'Tag...')
string(name: 'BACKEND_TAG_2', defaultValue: '', description: 'Tag...')
}
stage('prepare') {
steps {
script {
dir('application') {
git url: env.PIPELINE_APPLICATION_GIT_URL, branch: env.PIPELINE_APPLICATION_GIT_BRANCH
}
Values = readYaml file: 'application/values.yaml'
values.yaml
version:
default: 0.1.2
company_tag_1: 0.1.124
company_tag_2: 0.1.230
So I need to loop into the parameters and assign the corresponding values:
Values.each { Value ->
Value.version.minus('company')
/* This value should be assigned to the corresponding parameter BACKEND_TAG_* parameter.
e.g.: BACKEND_TAG_1.default=company_tag_1
BACKEND_TAG_2.default=company_tag_2
*/
}
Reading the YAML works fine but I don't know how to proceed in the assignment of the values.
I presume you would like to populate all parameters before click Build button. I mean after clicking "Build with Parameters" button, you basically would like to see your parameters are populated from your YAML file.
If this is the case You can use Active Choice Parameter or Extended Choice Parameter plugins for this purpose. These Plugins are able to run Groovy Script, so you can develop a small Groovy Script read and select parameters automatically.

Get the variable value if variable name is stored as string in Groovy Script

I am completely new to Groovy and Jenkins. I have some pre defined variable in Groovy Script (of Jenkins pipeline) and need to pick any one variable from them dynamically based on job/user input.
The example context of requirement is as provided below.
Here variable env is my input and based on that I should get correct userid.
env = "dev" //Input
def stg_userid = "abc"
def dev_userid = "xyz"
uid_var_name = "${env}_userid"
print "${uid_var_name}" // It is giving "dev_userid"
print 'abc' if we give 'stg' for env ;
print 'xyz' if we give 'dev' for env
Tried searching online for dynamic variable name use case in Groovy , but didn't got anything useful.
usually it's question of complex variable (Map) that holds parameters for all possible environments
and you could get section of this configuration by environment name
env = "dev"
def config = [
dev: [
user: 'aaa',
url: 'aaa-url'
],
stg: [
user: 'zzz',
url: 'zzz-url'
]
]
def uid_var_name = config[env].user // returns "aaa"

Merging two yaml files in a Jenkins Groovy pipeline

In my Jenkins pipeline, I've got a yaml file that I need to apply to multiple environments, and separate environment specific yaml files that I'd like to inject or merge into the default file and write as a new file.
I've looked at readYaml and writeYaml here: https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/ But I'm not finding a good way of merging multiple files.
A simple example of what I'd like to achieve is here:
# config.yaml
config:
num_instances: 3
instance_size: large
# dev-overrides.yaml
config:
instance_size: small
# dev-config.yaml (desired output after merging dev-overrides.yaml in config.yaml)
config
num_instances: 3
instance_size: small
The Jenkins implementation of readYaml uses SnakeYAML as processor and supports YAML 1.1. You could possibly use the merge operator to accomplish your goal. But the merge operator has been removed in YAML 1.2. Thus I would not advise using this feature even it's currently available.
I would instead merge the objects with some Groovy code like this:
Map merge(Map... maps) {
Map result = [:]
maps.each { map ->
map.each { k, v ->
result[k] = result[k] instanceof Map ? merge(result[k], v) : v
}
}
result
}
def config = readYaml text: """
config:
num_instances: 3
instance_size: large
"""
def configOverrides = readYaml text: """
config:
instance_size: small
"""
// Showcasing what the above code does:
println "merge(config, configOverrides): " + merge(config, configOverrides)
// => [config:[num_instances:3, instance_size:small]]
println "merge(configOverrides, config): " + merge(configOverrides, config)
// => [config:[instance_size:large, num_instances:3]]
// Write to file
writeYaml file: 'dev-config.yaml', data: merge(config, configOverrides)
Inspired by https://stackoverflow.com/a/27476077/1549149

Special Character in Property setting in artifactory through jenkins pipeline

I am following this link to add properties to a file in artifactory. But I am not able to find a way to add special characters in values of Property. Is there any escape character to do this.
I have tried using \ and %5C as suggested for api in artifactory. But it is not working for pipeline.
This is my pipeline script
node('master'){
stage('test'){
def arti_server = Artifactory.server 'Artifactory_Server'
def setPropsSpec = """{
"files": [{
"pattern": "test/test.groovy"
}
]
}"""
arti_server.setProps spec: setPropsSpec, props: "p1=%5C;1;p2=test2"
}
}
Error I am getting is because it is not taking ; as escape character. Instead it is taking it as another property. Here is my ERROR
java.io.IOException: Setting properties: Every property must have at least one value.
at org.jfrog.build.extractor.clientConfiguration.util.EditPropertiesHelper.validateSetProperties(EditPropertiesHelper.java:93)

Resources