Referring or passing variables within YAML file - jenkins

I have the below sample YAML configuration. I'm trying to refer variables within same YAML file as below:
appname : delta
configrepo : dev
development:
clusterurl : "/sa1/demo/dfn"
email: pseudo#my-fdn.com
mount_dir: /mnt/tigers
app_config_dir: "/opt/tigers/$appname/$configrepo/svcaccount"
After reading YAML through Groovy into vardict when I try to do echo '$vardict.app_config_dir' I'm getting the values as below:
/opt/tigers///svcaccount
I'm looking for desired output as below:
/opt/tigers/delta/dev/svcaccount
Kindly suggest the correct way of referring the variables within single YAML file.
Thanks

Related

Which csv format is appropriate for influxdb2?

I'm going to put the csv file into the bucket using influxdb v2.1.
Attempting to insert a simple example file results in the following error:
error in csv.from(): failed to read metadata: failed to read annotations: expected annotation datatype
The csv file that I was going to write is as follows.
#datatype measurement,tag,double,dateTime:RFC3339
m,host,used_percent,time
mem,host1,64.23,2020-01-01T00:00:00Z
mem,host2,72.01,2020-01-01T00:00:00Z
mem,host1,62.61,2020-01-01T00:00:10Z
mem,host2,72.98,2020-01-01T00:00:10Z
mem,host1,63.40,2020-01-01T00:00:20Z
mem,host2,73.77,2020-01-01T00:00:20Z
This is the example data in the official document of influxdata.
If you look at the first line of the example, you can see that datatype is annotated, but why does the error occur?
How should I modify it?
This looks like invalid annotated CVS.
In the csv.from function documentation, you can find examples (as string literals) of both annotated and raw CVS that the cvs.from supports.

Dynamically use current date in OUTPUT EXPORT filename

Stripped down example code using a static file name:
OUTPUT EXPORT /CONTENTS EXPORT=ALL /PDF DOCUMENTFILE='example.pdf'
My question is how to generate a datestamped file. I have tried using $DATE, '$DATE' and running it through a macro but can't seem to find the syntax.
Hey this is a really nice Idea for saving backups in a running syntax production - I will use this myself from now on :) .
So the following syntax works for me:
compute tdy= $time.
formats tdy (date11).
temporary.
select if $casenum=1.
write out="somepath\datemacro.sps" /"define !dated__filename () !quote(!concat('YOUR FILE NAME',' ','", tdy, "','.pdf')) !enddefine.".
exe.
Insert file ="somepath\datemacro.sps".
delete vars tdy.
The file name you used in the code above is now stored in a macro with the date added, and you can use it here:
OUTPUT EXPORT /CONTENTS EXPORT=ALL /PDF DOCUMENTFILE= !dated__filename .
Note that you can change "YOUR FILE NAME" into anything you like, including adding a path. And you can also change ".pdf" so save other kinds of files.
EDIT: Also of course change "somepath" to a valid path on your machine.

How can I pass a pointer to a file in helm upgrade command?

I have a truststore file(a binary file) that I need to provide during helm upgrade. This file is different for each target env(dev,qa,staging or prod). So I can only provide this file at time of deployment. helm upgrade --set-file does not take a binary file. This seem to be the issue I found here: https://github.com/helm/helm/issues/3276. This truststore files are stored in Jenkins Credential store.
As the command itself is described below:
--set-file stringArray set values from respective files specified via the command line (can specify multiple or separate values with commas: key1=path1,key2=path2)
it is also important to know The Format and Limitations of
--set.
The error you see: Error: failed parsing --set-file data... means that the file you are trying to use does not meet the requirements. See the example below:
--set-file key=filepath is another variant of --set. It reads the
file and use its content as a value. An example use case of it is to
inject a multi-line text into values without dealing with indentation
in YAML. Say you want to create a brigade project with certain value
containing 5 lines JavaScript code, you might write a values.yaml
like:
defaultScript: |
const { events, Job } = require("brigadier")
function run(e, project) {
console.log("hello default script")
}
events.on("run", run)
Being embedded in a YAML, this makes it harder for you to use IDE
features and testing framework and so on that supports writing code.
Instead, you can use --set-file defaultScript=brigade.js with
brigade.js containing:
const { events, Job } = require("brigadier")
function run(e, project) {
console.log("hello default script")
}
events.on("run", run)
I hope it helps.

Writing to a json file in workspace using Jenkins

I've a jenkins job with few parameters setup and I've a JSON file in the workspace which has to be updated with the parameters that I pass through jenkins.
Example:
I have the following parameters which I'll take input from user who triggers the job:
Environment (Consider user selects "ENV2")
Filename (Consider user keeps the default value)
I have a json file in my workspace under run/job.json with the following contents:
{
environment: "ENV1",
filename: "abc.txt"
}
Now whatever the value is given by user before triggering a job has to be replaced in the job.json.
So when the user triggers the job, the job.json file should be:
{
environment: "ENV2",
filename: "abc.txt"
}
Please note the environment value in the json which has to be updated.
I've tried https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin plugin. But I'm unable to find any help on parameterizing the values.
Kindly suggest on configuring this plugin or suggest any other plugin which can serve my purpose.
Config File Provider Plugin doesn't allow you to pass parameters to configuration files. You can solve your problem with any scripting language. My favorite approach is using Groovy plugin. Hit a check-box "Execute system Groovy script" and paste the following script:
import groovy.json.*
// read build parameters
env = build.getEnvironment(listener)
environment = env.get('environment')
filename = env.get('filename')
// prepare json
def builder = new JsonBuilder()
builder environment: environment, filename: filename
json = builder.toPrettyString()
// print to console and write to a file
println json
new File(build.workspace.toString() + "\\job.json").write(json)
Output sample:
{
"environment": "ENV2",
"filename": "abc.txt"
}
With Pipeline Utility Steps plugin this is very easy to achieve.
jsonfile = readJSON file: 'path/to/your.json'
jsonfile['environment'] = 'ENV2'
writeJSON file: 'path/to/your.json', json: jsonfile
I will keep it simple. A windows batch file or a shell script (depending on the OS) which will read the environment values and open the JSON file and make the changes.

Custom Config File Parsing in Bash

I'm creating a bash script for screen management and want to pull variables from a config file formatted like so:
[s=sample1]
FOLDER=folder/right/here
COMMAND=python script.py
[i=irssi]
COMMAND=irssi
BOOT
"FOLDER", "COMMAND", and "BOOT" would be optional. "[x=y]" would be required, where "x" is a single lowercase letter.
I'd like for this sample to be parsed into something like:
NAME[0]="sample1"
SHORT[0]="s"
FOLDER[0]="folder/right/here"
COMMAND[0]="python script.py"
NAME[1]="irssi"
SHORT[1]="i"
BOOT[1]="1"
If you really need to parse ini-styled config files in bash, have a look at this link. Else, as others have pointed out, just place the variables in a file and source it...

Resources