How to inject variables from a file in Jenkins Declarative Pipeline? - jenkins

I have a text file :
export URL = "useful url"
export NAME = "some name"
What I do is executing this file with command source var_file.txt
But when I do echo $URL or env.URL it returns nothing.
Please I don't have the ability to change the file var_file.txt : it means it will still be export var= value var
I know that it is possible to use load file.groovy step in pipeline to load variables but the file must be a list of : env.URL = 'url', I can't use this because I can't change file.
And we may also work with withEnv([URL = 'url']) but I must first get the values from an other script. This will really be a complicated solution.
So is there a way to use the file with list of export var = var_value in Jenkins Pipeline ?

What I have done is :
def varsFile = "var_file.txt"
def content = readFile varsFile
Get content line by line and split change the each line of content to env.variable = value:
def lines = content.split("\n")
for(l in lines){
String variable = "${l.split(" ")[1].split("=")[0]}"
String value = l.split(" ")[1].split("=")[1]
sh ("echo env.$variable = \\\"$value\\\" >> var_to_exp.groovy")
}
And then load file groovy with step load in the pipeline:
load var_to_exp.groovy

Alternative suggestion: embed scripted pipeline (not sure if there is a genuine "declarative" way of doing this -- at least I haven't found it so far):
stage('MyStage') {
steps {
script {
<extract your variables using some Groovy>
env.myvar = 'myvalue'
}
echo env.myvar
}
}
I'm not entirely sure how much modification you are allowed to do on your input (e.g. get rid of the export etc.), or whether that has to remain an executable shell script.

Related

How to read config file in Jenkins pipeline

I'm creating a new pipeline job, before execute detail bat files, there are lots of variable to define.
node('BuildMachine')
{
env.ReleaseNumber='1.00.00'
env.BuildType='Test'
env.Language='ENU'
...
Stage('Build')
{
bat '''
call build.bat %ReleaseNumber%_%BuildType%_%BUILD_NUMBER%
'''
}
}
Can I save these global variable to a config file, store in git repository and read it?
Can these variable still work in bat?
EnvInject Plugin aka (Environment Injector Plugin) gives you several options to set and use environment variables in Jenkins job.
It will write the variables to a file which can be loaded later to get the variables, but I don't think the variables will directly work in bat.
If you have to define the variables in every build, you could (because the Jenkinsfile is actually just Groovy) define variables and use them in the call as command line arguments:
node('BuildMachine') {
def releaseNumber='1.00.00'
def buildType='Test'
def language='ENU'
Stage('Build')
{
bat "call build.bat ${releaseNumber}_${buildType}_%BUILD_NUMBER%"
}
}
I assume that BUILD_NUMBER is a environment variable set before starting the build job. Otherwise you could do it like the others. Note that the double quotation marks instead of the single quotation marks are necessary for that to work.
Another option is that you define a < key > = < value > file and then you can do:
readFile "<your config file>"
configData = file.split("\n")
configData.each {
lineData = it.split("=")
switch(lineData[0].toLowerCase().trim()){
case "<key 1>": <varName 1> = lineData[1].trim(); break;
case "<key 2>": <varName 2> = lineData[1].trim(); break;
case "<key 3>": <varName 3> = lineData[1].trim(); break;
....
}
}
And then use the varName to call the bat as you mentioned
The advantage with this code is that you don't depend on the order of the contents of the config file

File.exists() in a Jenkins groovy file does not work

I want to create a groovy function in my Jenkins job that looks into a folder and deletes all files who are older than X days.
So I start looking in the internet and found different kind of solutions.
At first I create a .groovy file with Visual Studio Code on my local PC to understand how it works. That is the reason why my code looks not similar to the codes in the internet because I changed it so that I understand how the code works.
def deleteFilesOlderThanDays(int daysBack, String path) {
def DAY_IN_MILLIS = 24 * 60 * 60 * 1000
File directory = new File(path)
if(directory.exists()){
File[] listFiles = directory.listFiles()
for(File listFile : listFiles) {
def days_from_now = ( (System.currentTimeMillis() - listFile.lastModified()) /(DAY_IN_MILLIS))
if(days_from_now > daysBack) {
println('------------')
println('file is older')
println(listFile)
}
else{
println('------------')
println('File is not older')
println(listFile)
}
}//End: for(File listFile : listFiles) {
}//End: if(directory.exists()){
}
(I know, the code do not delete something. It is only for my understanding)
The second step was to include this new created function into my Jenkins groovy file. But since then I'm desperate.
I have the problem that I do not get a positive result at the beginning from the code if the folder really exist.
The line:
if(directory.exists()){
makes me a lot of problems and it is not clear for me why.
I have tried so many kind of versions but I haven’t found a solution for me.
I have also used the “Pipeline Syntax” example [Sample Step fileExists] but it doesn’t help for me.
I have included:
import java.io.File
At the beginning of my file.
I have a basic file which I include in the Jenkins job. This file includes my library files. One of this library files is the file.groovy. In the basic Jenkins file I execute the function file.deleteFilesOlderThanDays() (for testing I do not use any parameters).
The code from my function for testing is:
def deleteFilesOlderThanDays() {
dir = '.\\ABC'
echo "1. ----------------------------------------"
File directory1 = new File('.\\ABC\\')
exist = directory1.exists()
echo 'Directory1 name is = '+directory1
echo 'exist value is = '+exist
echo "2. ----------------------------------------"
File directory2 = new File('.\\ABC')
exist = directory2.exists()
echo 'Directory2 name is = '+directory2
echo 'exist value is = '+exist
echo "3. ----------------------------------------"
File directory3 = new File(dir)
exist = directory3.exists()
echo 'Directory3 name is = '+directory3
echo 'exist value is = '+exist
echo "4. Pipeline Syntax ------------------------"
exist = fileExists '.\\ABC'
echo 'exist value is = '+exist
echo "5. ----------------------------------------"
File directory5 = new File(dir)
echo 'Directory5 name is = '+directory5
// execute an error
// exist = fileExists(directory5)
exist = fileExists "directory5"
echo 'exist value is = '+exist
echo "6. ----------------------------------------"
exist = fileExists(dir)
echo 'exist value is = '+exist
File[] listFiles = directory5.listFiles()
echo 'List file = '+listFiles
}
And the Output in the Jenkins Console Output is: (I cleaned it a little bit up….)
1. ----------------------------------------
Directory1 name is = .\ABC\
exist value is = false
2. ----------------------------------------
Directory2 name is = .\ABC
exist value is = false
3. ----------------------------------------
Directory3 name is = .\ABC
exist value is = false
4. Pipeline Syntax ------------------------
exist value is = true
5. ----------------------------------------
Directory5 name is = .\ABC
exist value is = false
6. ----------------------------------------
exist value is = true
List file = null
I only get a true value in step 4 and 6. So I can be sure that the folder really exist.
So it seems to be for me that the command:
File directory = new File(dir)
Not work correct in my case.
I can’t create a listFile variable because the directory would not be initialized correct.
For me is also not clear which kind of commands I should use. The groovy examples use always functions like:
.exists()
But in the Jenkins examples I always find code like this:
fileExists()
Why there are some differences between groovy and Jenkins groovy style? It should be the same ore not?
Does anyone have an idea for me or can told me what I’m doing wrong?
You may benefit from this answer from a similar question:
"
java.io.File methods will refer to files on the master where Jenkins is running, so not in the current workspace on the slave machine.
To refer to files on the slave machine, you should use the readFile method
"
def dir = readFile("${WORKSPACE}/ABC");
Link to original answer
Thanks for all that feedback.
OK, for me is now clear that Jenkins Groovy != Groovy is.
I have read a lot about it that there are different command if you are executing file search on a Jenkins Master or on a Jenkins Slave.
The suggestion from Youg to start after confirmation helps me.
I had problems with deleting the file so at the end I used a primitive batch command to get my function run.
The finally functions looks like now:
def deleteFilesOlderThanXDays(daysBack, path) {
def DAY_IN_MILLIS = 24 * 60 * 60 * 1000
if(fileExists(path)){
// change into path
dir(path) {
// find all kind of files
files = findFiles(glob: '*.*')
for (int i = 0; i < files.length; i++) {
def days_from_now = ( (System.currentTimeMillis() - files[i].lastModified) /(DAY_IN_MILLIS))
if(days_from_now > daysBack) {
echo('file : >>'+files[i].name+'<< is older than '+daysBack+' days')
bat('del /F /Q "'+files[i].name+'"')
}
else{
echo('file : >>'+files[i].name+'<< is not only than '+daysBack+' days')
}
}// End: for (int i = 0; i < files.length; i++) {
}// End: dir(path) {
}// End: if(fileExists(path)){
}
Thanks for helping and best regards,
You can add below script to list the file and folders in current work directory, so that you can confirm the folder ABC is exists or not.
After you confirm the ABC folder exist, then dig into the rest code.
def deleteFilesOlderThanDays() {
// print current work directory
pwd
// if jenkins job run on window machine
bat 'dir'
// if jenkins job run on linux machine
sh 'ls -l'
dir = '.\\ABC'
echo "1. ----------------------------------------"
.....
For fileExists usage, I think the correct way as following:
fileExists './ABC'
def dir = './ABC'
fileExists dir
Should use / as path separator, rather than \ according to its document at below:

I have a Jenkins global variable in a string - how do I evaluate it?

I need to accept all kinds of global Jenkins variables as strings (basically as parameters to ansible like system - a template stored in \vars).
def proof = "\"${params.REPOSITORY_NAME}\""
echo proof
def before = "\"\${params.REPOSITORY_NAME}\""
echo before
def after = Eval.me(before)
echo after
The result is:
[Pipeline] echo
"asfd"
[Pipeline] echo
"${params.REPOSITORY_NAME}"
groovy.lang.MissingPropertyException: No such property: params for class: Script1
the first echo proves that the param value actually exists.
the second echo is the what the input actually looks like.
the third echo should have emitted asdf instead I get the exception.
Any ideas? I'm hours into this :-(
You may want to check:
groovy: Have a field name, need to set value and don't want to use switch
1st Variant
In case you have: xyz="REPOSITORY_NAME" and want the value of the parameter REPOSITORY_NAME you can simply use:
def xyz = "REPOSITORY_NAME"
echo params."$xyz" // will print the value of params.REPOSITORY_NAME
In case if your variable xyz must hold the full string including params. you could use the following solution
#NonCPS
def split(string) {
string.split(/\./)
}
def xyz = "params.REPOSITORY_NAME"
def splitString = split(xyz)
echo this."${splitString[0]}"."${splitString[1]}" // will print the value of params.REPOSITORY_NAME
2nd Variant
In case you want to specify an environment variable name as parameter you can use:
env.“${params.REPOSITORY_NAME}”
In plain groovy env[params.REPOSITORY_NAME] would work but in pipeline this one would not work inside the sandbox.
That way you first retrieve the value of REPOSITORY_NAME and than use it as key to a environment variable.
Using directly env.REPOSITORY_NAME will not be the same as it would try to use REPOSITORY_NAME itself as the key.
E.g. say you have a job named MyJob with the following script:
assert(params.MyParameter == "JOB_NAME")
echo env."${params.MyParameter}"
assert(env."${params.MyParameter}" == 'MyJob')
This will print the name of the job (MyJob) to the console assuming you did set the MyParameter parameter to JOB_NAME. Both asserts will pass.
Please don’t forget to open a node{} block first in case you want to retrieve the environment of that very node.
After trying all those solutions, found out that this works for my problem (which sounds VERY similar to the question asked - not exactly sure though):
${env[REPOSITORY_NAME]}

How can I build custom rules using the output of workspace_status_command?

The bazel build flag --workspace_status_command supports calling a script to retrieve e.g. repository metadata, this is also known as build stamping and available in rules like java_binary.
I'd like to create a custom rule using this metadata.
I want to use this for a common support function. It should receive the git version and some other attributes and create a version.go output file usable as a dependency.
So I started a journey looking at rules in various bazel repositories.
Rules like rules_docker support stamping with stamp in container_image and let you reference the status output in attributes.
rules_go supports it in the x_defs attribute of go_binary.
This would be ideal for my purpose and I dug in...
It looks like I can get what I want with ctx.actions.expand_template using the entries in ctx.info_file or ctx.version_file as a dictionary for substitutions. But I didn't figure out how to get a dictionary of those files. And those two files seem to be "unofficial", they are not part of the ctx documentation.
Building on what I found out already: How do I get a dict based on the status command output?
If that's not possible, what is the shortest/simplest way to access workspace_status_command output from custom rules?
I've been exactly where you are and I ended up following the path you've started exploring. I generate a JSON description that also includes information collected from git to package with the result and I ended up doing something like this:
def _build_mft_impl(ctx):
args = ctx.actions.args()
args.add('-f')
args.add(ctx.info_file)
args.add('-i')
args.add(ctx.files.src)
args.add('-o')
args.add(ctx.outputs.out)
ctx.actions.run(
outputs = [ctx.outputs.out],
inputs = ctx.files.src + [ctx.info_file],
arguments = [args],
progress_message = "Generating manifest: " + ctx.label.name,
executable = ctx.executable._expand_template,
)
def _get_mft_outputs(src):
return {"out": src.name[:-len(".tmpl")]}
build_manifest = rule(
implementation = _build_mft_impl,
attrs = {
"src": attr.label(mandatory=True,
allow_single_file=[".json.tmpl", ".json_tmpl"]),
"_expand_template": attr.label(default=Label("//:expand_template"),
executable=True,
cfg="host"),
},
outputs = _get_mft_outputs,
)
//:expand_template is a label in my case pointing to a py_binary performing the transformation itself. I'd be happy to learn about a better (more native, fewer hops) way of doing this, but (for now) I went with: it works. Few comments on the approach and your concerns:
AFAIK you cannot read in (the file and perform operations in Skylark) itself...
...speaking of which, it's probably not a bad thing to keep the transformation (tool) and build description (bazel) separate anyways.
It could be debated what constitutes the official documentation, but ctx.info_file may not appear in the reference manual, it is documented in the source tree. :) Which is case for other areas as well (and I hope that is not because those interfaces are considered not committed too yet).
For sake of comleteness in src/main/java/com/google/devtools/build/lib/skylarkbuildapi/SkylarkRuleContextApi.java there is:
#SkylarkCallable(
name = "info_file",
structField = true,
documented = false,
doc =
"Returns the file that is used to hold the non-volatile workspace status for the "
+ "current build request."
)
public FileApi getStableWorkspaceStatus() throws InterruptedException, EvalException;
EDIT: few extra details as asked in the comment.
In my workspace_status.sh I would have for instance the following line:
echo STABLE_GIT_REF $(git log -1 --pretty=format:%H)
In my .json.tmpl file I would then have:
"ref": "${STABLE_GIT_REF}",
I've opted for shell like notation of text to be replaced, since it's intuitive for many users as well as easy to match.
As for the replacement, relevant (CLI kept out of this) portion of the actual code would be:
def get_map(val_file):
"""
Return dictionary of key/value pairs from ``val_file`.
"""
value_map = {}
for line in val_file:
(key, value) = line.split(' ', 1)
value_map.update(((key, value.rstrip('\n')),))
return value_map
def expand_template(val_file, in_file, out_file):
"""
Read each line from ``in_file`` and write it to ``out_file`` replacing all
${KEY} references with values from ``val_file``.
"""
def _substitue_variable(mobj):
return value_map[mobj.group('var')]
re_pat = re.compile(r'\${(?P<var>[^} ]+)}')
value_map = get_map(val_file)
for line in in_file:
out_file.write(re_pat.subn(_substitue_variable, line)[0])
EDIT2: This is how the Python script is how I expose the python script to rest of bazel.
py_binary(
name = "expand_template",
main = "expand_template.py",
srcs = ["expand_template.py"],
visibility = ["//visibility:public"],
)
Building on Ondrej's answer, I now use somthing like this (adapted in SO editor, might contain small errors):
tools/bazel.rc:
build --workspace_status_command=tools/workspace_status.sh
tools/workspace_status.sh:
echo STABLE_GIT_REV $(git rev-parse HEAD)
version.bzl:
_VERSION_TEMPLATE_SH = """
set -e -u -o pipefail
while read line; do
export "${line% *}"="${line#* }"
done <"$INFILE" \
&& cat <<EOF >"$OUTFILE"
{ "ref": "${STABLE_GIT_REF}"
, "service": "${SERVICE_NAME}"
}
EOF
"""
def _commit_info_impl(ctx):
ctx.actions.run_shell(
outputs = [ctx.outputs.outfile],
inputs = [ctx.info_file],
progress_message = "Generating version file: " + ctx.label.name,
command = _VERSION_TEMPLATE_SH,
env = {
'INFILE': ctx.info_file.path,
'OUTFILE': ctx.outputs.version_go.path,
'SERVICE_NAME': ctx.attr.service,
},
)
commit_info = rule(
implementation = _commit_info_impl,
attrs = {
'service': attr.string(
mandatory = True,
doc = 'name of versioned service',
),
},
outputs = {
'outfile': 'manifest.json',
},
)

How to write jenkins plugin import last_successful_build artifact report to current build

I wrote a plugin following
http://www.baeldung.com/jenkins-custom-plugin
And it generates a html report
File artifactsDir = build.getArtifactsDir();
String path = artifactsDir.getCanonicalPath() + REPORT_TEMPLATE_PATH;
File reportFile = new File("path");
// write report's text to the report's file
and for the next build, I want to import this report file to see the changes
I tried these but none of them works
build.getPreviousSuccessfulBuild().getArtifactManager().root() + REPORT_TEMPLATE_PATH
// fail with File not found, but the file is there in bash
build.getPreviousSuccessfulBuild().getArtifactsDir() + REPORT_TEMPLATE_PATH
// null pointer exception, seems to be generated by getArtifactsDir()
build.getPreviousBuild().getArtifactManager().root() + REPORT_TEMPLATE_PATH
So how can I obtain the last successful build report file within current build ?
This is how I did it in an pipeline job. I stripped parts of the original code, I hope I didn't introduce an error. I also removed error handling for clarity:
// find last successful build
def lastSuccessfulBuild = currentBuild.getPreviousBuild()
while (lastSuccessfulBuild && (lastSuccessfulBuild.currentResult != 'SUCCESS')) {
lastSuccessfulBuild = lastSuccessfulBuild.getPreviousBuild()
}
// here I go for a file named 'crc.txt'
// this works only if you have a
// archiveArtifacts artifacts: 'crc.txt', fingerprint: true
// somewhere in your build
def build = lastSuccessfulBuild?.getRawBuild()
def artifact = build.getArtifacts().find { it.fileName == 'crc.txt' }
def uri = build.artifactManager.root().child(artifact.relativePath).toURI()
def content = uri.toURL().text
When I compare our solutions: you don't use child() and you have the relative path in REPORT_TEMPLATE_PATH while I obtain it from the artifact.

Resources