File.exists() in a Jenkins groovy file does not work - jenkins

I want to create a groovy function in my Jenkins job that looks into a folder and deletes all files who are older than X days.
So I start looking in the internet and found different kind of solutions.
At first I create a .groovy file with Visual Studio Code on my local PC to understand how it works. That is the reason why my code looks not similar to the codes in the internet because I changed it so that I understand how the code works.
def deleteFilesOlderThanDays(int daysBack, String path) {
def DAY_IN_MILLIS = 24 * 60 * 60 * 1000
File directory = new File(path)
if(directory.exists()){
File[] listFiles = directory.listFiles()
for(File listFile : listFiles) {
def days_from_now = ( (System.currentTimeMillis() - listFile.lastModified()) /(DAY_IN_MILLIS))
if(days_from_now > daysBack) {
println('------------')
println('file is older')
println(listFile)
}
else{
println('------------')
println('File is not older')
println(listFile)
}
}//End: for(File listFile : listFiles) {
}//End: if(directory.exists()){
}
(I know, the code do not delete something. It is only for my understanding)
The second step was to include this new created function into my Jenkins groovy file. But since then I'm desperate.
I have the problem that I do not get a positive result at the beginning from the code if the folder really exist.
The line:
if(directory.exists()){
makes me a lot of problems and it is not clear for me why.
I have tried so many kind of versions but I haven’t found a solution for me.
I have also used the “Pipeline Syntax” example [Sample Step fileExists] but it doesn’t help for me.
I have included:
import java.io.File
At the beginning of my file.
I have a basic file which I include in the Jenkins job. This file includes my library files. One of this library files is the file.groovy. In the basic Jenkins file I execute the function file.deleteFilesOlderThanDays() (for testing I do not use any parameters).
The code from my function for testing is:
def deleteFilesOlderThanDays() {
dir = '.\\ABC'
echo "1. ----------------------------------------"
File directory1 = new File('.\\ABC\\')
exist = directory1.exists()
echo 'Directory1 name is = '+directory1
echo 'exist value is = '+exist
echo "2. ----------------------------------------"
File directory2 = new File('.\\ABC')
exist = directory2.exists()
echo 'Directory2 name is = '+directory2
echo 'exist value is = '+exist
echo "3. ----------------------------------------"
File directory3 = new File(dir)
exist = directory3.exists()
echo 'Directory3 name is = '+directory3
echo 'exist value is = '+exist
echo "4. Pipeline Syntax ------------------------"
exist = fileExists '.\\ABC'
echo 'exist value is = '+exist
echo "5. ----------------------------------------"
File directory5 = new File(dir)
echo 'Directory5 name is = '+directory5
// execute an error
// exist = fileExists(directory5)
exist = fileExists "directory5"
echo 'exist value is = '+exist
echo "6. ----------------------------------------"
exist = fileExists(dir)
echo 'exist value is = '+exist
File[] listFiles = directory5.listFiles()
echo 'List file = '+listFiles
}
And the Output in the Jenkins Console Output is: (I cleaned it a little bit up….)
1. ----------------------------------------
Directory1 name is = .\ABC\
exist value is = false
2. ----------------------------------------
Directory2 name is = .\ABC
exist value is = false
3. ----------------------------------------
Directory3 name is = .\ABC
exist value is = false
4. Pipeline Syntax ------------------------
exist value is = true
5. ----------------------------------------
Directory5 name is = .\ABC
exist value is = false
6. ----------------------------------------
exist value is = true
List file = null
I only get a true value in step 4 and 6. So I can be sure that the folder really exist.
So it seems to be for me that the command:
File directory = new File(dir)
Not work correct in my case.
I can’t create a listFile variable because the directory would not be initialized correct.
For me is also not clear which kind of commands I should use. The groovy examples use always functions like:
.exists()
But in the Jenkins examples I always find code like this:
fileExists()
Why there are some differences between groovy and Jenkins groovy style? It should be the same ore not?
Does anyone have an idea for me or can told me what I’m doing wrong?

You may benefit from this answer from a similar question:
"
java.io.File methods will refer to files on the master where Jenkins is running, so not in the current workspace on the slave machine.
To refer to files on the slave machine, you should use the readFile method
"
def dir = readFile("${WORKSPACE}/ABC");
Link to original answer

Thanks for all that feedback.
OK, for me is now clear that Jenkins Groovy != Groovy is.
I have read a lot about it that there are different command if you are executing file search on a Jenkins Master or on a Jenkins Slave.
The suggestion from Youg to start after confirmation helps me.
I had problems with deleting the file so at the end I used a primitive batch command to get my function run.
The finally functions looks like now:
def deleteFilesOlderThanXDays(daysBack, path) {
def DAY_IN_MILLIS = 24 * 60 * 60 * 1000
if(fileExists(path)){
// change into path
dir(path) {
// find all kind of files
files = findFiles(glob: '*.*')
for (int i = 0; i < files.length; i++) {
def days_from_now = ( (System.currentTimeMillis() - files[i].lastModified) /(DAY_IN_MILLIS))
if(days_from_now > daysBack) {
echo('file : >>'+files[i].name+'<< is older than '+daysBack+' days')
bat('del /F /Q "'+files[i].name+'"')
}
else{
echo('file : >>'+files[i].name+'<< is not only than '+daysBack+' days')
}
}// End: for (int i = 0; i < files.length; i++) {
}// End: dir(path) {
}// End: if(fileExists(path)){
}
Thanks for helping and best regards,

You can add below script to list the file and folders in current work directory, so that you can confirm the folder ABC is exists or not.
After you confirm the ABC folder exist, then dig into the rest code.
def deleteFilesOlderThanDays() {
// print current work directory
pwd
// if jenkins job run on window machine
bat 'dir'
// if jenkins job run on linux machine
sh 'ls -l'
dir = '.\\ABC'
echo "1. ----------------------------------------"
.....
For fileExists usage, I think the correct way as following:
fileExists './ABC'
def dir = './ABC'
fileExists dir
Should use / as path separator, rather than \ according to its document at below:

Related

Start Jenkins-Pipeline if file exists

Are there any possibilities to trigger my pipeline only, if there exists a special file in my filesystem.
So I want to start a new build on my pipeline at 9 p.m. But only, if there is the file run.xy on my filesystem.
At your Build periodically in Build Triggers section add 0 21 * * *.
And pipeline start by something like this:
def folder = new File( 'filepath' )
// If it does exist
if( folder.exists() ) {
//run build
} else {
println "File doesn't exist"
}

How to inject variables from a file in Jenkins Declarative Pipeline?

I have a text file :
export URL = "useful url"
export NAME = "some name"
What I do is executing this file with command source var_file.txt
But when I do echo $URL or env.URL it returns nothing.
Please I don't have the ability to change the file var_file.txt : it means it will still be export var= value var
I know that it is possible to use load file.groovy step in pipeline to load variables but the file must be a list of : env.URL = 'url', I can't use this because I can't change file.
And we may also work with withEnv([URL = 'url']) but I must first get the values from an other script. This will really be a complicated solution.
So is there a way to use the file with list of export var = var_value in Jenkins Pipeline ?
What I have done is :
def varsFile = "var_file.txt"
def content = readFile varsFile
Get content line by line and split change the each line of content to env.variable = value:
def lines = content.split("\n")
for(l in lines){
String variable = "${l.split(" ")[1].split("=")[0]}"
String value = l.split(" ")[1].split("=")[1]
sh ("echo env.$variable = \\\"$value\\\" >> var_to_exp.groovy")
}
And then load file groovy with step load in the pipeline:
load var_to_exp.groovy
Alternative suggestion: embed scripted pipeline (not sure if there is a genuine "declarative" way of doing this -- at least I haven't found it so far):
stage('MyStage') {
steps {
script {
<extract your variables using some Groovy>
env.myvar = 'myvalue'
}
echo env.myvar
}
}
I'm not entirely sure how much modification you are allowed to do on your input (e.g. get rid of the export etc.), or whether that has to remain an executable shell script.

How can I build custom rules using the output of workspace_status_command?

The bazel build flag --workspace_status_command supports calling a script to retrieve e.g. repository metadata, this is also known as build stamping and available in rules like java_binary.
I'd like to create a custom rule using this metadata.
I want to use this for a common support function. It should receive the git version and some other attributes and create a version.go output file usable as a dependency.
So I started a journey looking at rules in various bazel repositories.
Rules like rules_docker support stamping with stamp in container_image and let you reference the status output in attributes.
rules_go supports it in the x_defs attribute of go_binary.
This would be ideal for my purpose and I dug in...
It looks like I can get what I want with ctx.actions.expand_template using the entries in ctx.info_file or ctx.version_file as a dictionary for substitutions. But I didn't figure out how to get a dictionary of those files. And those two files seem to be "unofficial", they are not part of the ctx documentation.
Building on what I found out already: How do I get a dict based on the status command output?
If that's not possible, what is the shortest/simplest way to access workspace_status_command output from custom rules?
I've been exactly where you are and I ended up following the path you've started exploring. I generate a JSON description that also includes information collected from git to package with the result and I ended up doing something like this:
def _build_mft_impl(ctx):
args = ctx.actions.args()
args.add('-f')
args.add(ctx.info_file)
args.add('-i')
args.add(ctx.files.src)
args.add('-o')
args.add(ctx.outputs.out)
ctx.actions.run(
outputs = [ctx.outputs.out],
inputs = ctx.files.src + [ctx.info_file],
arguments = [args],
progress_message = "Generating manifest: " + ctx.label.name,
executable = ctx.executable._expand_template,
)
def _get_mft_outputs(src):
return {"out": src.name[:-len(".tmpl")]}
build_manifest = rule(
implementation = _build_mft_impl,
attrs = {
"src": attr.label(mandatory=True,
allow_single_file=[".json.tmpl", ".json_tmpl"]),
"_expand_template": attr.label(default=Label("//:expand_template"),
executable=True,
cfg="host"),
},
outputs = _get_mft_outputs,
)
//:expand_template is a label in my case pointing to a py_binary performing the transformation itself. I'd be happy to learn about a better (more native, fewer hops) way of doing this, but (for now) I went with: it works. Few comments on the approach and your concerns:
AFAIK you cannot read in (the file and perform operations in Skylark) itself...
...speaking of which, it's probably not a bad thing to keep the transformation (tool) and build description (bazel) separate anyways.
It could be debated what constitutes the official documentation, but ctx.info_file may not appear in the reference manual, it is documented in the source tree. :) Which is case for other areas as well (and I hope that is not because those interfaces are considered not committed too yet).
For sake of comleteness in src/main/java/com/google/devtools/build/lib/skylarkbuildapi/SkylarkRuleContextApi.java there is:
#SkylarkCallable(
name = "info_file",
structField = true,
documented = false,
doc =
"Returns the file that is used to hold the non-volatile workspace status for the "
+ "current build request."
)
public FileApi getStableWorkspaceStatus() throws InterruptedException, EvalException;
EDIT: few extra details as asked in the comment.
In my workspace_status.sh I would have for instance the following line:
echo STABLE_GIT_REF $(git log -1 --pretty=format:%H)
In my .json.tmpl file I would then have:
"ref": "${STABLE_GIT_REF}",
I've opted for shell like notation of text to be replaced, since it's intuitive for many users as well as easy to match.
As for the replacement, relevant (CLI kept out of this) portion of the actual code would be:
def get_map(val_file):
"""
Return dictionary of key/value pairs from ``val_file`.
"""
value_map = {}
for line in val_file:
(key, value) = line.split(' ', 1)
value_map.update(((key, value.rstrip('\n')),))
return value_map
def expand_template(val_file, in_file, out_file):
"""
Read each line from ``in_file`` and write it to ``out_file`` replacing all
${KEY} references with values from ``val_file``.
"""
def _substitue_variable(mobj):
return value_map[mobj.group('var')]
re_pat = re.compile(r'\${(?P<var>[^} ]+)}')
value_map = get_map(val_file)
for line in in_file:
out_file.write(re_pat.subn(_substitue_variable, line)[0])
EDIT2: This is how the Python script is how I expose the python script to rest of bazel.
py_binary(
name = "expand_template",
main = "expand_template.py",
srcs = ["expand_template.py"],
visibility = ["//visibility:public"],
)
Building on Ondrej's answer, I now use somthing like this (adapted in SO editor, might contain small errors):
tools/bazel.rc:
build --workspace_status_command=tools/workspace_status.sh
tools/workspace_status.sh:
echo STABLE_GIT_REV $(git rev-parse HEAD)
version.bzl:
_VERSION_TEMPLATE_SH = """
set -e -u -o pipefail
while read line; do
export "${line% *}"="${line#* }"
done <"$INFILE" \
&& cat <<EOF >"$OUTFILE"
{ "ref": "${STABLE_GIT_REF}"
, "service": "${SERVICE_NAME}"
}
EOF
"""
def _commit_info_impl(ctx):
ctx.actions.run_shell(
outputs = [ctx.outputs.outfile],
inputs = [ctx.info_file],
progress_message = "Generating version file: " + ctx.label.name,
command = _VERSION_TEMPLATE_SH,
env = {
'INFILE': ctx.info_file.path,
'OUTFILE': ctx.outputs.version_go.path,
'SERVICE_NAME': ctx.attr.service,
},
)
commit_info = rule(
implementation = _commit_info_impl,
attrs = {
'service': attr.string(
mandatory = True,
doc = 'name of versioned service',
),
},
outputs = {
'outfile': 'manifest.json',
},
)

How to write jenkins plugin import last_successful_build artifact report to current build

I wrote a plugin following
http://www.baeldung.com/jenkins-custom-plugin
And it generates a html report
File artifactsDir = build.getArtifactsDir();
String path = artifactsDir.getCanonicalPath() + REPORT_TEMPLATE_PATH;
File reportFile = new File("path");
// write report's text to the report's file
and for the next build, I want to import this report file to see the changes
I tried these but none of them works
build.getPreviousSuccessfulBuild().getArtifactManager().root() + REPORT_TEMPLATE_PATH
// fail with File not found, but the file is there in bash
build.getPreviousSuccessfulBuild().getArtifactsDir() + REPORT_TEMPLATE_PATH
// null pointer exception, seems to be generated by getArtifactsDir()
build.getPreviousBuild().getArtifactManager().root() + REPORT_TEMPLATE_PATH
So how can I obtain the last successful build report file within current build ?
This is how I did it in an pipeline job. I stripped parts of the original code, I hope I didn't introduce an error. I also removed error handling for clarity:
// find last successful build
def lastSuccessfulBuild = currentBuild.getPreviousBuild()
while (lastSuccessfulBuild && (lastSuccessfulBuild.currentResult != 'SUCCESS')) {
lastSuccessfulBuild = lastSuccessfulBuild.getPreviousBuild()
}
// here I go for a file named 'crc.txt'
// this works only if you have a
// archiveArtifacts artifacts: 'crc.txt', fingerprint: true
// somewhere in your build
def build = lastSuccessfulBuild?.getRawBuild()
def artifact = build.getArtifacts().find { it.fileName == 'crc.txt' }
def uri = build.artifactManager.root().child(artifact.relativePath).toURI()
def content = uri.toURL().text
When I compare our solutions: you don't use child() and you have the relative path in REPORT_TEMPLATE_PATH while I obtain it from the artifact.

Groovy Missing Property Exception

I have a jenkins build that needs to get the filenames for all files checked in within a changeset.
I have installed groovy on the slave computer and configured Jenkins to use it. I am running the below script that should return the names (or so I assume as this may be wrong as well) and print to the console screen however I am getting this error:
groovy.lang.MissingPropertyException: No such property: paths for class: hudson.plugins.tfs.model.ChangeSet
Here is the Groovy System Script:
import hudson.plugins.tfs.model.ChangeSet
// work with current build
def build = Thread.currentThread()?.executable
// get ChangesSets with all changed items
def changeSet= build.getChangeSet()
def items = changeSet.getItems()
def affectedFiles = items.collect { it.paths }
// get file names
def fileNames = affectedFiles.flatten().findResults
fileNames.each {
println "Item: $it" // `it` is an implicit parameter corresponding to the current element
}
I am very new to Groovy and Jenkins so if its syntax issue or if I'm missing a step please let me know.
I don't know the version of jenkins you are using but according to the sourcecode of ChangeSet that you can find here I suggest you to replace line 9 with:
def affectedFiles = items.collect { it.getAffectedPaths() }
// or with the equivalent more groovy-idiomatic version
def affectedFiles = items.collect { it.affectedPaths }
Feel free to comment the answer if there will be more issues.

Resources