Jenkins groovy file can't import another groovy - jenkins

In simple form, I have 2 groovy files in the repo in /jenkins subfolder. File A.groovy and B.groovy. Inside A.groovy I have a load line
load(env.WORKSPACE + "#script/jenkins/B.groovy")
The problem is I get an error
java.nio.file.NoSuchFileException:
/Users/user/.jenkins/workspace/JobName#script/jenkins/B.groovy
But as we see, overall looks like load function created kinda almost correct url. The point is my actual fetched repo, and particularly A.groovy is getting into the additional subfolder. I see that in the very beginning of the logs and can find locally there.
Checking out git ... into /Users/user/.jenkins/workspace/JobName#script/ecb7a9317b1ad672698830264d9e0ce2b9b6f330c043bb85f48623f3cdcab65e/jenkins/A.groovy
Tried to log whole env object using echo sh(script: 'env|sort', returnStdout: true) and there is no any property containing that subfolder name at all.
Why I am getting that extra ecb7a9317b1ad672693830224d9e0ce2b9b3f730c043bb85f48925f3cdcab65e subfolder and how can I either get rid of it or get it's name somehow to compose correct url for import?

I've found a workaround (by searching the folder), but would like to find a better and native way.
Where jenkins in -name 'jenkins' is the subfolder name containing groovy scripts.
git_jenkins_folder = sh (
script: "find \"" + WORKSPACE + "\"#script -type d -name 'jenkins'",
returnStdout: true
).trim()
utils = load("$git_jenkins_folder/Utils.groovy")
Which generates the correct working path like this.
/Users/user/.jenkins/workspace/JobName#script/ecb7a9317b1ad672698830264d9e0ce2b9b6f330c043bb85f48623f3cdcab65e/jenkins

I think the issue is with the path you pass to load. The env.WORKSPACE does not end with /.
load("${env.WORKSPACE}/#script/jenkins/B.groovy")

Related

Outputting script output values from jenkins to a file

I have my jenkins pipeline working and calling Ps scripts.
env.output= powershell(returnStdout: true, script: '.\\scripts\\script.ps1 -target_servername')
I would like to save the env.output to an automatically generated file within the workspace directory.
so like %workspace%\logs\%jobname%_%job_no%.log
And I would like to append details onto the file.
Your are alreaddy using powershell. So i think the Add-Content Commandlet is the perfect tool for your. Just switch to a multiline powershell script.

How would I pass a variable into Jenkins which contains wildcards?

I am trying to build a jar and include specific files:
jar cf models.jar target/classes/**/models
However, I am making a pipeline with variables:
jar cf ${JAR_NAME}.jar ${FILE_SEARCH_PATTERN}
This causes the command to run as:
jar cf models.jar 'target/classes/**/models'
which causes the system to not find any files as the quotes break the search.
I found a solution to my problem; while it doesn't get around how the groovy script is translated in Jenkins, this might help people trying to achieve something similar.
# in project Jenkinsfile
file_search_path = "target/classes/.*/models/.*\\.class"
# in library Jenkinsfile
files=\$(find . -print | grep -i ${FILE_SEARCH_PATH})
jar cf ${JAR_NAME}.jar \$files
Here is a link to the full version of the code.

How do I pass information from one step in a TFS 2017 build to a later Copy Files step

OK - I've read all about environment variables and how they can't be set and read by the same process (even this can't be read by a later step in the build):
Environment.SetEnvironmentVariable("Major_Build_Number", BaseReleaseNumber, EnvironmentVariableTarget.Machine)
So has anyone come up with a simple way to pass along info from one build step to another? In my first step I determine the build number (this is a fairly complex process believe it or not) and I need to pass that build number to the last build step (which is a Copy Files step) so that it can copy the build into a folder that's named with the build number. I've tried setting an environment variable, but unfortunately that can't be set and read from the same session. There's gotta be a simple way to do this. Yes I could write a PS or batch script to do it and store the value in a file or the registry, but I would prefer to use the Copy Files task and I can't figure out how to pass that value along.
I tried defining a variable in the build definition and storing the value there but I can't seem to change that value after it's set in the build definition.
BTW, this is an on-premises installation - not VSTS.
Anyone have any ideas?
Thanks Andy for your response. So I tried this in SetBuildNumberENVVar.ps1
param([Int32]$MajorBuildNumber=0,[Int32]$MinorBuildNumber=0)
Write-Host "##vso[task.setvariable variable=MajorBuildNumber]$MajorBuildNumber"
Write-Host "##vso[task.setvariable variable=MinorBuildNumber]$MinorBuildNumber"
I then run it from the command line:
c:\powershell .\SetBuildNumberENVVar.ps1 23 45
and then try to echo the variable:
echo %MajorBuildNumber%
%MajorBuildNumber%
and as you can see it doesn't appear to work. I tried this from a C# script:
int.TryParse("$(MajorBuildNumber)", out mbn);
and mbn = 0 after this runs.
Any idea what I'm doing wrong?
Generally you can use the predefined variale $(Build.BuildNumber) to name the folder, it can be used directly in entire build process. See Predefined variables for details.
If you customed the build number variable as you said : "I determine the build number (this is a fairly complex process believe it or not)". Then you can pass the value with the Logging Command: ##vso[task.setvariable]value:
Add a PowserShell task in you build definition
Copy and paste below script and save it as *.ps1 file
$value= "The value of the build number here"
Write-Host "##vso[task.setvariable variable=BuildNumber]$value"
Check in the PS file, then run the PS file in PowerShell task
After that, you can use the variable $BuildNumber directly in any tasks behind the PowserShell task.
You can reference my answer in another similar thread : Custom TFS Enviroment Variable doesn't read $(Date)
UPDATE:
You need to run the PowerShell script in TFS build process.
See below example:
I created two PS scripts, one to set the variables to pass the value, another to use the variables to create a folder named with the passed values:
PS1: PassBuildNumber
param([Int32]$MajorBuildNumber=0,[Int32]$MinorBuildNumber=0)
Write-Host "##vso[task.setvariable variable=MajorBuildNumber]$MajorBuildNumber"
Write-Host "##vso[task.setvariable variable=MinorBuildNumber]$MinorBuildNumber"
PS2 : Use the variables
Write-Host "The Major build number is:" $env:MajorBuildNumber
Write-Host "The Minor build number is:" $env:MinorBuildNumber
$foldername = $env:MajorBuildNumber + "." + $env:MinorBuildNumber
Write-Host "foldername:" $foldername
$path = "\\myshare\DirA\$foldername"
Write-Host "path:" $path
New-Item -Path $path -ItemType directory # Create a folder
Write-Host "##vso[task.setvariable variable=path]$path" # Set path as a variable to be used in Copy Task
Then you can use Copy Files task to copy files to that folder.

Jenkins "Console Output" log location in filesystem

I want to access and grep Jenkins Console Output as a post build step in the same job that creates this output. Redirecting logs with >> log.txt is not a solution since this is not supported by my build steps.
Build:
echo "This is log"
Post build step:
grep "is" path/to/console_output
Where is the specific log file created in filesystem?
#Bruno Lavit has a great answer, but if you want you can just access the log and download it as txt file to your workspace from the job's URL:
${BUILD_URL}/consoleText
Then it's only a matter of downloading this page to your ${Workspace}
You can use "Invoke ANT" and use the GET target
On Linux you can use wget to download it to your workspace
etc.
Good luck!
Edit:
The actual log file on the file system is not on the slave, but kept in the Master machine. You can find it under: $JENKINS_HOME/jobs/$JOB_NAME/builds/lastSuccessfulBuild/log
If you're looking for another build just replace lastSuccessfulBuild with the build you're looking for.
Jenkins stores the console log on master. If you want programmatic access to the log, and you are running on master, you can access the log that Jenkins already has, without copying it to the artifacts or having to GET the http job URL.
From http://javadoc.jenkins.io/archive/jenkins-1.651/hudson/model/Run.html#getLogFile(), this returns the File object for the console output (in the jenkins file system, this is the "log" file in the build output directory).
In my case, we use a chained (child) job to do parsing and analysis on a parent job's build.
When using a groovy script run in Jenkins, you get an object named "build" for the run. We use this to get the http://javadoc.jenkins.io/archive/jenkins-1.651/hudson/model/Build.html for the upstream job, then call this job's .getLogFile().
Added bonus; since it's just a File object, we call .getParent() to get the folder where Jenkins stores build collateral (like test xmls, environment variables, and other things that may not be explicitly exposed through the artifacts) which we can also parse.
Double added bonus; we also use matrix jobs. This sometimes makes inferring the file path on the system a pain. .getLogFile().getParent() takes away all the pain.
You can install this Jenkins Console log plugin to write the log in your workspace as a post build step.
You have to build the plugin yourself and install the plugin manually.
Next, you can add a post build step like that:
With an additional post build step (shell script), you will be able to grep your log.
I hope it helped :)
Log location:
${JENKINS_HOME}/jobs/${JOB_NAME}/builds/${BUILD_NUMBER}/log
Get log as a text and save to workspace:
cat ${JENKINS_HOME}/jobs/${JOB_NAME}/builds/${BUILD_NUMBER}/log >> log.txt
For very large output logs it could be difficult to open (network delay, scrolling). This is the solution I'm using to check big log files:
https://${URL}/jenkins/job/${jobName}/${buildNumber}/
in the left column you see: View as plain text. Do a right mouse click on it and choose save links as. Now you can save your big log as .txt file. Open it with notepad++ and you can go through your logs easily without network delays during scrolling.
I found the console output of my job in the browser at the following location:
http://[Jenkins URL]/job/[Job Name]/default/[Build Number]/console
This is designed for use when you have a shell script build step. Use only the first two lines to get the file name.
You can get the console log file (using bash magic) for the current build from a shell script this way and check it for some error string, failing the job if found:
logFilename=${JENKINS_HOME}/${JOB_URL:${#JENKINS_URL}}
logFilename=${logFilename//job\//jobs\/}builds/${BUILD_NUMBER}/log
grep "**Failure**" ${logFilename} ; exitCode=$?
[[ $exitCode -ne 1 ]] && exit 1
You have to build the file name by taking the JOB_URL, stripping off the leading host name part, adding in the path to JENKINS_HOME, replacing "/job/" to "/jobs/" to handle all nested folders, adding the current build number and the file name.
The grep returns 0 if the string is found and 2 if there is a file error. So a 1 means it found the error indication string. That makes the build fail.
Easy solution would be:
curl http://jenkinsUrl/job/<Build_Name>/<Build_Number>/consoleText -OutFile <FilePathToLocalDisk>
or for the last successful build...
curl http://jenkinsUrl/job/<Build_Name>/lastSuccessfulBuild/consoleText -OutFile <FilePathToLocalDisk>

Read DSL from file in Jenkins outside of workspace

I know its possible to run a .dsl file from an external source instead of just writing the code of the flow in the job's description, but every time I try to run lets say:
/home/flows/flow_script.dsl
I get the following error:
java.io.FileNotFoundException:/home/flows/flow_script.dsl (No such file or directory)
The path is correct, I can see the file through that path from the shell, but it doesnt let me select anything outside the "builds workspace" apparetly.
I recently ran into this very issue: my DSL script was outside of my workspace (installed via a package). The problem is that the DSL Scripts path is an Ant format that only allows specific patterns (and not absolute paths).
My workaround is hacky, but it did work: add an Execute Shell step before the "Process Job DSLs" step that symlinks the external directory into the workspace.
Something like this:
echo "Creating a symlink from /home/flows to workspace"
ln -sf "/home/flows" .flows
Then you can set the DSL Scripts path to ".flows/flow_script.dsl".
This has some additional caveats, of course: the directory you're symlinking from will need to be accessible by the jenkins user. And it likely violates a lot of best practices.

Resources