java.io.FileNotFoundException in jenkins workspace - jenkins

I have a problem with a file in my jenkins workspace, I need to read a diffFile.txt, I'm using the global variable WORKSPACE like this File fileDiff = new File(env.WORKSPACE+"/diffFile.txt") but i get this error. I've checked and the file is there, I can read it with cat, but not with File, do you know what I can do to fix that?

Instead of using File try to use Jenkins native readFile step. Please check the following sample.
pipeline {
agent any
stages {
stage('Stage') {
steps {
script {
// Dummy code to create a file with new entries
sh "echo 'node1' >> nodeList.txt"
sh "echo 'node2' >> nodeList.txt"
sh "echo 'node3' >> nodeList.txt"
// Reading the file
def data = readFile(file: 'nodeList.txt')
for(def line: data.split('\n')) {
echo line
}
}
}
}
}
}

Related

How to read property from config file inside Jenkins pipeline using Config File Provider Plugin

I want to parametrize my Jenkins pipeline with a simple properties config file
skip_tests=true
that I've added to Jenkins Config File Managment:
In my pipeline I'm importing this file and try to read from it using the Jenkins Pipeline Config File Plugin.
node('my-swarm') {
MY_CONFIG = '27206b95-d69b-4494-a430-0a23483a6408'
try {
stage('prepare') {
configFileProvider([configFile(fileId: "$MY_CONFIG", variable: 'skip_tests')]) {
echo $skip_tests
assert $skip_tests == 'true'
}
}
} catch (Exception e) {
currentBuild.result = 'FAILURE'
print e
}
}
This results in an error:
provisioning config files...
copy managed file [my.properties] to file:/home/jenkins/build/workspace/my-workspace#tmp/config7043792000148664559tmp
[Pipeline] {
[Pipeline] }
Deleting 1 temporary files
[Pipeline] // configFileProvider
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
groovy.lang.MissingPropertyException: No such property: $skip_tests for
class: groovy.lang.Binding
Any ideas what I'm doing wrong here?
With the help of the other answers and How to read properties file from Jenkins 2.0 pipeline script I found the following code to work:
configFileProvider([configFile(fileId: "$PBD1_CONFIG", variable: 'configFile')]) {
def props = readProperties file: "$configFile"
def skip_tests = props['skip_tests']
if (skip_tests == 'true') {
print 'skipping tests'
} else {
print 'running tests'
}
}
I had to use readProperties from Jenkins' Pipeline Utility Steps Plugin.
Since the file is in property format you can use it in a shell step:
sh """
source ${MY_CONFIG}
.
.
.
"""
You would need to export the properties that need to be available on programs that the shell calls (e.g. Maven)
You made a wrong usage of Groovy GString, you should wrap $skip_tests in " or use skip_tests directly.
configFileProvider([configFile(fileId: "$MY_CONFIG", variable: 'skip_tests')]) {
echo skip_tests
assert skip_tests == 'true'
echo "$skip_tests"
assert "$skip_tests" == 'true'
}
Note: the value of skip_tests is the file path of the config file which is copied from master to job's workspace. It's not the content of the config file.

Pass variables between Jenkins stages

I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.
A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}
Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}

Jenkins pipeline shell step

Trying to get this pipeline working..
I need to prepare some variables (list or string) in groovy, and iterate over it in bash. As I understand, groovy scripts run on jenkins master, but I need to download some files into build workspace, that's why I try to download them in SH step.
import groovy.json.JsonSlurper
import hudson.FilePath
pipeline {
agent { label 'xxx' }
parameters {
...
}
stages {
stage ('Get rendered images') {
steps {
script {
//select grafana API url based on environment
if ( params.grafana_env == "111" ) {
grafana_url = "http://xxx:3001"
} else if ( params.grafana_env == "222" ) {
grafana_url = "http://yyy:3001"
}
//get available grafana dashboards
def grafana_url = "${grafana_url}/api/search"
URL apiUrl = grafana_url.toURL()
List json = new JsonSlurper().parse(apiUrl.newReader())
def workspace = pwd()
List dash_names = []
// save png for each available dashboard
for ( dash in json ) {
def dash_name = dash['uri'].split('/')
dash_names.add(dash_name[1])
}
dash_names_string = dash_names.join(" ")
}
sh "echo $dash_names_string"
sh """
for dash in $dash_names_string;
do
echo $dash
done
"""
}
}
}
}
I get this error when run..
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: dash for class: WorkflowScript
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:458)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:33)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at WorkflowScript.run(WorkflowScript:42)
Looks like I'm missing something obvious...
Escape the $ for the shell variable with a backslash, that should help:
for dash in $dash_names_string;
do
echo \$dash
done
the problem is on line three here:
for dash in $dash_names_string;
do
echo $dash
done
it's trying to find a $dash property in groovy-land and finding none. i can't actually think how to make this work vi an inline sh step (possibly not enough sleep), but if you save the relevant contents of your json response to a file and then replace those four lines with a shell script that reads the file and call it from the Jenkinsfile like sh './hotScript.sh', it will not try to evaluate that dollar value as groovy, and ought to at least fail in a different way. :)

Pipeline step having trouble resolving a file path

I am having trouble getting a shell command to complete in a stage I have defined:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
The shell command issues a protractor call which takes a config file argument, but this file fails to be found when protractor tries to retrieve it.
If I take a look at the workspace directory for where the repo is checked out to from the checkout scm step I can see the test directory is present with the config file present the sh step is referencing.
So I'm unsure why the file cannot be found.
I thought about trying to verify the files that can be seen around the time the protractor command is being issued.
So something like:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
def files = findFiles(glob: 'test/**/*.conf.js')
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
}
}
}
}
But this doesnt work, I dont think findFiles can be used inside a step?
Can anyone offer any suggestions about what may be going on here?
Thanks
to do the debugging you were attempting (to see if the file is actually there) you could wrap the findFiles in a script (making sure your echo is before the step that fails) or use a basic find in an "sh" step like this:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
// you could use the unix find command instead of groovy's findFiles
sh 'find test -name *.conf.js'
// if you're using a non-dsl-step (like findFiles), you must wrap it in a script
script {
def files = findFiles(glob: 'test/**/*.conf.js')
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
}

Jenkins Pipeline - Reading previous stage log

Consider a Jenkins Pipeline with two stages, Stage A then Stage B.
In Stage B, is it possible to parse the logs of Stage A for some particular text?
Use tee to split the output to both stdout and file. Next parse the file for your text.
STAGE_A_LOG_FILE = 'stage_a.log'
pipeline {
agent any
stages {
stage('Stage A') {
steps {
script {
// tee log into file
tee(STAGE_A_LOG_FILE) {
echo 'print some Stage_A log content ...'
}
}
}
}
stage('Stage B') {
steps {
script {
// search log file for 'Stage_A'
regex = java.util.regex.Pattern.compile('some (Stage_A) log')
matcher = regex.matcher(readFile(STAGE_A_LOG_FILE))
if (matcher.find()) {
echo "found: ${matcher.group(1)}"
}
}
}
}
}
}
Pipeline output:
print some Stage_A log content ...
found: Stage_A
Finished: SUCCESS
There's been an update since July 28th !
As mentionned in this answer, as of version 2.4 of Pipeline: Nodes and Processes you can use:
def out = sh script: 'command', returnStdout: true
At least it's much more simple and clean than outputting to a file and reading the file afterwards.
What I finally did was, as suggested, to write to a file (and stdout) using tee.
sh "command | tee <filename>"
Then parse the file as needed, using readFile <filename> to read the file from the workspace.
If you want to search for the first occurrance of a pattern, you can also use manager.logContains(regexp) or manager.getLogMatcher(regexp). See my other answer for more details: https://stackoverflow.com/a/39873765/4527766

Resources