Maybe I am misunderstanding the intended use for the Jenkins file parameter here...
I want to be able to upload a file containing some data (in my case comma separated variables). I then want to simply read this file and do stuff with the data. I've got this setup using a Pipeline job.
My file location is set to 'email_list.csv'. In my pipeline script I have
node {
stage('post') {
emailFile = readFile 'email_list.csv'
println "${emailFile}"
//.........
}
}
This fails with a java.io.FileNotFoundException: /var/lib/jenkins/workspace/job-name/email_list.csv (No such file or directory) exception
Shouldn't the parameterized build have set up this file? If not, how do I read the data uploaded?
Jenkins by default provides build job parameters as a params map in pipeline. It is a key-value pair. All you comma seperated values will be into values field. You can refer them in your groovy script as,
print params.emailFile
To dump it as a file, you can use writeFile library function.
P.S: If you print params in groovy script, you will be able to see all build parameters of your job.
There is a a bug since ages that makes impossible to use fileParameter:
Handle file parameters
file parameter not working in pipeline job
Related
I need to pass a list of parameters to Jenkins from a file written in this way
param1=test1
param2=test2
paramBool=true
....
How can I pass these parameters?
I have to use the parameters in a Jenkins pipeline
I mentioned before a dynamic declarative pipeline parameters.
You might combine it with:
either readFile
or load, which can evaluate a Groovy source file into the Pipeline script, and modify params.
I currently have a git repo which has a text file. I want to load its contents as a build parameter for a jenkins job.
One way would be to manually copy the contents of this file in Jenkins multi-line string parameter. But, since the content is in git already I want to keep it coupled.
Not sure, if this is even possible using Jenkins?
I am using Jenkins Job DSL to generate the job.
EDIT : You can find several different ways of achieving this in the following answer Jenkins dynamic declarative pipeline parameters
I think you can achieve it the following way (scripted pipeline).
node {
stage("read file") {
sh('echo -n "some text" > afile.txt')
def fileContent = readFile('afile.txt')
properties([
parameters([
string(name: 'FILE_CONTENT', defaultValue: fileContent)
])
])
}
stage("Display properties") {
echo("${params.FILE_CONTENT}")
}
}
The first time you execute it there will be no parameter choice. The second time, you'll have the option to build with parameter and the content will be the content of your file.
The bad thing with this approach is that it's always in sync with the previous execution, i.e. when you start the build on a commit where you changed the content of your file, it will prefill the parameter with the content of the file as per the last execution of the build.
The only way I know around this, is to split your pipeline into two pipelines. The first one reads the content of the file and then triggers the second one with the file content as build parameter with the build step.
If you find a better way let us know.
Why don't you have jenkins pull the repo as part of the Job, and then parse the contents of the parameters (in say, json, for example from a file within the repo) and then continue executing with those parameters?
I need to get data from a csv file placed in the same server that Jenkins, the idea is to fill the parameter with values from that file, in this case, IP addresses. Also I want to check if Jenkins is reaching the file or not (permissions or something else)
I tried with a simple groovy script using readCSV plugin inside of a extended-choice-parameter, but it didn't works...
readCSV file: '/var/lib/jenkins/ips.csv'.each { line ->
line.each { field ->
return field
}
File example:
0.0.0.0;1.1.1.1
I can install additional plugins in Jenkins if necessary
I already use Jenkins API for some tasks in my build pipeline. Now, there is a task that I want to persist some simple dynamic data say like "50.24" for each build. Then be able to retrieve this data back in a different job.
More concretely, I am looking for something on these lines
POST to http://localhost:8080/job/myjob//api/json/store
{"code-coverage":"50.24"}
Then in a different job
GET
http://localhost:8080/job/myjob//api/json?code-coverage
One idea is to do archiveArtifacts and save it into a file and then read it back using the API/file. But I am wondering if there is plugin or a simple way to write some data for this job.
If you need to send a variable from one build to another:
The parametrized build is the easiest way to do this:
https://wiki.jenkins.io/display/JENKINS/Parameterized+Build
the URL would look like:
http://server/job/myjob/buildWithParameters?PARAMETER=Value
If you need to share complex data, you can save some files in your workspace and use it (send the absolute path) from another build.
If you need to re-use a simple variable computed during your build
I would go for using an environment var, updated during your flow:
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
stages {
stage('Build') {
steps {
sh 'printenv'
}
}
}
}
All the details there:
https://jenkins.io/doc/pipeline/tour/environment/
If you need to re-use complex data between two builds
You have two case there, it's if your build are within the same workspace or not.
In the same workspace, it's totally fine to write your data in a text file, that is re-used later, by another job.
archiveArtifacts plugin is convenient if your usecase is about extracting test results from logs, and re-use it later. Otherwise you will have to write the process yourself.
If your second job is using another workspace, you will need to provide the absolute path to your child-job. In order for it to copy and process it.
Job A uses "For every property file, invoke one build" parameter factory to call downstream job B.
Here is the file pattern I am using:
d:\temp*.properties
There are two files in that folder:
build0.properties
build1.properties
each file looks something like this:
modified=SampleApp
Job B fails because job A is not setting the parameters from above file. If I look at the parameters for a build of Job B, they are empty.
The process works when I use "Parameters from properties file" parameter type instead of a parameter factory, and specify the full path to one of the files, so I know the files are in the right format. I do not want to add a parameter for each file I have,
since I will have these files generated dynamically.I would prefer to use the parameter factory if possible.
Issue with the file permissions, when I pointed to workspace directory with the file pattern It started workign fine.