I wan to check from groovy pipeline if a specific file (artifact) was collected or not.
How can I access the list of artefacts?
archiveArtifacts artifacts: 'foo.txt', allowEmptyArchive: true
...
// much later
// check if 'foo.txt' was collected?
Please note that I am looking for a solution that does not imply modification of the code that collects the artifacts. This is because this code is in multiple places, I only need something to do at the end, not at any possible call of archiveArtifacts (which can be deeply hidden).
You can get the list of artifacts using jenkins api in either XML/JSON format.
call the below url. insert job-name and build-number
http://localhost:8080/jenkins/job/job-name/build-number/api/json?pretty=true
Json Example:
"artifacts" : [
{
"displayPath" : "temp.jar",
"fileName" : "temp.jar",
"relativePath" : "target/temp.jar"
}
]
It's possible to access the artifacts using something like currentBuild.rawBuild.artifacts.each { println it.fileName }, but you will need to whitelist rawBuild, artifacts and fileName properties.
Related
I need to hide some values in Jenkins pipelines. I've created some libraries with classes that handle some common operations. The problem is some of these classes execute commands with sh but it shows sensitive information. My problem is I cannot use withCredentials because I don't have this credentials stored in Jenkins(they come from yaml files in the repository and are decrypted with a library). Is there a way to use withCredentials without having the credentials stored in jenkins? Could say to jenkins to hide some values?
I need something like:
maskVariables([var1: 'secret value 1', var2: 'secret value 2']) {
pipeline.sh('echo "don't show $var1 $var2"')
}
//don't show ************** **************
I'm migrating a Free Style job to a Pipeline on Jenkins. The Freestyle Job uses the ExportParametersBuilder (Export Parameters to File) plug-in. This is important for our workflow because the application expects the parameters as a JSON file.
I have tried with a Basic Step, as documented in Pipeline: Basic Steps - Jenkins documentation (search for ExportParametersBuilder):
step([
$class: 'ExportParametersBuilder',
filePath: 'config/parameters',
fileFormat: 'json',
keyPattern: '',
useRegexp: 'false'
])
But when I try to run the Pipeline I get the following error:
No known implementation of interface jenkins.tasks.SimpleBuildStep is named ExportParametersBuilder
The Pipeline Job is running on the same Jenkins instance as the Freestyle Job (which is currently working). So, the Plug-in is installed and working. I'm not sure why this is happening.
Does anyone knows if this plug-in can be used in Pipeline Jobs? And if so, how? What am I missing?
If it cannot be used, my apologies, Jenkins' documentation is often misleading.
I couldn't find a way to use the plug-in but I found an alternative. I'm leaving it here in case it results useful for someone else.
// Import the JsonOutput class at the top of your Jenkinsfile
import groovy.json.JsonOutput
...
stage('Environment Setup') {
steps {
writeFile(file: 'config/parameters.json', text: JsonOutput.toJson(params))
}
}
This is probably not the cleanest, or the most elegant way to do it but it works. The params are all written to the JSON file and the JsonOutput class takes care of all the escaping magic and so on.
Do keep in mind that the format of the JSON file is a little different from the one ExportParametersBuilder created, so you'll need to adapt for it:
ExportParametersBuilder format:
[
...
{
"key": "target_node",
"value": "c3po"
}
...
]
JsonOutput format:
{
...
"target_node": "c3po"
...
}
I have a build with a parameter, also I have few dependencies from another project. I had to print the results of the parameters and save them in a specific folder. I used echo, but is there any other way or plugin
If I understand correctly, you're wanting to write parameters that were passed, out to a file in a folder. Here's an example using Groovy / Scripted pipeline
stage('Create and write output file') {
f = new File("${WORKSPACE}/output_file.txt")
f.append("SomeParameter = ${params.SomeParameter}"
}
I have a Pylint running in a Jenkins pipeline. To implement it, I used Gerrit trigger plugin and Next Generation Warnings plugin. Everything is working as expected - Jenkins is joining the review, checks change with pylint and generates report.
Now, I'd like to post pylint score in a custom "Build successful" message. I wanted to pass the pylint score to a environment variable and use it in dedicated window for Gerrit plugin message.
Unfortunately no matter what I try, I cannot pass any "new" variable to the message. Passing parameters embedded in pipeline works (e.g. patchset number).
I created new environment variable in Configure Jenkins menu, tried exporting to shell, writing to it (via $VAR and env. syntax) but nothing works - that is, build message displays raw string like $VAR instead of what variable contains.
What should I do to pass local pylint score (distinct for every pipeline occurence) to the custom build message for Gerrit?
I don't think the custom message can be used for this. This is just supposed to be a static message.
They way I do this is to use the SSH command to perform the review. You can also achieve the same using the REST API.
First I run my linting and white space checking script that will generate a json file with the information I would like to pass to Gerrit. Next I send it to Gerrit using SSH. See below my pipeline script and an example json file.
As a bonus I have added the robot comments. This will now show up in your review as a remark from Jenkins that line 8 of my Jenkins file has a trailing white space. You can easily replace this with your lint result of you like or just ignore it and only put the message. It is easier to use a json file as it will make it easier to create multi line messages
node('master') {
sh """
cat lint_change.json | ssh -p ${env.GERRIT_PORT} ${env.GERRIT_HOST} gerrit review ${env.GERRIT_PATCHSET_REVISION} --json
"""
}
Example json file:
{
"labels": {
"Code-Style": "-1"
},
"message": "Lint Bot Review\nLint Results:\n Errors: 0\n Warnings: 0\n\nWhitespace results:\n Errors: 1",
"robot_comments": {
"Jenkinsfile": [
{
"robot_id": "lint-bot",
"line": "8",
"message": "trailing whitespace."
}
]
}
}
Alternatively, you may want to look at a new gerrit-code-review-plugin that should make this things even easier. However, I have not tried this yet.
I have a text file on server, e.g. /var/lib/jenkins/.../myChoices.txt
FirstChoice,SecondChoice
As the files will updated from time to time, I want the script update the parameters every time will I click "build with parameters"
But my code only works when I build the job, i.e. is not updating in real time.
def getMyChoices() {
List<String> choices = Arrays.asList(readFileFromWorkspace('/var/lib/jenkins/.../myChoices.txt').split(','))
return choices
}
job(jobName) {
description("Deploy something based on choice.")
parameters {
...
...
choiceParam('EB_ACTIVE_ENV_NAME', getMyChoices(), '')
}
}
I do not want to use the hudson plugin too due to some vulnerability reason.
Groovy scripts would be executed only when the job is run. Hence, until the job is run, the parameters would not be refreshed
The only solution available is to this job at regular intervals with an additional flag to refresh the parameters alone and then exit.
This way whenever you click on Build on Parameters options, you will have the latest parameters that exists in the file.
It is required to regenerate the job in order to refresh the parameters.
What I would do is create a job that generates the jobs with jobdsl step when I get a change on the repository where myChoices.txt is versioned
here is an exemple of use of jobDsl
jobDsl removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
targets: targetFile,
unstableOnDeprecation: true,
additionalParameters: [
pipelineJobs: arrFiles,
props: [
basePath: destination,
gitRemoteUrl: config.gitRemoteUrl,
gitConfigJenkinsBranch: config.gitConfigJenkinsBranch,
localPath: config.localPath ?: ''
]
]
I use it with a shared library I created that allows me to abstract jobDSL and only write pipelineDSL https://github.com/SAP/jenkins-pipelayer/ but there are restriction to this lib, because I parse the pipelineDSL, getMyChoices() would not be evaluated in the current version of the lib