How can I use Jenkins' ExportParametersBuilder in a Pipeline? - jenkins

I'm migrating a Free Style job to a Pipeline on Jenkins. The Freestyle Job uses the ExportParametersBuilder (Export Parameters to File) plug-in. This is important for our workflow because the application expects the parameters as a JSON file.
I have tried with a Basic Step, as documented in Pipeline: Basic Steps - Jenkins documentation (search for ExportParametersBuilder):
step([
$class: 'ExportParametersBuilder',
filePath: 'config/parameters',
fileFormat: 'json',
keyPattern: '',
useRegexp: 'false'
])
But when I try to run the Pipeline I get the following error:
No known implementation of interface jenkins.tasks.SimpleBuildStep is named ExportParametersBuilder
The Pipeline Job is running on the same Jenkins instance as the Freestyle Job (which is currently working). So, the Plug-in is installed and working. I'm not sure why this is happening.
Does anyone knows if this plug-in can be used in Pipeline Jobs? And if so, how? What am I missing?
If it cannot be used, my apologies, Jenkins' documentation is often misleading.

I couldn't find a way to use the plug-in but I found an alternative. I'm leaving it here in case it results useful for someone else.
// Import the JsonOutput class at the top of your Jenkinsfile
import groovy.json.JsonOutput
...
stage('Environment Setup') {
steps {
writeFile(file: 'config/parameters.json', text: JsonOutput.toJson(params))
}
}
This is probably not the cleanest, or the most elegant way to do it but it works. The params are all written to the JSON file and the JsonOutput class takes care of all the escaping magic and so on.
Do keep in mind that the format of the JSON file is a little different from the one ExportParametersBuilder created, so you'll need to adapt for it:
ExportParametersBuilder format:
[
...
{
"key": "target_node",
"value": "c3po"
}
...
]
JsonOutput format:
{
...
"target_node": "c3po"
...
}

Related

Jenkins Log Parser

I was using https://plugins.jenkins.io/log-parser/ plugin with freestyle Jenkins Jobs. But since moving to Jenkins Pipeline, I have not been able to integrate the log parser into the Declaratinve Pipeline syntax.
How can this be done? I also didn't find info in their docs. Also, what would be a good log parsing rule and where to specify it? In the Jenkinsfile also? Could you give an example? Thanks.
I don't user log-parser, but a quick glance at the issues suggests it is not presently compatible;
JENKINS-27208: Make Log Parser Plugin compatible with Workflow
JENKINS-32866: Log Parser Plugin does not parse Pipeline console outputs
Update:
This old response by Jesse Glick (Cloudbees; Jenkins sponsor) to similar question suggests it does in fact work now and suggests how to generate syntax, but OP complains DSL and documentation is weak.
gdemengin wrote pipeline-logparser to work around another issue JENKINS-54304
Build Failure Analyzer may also be of use to you.
YMMV
You can try something like the following:
stage('check') {
steps {
echo 'checking logs from previous stages...'
logParser failBuildOnError: true, parsingRulesPath: '/path/to/rules', useProjectRule: false, projectRulePath: ''
}
}
The pipeline syntax section in Jenkins allows you to get snippets for your Jenkinsfile

Jenkins declarative pipeline : How to configure the klocwork result display on the job page

I am creating a pipeline using the declarative pipeline flavour, with clockwork steps enclosed within a klockwork wrapper where I can define the klocwork setup :
klocworkWrapper(installConfig: 'My Klocwork', ltoken: "${HOME}/.klocwork/ltoken", serverConfig: 'Klocwork#XYZ', serverProject: 'S3cr3TPr0j3ct') {
klocworkBuildSpecGeneration([additionalOpts: '', buildCommand: 'make', ignoreErrors: true, output: 'kwinject.out', tool: 'kwinject'])
klocworkIntegrationStep1([additionalOpts: '', buildSpec: 'kwinject.out', disableKwdeploy: false, ignoreCompileErrors: true, importConfig: '', incrementalAnalysis: false, tablesDir: 'kwtables'])
klocworkIntegrationStep2([additionalOpts: '', buildName: "${JOB_BASE_NAME}_${BUILD_NUMBER}", tablesDir: 'kwtables'])
}
Ok, analysis is launched, and I can see the results on the Klocwork server web interface.
But I cannot find a way to retrieve resulting diagrams on the Jenkins web interface, even when using the pipeline script generator.
Unless I am totally wrong, I think that I should use klocworkQualityGateway, but the generated script snippet is not correct.
Once copied within the wrapper, it fails lacking for some enableXYGateway or gatewayXYConfig property.
For example this line :
klocworkQualityGateway([enableCiGateway: false, enableServerGateway: true, gatewayServerConfigs: [[conditionName: 'Issues', jobResult: 'failure', query: 'state:+Status,Fix', threshold: '1']]])
fails with an error message :
WorkflowScript: 92: Missing required parameter: "gatewayCiConfig" # line 92, column 1.
klocworkQualityGateway([enableCiGateway: false, enableServerGateway: true, gatewayServerConfigs: [[conditionName: 'Issues', jobResult: 'failure', query: 'state:+Status,Fix', threshold: '1']]])
I really cannot find a way to make it work, and I guess I can take a wrong turn... so any help would be appreciate.
Thanks for your help and best regards
J-L
Well, after a fruitful discussion with the plugin maintainer (M. Baron) it appears that there is currently no simple and direct solution to display Klocwork result on a pipeline job page.
He said :
This step doesn't have a native pipeline interface and a few people
have tried, but haven't had much success with workarounds to use this
in a pipeline.
The simplest thing to do seems to trigger a freestyle job that will only do that.
As far as I have understood, a new plugin version with full pipeline support will replace the current one.
So, I think this discussion can be closed.

How to access collected artifacts from Jenkins pipeline groovy script?

I wan to check from groovy pipeline if a specific file (artifact) was collected or not.
How can I access the list of artefacts?
archiveArtifacts artifacts: 'foo.txt', allowEmptyArchive: true
...
// much later
// check if 'foo.txt' was collected?
Please note that I am looking for a solution that does not imply modification of the code that collects the artifacts. This is because this code is in multiple places, I only need something to do at the end, not at any possible call of archiveArtifacts (which can be deeply hidden).
You can get the list of artifacts using jenkins api in either XML/JSON format.
call the below url. insert job-name and build-number
http://localhost:8080/jenkins/job/job-name/build-number/api/json?pretty=true
Json Example:
"artifacts" : [
{
"displayPath" : "temp.jar",
"fileName" : "temp.jar",
"relativePath" : "target/temp.jar"
}
]
It's possible to access the artifacts using something like currentBuild.rawBuild.artifacts.each { println it.fileName }, but you will need to whitelist rawBuild, artifacts and fileName properties.

Jenkins pipeline and Robot Framework results

I had to implement a Pipeline and trying to find a way, how to publish Robot Framework results in Jenkins Pipeline.
I found multiple questions about implementation of Robot Framework plugin into Pipeline and also found this question which seems to be solution. However I have tried this approach and results are still missing.
Is there any workaround or functional example?
[Edited to reflect successful workaround]
This comment on the issue tracker shows a workaround that seems to work:
step([
$class : 'RobotPublisher',
outputPath : outputDirectory,
outputFileName : "*.xml",
disableArchiveOutput : false,
passThreshold : 100,
unstableThreshold: 95.0,
otherFiles : "*.png",
])
However, the Robot Framework Plugin currently does not seem to be fully compatible with Pipeline right now: https://issues.jenkins-ci.org/browse/JENKINS-34469
This is common with many plugins in the Jenkins ecosystem right now that have not been updated yet to be compatible with the new Jenkins Pipeline. You could potentially create the full compatibility yourself though, if you're motivated enough.
I used the workaround mentioned in the other answer but it wouldn't display the results with the job like in non pipline jobs, so i made freestyle project that is triggered by the pipline job and just copies the results files across then runs the analysis. This is crufty and won't be portable across nodes, the job numbers might get confusing over time so the correlations might be tricky. At the point i will investigate using generic artifact storage or just getting rid of robot altogether.
I had trouble using the answer given above, resulting in errors; but I was able to figure it out and add it to the Pipeline. Here is how I fixed it in case anyone else has come across the same issues:
stage('Tests') {
steps {
echo 'Testing...'
script {
step(
[
$class : 'RobotPublisher',
outputPath : '<insert/the/output/path>',
outputFileName : "*.xml",
reportFileName : "report.html",
logFileName : "log.html",
disableArchiveOutput : false,
passThreshold : 100,
unstableThreshold : 95.0,
otherFiles : "*.png"
]
)
}
}
}

How to implement and invoke utility methods in Jenkins pipelines?

I want to implement reusable functions/methods to use in my Jenkins pipeline...
listAllUnitTests([
$class: 'MyUtilities',
arg1: 'foo',
arg2: 'bar'
])
What's not clear is how to actually do it; is this a plugin, extension, what is it?
The Hunt
So I started with something familiar, such as Git checkout...
node {
checkout([
$class: 'GitSCM',
branches: scm.branches,
extensions: scm.extensions,
userRemoteConfigs: scm.userRemoteConfigs
])
}
Looking at the source for GitSCM, a Jenkins plugin, the checkout method appears to be fairly standard; no special annotations or anything else, although I'm not sure how the pipeline arguments align with the method signature because there's clearly a mismatch. I suspect I'm on the wrong track.
#Override
public void checkout(
Run<?, ?> build,
Launcher launcher,
FilePath workspace,
TaskListener listener,
File changelogFile,
SCMRevisionState baseline)
throws IOException, InterruptedException {
Question
I'll keep it simple: how do I implement parameterized functionality to invoke from Jenkins pipelines to achieve something like this?
node {
stage('test'){
myUtilMethod([
$class: 'MyUtilities',
arg1: 'foo',
arg2: 'bar'
])
}
}
You can implement one or more libraries using https://github.com/jenkinsci/workflow-cps-global-lib-plugin/
I recommend explicitly specifying that you need the library (with the #Library annotation as mentioned in the page above), and not make it implicitly available. This way you can use a specific branch of it on test jobs while developing and testing your library.
Check out fabric8 for a pretty comprehensive set of examples: https://github.com/fabric8io/fabric8-pipeline-library

Resources