Jenkins Log Parser - jenkins

I was using https://plugins.jenkins.io/log-parser/ plugin with freestyle Jenkins Jobs. But since moving to Jenkins Pipeline, I have not been able to integrate the log parser into the Declaratinve Pipeline syntax.
How can this be done? I also didn't find info in their docs. Also, what would be a good log parsing rule and where to specify it? In the Jenkinsfile also? Could you give an example? Thanks.

I don't user log-parser, but a quick glance at the issues suggests it is not presently compatible;
JENKINS-27208: Make Log Parser Plugin compatible with Workflow
JENKINS-32866: Log Parser Plugin does not parse Pipeline console outputs
Update:
This old response by Jesse Glick (Cloudbees; Jenkins sponsor) to similar question suggests it does in fact work now and suggests how to generate syntax, but OP complains DSL and documentation is weak.
gdemengin wrote pipeline-logparser to work around another issue JENKINS-54304
Build Failure Analyzer may also be of use to you.
YMMV

You can try something like the following:
stage('check') {
steps {
echo 'checking logs from previous stages...'
logParser failBuildOnError: true, parsingRulesPath: '/path/to/rules', useProjectRule: false, projectRulePath: ''
}
}
The pipeline syntax section in Jenkins allows you to get snippets for your Jenkinsfile

Related

What to do about error in Jenkins pipeline Process working directory '/var/lib/jenkins/workspace/<yourpipelinenamehere> doesn't exist

I have found this question asked in various places in various forms and even fought it myself.
I believe that I have found the solution for the scenario in which I have encountered this and am curious if it helps others who similarly encounter this.
The short answer of what i found to do, is set the first stage of my pipeline to a known module that has logic to create the workspace. I.E. the following:
pipeline {
agent any
stages {
stage('Opening Workspace') {
steps {
script {
def date = new Date()
def data = "I am arbitrary text\nSecond line\n" + date
writeFile(file: 'workspacecreated.txt', text: data)
sh "ls -l"
}
}
}
stage('alltherest') {
<< the rest of your steps and end of your pipeline to paste here>>
In my case of fighting this, My first stage was ansiblePlaybook() which, as it turns out... does not seem to try and create this workspace. I have filed this as a bug in jenkins against the ansible plugin.
So the first question is,
If you hit this error message in jenkins, does setting the first step to writefile help you?
If so what was your original first step? Perhaps you should post that first steps plugin failing to create a workspace for itself as a bug to jenkins.
The second question is,
Does anyone have a more elegant solution of this workaround?

How can I use Jenkins' ExportParametersBuilder in a Pipeline?

I'm migrating a Free Style job to a Pipeline on Jenkins. The Freestyle Job uses the ExportParametersBuilder (Export Parameters to File) plug-in. This is important for our workflow because the application expects the parameters as a JSON file.
I have tried with a Basic Step, as documented in Pipeline: Basic Steps - Jenkins documentation (search for ExportParametersBuilder):
step([
$class: 'ExportParametersBuilder',
filePath: 'config/parameters',
fileFormat: 'json',
keyPattern: '',
useRegexp: 'false'
])
But when I try to run the Pipeline I get the following error:
No known implementation of interface jenkins.tasks.SimpleBuildStep is named ExportParametersBuilder
The Pipeline Job is running on the same Jenkins instance as the Freestyle Job (which is currently working). So, the Plug-in is installed and working. I'm not sure why this is happening.
Does anyone knows if this plug-in can be used in Pipeline Jobs? And if so, how? What am I missing?
If it cannot be used, my apologies, Jenkins' documentation is often misleading.
I couldn't find a way to use the plug-in but I found an alternative. I'm leaving it here in case it results useful for someone else.
// Import the JsonOutput class at the top of your Jenkinsfile
import groovy.json.JsonOutput
...
stage('Environment Setup') {
steps {
writeFile(file: 'config/parameters.json', text: JsonOutput.toJson(params))
}
}
This is probably not the cleanest, or the most elegant way to do it but it works. The params are all written to the JSON file and the JsonOutput class takes care of all the escaping magic and so on.
Do keep in mind that the format of the JSON file is a little different from the one ExportParametersBuilder created, so you'll need to adapt for it:
ExportParametersBuilder format:
[
...
{
"key": "target_node",
"value": "c3po"
}
...
]
JsonOutput format:
{
...
"target_node": "c3po"
...
}

Jenkins declarative pipeline : How to configure the klocwork result display on the job page

I am creating a pipeline using the declarative pipeline flavour, with clockwork steps enclosed within a klockwork wrapper where I can define the klocwork setup :
klocworkWrapper(installConfig: 'My Klocwork', ltoken: "${HOME}/.klocwork/ltoken", serverConfig: 'Klocwork#XYZ', serverProject: 'S3cr3TPr0j3ct') {
klocworkBuildSpecGeneration([additionalOpts: '', buildCommand: 'make', ignoreErrors: true, output: 'kwinject.out', tool: 'kwinject'])
klocworkIntegrationStep1([additionalOpts: '', buildSpec: 'kwinject.out', disableKwdeploy: false, ignoreCompileErrors: true, importConfig: '', incrementalAnalysis: false, tablesDir: 'kwtables'])
klocworkIntegrationStep2([additionalOpts: '', buildName: "${JOB_BASE_NAME}_${BUILD_NUMBER}", tablesDir: 'kwtables'])
}
Ok, analysis is launched, and I can see the results on the Klocwork server web interface.
But I cannot find a way to retrieve resulting diagrams on the Jenkins web interface, even when using the pipeline script generator.
Unless I am totally wrong, I think that I should use klocworkQualityGateway, but the generated script snippet is not correct.
Once copied within the wrapper, it fails lacking for some enableXYGateway or gatewayXYConfig property.
For example this line :
klocworkQualityGateway([enableCiGateway: false, enableServerGateway: true, gatewayServerConfigs: [[conditionName: 'Issues', jobResult: 'failure', query: 'state:+Status,Fix', threshold: '1']]])
fails with an error message :
WorkflowScript: 92: Missing required parameter: "gatewayCiConfig" # line 92, column 1.
klocworkQualityGateway([enableCiGateway: false, enableServerGateway: true, gatewayServerConfigs: [[conditionName: 'Issues', jobResult: 'failure', query: 'state:+Status,Fix', threshold: '1']]])
I really cannot find a way to make it work, and I guess I can take a wrong turn... so any help would be appreciate.
Thanks for your help and best regards
J-L
Well, after a fruitful discussion with the plugin maintainer (M. Baron) it appears that there is currently no simple and direct solution to display Klocwork result on a pipeline job page.
He said :
This step doesn't have a native pipeline interface and a few people
have tried, but haven't had much success with workarounds to use this
in a pipeline.
The simplest thing to do seems to trigger a freestyle job that will only do that.
As far as I have understood, a new plugin version with full pipeline support will replace the current one.
So, I think this discussion can be closed.

file parameter in declarative pipeline

I am developing declarative pipeline and want to use file parameter to read its content, but its not working as expected
parameters{
file(fileLocation:'list.txt', description:'contains list of projects to be build')
}
I am getting following error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 12: Invalid parameter "fileLocation", did you mean "description"? # line 12, column 14.
file(fileLocation:'release-list.txt', description:'contains list of projects to be build')
Following is another option mentioned for basic step plugin
readFile: Read file from workspace
Reads a file from a relative path (with root in current directory, usually workspace) and returns its content as a plain string.
file
Relative ( /-separated) path to file within a workspace to read.
Type: String
encoding (optional)
Type: String
its working in script step like
def myfile = readFile('list.txt')
echo "${myfile}"
But how to use it directly in declarative script as we used other basic steps like dir??
The correct arguments for the file parameter are name and description. So it should be:
file(name:'list.txt', description:'contains list of projects to be build')
However there's an open jenkins issue dating back from 2015 about the file parameter not working for pipelines, so I don't think even this will solve your issue. https://issues.jenkins-ci.org/browse/JENKINS-27413
Following syntax is working
parameters{
file name:'list.txt', description:'contains list of projects to be build'
}
But fileLocation parameter is not acceptable still.
Below syntax is available in Jenkins2 Up & Running book but its not working
parameters{
file(fileLocation:'list.txt', description:'contains list of projects to be build')
}
Till outstanding issues gets fixed, I believe we may have to stick to freestyle mode & handle things either in downstream pipeline job or within same job leveraging needy plugin feature.
Here is my attempt which looks to work file irrespective (yes supports Binaries as well) types : https://i.stack.imgur.com/vH7mQ.png
${list.txt} will point to right file in your case..
Take a look at the plug-in https://plugins.jenkins.io/file-parameters/.
This plug-in adds support for file parameters in your Jenkinsfile: https://plugins.jenkins.io/file-parameters/#plugin-content-usage-in-declarative-pipeline
parameters {
base64File 'small'
stashedFile 'large'
}
https://github.com/jenkinsci/file-parameters-plugin

gradle - make a step 100% optional

We use clover for code coverage testing but it interferes with stack traces and error information. I want to be able to use cloverGenerateReport when doing automated builds via jenkins but to skip this step entirely when doing local builds.
I've tried the various suggestions from searches for 'gradle optional dependencies' but I can't seem to get clover completely out of the way.
Suggestions?
You can use the method onlyIf.
cloverGenerateReport.onlyIf {
project.hasProperty('enableClover') ? Boolean.valueOf(project.getProperty('enableClover')) : false
}
On the command line you can enable it by providing the project property:
gradle cloverGenerateReport -PenableClover=true
One solution would be to check if the environment variable "JENKINS_HOME" exists. If it does, then set cloverGenerateReport as a dependency to another task.
In your build.gradle:
def env = System.getenv()
if(env.containsKey('JENKINS_HOME')){
reportTask.dependsOn cloverGenerateReport
}

Resources