FindBugs warnings not displayed in Jenkins Dashboard - jenkins

I am using a Multibranch Pipeline job to trigger my build. One of the steps of the build is to run Sonar. After the Sonar is run, findbugs-result.xml file is created the target/sonar directory.
I publish the results using the below commands in Groovy. The build shows that there is 1 warning for FindBugs. But I do not see in the Jenkins Dashboard (FindBugs Warning portlet).
If I create a normal Freestyle job and try to do the same thing using a Post-build action, the results are visible on the Jenkins Dashboard.
bat "${env.M2_HOME}/bin/mvn sonar:sonar --settings ../HudsonSettings/settings.xml -B -U -P reporting-plugins"
step([$class: 'FindBugsPublisher', canComputeNew: false, canRunOnFailed: true, defaultEncoding: '', excludePattern: '', healthy: '', includePattern: '', isRankActivated: true, pattern: '**/target/sonar/findbugs-result.xml', unHealthy: ''])
Can anyone help ?
Thanks and Regards
Saroj Gharat

By default, the Jenkins-Findbugs will look for files with the name findbugsXml.xml.
If your integration is dropping a report with a report filename, you need to add the filename findbugs-result.xml (Under Post-build action > Publish FindBugs analysis result > Findbugs results).

Related

Allure report generated from Jenkins job returns 404

I'm trying to use the allure report plugin with Jenkins on CentOS7 for a Java maven project that is using TestNg and Allure test adaptor. The tests run fine and allure-results are generated as expected. My build step in the Jenkins file is
post {
always {
script {
allure([
commandline: 'allure',
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: target/allure-results]]
])
}
}
}
I can see the little allure icon in front of the build and when I click on it I get routed to http://xxx.xx.x.xx/index.html#404 with the following information:
URI: /index.html
STATUS: 404
MESSAGE: Not Found
SERVLET: Stapler
In the Jenkins build logs I see the following:
$ /bin/allure generate "/home/path/to/job/target/allure-results" -c -o "/home/path/to/job/allure-report"
WARN: No current version specified. Use `allure switch <version>` to select version of the report.
Allure report was successfully generated.
Creating artifact for the build.
Artifact was added to the build.
Finished: SUCCESS
When I manually try generating the allure report using allure serve, I can view the HTML report in the browser. But it's returning 404 in the Jenkins job.

Publishing newman-reporter-htmlextra reports with Jenkins html publisher fails

I am running newman with newman-reporter-htmlextra in a Jenkins pipeline, generating a html report which I want to publish via the jenkins html publisher.
This is the stage in the pipeline I´m using
stage('Newman tests') {
steps() {
script {
dir("${JENKINS_HOME}/workspace/myproject") {
sh 'newman run "./Collections/my_collection.postman_collection.json" --reporters cli,junit,htmlextra --reporter-junit-export "newman_result.xml" --reporter-htmlextra-export "newman_result.html"'
junit "*.xml"
}
}
publishHTML target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: '.',
reportFiles: 'newman_result.html',
reportName: 'Newman HTML Reporter'
]
}
This is running, and creating an entry Newman HTML Reporter in my Jenkins project.
However, when I open such a report, it is empty, pls check .
Any ideas?
Many thanks in advance,
Christian
I guess you are accessing the wrong folder when you want to publish your html result.
You are creating your file not in the regular jenkins workspace:
dir("${JENKINS_HOME}/workspace/myproject") {
sh 'newman run "./Collections/my_collection.postman_collection.json" --reporters cli,junit,htmlextra --reporter-junit-export "newman_result.xml" --reporter-htmlextra-export "newman_result.html"'
junit "*.xml"
}
After you are leaving the script{} you are accessing the original workspace of Jenkins so reportDir: '.' is not the same folder where your file is which means no file = no html can be displayed.
You have 3 choices here:
You create the file in the regular workspace of Jenkins
The HTML Publisher Plugin aims to the correct folder
Put the HTML Publisher plugin into the scope of your dir{} as you did with your junit plugin
To find out easily which folder you are accessing in which scope you can run an echo pwd. Do this once in the scope of your dir{} and once in the scope of your plugin then it should be clear that the reportDir of your plugin is wrong.

Jenkins: How to access copied artifact?

I'm trying to execute a jar file built in a different jenkins job. For that I use the CopyArtifact plugin.
Copying the artefact seems to work. The following code
step ([$class: 'CopyArtifact',
projectName: '/BuildJob/master',
filter: 'target/mytool.jar',
target: './']);
finishes with
Copied 1 artifact from "BuildJob » master" build number 1
But how can I access this copied jar file? I tried the following:
sh(returnStdout: true, script: "java -jar mytool.jar")
but this ended up in the following error message:
Error: Unable to access jarfile mytool.jar

Cobertura code coverage report for jenkins pipeline jobs

I'm using the pipeline plugin for jenkins and I'd like to generate code coverage report for each run and display it along with the pipeline ui. Is there a plugin I can use to do that(e.g. Cobertura but it doesn't seem to be supported by pipeline)?
There is a way to add a pipeline step to publish your coverage report but it doesn't show under the BlueOcean interface. It will show fine in the normal UI.
pipeline {
agent any
stages {
...
}
post {
always {
junit '**/nosetests.xml'
step([$class: 'CoberturaPublisher', autoUpdateHealth: false, autoUpdateStability: false, coberturaReportFile: '**/coverage.xml', failUnhealthy: false, failUnstable: false, maxNumberOfBuilds: 0, onlyStable: false, sourceEncoding: 'ASCII', zoomCoverageChart: false])
}
}
}
Note that one of the parameters to the Cobertura plugin is the XML that it will use ('**/coverage.xml' in the example).
If you are using python, you will want to use something like:
nosetests --with-coverage --cover-xml --cover-package=pkg1,pkg2 --with-xunit test
Nowadays you can also use the cobertura command directly in a Jenkinsfile
stage ("Extract test results") {
cobertura coberturaReportFile: 'path-to/coverage.xml'
}
source: https://issues.jenkins-ci.org/browse/JENKINS-30700
The answer from hwjp is correct, however there are extra parameters that you can add to the command that are not easy to find.
Once you have installed the Cobertura plugin, you can find the cobertura step options in
Job Dashboard Page -> Pipeline Syntax -> Steps Reference
There's also a snippet generator which is really useful to get started at
Job Dashboard Page -> Pipeline Syntax
example command:
cobertura coberturaReportFile: 'coverage.xml', enableNewApi: true, lineCoverageTargets: '80, 60, 70'
enableNewApi is a good one to set to true, as the new API is much prettier :D
setting coverage targets will automatically fail the job if the code coverage is too low
Generate report using command line cobertura-report in specified directory and attach results as artifacts.
cobertura-report [--datafile file] --destination dir [--format
html|xml] [--encoding encoding] directory [--basedir dir]

Aggregating results of downstream is ‘no test’ in Jenkins

After running the main project, every downstream project has test result, but the "Latest Aggregated Test Result" is no tests. How to configure the Jenkins to make all the test results display in aggregated list?
Aggregate downstream test results is not obvious, and not documented. The steps below are synthesized from How To Aggregate Downstream Test Results in Hudson.
To aggregate, you need to archive an artifact in the upstream job, fingerprint the artifact, and then pass the artifact from the upstream job to the downstream job. In my own words:
the shared, finger-printed artifact "ties" the jobs together and allows the upstream job to see the downstream test results
To show this, we can make a very simple flow between two free-style jobs, Job_A and Job_B.
Upstream
Job_A will run and create an artifact named some_file.txt. We're not aggregating the value/contents of some_file.txt, but it needs to be finger-printed and so it cannot be empty. Job_A will then trigger a build of Job_B.
Job_A's configuration:
Execute shell:
echo $(date) > some_file.txt
Archive the artifacts:
set Files to archive to the file, some_file.text
Aggregate downstream test results:
check the Automatically aggregate... option
Build other projects:
set Projects to build to Job_B
Record fingerprints of files to track usage:
set Files to fingerprint to some_file.txt
Downstream
Job_B will run, copy the file some_file.txt from the upstream job that triggered this run, echo out some mock test results to an XML file, then publish that XML result file. It's the published results that will get aggregated back into Job_A.
Job_B's configuration:
Copy artifacts from another project:
Project name
Job_A
Which build
Upstream build that triggered this job
Artifacts to copy
some_file.txt
Fingerprint Artifacts
✔
Execute shell:
XML_VAR='<testsuite tests="3">
<testcase classname="foo" name="ASuccessfulTest"/>
<testcase classname="foo" name="AnotherSuccessfulTest"/>
<testcase classname="foo" name="AFailingTest">
<failure type="ValueError">Not enough foo!!</failure>
</testcase>
</testsuite>'
echo $XML_VAR > results.xml
Publish JUnit test result report:
set Test report XMLs with the file, results.xml
This should be sufficient to have Job_A aggregate Job_B's test results. I'm not sure if there's a way/plugin to change Job_A's status based on downstream results (like if Job_B failed, then Job_A would retroactively fail).
For Scripted Pipeline,
Say I have:
one upstream job - mainJob
two downstream jobs - downStreamJob1 and downStreamJob2.
To aggregate test result from downstreamJob1 and downStreamJob2, here is what the Jenkinsfile will look like:
downStreamJob1 Jenkinsfile - Archive and fingerprint the test result xml
archiveArtifacts allowEmptyArchive: true,
artifacts: **/test-results/test/*.xml,
fingerprint: true, defaultExcludes: false
downStreamJob2 Jenkinsfile - Archive and fingerprint the test result xml
archiveArtifacts allowEmptyArchive: true,
artifacts: **/output/junit-report/*.xml,
fingerprint: true, defaultExcludes: false
The artifacts path used Fileset to grab all test report XML. Read more about fileset HERE
mainJob Jenkinsfile - Copy artifact from each of the downstream jobs
copyArtifacts filter: 'build/test-result/test/*.xml', fingerprintArtifacts: true, projectName: 'downStreamJob1', selector: lastCompleted()
copyArtifacts filter: 'output/junit-report/*.xml', fingerprintArtifacts: true, projectName: 'downStreamJob2', selector: lastCompleted()
The best way to make sure you have the right path for filter and artifacts is to navigate to the artifact in each downstream job using this url $BUILD_URL/artifact/ where BUILD_URL is Full URL of this build, like http://server:port/jenkins/job/foo/15/

Resources