Jenkins plugin to monitor custom metrics - jenkins

I have a build-system where one of the tools in the toolchain is analyzing the code and performing static code analysis. The metrics are outputted in an xml-file similair to the one below:
<metrics>
<metric name="metric_a">65</metric>
<metric name="metric_a">32</metric>
<metric name="metric_a">42</metric>
</metrics>
What I want to in Jenkins is to be able to parse this file and then be able to visualize the metrics over time as well as set thresholds so build should fail if for example metric_a is below a certain value.
I've been looking for a suitable plugin but the closest I've found is the Warnings Plugin. However the Warnings Plugin parses logs and aggregates the results by utself rather than parsing an actual file with the final metrics.
Are there any other plugins suitable for handling "custom-metrics" or what is by best option?

For visualization, you could go for the Measurment Plots Plugin.
For failing the build when falling below thresholding on a custom metric, I can only think of making the metric's name/format identical to some existing plugins, e.g. Findbugs or PMD and configure those plugins accordingly.

Related

What version of Checkstyle is getting used in warnings next generation plugin 6.1.1 from jenkins?

I am trying to configure static code analysis on my jenkins server for maven project. I want to use google checks for Checkstyle based code analysis. To use appropriate google checks xml file I want to know the version of checkstyle used in warnings next generation plugin 6.1.1.
Can anyone please help me with this?
Warnings-ng plugin is not analyzing your code, it just creates visualization of checkstyle analysis results.
So you need to perform analysis by some other tool (e.g. maven) first and after than warnings-ng can show you results.

How to use exclude the analysis of all other language and do analysis for required one in sonarQube

I am using SonarQube 5.1.2 and maven 3.0.4.I run sonar analysis for one of my project and it gives me all the result.But the issue is it is showing the analysis for all the languages and I need only for Java.Is there a way we can exclude other and keep only java
Current analysis is
Property sonar.language is what you're looking for, see Analysis Parameters

Is there a Jenkins code coverage plugin for Perl?

We have our software product which is written in perl. We have setup a Jenkins build for that which runs the automated tests, to get the coverage values we run the unit tests under Devel::Cover, and get the coverage values for each module. But I find it very difficult to go and check the coverge values for each module individually, also it doesn't tell us where are we heading as far as coverage is concerned (increasing/decreasing trend). So what I'd like is to have a history or graph of those values, so that I can have a better picture of where our coverage values are and how do they compare to, say a month ago.
Is there a plug-in available that would do this thing?
I searched the Internet and found some plugins like cobertura and emma that I hope would achieve that purpose, but they are for Java. Is there a good alternative for Perl?
Also, would it be worth it to develop our own tool to display such plots? We already have the coverage data in Jenkins, and all we need is just to get that data and plot it.
You can with luck use Devel::Cover's HTML output formatting, put by default in the cover_db directory by:
$ cover -test
The install and configure the HTML Publisher Plugin.
Configure it as so:
HTML directory: cover_db
index page: coverage.html
Report title: Coverage Report
I written up my notes for setting this up, and they are available at:
https://logiclab.jira.com/wiki/display/OPEN/Continuous+Integration#ContinuousIntegration-CoverageTests
I am currently looking into the Jenkins Plugin Clover, since Devel::Cover can also output in Clover format. I will update my notes soon.
You can with luck use my CPAN distribution Task::Jenkins, installing all of the CPAN dependencies needed for the setup on the referenced wiki page.

How can I track values between Jenkins Builds (Static Analysis)

I'm running a number of static analysis tools and I want to track the results from build to build. For example, if a commit to a branch increases the number of security vulnerabilities, I want to send an email to the committer. I know there are plugins like Sonar and Analysis Collector, but they don't cover all of the areas of analysis I want and they don't seem to have the ability to trigger actions based on build trends (correct me if I'm wrong).
You can use the Groovy Postbuild Plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Groovy+Postbuild+Plugin
It lets you extract data (such as number of vulnerabilities detected) from the current build's buildlog with num_vul = manager.getLogMatcher(regexp)
And compare that to previous builds, by extracting info from their buildlog, e.g.:
currentBuildNumber = manager.build.number
manager.setBuildNumber(currentBuildNumber - 1)
prev_num_vul = manager.getLogMatcher(regexp)
Then, if the number of vulnerabilities had gone up, I would call manager.buildFailure() which sets the build status to FAILURE, and then have the next PostBuild step be to the Email-ext plugin which allows you to send email to the committer in the event of a failure.
I would recommend the SonarQube tool, which does just what you describe. You mention that you already looked at it, but maybe you missed the Notifications feature or the Build Breaker Plugin. There are more SonarQube features centered around Jenkins integration. SonarQube is free to use.
If you are still missing something, it might be worthwhile asking specifically how that aspect could be covered by SonarQube. Just my two cents.

How to display performance test results on Jenkins

We've written a framework to test the performance of our Java application (none of the existing frameworks, eg JMeter, were appropriate). The framework produces various metrics, e.g. mean/min/max transactions per second.
We'd like each Jenkins build to display these metrics so that we can keep track of whether a commit has improved performance or not.
I can't figure out how to do this.
One idea is to modify our performance test framework to output a HTML file, then somehow make Jenkins display/link to it on the build results page.
Any advice gratefully received.
The Peformance Plugin can show the results of JMeter and JUnit test in the nice, graphical fashion. And on the plugin page there is a description on how to use it.
This is an open-source plugin hosted on GitHub. The JUnit and JMeter parser are already there, but You can implement your own just by subclassing PerformanceReportParser. It's pretty easy and you can just fork the repo and start your implementation.
I agree that it is hard (if not impossible) to squeeze all the information into standard formats, like JUnit. They are good for quick identification of problems. Once you know there is a problem - you need more information that is usually free-form or custom-formatted to fit your particular needs. So we use both: JUnit that can be immediately processed by Jenkins to decide if the build is stable or not, draw the nice trend graph, etc. We also produce an HTML report that is much more detailed.
Now to your immediate question: you can simply archive your HTML file as an artifact (there is a standard post-build step to do that). Then a link to it will be displayed among the artifacts for the build. There are permalinks to the latest artifacts and latest successful build artifacts:
http://[server]/job/[job_name]/lastCompletedBuild/artifact/foo.html
http://[server]/job/[job_name]/lastSuccessfulBuild/artifact/foo.html
You may bookmark those links and have quick and easy one-click access to your results.
You could use the HTML Publisher Plugin to publish the resulting HTML page. That would be pretty straightforward.
If you want better integration you could try to create output that follows the same format JMeter produces, and use the Performance Plugin.
For best result you could take Ɓukasz advice and modify the Performance Plugin to your needs. That requires the most effort on your part, of course.

Resources