How to reuse fxcop.xml in SonarQube - jenkins

We have FxCop analysis being run by Nant/Jenkins. A SonarQube C# analysis is then triggered, and executed successfully.
We'd like to reuse fxcop.xml result file from Jenkins for Sonar's analysis.
We tried this configuration with no luck:
sonar.fxcop.mode=reuseReport
sonar.fxcop.reportsPaths=fxcop.xml
SonarQube is asking for path to FxCopCmd.exe. If specified, Sonar is running FxCop analysis one more time. We don't want that.

According to this thread sonar.fxcop.mode is no longer supported.
Quoted from the thread:
You can skip the execution of FxCop on a specific project by using
another Quality Profile for it, that does not contain any FxCop rule.
Indeed, it is important for the evolution of a project's technical
debt to be trackable over time. Changes in the rules that are applied
during the analysis has an obvious impact on the technical debt, and
therefore should be tracked.
The "sonar.fxcop.mode" property did not allow that, and was therefore
removed. For example, with the reuseReports mode, you could launch
just 1 or 2 rules, whereas in SonarQube all FxCop rules are enabled.
Skipping the FxCop analysis fully when all FxCop rules are enabled in
SonarQube is obviously deceiving.
The reuseReports mode should not be required, as SonarQube is able to
drive FxCop's execution.
I guess FxCop should be started from Sonarqube direct.

Related

Can you ask for user input for TFS 2015 CI build?

This seems simple enough, but I can't find a solution for this online.
I am integrating SonarQube into our build definitions that get triggered on check in. I want the version SonarQube uses to be tied back to the project number defined by the business side of things.
Ideally, I would like to be able to prompt the user for input. When you go to check in and it kicks off the build, it would ask you for the project number to be used as the version for SonarQube. Is this something TFS 2015 supports?
User input for build definitions
As far as I know, build definitions that are not manually triggered do not prompt for user input. A prompt allowing users to set build variables is shown for manually triggered builds from the VSTS web page.
SonarQube project version
I would recommend against you using the build or assembly version in your build tasks. This is because the SonarQube concept of version is quite different from the build concept. SonarQube uses versions as a baselining mechanism / to determine the leak period. If you up the version number often, the leak period is going to be too short to be actionable.
I'd recommend keeping the SonarQube project version in sync with your release schedule instead.
The short answer to this question is no, there is no way to prompt for input on a non-manually triggered CI build.
Here's what we did to work around this:
I wrote a Powershell script to read a config file and set the values to environment variables exposed to later build steps. Those variables are then what are specified in the Sonar Begin Analysis build task. I packaged that script up as a custom build task that will read a "sonar.config" file. This means all we have to do is add a "sonar.config" file to each solution we want to run Sonar analysis for, defining the key, name and version for the project, and then this build task will populate all necessary environment variables as the first step in the build.
So not a perfect solution, but it gets the job done without us having to add a lot of extra code to our solutions.

SonarQube C# Runner and Visual Studio report different CA results

I am trying to setup SonarQube to report on our C# projects. I have created a new Quality Profile in SQ which only include the Code Analysis rules (225 in total). I have made sure that these rules are in sync with the projects in source in Visual Studio.
When SonarQube analysis is run, different results are reported. Visual Studio tends to pick up more than the SonarQube runner.
For example, here are 3 results from SonarQube about rule CA1704:
and for the same solution in VS, there are many more:
The 3 that I have highlighted are the 3 that SonarQube is picking up.
This is the same for a number of different rules. I want SQ and VS to report the same results. I run analysis on TFS build (vNext), can I simply pass the results from the build to SonarQube? I mean, if I don't have to run it twice then great.
Do I need to modify the SonarQube rules themselves? Has anyone experienced this problem before?
UPDATE
I have enable verbose logging on the sonar publish and I have found that it is skipping some issues found:
2016-01-08T14:33:53.5086817Z 14:33:53.430 DEBUG - Skipping the FxCop issue at line 10 which has no associated file.
2016-01-08T14:33:53.5243155Z 14:33:53.430 DEBUG - Skipping the FxCop issue at line 19 which has no associated file.
There are lots of these for every project in my solution and the gap matches exactly, e.g. in the above case, VS reports 47 issues but SonarQube reports 45. I cannot yet find a correlation and Google doesn't have much info on it. This is going to be a big problem as one of my solutions has 18.5k issues but SonarQube is only reporting 13k.
Are the CA1704 violations that aren't showing up in SonarQube for classes or for members that are declared as fields, as opposed to properties? If so, you've run into one of the more "interesting" behaviours of the FxCop plug-in for SonarQube, which is that it ignores any violations that do not include a file and line number (see the relevant source file at https://github.com/SonarSource/sonar-fxcop-library/blob/master/src/main/java/org/sonar/plugins/fxcop/FxCopSensor.java, of which current version is c518065 for the details if you're interested).
Line numbers in FxCop reports are taken from the PDB for the target assembly. However, the PDB only contains lines for executable code, so violations that aren't associated with executable lines of code (or at least with a method that FxCop can tie to its first line) won't have a file name or line number in the FxCop report. These will all end up getting ignored by SonarQube.
If you're dependent on SonarQube reporting of your FxCop results, you may wish to consider submitting a bug report.

Enabling pmd kind of analysis in fortify

We are running fortify to check security vulnerabilities and sonar for code cleanup.
I would like to know if we can enable static code analysis in fortify and get rid of sonar/pmd/findbugs etc.
I have a java project which will be checked for security vulnerabilities using fortify sca. I also use sonar for code quality and cleanup.
Someone said me that I can configure sonar kind of rules in fortify so that I can avoid sonar and save the build time.
Basically I want to configure sonar rule set in fortify. So that fortify checks the security vulnerabilities and code quality and cleanup.
Thank you in advance.
The default Fortify ruleset includes many "sonar-like" style checks. For example, Null Dereference or Poor Exception Handling. It does not have ALL of the Sonar checks, but you could make them yourself using Custom Rules.
See this post for an example:
How to write Fortify custom rules language specific?
And here is one based on Spring MVC:
http://blog.gdssecurity.com/labs/2013/12/2/building-fortify-custom-rules-for-spring-mvc.html
Fortify also includes findbugs, and also contributes to the findbugs OSS code.

How can I track values between Jenkins Builds (Static Analysis)

I'm running a number of static analysis tools and I want to track the results from build to build. For example, if a commit to a branch increases the number of security vulnerabilities, I want to send an email to the committer. I know there are plugins like Sonar and Analysis Collector, but they don't cover all of the areas of analysis I want and they don't seem to have the ability to trigger actions based on build trends (correct me if I'm wrong).
You can use the Groovy Postbuild Plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Groovy+Postbuild+Plugin
It lets you extract data (such as number of vulnerabilities detected) from the current build's buildlog with num_vul = manager.getLogMatcher(regexp)
And compare that to previous builds, by extracting info from their buildlog, e.g.:
currentBuildNumber = manager.build.number
manager.setBuildNumber(currentBuildNumber - 1)
prev_num_vul = manager.getLogMatcher(regexp)
Then, if the number of vulnerabilities had gone up, I would call manager.buildFailure() which sets the build status to FAILURE, and then have the next PostBuild step be to the Email-ext plugin which allows you to send email to the committer in the event of a failure.
I would recommend the SonarQube tool, which does just what you describe. You mention that you already looked at it, but maybe you missed the Notifications feature or the Build Breaker Plugin. There are more SonarQube features centered around Jenkins integration. SonarQube is free to use.
If you are still missing something, it might be worthwhile asking specifically how that aspect could be covered by SonarQube. Just my two cents.

More Violations In Sonar then Jenkins

We are using only FxCop and StyleCop rules in Jenkins and Sonar.
But we are getting more no. of Violations on Sonar Dashboard then in Jenkins.
Jenkins in using 318 rules (FxCop & Stylecop Combined) whereas Sonar is using only 130 rules (FxCop & Stylecop Combined).
But still we get more no. of Violations on Sonar Dashboard then on Jenkins.
Can anyone tell me why this is happening?
And Jenkins shows result of FxCop and StyleCop seperately, whereas Sonar merge result of FxCop and StyleCop and display them. Does any one know how can we seprate the result ?
There can be several explanations why you get more violations in Sonar:
the FxCop and StyleCop rules are not configured with the same parameters (threshold, ...) between Sonar and Jenkins. This is the first thing you should do actually.
maybe you also have Gendarme rules activated in your quality profile on Sonar? (this is the case if you're using the default "Sonar way" profile)
you may also have lots of other Sonar violations (which don't come from external tools like FxCop), ranging from detected duplications to design issues.
And to answer your last question, there's currently no way in Sonar UI to seperate violations based on the tool that generates them. The most important for Sonar users is to get the whole list of violations, whatever the tool that generates them is.

Resources