I have the following environment:
We use Microsoft Team Foundation Server 2013 to daily build our
system which consists of hundreds of C# projects.
Our code quality department has identified a set of 194 C# code
analysis rules that we must keep to a minimum.
The compilation produces around 80,000 code analysis warnings that
belong to the 194 code quality rules.
10-20 developers update the system and checkin source files every day
I have the following business requirement:
Report the progress of the code rule warnings on daily basis in the form of charts and tables.
Objective is to monitor and control the amount of code warnings on daily basis.
According to this: https://learn.microsoft.com/en-us/vsts/report/sql-reports/table-reference-build-project compile errors are saved in a fact table called FactBuildProject
I am not sure about:
Are all my 80,000 warnings saved in the TFS warehouse database every day?
Generally the warnings will be saved with the builds.
For XAML Builds the number of compiler warnings used be stored in TFS Warehouse (BuildProjectView).
So, you can get the number of warnings from TFS Warehouse BuildProjectView.
UPDATE:
Cannot find the specific warning messages or IDs from the TFS Warehouse.
Just I mentioned above, the warnings will be saved with the builds. So, we can try checking the build logs with TFS API. Reference this thread : How to fetch Build Warning from TFS MS Build or TFS API
You can get the build list first, then get the logs in a loop.
Related
We currently use Visual Studio 2019+TFS 2018 with DevArt ReviewAssistant for peer review. I would like to know how many lines of code have been changed in order too keep our review small, but neither of VS or ReviewAssistant seems to be able to do this.
Do you know a tool that would work with VS2019/TFS2018 that can do that ?
You can use a Code Churn Report to achieve this.
You can report on the software quality by using the Code Churn and Run
Coverage perspectives from the SQL Server Analysis Services cube for
Team Foundation Server. By using these perspectives, you can view just
the measures, dimensions, and attributes that are associated with the
changes in lines of codes and the extent to which code is covered in
builds and test runs.
You should be able to create the report based on Team project and specific date.(Build/Release date) to view the Lines of code Added, Deleted and Modified.
Detail tutorials for your reference:
Analyze and report on code churn and coverage using the code churn
and run coverage perspectives
Code Churn tables
Getting Started with the TFS Data Warehouse
I am trying to setup SonarQube to report on our C# projects. I have created a new Quality Profile in SQ which only include the Code Analysis rules (225 in total). I have made sure that these rules are in sync with the projects in source in Visual Studio.
When SonarQube analysis is run, different results are reported. Visual Studio tends to pick up more than the SonarQube runner.
For example, here are 3 results from SonarQube about rule CA1704:
and for the same solution in VS, there are many more:
The 3 that I have highlighted are the 3 that SonarQube is picking up.
This is the same for a number of different rules. I want SQ and VS to report the same results. I run analysis on TFS build (vNext), can I simply pass the results from the build to SonarQube? I mean, if I don't have to run it twice then great.
Do I need to modify the SonarQube rules themselves? Has anyone experienced this problem before?
UPDATE
I have enable verbose logging on the sonar publish and I have found that it is skipping some issues found:
2016-01-08T14:33:53.5086817Z 14:33:53.430 DEBUG - Skipping the FxCop issue at line 10 which has no associated file.
2016-01-08T14:33:53.5243155Z 14:33:53.430 DEBUG - Skipping the FxCop issue at line 19 which has no associated file.
There are lots of these for every project in my solution and the gap matches exactly, e.g. in the above case, VS reports 47 issues but SonarQube reports 45. I cannot yet find a correlation and Google doesn't have much info on it. This is going to be a big problem as one of my solutions has 18.5k issues but SonarQube is only reporting 13k.
Are the CA1704 violations that aren't showing up in SonarQube for classes or for members that are declared as fields, as opposed to properties? If so, you've run into one of the more "interesting" behaviours of the FxCop plug-in for SonarQube, which is that it ignores any violations that do not include a file and line number (see the relevant source file at https://github.com/SonarSource/sonar-fxcop-library/blob/master/src/main/java/org/sonar/plugins/fxcop/FxCopSensor.java, of which current version is c518065 for the details if you're interested).
Line numbers in FxCop reports are taken from the PDB for the target assembly. However, the PDB only contains lines for executable code, so violations that aren't associated with executable lines of code (or at least with a method that FxCop can tie to its first line) won't have a file name or line number in the FxCop report. These will all end up getting ignored by SonarQube.
If you're dependent on SonarQube reporting of your FxCop results, you may wish to consider submitting a bug report.
Our project group stored binary files of the project that we are working on in SVN repository for over a year, in the end our repository grew out of control, taking backups of SVN repo became impossible at one point since each binary that is checked in is around 20 MB.
Now we switched to TFS,we are not responsible for backing the repository up, our IT tream takes care of it and we have more network and storage capacity for backups because of that but we want to decide what to do with the binaries. As far as I know TFS stores deltas and for binary files but deltas will be huge, but we might end up reaching our disk space quota one day, so I would like to plan things better from the start, I don't want to get caught up in a bad situation when it's too late to fix the problem.
I would prefer not keeping builds in the source control but our project group insists to keep a copy of every binary for reproducing the problems that we see in the production system, I can't get them to get the source code from TFS, build it and create the binary, because it is not straightforward according to them.
Does TFS offer a better build versioning method? If someone can share some insight I'd really be grateful.
As a general rule you should not be storing build output in TFS. Occasionally you may want to store binaries for common libraries used by many applications but tools such as nuget get around that.
Build output has a few phases of its life and each phase should be stored in a separate place. e.g.
Build output: When code is built (by TFS / Jenkins / Hudson etc.) the output is stored in a drop location. This storage should be considered volatile as you'll be producing a lot of builds, many of which will be discarded.
Builds that have been passed to testers: These are builds that have passed some very basic QA e.g. it compiles, static code analysis tools are happy, unit tests pass. Once a build has been deemed good enough to be given to test it should be moved from the drop location to another area. This could be a network share (non production as the build can be reproduced) there may be a number of builds that get promoted during the lifetime of a project and you will want to keep track of what versions the testers are using in each environment.
Builds that have passed test and are in production: Your test team deem the build to be of a high enough quality to ship. As part of your go live process, you should take the build that has been signed off by test and store it in a 3rd location. In ITIL speak this is a Definitive Media Library. This can be a simple file share, but it should be considered to be "production" and have the same backup and resilience criteria as any other production system.
The DML is the place where you store the binaries that are in production (and associated configuration items such as install instructions, symbol files etc.) The tool producing the build should also have labelled the source in TFS so that you can work out what code was used to produce the binary. Your branching strategy will also help with being able to connect the binary to the code.
It's also a good idea to have a "live like" environment, this should be separate from your regular dev and test environments. As the name suggests it contains only the code that has been released to production. This enables you to quickly reproduce bugs in production
Two methods that may help you:
Use Team Foundation Build System. One of the advantages is that you can set up retention periods for finished builds. For example, you can order TFS to store the 10 latest successful builds, and the two latest failed ones. You can also tell TFS to store certain builds (e.g. "production builds"/final releases) indefinitely. These binaries folders can of course also be backed up externally, if needed.
Use a different collection for your binaries, with another (less frequent) backup schedule. TFS needs to backup whole collections, but by separating data that doesn't change as frequently as the source you can lower the backup cost. This of course depends on the frequency you are required to have the binaries backed up.
You might want to look into creating build definitions in TFS to give your project group an easy 'one button' push to grab the source code from a particular branch and then build it and drop it to a location. That way they get to have their binaries, and you don't have to source control them.
If you are using a branching strategy where you create Release or RTM branches when you push something to production, then you can point your build definitions at those branches and they can manually trigger them from the TFS portal or from within Visual Studio.
I am having a problem with the TFS SharePoint portals code coverage chart.
We have a .NET 4 solution that is being developed TDD, as a result we have pretty good code coverage, but as a quality check I want to monitor the code coverage rates as the project progresses.
To this end I have a test configuration (a .testsettings file in the solution) which is configured to instrument our soluton assemblies for code coverage and two team build definitions that use that test definition.
Both team builds (one is a CI trigger, the other a nightly shceduled trigger) work and produce code coverage figures
However despite sheduled team builds with code coverage the dashboard "code coverage" excel report always shows 0% coverage, in fact the excel spreadsheet containing the report does not contain any data. This is rather unexpected!
So my question boils down to what steps have I missed to make code coverage data from team builds show up in the TFS database used by the excel code coverage report?
As a side note the SSRS reports are also showing code coverage from the builds, it just seems to be the Excel spreadsheets that fail to see data.
UPDATE
It seems the problem is the filter "Is Build Verification Run" when this filter is removed I see data.
Specifically in my template (MSF for agile v5.0) the version of "code coverage" had a filter applied restricting output to just the "Other" value. Very odd.
When I am back in the office I'll try creating a new project based on the MSF Agile 5.0 tempate and see if this odd filter setting is part of it, or something I did in the past to this project!
I have verified this by creating a new project from the MSF For Agile Software Developmnet v5.0 template.
This turns out to be what I percieve as a bug in the "MSF For Agile Software Developmnet v5.0" template.
When the project is created the excel spreadhseet used for the "code coverage" chart in the dashboards has a filter on it restricting the data to just items whose origin is "other" this excludes code coverage data from a TFS build which has this value set to "true" in the cube.
Simply clearing the filter or including "true" in it resolves the issue and shows you your TFS build code coverage data.
I am told TFS can accept data on build/test metrics from 3rd party continuous integration tools. Does anyone know how this works or have any good links for me? My google-fu seems weak today and I cannot find any info on this. We would like to have a short powershell script or app run at the end of the build and send all known metrics up to TFS so it could show up in certain reports. I actually would like things that (I think) should already have space for in the data warehouse for TFS BUild Server, but I will be using CC.NET. I am thinking build name, result (Pass/Fail), Number of Warnings, Number of Errors, Time, UnitTests Run, UnitTests Passed, Code Coverage, FxCop resultsThanks.
I'm afraid that there is not a ready made integration that does this yet. The plug-in that links CC.NET to TFS is available over at CodePlex but this just lets CC.NET use TFS for version control and doesn't allow the results of the builds to be published back into TFS.
To get the data into TFS from CruiseControl.NET you have a couple of options. You could write your own custom TFS Data Warehouse Adapter which is complex but ultimately flexible or you could use a combination of the Team Build API and a little bit of vodoo to push data in to the TFS Build store that would also get pushed into the TFS Data warehouse. However, this would be limited in TFS2008 as you would only be able to push data about the build and the unit tests but not things like Code Coverage.
That said, pushing data from CC.NET to TFS is something that I originally wanted to do. However in TFS2008 the built in build system was so good that I switched from using CC.NET to trigger the builds to using TFS to trigger and manage the build. This had the advantage that all the stats stuff was taken care of automatically (and the built in UI in Visual Studio). Because I moved to TFS2008 I then lost the motivation to get the CC.NET stuff built.
If anyone wanted to contribute a TFS build result publishing feature to the CC.NET integration then feel free to join the project on CodePlex - I would love to have any help going.