I was trying to generate a code coverage report from trace1.json. The trace1.json was not generated by gcovr, it was generated using Lauterbach software from the real hardware trace data. From Lauterbach spec, it has the ability to export coverage information about functions and lines to a file in JSON format compatible to Gcov. So after I got the JSON file and tried to use gcovr to generate the code coverage report:
gcovr --add-tracefile result.json --html-details result.html --verbose
I got an empty report and the gcovr log shows "Gathered coveraged data for 0 files".
So I'm wondering after I get the Json file, do I still need to compile the source with --coverage? since even I compile the source with coverage flags, the executable is running on another real hardware, which will not able to collect any gcda.
As of gcovr 5.2, there is no support for the gcov JSON format. Gcovr's internal JSON format is closely based on the gcov JSON format, but differs in some details. Importantly, gcovr validates that the input JSON document has a matching gcovr/format_version. This means gcovr should die with an error in your scenario, and shouldn't even get to the “Gathered coveraged data for 0 files” message.
This suggests that your result.json is an empty JSON report generated by a previous gcovr --json result.json run, and NOT the trace1.json generated by your Lauterbach tool!
You may be able to write a script to modify the JSON file to conform to gcovr's expected format, but you're on your own there. The JSON format is documented in the Gcovr User Guide, though there's a pending update.
Assuming things work as expected, the GCC --coverage flag is unnecessary. While --coverage is necessary for producing .gcda and .gcno files that are needed by gcov to work, gcovr's --add-tracefile mode only consumes the JSON file (and perhaps the source code files) and no other data.
Related
I am working on creating a Jenkins pipeline for unit-testing maybe with GTest.
My plan is to use the following tools:
GTest for Unit-Testing, gcov for generating gcda and gcno Files and gcovr for xml or Html outputs of the unit-testing results.
It's working well till now with the help from the internet and particularly stack overflow.
But I am struggling with 3 issues.
gcov is creating gcda and gcno files for gtest sources and my unit-tests. Because gcovr is invoking them and I see them in the HTML files. how can I avoid this? I only want my production code in the HTML files.
I can only see code coverage for template classes if gcov is generating gcda and gcno files for my unit-tests. So I need a simple idea for 1) Maybe can I use an exclude flag in gcovr?
Unused functions in template classes (inline functions) are not covered. Code coverage is always 100% but I tried with different flags, and nothing helped.
-fprofile-abs-path --coverage -fno-inline -fno-inline-small-functions -fno-default-inline -fkeep-inline-functions
I added a picture to show, what I am talking about. UnitTests and GTests covering results should not appear in gcovr HTML...
You can filter out unwanted coverage data, but you can't create data that doesn't exist.
1. gcov is creating gcda and gcno files for gtest sources and my unit-tests. Because gcovr is invoking them and I see them in the HTML files. how can I avoid this? I only want my production code in the HTML files.
Use gcovr --exclude GoogleTest/ --exclude UnitTests/
Gcovr has a per-file filtering system that allows you to specify which source code files to include/exclude. For a file to be included in the coverage report,
any --filter pattern must match, and
no --exclude pattern must match.
Or phrased in reverse: a file is excluded if it doesn't match any --filter or if it matches any --exclude pattern.
If you don't provide an explicit --filter, then the default filter is the --root directory, which in turn defaults to the current working directory.
These patterns are regexes. Usually, these are used to match paths relative to the current working directory. For example, you can limit the reports to a src/ directory with gcovr --filter src/. Or you can exclude the GoogleTest/ directory with gcovr --exclude GoogleTest/.
Gcovr also has a way to filter gcda/gcno files (search_paths and --gcov-filter), but that is mostly useful as a performance optimization.
2. I can only see code coverage for template classes if gcov is generating gcda and gcno files for my unit-tests. So I need a simple idea for 1) Maybe can I use an exclude flag in gcovr?
This is by design. As explained above, you can solve this via gcovr's exclude flag.
You get a gcda/gcno file per compilation unit. Header files are included into multiple compilation units, so their coverage information is essentially split across all compilation units that include it.
So, if you want coverage for code in header files, and you include these headers into your unit tests, then gcovr must also process the gcda/gcno files from those unit tests.
3. Unused functions in template classes (inline functions) are not covered. Code coverage is always 100% but I tried with different flags, and nothing helped. -fprofile-abs-path --coverage -fno-inline -fno-inline-small-functions -fno-default-inline -fkeep-inline-functions
The gcov coverage data model works on an assembly-code level. Counters are inserted by the compiler itself, but only for functions for which the compiler actually generates machine code. Thus, as far as gcov is concerned, inline functions, optimized-out code, and non-instantiated templates simply do not exist.
This is quite annoying, but it's potentially difficult to work around.
This can most clearly be avoided by making sure that all functions for which you want coverage data are referenced via your unit tests. It is not necessary to actually invoke that function, merely referencing it should be sufficient. For example, I'd write a function to ignore() arbitrary values despite optimizations, then:
ignore(&some_inline_function);
Possible implementation: template<class T> void ignore(T const& t) { volatile T sinkhole = t; }
Your suggested options like -fno-inline do not work because the code for these functions isn't generated in the first place.
With GCC and when using C++ (but not C), the -fkeep-inline-functions should work, but only for non-templated inline functions.
If a non-templated inline function is only used within one file and isn't provided in a header to multiple files, then it should instead be declared static (in C) or in an anonymous namespace (in C++11 or later), so that -Wunused-function or -Wall notify you if it isn't referenced.
Templates are more tricky in general. Each distinct instantiation of a template results in separate functions. Gcovr does aggregate coverage data across them, but in order for the template to appear in the coverage data it must be instantiated at least once. You will have to do this manually.
I am able to generate coverage.dat files with bazel command:
bazel coverage //tests/... --instrumented_filter=/src[/:]
This generates report for one of the classes, because coverage.dat files are generated separately for each instrumented file in different directories. How do I get a merged coverage.dat?
The coverage.dat report should contain coverage information about all the classes affected by the --instrumentation_filter. This file should be located under bazel-testlogs/path/to/your/package/TestTarget.
You shouldn't have to write anything additional. Bazel does generate multiple temporary .dat files, but it merges all of them in the final coverage.dat file, whose location is printed by bazel when it finishes to run. That file is the one with the location I described above. Make sure to check that file and check if you're using --instrumentation_filter (*) correctly.
(*) From the command line manual:
When coverage is enabled, only rules with names included by the
specified regex- based filter will be instrumented. Rules prefixed
with '-' are excluded instead. Note that only non-test rules are
instrumented unless -- instrument_test_targets is enabled.
Is the output, as shown below,
File 'printtokens.c'
Lines executed:47.18% of 195
Branches executed:65.14% of 109
Taken at least once:35.78% of 109
Calls executed:33.33% of 81
printtokens.c:creating 'printtokens.c.gcov'
generated by GCOV stored somewhere? If not, how can we store it?
GCOV calculates that from the information in the *.gcda files, but if you want to keep that summary around the easiest thing to do is
gcov printtokens.c > printtokens.c.summary
If you want to see that information together with the line coverage, you could look at lcov, which uses gcov to generate HTML files with the line coverage and summary coverage information.
I totally agree with Ankur.
You should use it in this way:
From the directory in which .gcno and .gcda are located, write the command:
lcov -c -d . -o fileName.info
And then, to generate a HTML report:
genhtml fileName.info
It is the proper way to view the results.
I'm using lcov to generate code coverage reports for a C code base. I would like to integrate test descriptions into the final output (using lcov's gendesc utility.)
However, I have no clue on how to do that, and documentation on gendesc seems rather sparse (as far as the good old google has been able to tell me.)
The gendesc info at LTP describes how to create the input test case description files (as expected by genhtml). And the genhtml info provides --show-descriptions, and --description-file for inputting such test case description files.
However, I don't know how to reference the test cases so that they get included in the final report. genhtml sees them as unused test cases and thus keeps them out of the generated html output. I can use --keep-descriptions, but that doesn't tell me what test cases were run (obviously because I do not know how to make the reference from code to test description.)
So, how do we tell lcov/genhtml which tests were run in the final output? Any ideas?
To associate a test case name with coverage data, specify that name while collecting coverage data using lcov's --test-name option:
lcov --capture --directory project-dir --output-file coverage.info --test-name "test01"
Then continue with the steps that you already mentioned, that is create a test case description file "tests.txt":
test01
Some test
Convert it into the format expected by genhtml:
gendesc tests.txt --output-filename tests.desc
Finally specify the descriptions file to genhtml:
genhtml coverage.info --output-directory out --description-file tests.desc --show-details
When I do gcov . there is no problems. However, when I do gcov -a . gcov froze. The last few lines of the output is:
File '/usr/include/boost/archive/detail/iserializer.hpp'
Lines executed:78.18% of 55
/usr/include/boost/archive/detail/iserializer.hpp:creating 'iserializer.hpp.gcov'
File '/usr/include/boost/serialization/extended_type_info_typeid.hpp'
Lines executed:40.74% of 27
/usr/include/boost/serialization/extended_type_info_typeid.hpp:creating 'extended_type_info_typeid.hpp.gcov
Do you know why that is happening ? The reason I need "-a" is when I use lcov, it gives that option to gcov, I can hack geninfo to ignore that option but I prefer not to since I'll eventually run lcov on a public system.
Thank you for any inputs!
I also have code that uses boost::serialization - the lcov process isn't /frozen/, it just takes a very very long time to run. I have had it complete successfully after several hours, and I finally do get a nice lcov report.
It would be lovely to be able to exclude processing of the boost serialization code when running lcov -c but I have not been able to figure out exactly how to do that yet. (Of course, I /want/ to get coverage over the code that uses boost serialization, but not the boost headers themselves) Even putting // LCOV_EXCL_START & LCOV_EXCL_STOP around the majority of the serialization code doesn't work, as I think those exclusion markers are only used when genhtml is called, not on lcov -c.