How to include all targets in bazel coverage - code-coverage

How do I include non-test targets in bazel coverage? Currently I use the following bazel command to get code coverage:
bazel coverage \
--instrument_test_targets \
--experimental_cc_coverage \
--combined_report=lcov \
//... --test_arg=--logtostderr
The project is written in C++. The command works fine. However, the output lcov trace file only includes the files that have coverage. If a C file does not have a test, it is not in the lcov trace file.
Does bazel coverage only executes the test targets? Is there a way to include all targets (the non-test targets)? So that even if a file has no test, I can still see it in the report (the report will show zero coverage). The intention for this is that if someone adds new files and doesn't write unit test, the file can be shown in coverage report.

Can you reproduce using --incompatible_cc_coverage?

Related

bazel coverage results are incomplete

The problem I'm running into is that my bazel coverage output coverage data feels incomplete.
For example, I have a project with 10 source files, and 1 test file (which tests only 1 source file).
Run bazel clean && bazel coverage --combined_report=lcov -- //src/...:all
Verify coverage data generated with find -L -type f -name 'coverage.dat'
Notice that:
There exists 1 coverage file for each test run
(Problem!) There is no coverage data for the untested source files
Does anyone know how to configure bazel to generate a complete coverage report, which includes the untested source files as well?

Bazel's Java lcov report uses package instead of file paths

I am collecting code coverage for a Java-based repository using the following command:
bazel coverage --jobs=$PARALLEL_JOBS --keep_going --flaky_test_attempts=3 --cache_test_results=no --combined_report=lcov --coverage_report_generator="#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main" -- //$TARGET/...
This creates a coverage report in this directory:
$(bazel info output_path)/_coverage/_coverage_report.dat
Problem: All of the files reported in this coverage report correspond to Java packages instead of actual files (e.g. com/my_package/path/my_class.java)
Question: How do I configure bazel coverage to report coverage data by original source files instead of Java packages?
I would like to post process these coverage files, but I don't have a clear way to map all of these Java packages into original source files.
Thank you!

Bazel: How to exclude path from code coverage for Scala / Java?

I am using Bazel with rules_scala. My problem now is how to exclude files from code coverage. So far this is how I am running coverage:
rm -rf coverage
bazel coverage --combined_report=lcov --coverage_report_generator="#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main" ...
genhtml -o coverage --ignore-errors source bazel-out/_coverage/_coverage_report.dat
But there are some folders I would like to exclude from code coverage. I tried using the --instrumentation_filter flag, but no matter what I tried putting there Bazel still collect coverage for this folder.
Are there any examples how I should use this flag?
Thanks!
Use
bazel coverage --combined_report=lcov --coverage_report_generator="#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main" -- //... -//package/to/ignore/...
This appears to be a bug with rules_scala. See this issue for more details.

Implementing clang based code coverage in custom Bazel C rules

I am trying to get bazel coverage //my:test to output coverage data files, building with custom C rules and using a custom clang toolchain.
For Bazel's native C rules this is a solved problem. I can build coverage output using the cc_library and cc_test native rules by running the following command with env set:
export BAZEL_USE_LLVM_NATIVE_COVERAGE=1
export GCOV=/path/to/llvm-profdata
export BAZEL_LLVM_COV=/path/to/llvm-cov
export CC=/path/to/clang
bazel coverage //my:test --experimental_generate_llvm_lcov --combined_report=lcov
The test target has a coverage.dat file output and there is a combined dat report file too. I have noticed that the cc_library target returns an InstrumentedFilesInfo provider that has the "metadata files" attribute populated with the .gcno files output during compilation.
I am using the cc_common Starlark library to build custom C rules and my compile action is setup via cc_common.compile(). While *.gcno files are outputs that Bazel expects from this action [0], the compile function does not return any *.gcno File objects in the compilation context or compilation output so using them as the inputs to another action/returning in a provider/adding to the target's runfiles is not possible.
I understand that the .dat files are produced using the *.gcno compile output and the *.gcda sandboxed test execution output and combined in the collect_cc_coverage.sh script. Something in the plumbing of my rule implementation is missing that is not being fixed by returning a provider constructed by coverage_common.instrumented_files_info() and declaring extra outputs of cc_common.compile() is currently not possible.
[0]: Running under coverage rather than test has the toolchain feature where --compile is added the .gcno files are output and they appear in bazel-out.
My questions:
Has anyone had any experience implementing code coverage for custom C rules?
How do I get my test executable to take in .gcno files, generate the .gcda files and combine the two using my toolchain to produce the .dat files that are expected with the native C rules? (This question does not require .gcno - solutions involving profraw/profdata are equally valid.)

How to show branch coverage for C++ project on coveralls.io?

I am using the coveralls.io service to display line coverage for my C++ project. I also want to track branch coverage, but cannot get it to work.
On Travis CI, I use this call to generate the coverage report:
coveralls -r <my_project_root> -b <my_build_dir> --verbose --gcov=gcov --gcov-options '\-lpbc';
The coveralls script is previously installed with pip
pip install cpp-coveralls urllib3[secure]
I get the line coverage shown correctly on coveralls.io, but not the branch coverage. I don't know what of the following things I am doing wrong.
Do I have to activate it on coveralls.io explicitly?
Is there something wrong with the coveralls command?
Can coveralls.io even show branch coverage?
Pretty late to the party, but to answer your question(s):
Yes, you will want to enable the Coveralls setting for BRANCH COVERAGE: INCLUDE IN AGGREGATE %:
Of course, this will only work if branch coverage is included in your original coverage report.
That happens in a prior step, when you compile the original project into an "instrumented" version of the source code and generate the GCOV coverage report, before you use the coveralls command to POST the coverage report to Coveralls.
Something like:
gcc -Wall -ftest-coverage -fprofile-arcs cov.c
gcov --branch-probabilities cov.c
Source: gcov Wiki - Example

Resources