Is there a way to focus lcov code coverage reports to just one or two directories? - code-coverage

I recently started using lcov to visualize my code coverage. It's a great tool.
One thing I'm noticing is that it generates code coverage reports for all the files that I'm using - including those that I'm not interested in. For example, it will give me code coverage reports for boost and mysql++ files.
Is there an easy way to force lcov to only generate coverage reports for specific files?
I have tried using the -k parameter like so:
/usr/bin/lcov -q -c -i -b . -d .obj -k src/ -k include/ -o app_base.info
{run unit tests now}
/usr/bin/lcov -q -c -b . -d .obj -k src/ -k include/ -o app_test.info
/usr/bin/lcov -q -a app_base.info -a app_test.info -o app_total.info
/usr/bin/genhtml -q -o lcov_output_directory app_total.info
(Meaning that I only want coverage files for the "include" and "src" directories.)
However, this doesn't seem to work. The report still shows me all the extraneous files. Any suggestions are very much appreciated. Thanks!

I used the --no-external flag together with the --directory flag to exclude unwanted files.
The definition of external from the man:
External source files are files which are not located in one of the directories specified by --directory or --base-directory.
So my command looked like this:
$ lcov --directory src -c -o report.info --no-external
Capturing coverage data from src
Found gcov version: 4.2.1
Scanning src for .gcda files ...
Found 4 data files in src
Processing src/C####.gcda
ignoring data for external file /usr/include/c++/4.2.1/bits/allocator.h

lcov supports a command line argument --remove to do exactly what you are asking for.

A possible approach is to constrain which files are compiled with the coverage flags (-fprofile-arcs -ftest-coverage). If you don't want to engineer your make file system to be selective about which files are built with test instrumentation, the following trick might work for you:
Build your application without instrumentation.
Remove the .o files for source that you want to instrument
Turn on instrumentation and rebuild. Only the deleted object files will be rebuilt with instrumentation.
Run lcov
This should result in only the targeted areas emitting gcov artifacts, which are blindly consumed by the lcov scripts.

Related

Why does Bazel's foreign_cc rules dereference symlinks in the output? How can I change this?

I currently into "migrating" some third party dependency projects (typically old style configure/make based) to Bazel using it's foreign_cc rules.
One goal is to have identical output compared to before the migration, and among some attributes like permissions and RPATH I'm still struggling with symlinks being de-referenced seemingly unconditionally.
So instead of libfoo.so -> libfoo.so.3, libfoo.so.3 -> libfoo.so.3.14 I'll always get three files now.
Inspecting the generated bazel-bin/external/foo/foo_foreign_cc/build_script.sh the last commands contain two invocations of cp -L with no variables modifying the behavior:
[configure command]
[make commands]
set +x
cp -L -r --no-target-directory "$BUILD_TMPDIR/$INSTALL_PREFIX" "$INSTALLDIR" && find "$INSTALLDIR" -type f -exec touch -r "$BUILD_TMPDIR/$INSTALL_PREFIX" "{}" \;
[content of #postfix_script]
replace_in_files $INSTALLDIR $BUILD_TMPDIR \${EXT_BUILD_DEPS}
replace_in_files $INSTALLDIR $EXT_BUILD_DEPS \${EXT_BUILD_DEPS}
replace_in_files $INSTALLDIR $EXT_BUILD_ROOT \${EXT_BUILD_ROOT}
mkdir -p $EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo
cp -L -r --no-target-directory "$INSTALLDIR" "$EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo" && find "$EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo" -type f -exec touch -r "$INSTALLDIR" "{}" \;
cd $EXT_BUILD_ROOT
So it looks quite obvious to me that for some reason configure_make doesn't even consider to keep symlinks, turning this into something I have to do outside the Bazel rule (while also possibly polluting the remote cache).
Is there a reason for this? I.e. why shouldn't I create a fork of rules_foreign_cc just to remove this -L flag which someone seem to have added intentionally?
I'm one of the rules_foreign_cc maintainers.
The reason why rules_foreign_cc dereferences the symlinks there is because in general the outputs being copied into named outputs may be dangling symlinks as they may not be relative outputs to other build outputs and at least in Bazel 4 which is the minimum version we currently support, dangling symlinks are not allowed as build artifacts. (this behaviour may have changed in later Bazel versions but I'm not 100% sure on this).
What you likely want to actually consume is the output_group gendir. This can be accessed like so:
filegroup(
name = "my_install_tree",
src = ":cmake_target",
output_group = "gendir",
)
The gendir output group is the entire install directory as created by the build artifacts.
Note that you wouldn't actually need to fork the rules to achieve what you were proposing either. The shell script is generated by a toolchain (whose type is currently in the private package and so the right to change this is reserved.) and thus you could provide your own implementation of the toolchain to override the behaviour.

Using lcov with gcc-8

I am trying to determine my testcoverage. To do this I compile my program with a newer version of gcc:
CC=/usr/local/gcc8/bin/gcc FC=/usr/local/gcc8/bin/gfortran ./configure.sh -external cmake -d
After compiling this with the --coverage option I run my tests and this creates *.gcda, *.gcno and *.o.provides.build files. And if I run something like:
> $ /usr/local/gcc8/bin/gcov slab_dim.f90.gcda [±develop ●]
File '/Local/tmp/fleur/cdn/slab_dim.f90'
Lines executed:0.00% of 17
Creating 'slab_dim.f90.gcov'
Which shows me, that gcov runs fine. However if I try to run lcov on these results:
lcov -t "result" -o ex_test.info -c -d CMakeFiles/
I get error messages like these for every file:
Processing fleur.dir/hybrid/gen_wavf.F90.gcda
/Local/tmp/fleur/build.debug/CMakeFiles/fleur.dir/hybrid/gen_wavf.F90.gcno:version 'A82*', prefer '408R'
/Local/tmp/fleur/build.debug/CMakeFiles/fleur.dir/hybrid/gen_wavf.F90.gcno:no functions found
geninfo: WARNING: gcov did not create any files for /Local/tmp/fleur/build.debug/CMakeFiles/fleur.dir/hybrid/gen_wavf.F90.gcda!
This is the same error message I get when I use the systems standard /usr/bin/gcov
This leads me to believe that lcov calls the old gcov rather than the new one. How do I force gcov to use the new version?
The simplest solution I found was to run /usr/bin/gcov-8 instead of /usr/bin/gcov.
The $PATH environment variable needs to be to extended by /usr/local/gcc8/bin/
The source of the error is clear, from the fact that you get the same result by using /usr/bin/gcov. /usr/bin/gcov should be a link to a binary from the installed compiler, but in your case the link doesn't point to a binary within gcc 8.2 installation.
You can delete the link and re-create it to point to the correct gcov or you can setup something like update-alternatives to change the version of gcov when you change the default compiler.
The previous answer should work as well if you have a binary called gcov in /usr/local/gcc8/bin, because if you add that path, into your environment PATH first, it will be selected first.

Installing Documentation fails

I followed the instructions for installing the documentation from the source build (by compiling the documentation scheme in CorePlotExamples) but it fails when trying to compile the documentation with the following errors.
3068: protocol_c_p_t_bar_plot_data_source-p.html
3069: protocol_c_p_t_scatter_plot_data_source-p.html
3070: _c_p_t_utilities_8m.html#a794f89cd14d4cfb21bf8c050b2df8237
3071: category_c_p_t_layer_07_c_p_t_platform_specific_layer_extensions_08.html
3072: interface_c_p_t_line_style.html#a4013bcb6c2e1af2e37cfabd7d8222320
3073: _c_p_t_utilities_8h.html#ae826ae8e3f55a0aa794ac2e699254cad
Loading symbols from /Users/GeoffCoopeMP/Downloads/core-plot-master-3/framework/CorePlotDocs.docset/html/com.CorePlot.Framework.docset/Contents/Resources/Tokens.xml
1000 tokens processed...
2000 tokens processed...**strong text**
3000 tokens processed...
4000 tokens processed...
5000 tokens processed...
* 5145 tokens processed ( 1.8 sec)
* 20 tokens ignored
Linking up related token references
Sorting tokens
rm -f com.CorePlot.Framework.docset/Contents/Resources/Documents/Nodes.xml
rm -f com.CorePlot.Framework.docset/Contents/Resources/Documents/Info.plist
rm -f com.CorePlot.Framework.docset/Contents/Resources/Documents/Makefile
rm -f com.CorePlot.Framework.docset/Contents/Resources/Nodes.xml
rm -f com.CorePlot.Framework.docset/Contents/Resources/Tokens.xml
mkdir -p ~/Library/Developer/Shared/Documentation/DocSets
cp -R com.CorePlot.Framework.docset ~/Library/Developer/Shared/Documentation/DocSets
cp: /Users/GeoffCoopeMP/Library/Developer/Shared/Documentation/DocSets/com.CorePlot.Framework.docset: Not a directory
make: *** [install] Error 1
find: /Users/GeoffCoopeMP/Library/Developer/Shared/Documentation/DocSets/com.CorePlot.Framework.docset/Contents/: Not a directory
false
Showing first 200 notices only
Command /bin/sh emitted errors but did not return a nonzero exit code to indicate failure
I found the com.CorePlot.Framework.docset files (7kb) but noticed the KIND is "Unix executable file" rather than the expected "Documention Set" like other Xcode help files.
The dockset files are also 7kb in the zip file download under the documentation folder and the kind is shown as Unix executable file there too.
Under my user Library folder I can see the dockets as in:
I also noticed that the docksets can be within the Xcode.app contents but placing these files here didn't work either.
So, is this 7k file the right one? should it be kind Documentation Set rather than Unix Exectuable File? Why does the documentation not compile in Xcode but still generates the files?
I am using Xcode version 5.1.1, Doxygen 1.8.7, graphviz 2.36 and Core Plot 2.0 source from github.
Any help would be much appreciated as I am trying to learn how to use this excellent SDK.
The Core Plot docsets should each be around 70 MB in size. A "docset" is a package which is a special type of folder treated as a single file in the Finder. When building Core Plot documentation, Doxygen makes the docset folder inside the Core Plot "framework" folder and copies it to your library from there.
Did the docset get built in the "framework" folder? Are there any aliases or file links in the path to the Core Plot folder that might be confusing Doxygen or the cp command?

IOS project code coverage source file is relative path

I have generated .gcno and .gcda files after running my iphone app.Then I use cover story to view the coverage rate.However, cover story could not open the source file and I found that the source path is a relative path, not full path.All I can see is full of /EOF/ in the screen.
The strange to me is that only some of the files could not open due to this path issue. Most of them are full path and cover story can open them successfully.Unable to attach screenshot
How can I show the correct path names in CoverStory?
I suggest generating an HTML report with lcov, which allos you to normalize the directory names.
Other benefits of using an HTML report are:
The coverage information is available on both desktop machines as well as from a Continuous Integration build server.
To install lcov
Use Homebrew or MacPorts
Example:
brew install lcov
First Generate Datafile
#!/bin/bash
set -e # fail script if any commands fail
${gen.info} ${temp.dir}/coverage-data/*.gcno --no-recursion --output-filename \
${temp.dir}/${module.name}-temp.info
#Remove symbols we're not interested in.
${lcov} -r ${temp.dir}/${module.name}-temp.info > ${temp.dir}/${module.name}.info
Now Generate the HTML Report
#!/bin/bash
set -e # fail script if any commands fail
${gen.html} --no-function-coverage --no-branch-coverage -o ${coverage.reports.dir} \
--prefix ${basedir} ${temp.dir}/${module.name}.info
If you're interested, I have an build script the produces HTML reports here. An example report: http://jasperblues.github.io/Typhoon/coverage/index.html

Run gcov tool on folder

I run gcov tool on some .c files using gcc -fprofile-arcs -ftest-coverage [filenames]. command
But its is very tedious job of supplying file names to this command.
Instead I need help in which I can run gcov tool on a folder which contains all source files.
Is this possible?
Please help me out with a solution.
Thanks in advance.
I ran into the same problem, my project contains ~3000 files.
Write a shell script to grab all .c .gcno and .gcda files to a common folder using find exec, then run gcov using the same command.
sample:
LOCATION=your_gcov_folder_name
find -name '*.c' -exec cp -t $LOCATION {} +
find -name '*.gcno' -exec cp -t $LOCATION {} +
find -name '*.gcda' -exec cp -t $LOCATION {} +
cd $LOCATION
find -name '*.c' -exec gcov -bf {} \;
run it on your code folder which contains your project.
[LCOV][1] provides user friendly reports automatically, firstly I would suggest to take a look.
If you really want to use gcov to show coverage data you could try
find . -name "*.cpp" -exec sh -c 'gcov {} -o "$(dirname {})"' \;
this will create gcov files based on your gcno and gcda files.
And usually it is not perfect idea to move gcno/gcda files. It will cause problems with finding source codes.
First of all, the command you have specified in the question is for compiling c/c++ files and instrumenting them for getting coverage generated later at the time of execution.
That command can be used as following too:
gcc --coverage
g++ --coverage
Note: you must specify the same flag for linking too.
Now about the question, if your question is about compiling multiple files then there are a lot ways for building projects, no matter how complex. You can use automated builds for it.
If your question is about generating coverage report for multiple files then:
You can use gcovr for generating report in various forms just by specifying root directory (directory above src and obj ) with "-r or --root=ROOT" flags.
Refer to this user guide.
Answers given by others works too if you really want to use only gcov and nothing else. But in my opinion gcovr meets every purpose that can be fulfilled with gcov(except function level detail, you can get line level details though).
if you are not getting coverage report try removig
"coverageReporters": [
"text",
"text-summary" ],
from file
jest.config.js

Resources