Makefile: generic way to indccate all available .info file from one folder to lcov - code-coverage

I'm working on some tests and I'm using lcov
In order to correctly manage the coverage of all tests, my makefile:
compile each test
launch the test
generate the .info file using geninfo
delete *.gcda, *.gcno
When this is made for all tests, I want to generate the whole coverage by calling:
lccov --add-tracefile <name>.info -o global_coverage.info
However, it seems that I need to indicate all .info file by --add-tracefile <name>.info
Ex: if I've file1.info and file2.info, I need to call:
lccov --add-tracefile file1.info --add-tracefile file2.info -o global_coverage.info
I can retrieve the list of all .info file by
# Get all .info file for coverag
INFO_FILES = $(wildcard $(TEST_BUILD_DIR)/*.info)
Is there a generic way to call this lcov tool by indicating all .info file ?
Thanks

You can use this:
# Get all .info file for coverag
INFO_FILES := $(wildcard $(TEST_BUILD_DIR)/*.info)
global_coverage.info: $(INFO_FILES)
lccov $(addprefix --add-tracefile ,$^) -o $#

Related

Make offsetting file contents during build

I'm trying to use Make to ... make modular Dockerfiles. Long story short, I want to centralize certain elements and make the composable and reusable, like classes and functions really, but the Dockerfile syntax does not - and according to the developers, will not - offer any facilities in the image of C's #include or similar composability solutions. Not to worry, #include and friends to the rescue!
Except...
I have the following Makefile in my project:
BUILD_DIR := ${CI_PROJECT_DIR}/build
TEMPLATE_FILES := $(shell find ${CI_PROJECT_DIR} -name '*.build')
TEMPLATE_FILENAMES := $(foreach file,$(TEMPLATE_FILES),$(BUILD_DIR)/$(notdir $(file)).built)
BUILT_TEMPLATES := $(TEMPLATE_FILENAMES:.build.built=.built)
DOCKER_FILES := $(shell find ${CI_PROJECT_DIR} -name '*.Dockerfile')
DOCKER_OBJS := $(foreach file,$(DOCKER_FILES),$(BUILD_DIR)/$(notdir $(file)))
all: $(BUILT_TEMPLATES) $(DOCKER_OBJS)
$(BUILD_DIR)/%.built: $(TEMPLATE_FILES) $(BUILD_DIR) # build any templated Dockerfiles
cpp -E -P -o $(BUILD_DIR)/$(notdir $#) -I ${CI_PROJECT_DIR}/modules $<
sed -i 's/__NL__ /\n/g' $(BUILD_DIR)/$(notdir $#)
$(BUILD_DIR)/%.Dockerfile: $(DOCKER_FILES) $(BUILD_DIR)
cp $< $(BUILD_DIR)/$(notdir $(#))
$(BUILD_DIR):
mkdir -p $(BUILD_DIR)
.PHONY: clean
clean:
-rm -r $(BUILD_DIR)
The objective is to run the templated Dockerfiles through GCC to compile the #includes in them into proper Docker instructions, and just copy the rest of the files. Sounds simple enough.
Except that it looks like all the target files are "offset" from their sources - like the file names are correct, but the contents are from a file elsewhere in the list, and with no discernible order either.
One thing that I'm fairly sure is wrong - but even more wrong otherwise - is the line
$(BUILD_DIR)/%.built: $(TEMPLATE_FILES) $(BUILD_DIR) # build any templated Dockerfiles
By all manuals and documentation, it ought to be
$(BUILD_DIR)/%.built: %.build $(BUILD_DIR) # build any templated Dockerfiles
but that's even worse, because then Make just says make: *** No rule to make target '/docker/build/runner-dart-2-18-firebase.built', needed by 'all'. Stop.
I'm out of ideas here, along with my limited knowledge of Make. What am I missing to make Make make - sorry - my Dockerfiles?
This line:
$(BUILD_DIR)/%.built: $(TEMPLATE_FILES) $(BUILD_DIR)
Says that if make wants to build a target that matches that pattern, and it can find all the prerequisites, then the pattern rule matches and the recipe can be used. Let's ignore BUILD_DIR (note that it's always a bad idea to list a directory as a prerequisite, but that's not causing this problem). Suppose TEMPLATE_FILES is set to the value ./foo/foo.build ./bar/bar.build. Now the above rule expands to:
./build/%.built: ./foo/foo.build ./foo/bar.build ./build
What is the recipe?
cpp -E -P -o $(BUILD_DIR)/$(notdir $#) -I ${CI_PROJECT_DIR}/modules $<
First it's always wrong to create a file that is not exactly $# so you should use just $# not $(BUILD_DIR)/$(notdir $#). But more importantly, what will $< be set to? It is always set to the first prerequisite, and the first prerequisite is always ./foo/foo.build. So every time you run this recipe, regardless of which .built file you're trying to create, you will always be preprocessing the first .build file.
Your idea that you want this instead:
$(BUILD_DIR)/%.built: %.build $(BUILD_DIR)
is correct, in general. Why do you get the error? Because if you are trying to build the target ./build/foo.built, then the stem (part that matches %) is foo. Then make will look to see if the prerequisite foo.build exists or can be created, because you said the prerequisite is %.build. That file does NOT exist and CANNOT be created (make doesn't know how to create it), because the file is ./foo/foo.build not foo.build which is a totally different file.
You have three options. You can either write separate rules for each source directory:
$(BUILD_DIR)/%.built: foo/%.build
...
$(BUILD_DIR)/%.built: bar/%.build
...
Or, you can change your generated files so they are not all in the same directory but instead keep the source directory structure; you would change this:
TEMPLATE_FILENAMES := $(foreach file,$(TEMPLATE_FILES),$(BUILD_DIR)/$(notdir $(file)).built)
BUILT_TEMPLATES := $(TEMPLATE_FILENAMES:.build.built=.built)
to just this:
BUILT_TEMPLATES := $(patsubst %.build,$(BUILD_DIR)/%.built,$(TEMPLATE_FILES))
then create the output directory as part of the recipe:
#mkdir -p $(#D)
cpp -E -P -o $# -I ${CI_PROJECT_DIR}/modules $<
sed -i 's/__NL__ /\n/g' $#
Or finally, you could use VPATH to tell make what directories to look in to find the *.build files:
VPATH := $(sort $(dir $(TEMPLATE_FILES)))
(note, you should choose only one of these options).

Coverage report when source and objects are in different directories

I am trying to generate coverage report for my project and running into a problem.
I understand that to get the coverage info, I need .gcno, .gcda and source files.
My current project dir structure is
/root/proj/src --> top level Makefile and main.c
/root/proj/src/module1
/root/proj/src/module2
..... -> contains all .c/.h ,makefile
/root/proj/build/obj -> contains all .o,.gcno,.gcda files after compilation
/root/proj/build/exe -> contains the executable
(copying minimal lines below to show the problem)
cd /root/proj/build/obj
when I run
lcov -b ../../src/ --directory . --capture --output-file app.info
Processing module1.gcda
module1.c:cannot open source file
......
Finished .info-file creation
Then :
genhtml --legend -o ./latest_code_cov/ app.info
Reading data file app.info
Found 5 entries.
Found common filename prefix "/root/proj/src"
Writing .css and .png files.
Generating output.
Processing file src/module1.c
genhtml: ERROR: cannot read /root/proj/src/module1.c
bash-4.1$
1) Do I need to change my makefile to dump .gcno/.gcda files in the same folders as the source?
2) Is there a way(some flag) to set the source file path in .gcno/.gcda files?
Any suggestions?
gcc version 4.4.7 20120313 (Red Hat 4.4.7-11) (GCC)
lcov: LCOV version 1.13

Using Bazel to generate coverage report

I am using genhtml command to generate html coverage report from Bazel generated coverage.dat file:
genhtml bazel-testlogs/path/to/TestTarget/coverage.dat --output-directory coverage
The problem with using genhtml is that I have to provide the paths to the coverage.dat files (which are generated in bazel-testlogs/..) Is it possible to fetch those coverage.dat files as an output from another rule?
I would like to not have to call genthml command directly, but have Bazel handle everything.
I was not able to find a way to get coverage.dat files as an output of a bazel rule. However, I was able to wrap all the locations of all the .dat files as srcs to a filegroup in WORKSPACE directory:
filegroup(
name = "coverage_files",
srcs = glob(["bazel-out/**/coverage.dat"]),
)
and then use that filegroup in a custom .bzl rule that wraps the genthml command to generate html coverage report. So now I only have to call
bazel coverage //path/... --instrumentation_filter=/path[/:]
command to generate the coverage.dat files, generate html report and zip it up. Thus, bazel handles everything.
Bazel added support for C++ coverage (though I couldn't find much documentation for it).
I was able to generate a combined coverage.dat file with
bazel coverage -s \
--instrument_test_targets \
--experimental_cc_coverage \
--combined_report=lcov \
--coverage_report_generator=#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main \
//...
The coverage file gets added to bazel-out/_coverage/_coverage_report.dat
For Java based project we can get code coverage in following way
To get coverage for complete module ->
Running coverage for complete project module. Run following command ->
bazel coverage ... --compilation_mode=dbg --subcommands --announce_rc --verbose_failures --jobs=auto --sandbox_debug --build_runfile_links --combined_report=lcov --coverage_report_generator=#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main
Then run following command from parent project directory to get html view. Html report is generated in output-directory-name we specified. From that open index.html to see coverage report.
genhtml -o <output-directory-name> bazel-out/_coverage/_coverage_report.dat
bazel-out directory usually gets created in project parent directory(e.g. where bazel WORKSPACE file is present)
To get coverage for specific IT/Test in a module ->
Running coverage for for specific IT/Test in a module. Run following command from project/sub-project directory ->
bazel coverage <class-name-of-Test-or-IT> --compilation_mode=dbg --subcommands --announce_rc --verbose_failures --jobs=auto --sandbox_debug --build_runfile_links --combined_report=lcov --coverage_report_generator=#bazel_tools//tools/test/CoverageOutputGenerator/java/com/google/devtools/coverageoutputgenerator:Main
Then run following command from parent project directory to get html view. Html report is generated in output-directory-name we specified. From that open index.html to see coverage report.
genhtml -o <output-directory-name> bazel-out/_coverage/_coverage_report.dat

How to use gcovr with source files outside the current/build/run directory?

mkdir -p /tmp/build &&
cd /tmp/build &&
mkdir -p /tmp/src &&
echo "int main(){return 0;}" > /tmp/src/prog.c &&
gcc --coverage -o prog /tmp/src/prog.c &&
./prog &&
gcovr -v -r .
will output an empty report.
Scanning directory . for gcda/gcno files...
Found 2 files (and will process 1)
Processing file: /tmp/build/prog.gcda
Running gcov: 'gcov /tmp/build/prog.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /tmp/build' in '/tmp/build'
Finding source file corresponding to a gcov data file
currdir /tmp/build
gcov_fname #tmp#src#prog.c.gcov
[' -', ' 0', 'Source', '/tmp/src/prog.c\n']
source_fname /tmp/build/prog.gcda
root /tmp/build
fname /tmp/src/prog.c
Parsing coverage data for file /tmp/src/prog.c
Filtering coverage data for file /tmp/src/prog.c
Gathered coveraged data for 0 files
------------------------------------------------------------------------------
GCC Code Coverage Report
Directory: .
------------------------------------------------------------------------------
File Lines Exec Cover Missing
------------------------------------------------------------------------------
------------------------------------------------------------------------------
TOTAL 0 0 --%
------------------------------------------------------------------------------
However if I manually run
gcov /tmp/build/prog.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /tmp/build
I get correct results
File '/tmp/src/prog.c'
Lines executed:100.00% of 1
No branches
No calls
Creating '#tmp#src#prog.c.gcov'
It seems that gcovr did not extract the coverage from the otherwise correct gcov output. This only happens if the source file is outside the current directory (same as build directory, same as output directory, same as run directory), and gcc ics called with an absolute path to the source file.
How can I fix this?
Edit
Fixed in upstream gcovr for relative paths, but looks like a bug for absolute paths.
See https://github.com/gcovr/gcovr/issues/169.
What I understood from your code up there is that you made everything and ran the program but you are still inside build directory where the object file resides.
So, what you need to understand is:
gcovr -v -r <path>
this -r flag takes the root directory, which means the parent directory inside which the source and object directory resides. So that it can trace them both and generate coverage data and whatever else you want it to generate.
Try doing that and it will work.
For your understanding:
The .gcno files that gets generated after compilation is just the flowchart kind of data for that particular source file.
Then later when you execute the program, a .gcda file gets generated for each source file. This file contains real coverage data, but for gcovr all these three files are necessary (.gcno, .gcda, sourceFile)
Hope it helped. :)
update:
Back with the work around
You can supply your coverage data location as a pure arg (no option) and point the root to your sources.
gcovr .../path/To/GCDA -r .../path/to/src/ [rest desired flags]
This will solve your problem for sure.
Worked for me in covering my projects.
Gcovr only generates reports for source files within your project. This is intended to exclude coverage from library headers etc.
The question is, which files are in your project? This is determined by the -r root path.
If you are in /tmp/build and root is . aka /tmp/build and the source file is /tmp/src/prog.c, then that source file is clearly outside of your project. In the verbose output, gcovr will report Filtering coverage data for file /tmp/src/prog.c.
If you are in /tmp/build and root is .. aka /tmp and the source file is /tmp/src/prog.c, then that source file is within the project.
If you are in /tmp/build and root is . aka /tmp/build and the source file is ../src/prog.c, then gcovr seems to do something questionable: It joins the file name with the current directory and checks that. So we actually see /tmp/build/../src/prog.c. As far as gcovr is concerned, that's within your project. It seems this behaviour is necessary to include code that is symlinked into a project.
You can disable this “is the source within the project?” filter by providing your own, better filter. For example, you can ask gcovr to only report coverage for sources under /tmp/src:
gcovr -r . -f /tmp/src

GCOV: why sample.gcda and sample.gcno may be different

At first I take the message sample.gcda:stamp mismatch with graph file
the order of compilation and running is observed
hexdump -e '"%x\n"' -s8 -n4 sample.gcno -> aaa1aaaa
hexdump -e '"%x\n"' -s8 -n4 sample.gcda -> bbb2bbbb
stamp mismatch with graph file
Means that graph file has been compiled again after binaries built.
If the compilation order is correct, you could try to check if there is a compilation of the sample.cpp twice somewhere in building rules.
For example we have something like that:
g++ ... sample.cpp -o sample
g++ ... -shared sample.cpp -o sample2.o
So one file is compiled twice. It will cause that gcno file will be updated by new timestamp that will not match to gcda file anymore.
If you performed your product or application testing thoroughly and manually and spent lot of effort on it. If your objective is to get code coverage report using lcov and gcov but by mistake deleted gcno files. You can regenerate gcno files by recompiling the code but it will be generated with new timestamp and gcov reports error saying "stamp mismatch with graph file" and no code coverage report will be generated. This will result in all your testing effort getting wasted.
There is a shortcut to still generate the code coverage report. This is just a workaround and should not be relied upon all the time. Its recommended to preserve *.gcno files till your testing completes.
Note down your gcc version(gcc -v) and download its source code from one of the mirror sites
Eg - ftp://gd.tuwien.ac.at/gnu/sourceware/gcc/releases/gcc-4.4.6/gcc-4.4.6.tar.bz2
After extracting downloaded file, gcc the folder structure will be as follows
gcc-4.4.6
gcc-4.4.6/gcc
If you directly go inside gcc-4.4.6/gcc and try to do ./configure and compile(make) from there then you will encounter below problem
build/genmodes -h > tmp-modes.h
/bin/sh: build/genmodes: No such file or directory
Solution is do ./configure and make from gcc-4.4.6 and no errors will be shown related to genmodes. This will compile all modules including gcc. You may have to install mpfr and gmp modules which are needed by gcc if any error shown by ./configure
goto gcc-4.4.6/gcc/gcov.c and comment below lines and then recompile with above command
/* if (tag != bbg_stamp)
{
fnotice (stderr, "%s:stamp mismatch with graph file\n", da_file_name);
goto cleanup;
}*/
Example path of new gcov binary after compilation is gcc-4.4.6/host-x86_64-unknown-linux-gnu/gcc/gcov
Place this binary in /usr/bin and regenerate code coverage report with command as shown in below example
lcov --capture --directory ./ --output-file coverage.info ; genhtml coverage.info --output-directory /var/www/html/coverage
Now you should not get "stamp mismatch with graph file" error and you will get code coverage report properly

Resources