Run gcov tool on folder - gcov

I run gcov tool on some .c files using gcc -fprofile-arcs -ftest-coverage [filenames]. command
But its is very tedious job of supplying file names to this command.
Instead I need help in which I can run gcov tool on a folder which contains all source files.
Is this possible?
Please help me out with a solution.
Thanks in advance.

I ran into the same problem, my project contains ~3000 files.
Write a shell script to grab all .c .gcno and .gcda files to a common folder using find exec, then run gcov using the same command.
sample:
LOCATION=your_gcov_folder_name
find -name '*.c' -exec cp -t $LOCATION {} +
find -name '*.gcno' -exec cp -t $LOCATION {} +
find -name '*.gcda' -exec cp -t $LOCATION {} +
cd $LOCATION
find -name '*.c' -exec gcov -bf {} \;
run it on your code folder which contains your project.

[LCOV][1] provides user friendly reports automatically, firstly I would suggest to take a look.
If you really want to use gcov to show coverage data you could try
find . -name "*.cpp" -exec sh -c 'gcov {} -o "$(dirname {})"' \;
this will create gcov files based on your gcno and gcda files.
And usually it is not perfect idea to move gcno/gcda files. It will cause problems with finding source codes.

First of all, the command you have specified in the question is for compiling c/c++ files and instrumenting them for getting coverage generated later at the time of execution.
That command can be used as following too:
gcc --coverage
g++ --coverage
Note: you must specify the same flag for linking too.
Now about the question, if your question is about compiling multiple files then there are a lot ways for building projects, no matter how complex. You can use automated builds for it.
If your question is about generating coverage report for multiple files then:
You can use gcovr for generating report in various forms just by specifying root directory (directory above src and obj ) with "-r or --root=ROOT" flags.
Refer to this user guide.
Answers given by others works too if you really want to use only gcov and nothing else. But in my opinion gcovr meets every purpose that can be fulfilled with gcov(except function level detail, you can get line level details though).

if you are not getting coverage report try removig
"coverageReporters": [
"text",
"text-summary" ],
from file
jest.config.js

Related

Why does Bazel's foreign_cc rules dereference symlinks in the output? How can I change this?

I currently into "migrating" some third party dependency projects (typically old style configure/make based) to Bazel using it's foreign_cc rules.
One goal is to have identical output compared to before the migration, and among some attributes like permissions and RPATH I'm still struggling with symlinks being de-referenced seemingly unconditionally.
So instead of libfoo.so -> libfoo.so.3, libfoo.so.3 -> libfoo.so.3.14 I'll always get three files now.
Inspecting the generated bazel-bin/external/foo/foo_foreign_cc/build_script.sh the last commands contain two invocations of cp -L with no variables modifying the behavior:
[configure command]
[make commands]
set +x
cp -L -r --no-target-directory "$BUILD_TMPDIR/$INSTALL_PREFIX" "$INSTALLDIR" && find "$INSTALLDIR" -type f -exec touch -r "$BUILD_TMPDIR/$INSTALL_PREFIX" "{}" \;
[content of #postfix_script]
replace_in_files $INSTALLDIR $BUILD_TMPDIR \${EXT_BUILD_DEPS}
replace_in_files $INSTALLDIR $EXT_BUILD_DEPS \${EXT_BUILD_DEPS}
replace_in_files $INSTALLDIR $EXT_BUILD_ROOT \${EXT_BUILD_ROOT}
mkdir -p $EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo
cp -L -r --no-target-directory "$INSTALLDIR" "$EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo" && find "$EXT_BUILD_ROOT/bazel-out/k8-fastbuild/bin/external/foo/copy_foo/foo" -type f -exec touch -r "$INSTALLDIR" "{}" \;
cd $EXT_BUILD_ROOT
So it looks quite obvious to me that for some reason configure_make doesn't even consider to keep symlinks, turning this into something I have to do outside the Bazel rule (while also possibly polluting the remote cache).
Is there a reason for this? I.e. why shouldn't I create a fork of rules_foreign_cc just to remove this -L flag which someone seem to have added intentionally?
I'm one of the rules_foreign_cc maintainers.
The reason why rules_foreign_cc dereferences the symlinks there is because in general the outputs being copied into named outputs may be dangling symlinks as they may not be relative outputs to other build outputs and at least in Bazel 4 which is the minimum version we currently support, dangling symlinks are not allowed as build artifacts. (this behaviour may have changed in later Bazel versions but I'm not 100% sure on this).
What you likely want to actually consume is the output_group gendir. This can be accessed like so:
filegroup(
name = "my_install_tree",
src = ":cmake_target",
output_group = "gendir",
)
The gendir output group is the entire install directory as created by the build artifacts.
Note that you wouldn't actually need to fork the rules to achieve what you were proposing either. The shell script is generated by a toolchain (whose type is currently in the private package and so the right to change this is reserved.) and thus you could provide your own implementation of the toolchain to override the behaviour.

How to remove unused Images from Xcode Project?

I want to delete all the unused images from a XCode project and in order to do that I am using the following script:
#!/bin/sh
PROJ=`find . -name '*.xib’ -o -name '*.[mh]'`
for png in `find . -name '*.png'`
do
name=`basename $png`
if ! grep -q $name $PROJ; then
rm –Rf "$png"
echo "$png is not referenced"
fi
done
The above script is working fine and deleting all the images from the project that are not referenced in ".xib" however, there is a catch.
Problem
The script is also deleting the images that are referenced in ".m" files. (Images that are getting set programmatically)
Request
Could you please tell me how can I add ".m" with ".xib" files in search.
PROJ=`find . -name '*.xib’ -o -name '*.[mh]'`
First, not you are using rm -Rf to delete a single image. Be careful! This removes recursively and without forcing it, so it can be risky and remove things you don't want. Probably better to just say rm.
Your script is quite well organized and tidy. To make it more robust, it is always good to use quotes in the variables. This way, it will also support names with spaces. That is, if you want to remove a file called "a b.png", and the name is stored in the variable $png, saying rm $png you run rm a b.png, so it will try to remove a and b.png.
After all this introduction, let's focus on the specific problem here.
It looks like you are looking for those files that either end with .xib or m. The find . -name '*.xib’ -o -name '*.[mh]' syntax seems to be fine, but it may be better to use regex in find.
find -type f -regex '.*\.\(xib\|m|h\)'
Finally, you are using a for loop to go through the result of a find. Note you can also say:
while IFS= read -r png;
do
# things with "$png"
done < <(find ...)
but I won't go and suggest anything else here because I don't really follow the logic on these .xib, .png files. If you can show an example I will update my answer.

Script to generate Xcode empty project

I always need some empty Xcode projects for testing purposes. (I cannot use coderunner or other stuff, I really need an Xcode project).
I tried different approaches but I didn't find a real solution:
Created a basic, empty project and created a script for copying the entire folder.
It works, but you cannot have different names for the project, this means that you have to rename the project manually after the copy.
Using the Crafter gem
It's a useful gem, but you can only configure an existing project, you cannot create a new one.
Using KZBootstrap
The same as before, it's useful for configuring the project, not for creating a new one.
Using the xcodeproj gem (http://rubygems.org/gems/xcodeproj)
The documentation is not enough for me, and I don't understand how to use it :(
Any advice?
Finally i found a solution that fits my needs.
I started with an empty, sample project (here https://dl.dropboxusercontent.com/u/792862/SamplePRJ.zip)
and i wrote a bash script to rename all the files and all the occurrences of the previous name.
The script can be improved, but it basically works
export LC_CTYPE=C
export LANG=C
OLDNAME="SamplePRJ"
NEWNAME="Sample2PRJ"
mv $OLDNAME $NEWNAME
cd $NEWNAME
mv $OLDNAME $NEWNAME
mv ${OLDNAME}Tests ${NEWNAME}Tests
mv ${OLDNAME}.xcodeproj ${NEWNAME}.xcodeproj
mv ${NEWNAME}.xcodeproj/xcshareddata/xcschemes/${OLDNAME}.xcscheme ${NEWNAME}.xcodeproj/xcshareddata/xcschemes/${NEWNAME}.xcscheme
find . -type f -print0 | xargs -0 sed -i '' "s/${OLDNAME}/${NEWNAME}/g"

IOS project code coverage source file is relative path

I have generated .gcno and .gcda files after running my iphone app.Then I use cover story to view the coverage rate.However, cover story could not open the source file and I found that the source path is a relative path, not full path.All I can see is full of /EOF/ in the screen.
The strange to me is that only some of the files could not open due to this path issue. Most of them are full path and cover story can open them successfully.Unable to attach screenshot
How can I show the correct path names in CoverStory?
I suggest generating an HTML report with lcov, which allos you to normalize the directory names.
Other benefits of using an HTML report are:
The coverage information is available on both desktop machines as well as from a Continuous Integration build server.
To install lcov
Use Homebrew or MacPorts
Example:
brew install lcov
First Generate Datafile
#!/bin/bash
set -e # fail script if any commands fail
${gen.info} ${temp.dir}/coverage-data/*.gcno --no-recursion --output-filename \
${temp.dir}/${module.name}-temp.info
#Remove symbols we're not interested in.
${lcov} -r ${temp.dir}/${module.name}-temp.info > ${temp.dir}/${module.name}.info
Now Generate the HTML Report
#!/bin/bash
set -e # fail script if any commands fail
${gen.html} --no-function-coverage --no-branch-coverage -o ${coverage.reports.dir} \
--prefix ${basedir} ${temp.dir}/${module.name}.info
If you're interested, I have an build script the produces HTML reports here. An example report: http://jasperblues.github.io/Typhoon/coverage/index.html

Is there a way to focus lcov code coverage reports to just one or two directories?

I recently started using lcov to visualize my code coverage. It's a great tool.
One thing I'm noticing is that it generates code coverage reports for all the files that I'm using - including those that I'm not interested in. For example, it will give me code coverage reports for boost and mysql++ files.
Is there an easy way to force lcov to only generate coverage reports for specific files?
I have tried using the -k parameter like so:
/usr/bin/lcov -q -c -i -b . -d .obj -k src/ -k include/ -o app_base.info
{run unit tests now}
/usr/bin/lcov -q -c -b . -d .obj -k src/ -k include/ -o app_test.info
/usr/bin/lcov -q -a app_base.info -a app_test.info -o app_total.info
/usr/bin/genhtml -q -o lcov_output_directory app_total.info
(Meaning that I only want coverage files for the "include" and "src" directories.)
However, this doesn't seem to work. The report still shows me all the extraneous files. Any suggestions are very much appreciated. Thanks!
I used the --no-external flag together with the --directory flag to exclude unwanted files.
The definition of external from the man:
External source files are files which are not located in one of the directories specified by --directory or --base-directory.
So my command looked like this:
$ lcov --directory src -c -o report.info --no-external
Capturing coverage data from src
Found gcov version: 4.2.1
Scanning src for .gcda files ...
Found 4 data files in src
Processing src/C####.gcda
ignoring data for external file /usr/include/c++/4.2.1/bits/allocator.h
lcov supports a command line argument --remove to do exactly what you are asking for.
A possible approach is to constrain which files are compiled with the coverage flags (-fprofile-arcs -ftest-coverage). If you don't want to engineer your make file system to be selective about which files are built with test instrumentation, the following trick might work for you:
Build your application without instrumentation.
Remove the .o files for source that you want to instrument
Turn on instrumentation and rebuild. Only the deleted object files will be rebuilt with instrumentation.
Run lcov
This should result in only the targeted areas emitting gcov artifacts, which are blindly consumed by the lcov scripts.

Resources