Partcover: Exclude multiple namespaces via commandline - partcover

I'm using Partcover to generate code coverage statistic and want to exclude multiple namespaces using commandline parameter --exclude.
I can't manage to specify multiple namespaces to be excluded, could anyone suggest how to do it? Obviously single namespace mask working well.

You should just be able to add a --exclude for each assembly/namespace combination
i.e. --exclude [Assembly*]Namespace1.* --exclude [Assembly*]Namespace2.*

Related

gcov is invoking gtest sources and unit-tests. how can I avoid this?

I am working on creating a Jenkins pipeline for unit-testing maybe with GTest.
My plan is to use the following tools:
GTest for Unit-Testing, gcov for generating gcda and gcno Files and gcovr for xml or Html outputs of the unit-testing results.
It's working well till now with the help from the internet and particularly stack overflow.
But I am struggling with 3 issues.
gcov is creating gcda and gcno files for gtest sources and my unit-tests. Because gcovr is invoking them and I see them in the HTML files. how can I avoid this? I only want my production code in the HTML files.
I can only see code coverage for template classes if gcov is generating gcda and gcno files for my unit-tests. So I need a simple idea for 1) Maybe can I use an exclude flag in gcovr?
Unused functions in template classes (inline functions) are not covered. Code coverage is always 100% but I tried with different flags, and nothing helped.
-fprofile-abs-path --coverage -fno-inline -fno-inline-small-functions -fno-default-inline -fkeep-inline-functions
I added a picture to show, what I am talking about. UnitTests and GTests covering results should not appear in gcovr HTML...
You can filter out unwanted coverage data, but you can't create data that doesn't exist.
1. gcov is creating gcda and gcno files for gtest sources and my unit-tests. Because gcovr is invoking them and I see them in the HTML files. how can I avoid this? I only want my production code in the HTML files.
Use gcovr --exclude GoogleTest/ --exclude UnitTests/
Gcovr has a per-file filtering system that allows you to specify which source code files to include/exclude. For a file to be included in the coverage report,
any --filter pattern must match, and
no --exclude pattern must match.
Or phrased in reverse: a file is excluded if it doesn't match any --filter or if it matches any --exclude pattern.
If you don't provide an explicit --filter, then the default filter is the --root directory, which in turn defaults to the current working directory.
These patterns are regexes. Usually, these are used to match paths relative to the current working directory. For example, you can limit the reports to a src/ directory with gcovr --filter src/. Or you can exclude the GoogleTest/ directory with gcovr --exclude GoogleTest/.
Gcovr also has a way to filter gcda/gcno files (search_paths and --gcov-filter), but that is mostly useful as a performance optimization.
2. I can only see code coverage for template classes if gcov is generating gcda and gcno files for my unit-tests. So I need a simple idea for 1) Maybe can I use an exclude flag in gcovr?
This is by design. As explained above, you can solve this via gcovr's exclude flag.
You get a gcda/gcno file per compilation unit. Header files are included into multiple compilation units, so their coverage information is essentially split across all compilation units that include it.
So, if you want coverage for code in header files, and you include these headers into your unit tests, then gcovr must also process the gcda/gcno files from those unit tests.
3. Unused functions in template classes (inline functions) are not covered. Code coverage is always 100% but I tried with different flags, and nothing helped. -fprofile-abs-path --coverage -fno-inline -fno-inline-small-functions -fno-default-inline -fkeep-inline-functions
The gcov coverage data model works on an assembly-code level. Counters are inserted by the compiler itself, but only for functions for which the compiler actually generates machine code. Thus, as far as gcov is concerned, inline functions, optimized-out code, and non-instantiated templates simply do not exist.
This is quite annoying, but it's potentially difficult to work around.
This can most clearly be avoided by making sure that all functions for which you want coverage data are referenced via your unit tests. It is not necessary to actually invoke that function, merely referencing it should be sufficient. For example, I'd write a function to ignore() arbitrary values despite optimizations, then:
ignore(&some_inline_function);
Possible implementation: template<class T> void ignore(T const& t) { volatile T sinkhole = t; }
Your suggested options like -fno-inline do not work because the code for these functions isn't generated in the first place.
With GCC and when using C++ (but not C), the -fkeep-inline-functions should work, but only for non-templated inline functions.
If a non-templated inline function is only used within one file and isn't provided in a header to multiple files, then it should instead be declared static (in C) or in an anonymous namespace (in C++11 or later), so that -Wunused-function or -Wall notify you if it isn't referenced.
Templates are more tricky in general. Each distinct instantiation of a template results in separate functions. Gcovr does aggregate coverage data across them, but in order for the template to appear in the coverage data it must be instantiated at least once. You will have to do this manually.

How to use --save_temps in Bazel rule instead of command line?

Is there a way to control the Bazel build to generate wanted temp files for a list of source files instead of just using the command line option "--save_temps"?
One way is using a cc_binary, and add "-E" option in the "copts", but the obj file name will always have a ".o". This kind of ".o" files will be overwriten by the other build targets. I don't know how to control the compiler output file name in Bazel.
Any better ideas?
cc_library has an output group with the static library, which you can then extract. Something like this:
filegroup(
name = "extract_archive",
srcs = [":some_cc_library"],
output_group = "archive",
)
Many tools will accept the static archive instead of an object file. If the tool you're using does, then that's easy. If not, things get a bit more complicated.
Extracting the object file from the static archive is a bit trickier. You could use a genrule with the $(AR) Make variable, but that won't work with some C++ toolchains that require additional flags to configure architectures etc.
The better (but more complicated) answer is to follow the guidance in integrating with C++ rules. You can get the ar from the toolchain and the flags to use it in a custom rule, and then create an action to extract it. You could also access the OutputGroupInfo from the cc_library in the rule directly instead of using filegroup if you've already got a custom rule.
Thanks all for your suggestions.
Now I think I can solve this problem in two steps(Seems Bazel does not allow to combine two rules into one):
Step1, add a -E option like a normal cc_libary, we can call it a pp_library. It is easy.
Step2, in a new rules, its input is the target of pp_library, then in this rule find out the obj files(can be found via : action.outputs.to_list()) and copy them to the a new place via ctx.actions.run_shell() run_shell.
I take Bazel: copy multiple files to binary directory as a reference.

Search for files that contain pattern

I have this search - I would like to print out the paths of files that contain the matching text:
grep -r "jasmine" .
and it yields results that look like this:
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:- [Jasmine Google Group](http://groups.google.com/group/jasmine-js)
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:- [Jasmine-dev Google Group](http://groups.google.com/group/jasmine-js-dev)
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:git clone git#github.com:yourUserName/jasmine.git # Clone your fork
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:cd jasmine # Change directory
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:git remote add upstream https://github.com/jasmine/jasmine.git # Assign original repository to a remote named 'upstream'
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:Note that Jasmine tests itself. The files in `lib` are loaded first, defining the reference `jasmine`. Then the files in `src` are loaded, defining the reference `j$`. So there are two copies of the code loaded under test.
./app-root/runtime/repo/node_modules/jasmine-core/.github/CONTRIBUTING.md:The tests should always use `j$` to refer to the objects and functions that are being tested. But the tests can use functions on `jasmine` as needed. _Be careful how you structure any new test code_. Copy the patterns you see in the existing code - this ensures that the code you're testing is not leaking into the `jasmine` reference and vice-versa.
But I just want the file names, I don't want to print out the matching contents, just the file names, how can I do that?
The problem is that the matching text will wrap around in the terminal and make the results basically unreadable.
Did you use the -l flag from grep?
-l, --files-with-matches
Suppress normal output; instead print the name of each input file from which output would normally have been printed. The scanning will stop on the first match.
A simple search on my home directory,
grep -rl 'bash' .
./.bashrc
./.bash_history
./.bash_logout
./.bash_profile
./.profile
./.viminfo
As a matter of fact, -l is a POSIX defined option for grep should be available in almost all distros.

How do I find a particular sub directory using Ant and then use it to create a symlink?

I need to create a symlink to a sub-directory using Ant. The issue is that I don't know where the target sub-directory is.
To create a symlink with ant I do this:
<symlink link="${parent.dir}/FOO/linkname" resource="${parent.dir}/BAR/target"/>
But I don't know what BAR is called in advance so I need to do a search for "target" under parent.dir and then pass the one result into the resource.
Is this possible using fileset? Or another way?
It might be possible to use a fileset but that might give you several symlinks or none.
A much better approach is to define the path to BAR in a property. If there is a dynamic part in this path, change the code so that Ant evaluates the dynamic part and everyone else uses Ant's value.
The typical example here is that the path contains a version or timestamp. Define those in your build file so you can use them everywhere. If a Java process needs the values, pass them to the process as a system property (-D...).

How to add filters to OpenCover tool to skip some of the classes in a namespace

How can I add filters to skip some of the classes in a namespace/assembly. For example: SYM.UI is the base assembly and i want to skip SYM.UI.ViewModels. Writing the below filter but it is including all of them and not fulfilling my request:
+[SYM.UI*]* -[SYM.UI.ViewModels*]*
Kindly help me correcting this?
The opencover wiki is a good place to start.
The usage is described as +/-[modulefilter]typefilter (this is based on how you would see the types in IL; where the type filter also includes the namespace and module filter usually is the name of the assembly (without the file extension).
Thus to exclude your types you could use
+[SYM.UI]* -[SYM.UI]SYM.UI.ViewModels.*
NOTE: Exclusion filters take preference over inclusion filters.
You can use following:
"-filter:+[*]* -[SYM.UI]SYM.UI.ViewModels.*"
Note that the quotes must be around the -filter: part, too

Resources