I am using Common Test for code coverage analysis in my erlang project.
File structure
myProject/
ebin/
src/
test/
myProject.coverspec
The .beam-files for the source code is located in ebin/ and for the tests they are in test/ together with the test sources.
I am currently using absolute paths to the .beam files in the .coverspec-file.
myProject.coverspec:
{level,details}.
{incl_dirs, ["/home/user/myProject/ebin", "/home/user/myProject/test"]}.
This works, but is far from optimal as the project development is distributed.
ct_run is called from the base of the project, myProject, the paths don't seem to be relative to myProject but somewhere else.
I've tried paths relative to both myProject and myProject/test w/o success.
My question is, where are the paths in myProject.coverspec relative to?
In my last Erlang project where I have used CT and cover my setup looked like this:
cover.spec in project root directory:
{incl_dirs, ["apps/application_manager/ebin", "apps/session_counter/ebin", "apps/session_heartbeat/ebin", "apps/session_api/ebin"]}.
Tests executed from project root via;
ct_run -pa apps/*/ebin -pa deps/*/ebin -dir apps/*/test/ -logdir tests -- cover cover.spec
Not sure if this solves your problem but it worked for me.
Related
I am trying to generate coverage report for project using .bat file as detailed below.
I see very few .gcov files. Also, when I click on link on generated html output, I am not able to see file details (file not found error). How do I fix it?
After I execute .bat file, I see output like ‘parsing coverage data for QString.h’ (QT library files). Is it expected?
I have seen many related questions but I am not able to figure out
(in report_coverage.bat)
set GCovrpath= C:\python37\script\lib\
set GCovpath= C:\abc\ghj\bin\
set datafiles= C:\source\mywork\root\testing\unittests\rose\build\debug\
set gcovr_src= C:\source\mywork\root\
%GCovpath%gcov.exe %datafiles% >> output.log
gcovr %datafiles% -s -p --html --html-details --gcov-executable %GCovpath%gcov.exe -o Test.html –verbose
Here are details….
Compile and execute code using
QMAKE_CXXFLAGS += -fprofile-arcs -ftest-coverage
QMAKE_LFLAGS += --coverage
.GCNO and .GCDA files are generated as expected
It seems simple directory structure
Root
Header
Rose
Marigold
Jasmin
Source
Rose
Marigold
Jasmin
Testing
UnitTests
Rose
build
debug
Marigold
build
debug
Jasmin
build
debug
Thank you.
Update:
See answer below
I can not emphasis enough "\" for windows.
run this command from debug folder(because test.exe is here)
gcov -b -l -s C:\source\mywork\root\ debug\*.gcno
run this command from Unit tests folder (this will exclude .h files and files containing test)
gcovr -g -k -v --root C:\source\mywork\root\ -e ".*\.h" -e ".*test[_-|A-Z|a-z|0-9]*\.cpp" --html --html-details -o report.html
If you invoke gcov yourself, you need to run it from the same directory where the compiler was executed, and you need to give it either the path to the gcno, gcda, or source file. Gcov can only handle one input file at a time.
When gcov runs in the correct place, it can look at compilation metadata to find the correct source file. If there are errors about missing source files, that indicates that you didn't use the correct directory.
Gcovr runs gcov automatically, and has heuristics to figure out the correct directory. However, you should still run it from the directory where you started the compilation (typically, a build directory).
And gcovr will exclude coverage data if it doesn't belong to your project. If you have a separate build directory, you will need to set the --root argument to the directory containing your source code. Gcov processes coverage data for all files that were compiled, which makes this post-processing by gcovr necessary.
In verbose mode, gcovr will output “Parsing coverage data for <file>” when opening a gcov report. It will then use data within the file to decide whether it belongs to your project, and output “Filtering coverage data” if the source code is part of your project, “Excluding coverage data” otherwise.
There are multiple reasons why the coverage report might not be complete:
There is a problem with filtering.
Gcovr's heuristics can get confused when multiple files have the same name, e.g. two files called util.h in different directories.
Gcovr's --html-details report consists of multiple .html files, so make sure that they are all available.
In your BAT file, this invocation might work better:
gcovr --root ../src --print-summary --sort-percentage --html-details --gcov-executable %GCovpath%gcov.exe --output Test.html --verbose
assuming the following directory structure, and that you run gcovr from within build/:
your-project/
src/
Header/
...
Source/
...
Testing/
...
build/
...
If there are problems with a root path like ../src, consider using an absolute path like C:/path/to/the/src.
When using:
$ rebar3 as test eunit
it compiles the code into ebin, but the other directories are symlinked in the _build/test/lib folder. I've tried using profile test by modifying:
{relx, [{dev_mode, false}]}
This works only for _build/test/rel directory but not the lib directory. So during tests they are referencing the lib symlink directories. Is there a way to have these directories not symlinked to the original, but actual copies provided like the release?
After a quick look at the rebar3 code there doesn't seem to be a way to force copying of these directories. It looks like the priv directory is always symlinked here. Even though the function used is called symlink_or_copy, it only ends up copying when there is an error while creating the symlink.
dev_mode is a relx option, that's why it doesn't affect rebar3 features.
If you want this feature added you can create a feature request explaining your use case, why you think it would be useful and it might get implemented.
I'm a newbie to Erlang just having gone through some tutorials on Erlang. Coming from TDD back-ground I thought I should follow some TDD principles in Erlang. I have organized my code as below
root
|- tests
| |- name_builder_tests.erl
|- src
| |- name_builder.erl
I start Erlang shell in root directory. But I cannot compile my erl files from there so I have to switch to tests or src directories every time I make a change to one of those files and I need to compile them.
Is there any way I can tell shell to look for module in all the sub-directories when compiling modules or executing functions from particular modules? What I'm trying to ask is, if my shell is at root directory can I successfully execute following
c(name_builder).
c(name_builder_tests).
Let organize code like this.
root
|- test
| |- name_builder_tests.erl
|- src
| |- name_builder.erl
|- rebar
|- rebar.config
Than run './rebar compile eunit'.
Rebar script and docs you can find here https://github.com/basho/rebar/wiki
One of the approaches when doing unit testing, is to place the tests within the same module as the production code:
-module(my_code).
-export([run/0]).
run() -> ok.
-ifdef(TEST).
-include_lib("eunit/include/eunit.hrl").
run_test() ->
?assertEqual(ok, run()).
-endif.
That way you have the tests close to you.
Regarding the availability of the code, you should run the erlang shell using the "-pa" parameter and specify the location of your code:
erl -pa src/
that's because you're compiling by default will put the beam files to the same folder as the sources.
But I would recommend using something like rebar, as it will make your life easier.
HTH,
Alin
You can do this using an Emakefile as well to tell the compiler where to look for source files.
Create a file named Emakefile in the root dir with following contents
{'src/*', [debug_info,
{i, "src"},
{i, "include"},
{outdir, "ebin"}]}.
{'test/*', [debug_info,
{i, "src"},
{i, "include"},
{i, "test"},
{outdir, "ebin"}]}.
Now compile all modules using erl -make which will put all .beam files inside the ebin/ dir.
Then start the shell by running the command erl -pa ebin/ which will add the ebin dir to sys path
PS 1: I am pretty much an erlang newbie too and I learnt this approach from Learn You Some Erlang, more precisely from this lesson
PS 2: If you are working on an app that is planned to be more complicated than this, I would recommend you to check rebar
In my project, i want to use mysql so i checkout this https://github.com/dizzyd/erlang-mysql-driver. I want to know how install the application so that my project can interact with it
Have a look at "rebar" - https://bitbucket.org/basho/rebar/wiki/Home
It can be used for installing dependencies, and for creating independent releases.
And a quick look at erlang-mysql-driver, that you want to use, shows that it is also using rebar for its dependency management.
rebar may complicate things if you have already started laying out your app (done some coding already) or if you are a newbie , however, if your project is an erlang/OTP app, then i suggest that you first organize you code according to the recommended file system like this:
MyProject--/src
/ebin
/lib
/include
/priv
/doc
/examples
/test
/Emakefile
The Emakefile is an important file. It maynot have a file extension. It enables the BIF: make:all() to compile all the erlang source modules you point it to and transfers all the .beam files to the destination you want.
For example: is i want all the modules in src to be compiled and transfer the beam files into ebin, i enter this into the Emakefile
{"src/*", [debug_info, netload,strict_record_tests,warn_obsolete_guard,{outdir, "ebin"}]}.
In that case i would start the erlang shell with its pwd() pointing in the folder MyProject, to enable the function call make:all() to find the file Emakfile so as to compile all my src files.
Now, suppose you have another OTP app which you want to have as an extra package in your build. If it OTP-like arranged as i have showed you, and not yet built i.e. not yet made, i mean with only its src and its folder ebin are empty or it ebin may be containing a .APP file already. Then you copy this OTP application into your lib folder, so that your application looks like this:
MyProject--/src
/ebin
/lib/some_otp_app-1.0
/include
/priv
/doc
/examples
/test
/Emakefile
then we would change our Emakefile to look like this:
{"src/*", [debug_info, netload,strict_record_tests,warn_obsolete_guard,{outdir, "ebin"}]}.
{"lib/some_otp_app-1.0/src/*", [debug_info, netload,strict_record_tests,warn_obsolete_guard,{outdir, "lib/some_otp_app-1.0/ebin"}]}.
In the folder MyProject, you can put a shell script that will start your project and add all relevant ebin paths to your nodes code path.the sh script may look like this:
#!/bin/bash
erl \
-name my_node#my_domain \
-pa ./ebin ./lib/*/ebin ./include \
-mnesia dump_log_write_threshold 10000 \
-eval "make:all()"
You can save this file as start_project.sh. Hence as you make changes to your source code, even at the time of starting your project, when you run the sh script with your terminal path pointing into the folder: MyProject, you do this:
$pwd
/export/home/your_user_name/MyProject
$sh start_project.sh
This would start your project at the node you entered in the script and would compile all src files which were changed when it was off. Not only this, you can as well call: make:all() in your shell whenever you make cahnges to your src code. then you would call: l(some_module) after making so that the erlang vm reloads the new object code of the compiled module.
So, your entire project will now appear like this:
MyProject--/src
/ebin
/lib/some_otp_app-1.0
/include
/priv
/doc
/examples
/test
/Emakefile
/start_project.sh
So if you substitute the erlang driver for mysql application with this "some_otp_app-1.0", everything will be fine. success!
Let's say I have this directory structure:
SConstruct
src/
a.cpp
b.cpp
include/
a.h
b.h
in SConstruct I don't want to specify ['src/a.cpp', 'scr/b.cpp'] every time; I'm looking for some way to set the base source directory to 'src'
any hint? I've been looking into the docs but can't find anything useful
A couple of options for you:
First, scons likes to use SConscript files for subdirectories. Put an SConscript in src/ and it can refer to local files (and will generate output in a build subdir as well). You can set up your environment once in the SConstruct. Then you "load" the SConscript from your master SConstruct.
SConscript('src/SConscript')
As your project grows, managing SConscript files in subdirectories is easier than putting everything in the master SConstruct.
Second, here's a similar question / answer that might help -- it uses Glob with a very simple example.
Third, since it's just python, you can make a list of files without the prefix and use a list comprehension to build the real list:
file_sources = [ 'a.c', 'b.c' ]
real_sources = [os.path.join('src', f) for f in file_sources]