Does a BEAM file remember whether it was built with -Werror? - erlang

I am working on a tool that deals with BEAM files, and we want to be able to assume the code was compiled with -Werror, so we don't have to repeat validations that are already done by the erl_lint compiler pass.
Is there a way to figure out if the BEAM was built with -Werror?
I'd expect beam_lib:chunks/2 to help here, but unfortunately it doesn't seem to have what I'm looking for:
beam_lib:chunks("sample.beam", [debug_info, attributes, compile_info]).
% the stuff returned says nothing about -Werror, even if I compile with -Werror

It seems that this information would be always stripped
However, if you are in control of compilation process - you can put additional info into beam files, - which will be accessible through M:module_info(compile) and via beam chunks as well.
For example in rebar:
{erl_opts, [debug_info, {compile_info, [{my_key, my_value}]}]}.
And then:
1> my_module:module_info(compile).
[{version,"7.6.6"},
{options,[debug_info, ...
{my_key,my_value}]
The same is true for "discoverability" of this key directly from beam chunks:
2> beam_lib:chunks("my_beam.beam", [compile_info]).
{ok, ... {my_key,my_value}]}]}}
Meaning, that you can "stamp" your beam files with some meta-information easily. So, a workaround may be to stamp those beam files with this mark.

Related

Different checksum results for jar files compiled on subsequent build?

I am working verifying the jar files present on remote unix boxes with that of built on local machine(Windows & Cygwin) with same JVM.
As a POC I am trying to verify if same checksum is produced with jar files generated on my machine with consecutive builds, I tried below,
Generated the jar file first time using ant script
Calculated the checksum (e.g. "xyz abc")
Generated the jar file again with same ant script without changing anything
I got different checksum but same byte count (e.g. "xvw abc")
I am not sure how java internal processes produce the class files and then the jar files, Can someone please help me understand below points
Does the cksum utility of unix/cygwin consider timestamp of the file while coming up with the value?
Will the checksum be different for compiled class files/jar file produced if we keep every other things same [Compiler version + sourcecode + machine + environment]?
Answer to question 1: cksum doesn't consider the timestamp of the archive (e.g. jar-file) but it does consider the timestamps of the files inside the jarfile.
Answer to question 2: The checksums of the individual class-files will be the same with all other things the same (source-code, compiler etc.) The checksums of the jar-files will be different. Causes of differences can be the timestamp of the files inside the jarfile or if files are put into the archive in different orders (e.g. caused by parallel builds).
If you want to create a reproducible build with gradle you can do so with the config below:
tasks.withType(AbstractArchiveTask) {
preserveFileTimestamps = false
reproducibleFileOrder = true
}
Maven allows something similar, sorry I don't know how to do this with ant..
More info here:
https://dzone.com/articles/reproducible-builds-in-java
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=74682318

How to control the function order in edoc file?

i want to generate the edoc file for one erlang module, here is my code http://ideone.com/TFktht,
I want oder the export api interface, but i can not success.
generate_log_statistics/4 Analyse the log files and draw graph.
prepare_sipp_cmd/3 Shows SIPp online help documentation of what config parameters there are.
prepare_systeminfo_filenames/1 Prepare the log files according which blades you want to monitor.
run_scenario/6 Run the SIPp scenario.
start_collect_system_infos/1 Start collecting the logs.
stop_collect_system_infos/1 Stop collecting the logs.
These exported api in edoc are not odered.
I want:
prepare_sipp_cmd/3
prepare_systeminfo_filenames/1
start_collect_system_infos/1
run_scenario/6
start_collect_system_infos/1
generate_log_statistics/4
To order functions
Put this in your rebar.config:
{edoc_opts, [{sort_functions, true}]}.
Or you can generate single doc files with something like:
edoc:file("src/foo.erl", [{dir, "doc"}, {sort_functions, true}]).
"doc" being the output directory.
To achieve specific order
Turn off function sorting, as above, but with {sort_functions, false} and order the functions in your source as required.

How do I get the exported types of an erlang module?

I had cause to check the types exported by a module, and I immediately thought "right, module_info then" but was surprised to run into a few difficulties. I found I can get the exported types from modules I compile, but not from say modules in stdlib.
My (three) questions are, how do I reliably get the exported types of a module, why are the exported types in the attributes bit of the module info on some modules, and why some modules and not others?
I discovered that if I build this module:
-module(foo).
-export([bar/0]).
-export_types([baz/0]).
bar() -> bat .
And then use foo:module_info/0, I get this:
[{exports,[{bar,0},{module_info,0},{module_info,1}]},
{imports,[]},
{attributes,[{vsn,[108921085595958308709649797749441408863]},
{export_types,[{baz,0}]}]},
{compile,[{options,[{outdir,"/tmp"}]},
{version,"5.0.1"},
{time,{2015,10,22,10,38,8}},
{source,"/tmp/foo.erl"}]}]
Great, hidden away in 'attributes' is 'export_types'. Why this is in attributes I'm not quite sure, but... whatever...
I now know this will work:
4> lists:keyfind(export_types, 1, foo:module_info(attributes)).
{export_types,[{baz,0}]}
Great. So, I now know this will work:
5> lists:keyfind(export_types, 1, ets:module_info(attributes)).
false
Ah... it doesn't.
I know there are exported types of course, if the documentation isn't good enough the ets source shows:
-export_type([tab/0, tid/0, match_spec/0, comp_match_spec/0, match_pattern/0]).
In fact the exported type information for the ets module doesn't seem to be anywhere in the module info:
6> rp(ets:module_info()).
[{exports,[{match_spec_run,2},
{repair_continuation,2},
{fun2ms,1},
{foldl,3},
{foldr,3},
{from_dets,2},
{to_dets,2},
{test_ms,2},
{init_table,2},
{tab2file,2},
{tab2file,3},
{file2tab,1},
{file2tab,2},
{tabfile_info,1},
{table,1},
{table,2},
{i,0},
{i,1},
{i,2},
{i,3},
{module_info,0},
{module_info,1},
{tab2list,1},
{match_delete,2},
{filter,3},
{setopts,2},
{give_away,3},
{update_element,3},
{match_spec_run_r,3},
{match_spec_compile,1},
{select_delete,2},
{select_reverse,3},
{select_reverse,2},
{select_reverse,1},
{select_count,2},
{select,3},
{select,2},
{select,1},
{update_counter,3},
{slot,2},
{safe_fixtable,2},
{rename,2},
{insert_new,2},
{insert,2},
{prev,2},
{next,2},
{member,2},
{match_object,3},
{match_object,2},
{match_object,1},
{match,3},
{match,2},
{match,1},
{last,1},
{info,2},
{info,1},
{lookup_element,3},
{lookup,2},
{is_compiled_ms,1},
{first,1},
{delete_object,2},
{delete_all_objects,1},
{delete,2},
{delete,1},
{new,2},
{all,0}]},
{imports,[]},
{attributes,[{vsn,[310474638056108355984984900680115120081]}]},
{compile,[{options,[{outdir,"/tmp/buildd/erlang-17.1-dfsg/lib/stdlib/src/../ebin"},
{i,"/tmp/buildd/erlang-17.1-dfsg/lib/stdlib/src/../include"},
{i,"/tmp/buildd/erlang-17.1-dfsg/lib/stdlib/src/../../kernel/include"},
warnings_as_errors,debug_info]},
{version,"5.0.1"},
{time,{2014,7,25,16,54,59}},
{source,"/tmp/buildd/erlang-17.1-dfsg/lib/stdlib/src/ets.erl"}]}]
ok
I took things to extremes now and ran this, logging the output to a file:
rp(beam_disasm:file("/usr/lib/erlang/lib/stdlib-2.1/ebin/ets.beam")).
Not that I don't consider this absurd... but anyway, it's about 5,000 lines of output, but nowhere do I find an instance of the string "tid".
Up to Erlang 18 this information is not easily available.
Dialyzer, for example, extracts it from the abstract syntax tree of the core Erlang version of a module (see e.g. dialyzer_utils:get_record_and_type_info/1 used by e.g. dialyzer_analysis_callgraph:compile_byte/5)
Regarding this part:
why are the exported types in the attributes bit of the module info on some modules, and why some modules and not others?
this is due to a bad definition in your module. The attribute should be -export_type, not -export_types. If you use the correct one (and define the baz/0 type and use it somewhere so that the module compiles), the exported types... vanish, as is expected.

How to make rebar/reltool respect subdirs in ebin when making release?

In my app, I have dir structure close to the following:
src/
api/
server.erl
model.erl
common/
common_stuff.erl
util.erl
some_app.erl
some_server.erl
something_else.erl
some_app.app.src
Files that reside in subdirs (common, api etc) are namespaced in a usual package style.
For example, src/common/util.erl is declared as:
-module(common.util).
src/api/server.erl is declared as:
-module(api.server).
and so on.
rebar compile works perfectly and generates appropriate subdir tree in ebin:
ebin/
api/
server.beam
model.beam
common/
common_stuff.beam
util.beam
some_app.beam
some_server.beam
something_else.beam
some_app.app
But, rebar generate only copies top-level files to the rel dir:
rel/some_app/lib/some_app-0.0.2/ebin/
some_app.beam
some_server.beam
something_else.beam
some_app.app
Everything that resides in subdirs is not copied over to release. Thus, when I try to start generated release, I immediately get this kind of error message:
{"init terminating in do_boot",{'cannot load','api.server',get_files}}
Crash dump was written to: erl_crash.dump
init terminating in do_boot ()
My rebar-generated ebin/some_app.app does list all the required modules:
{application,some_app,
[{description,"0.0.2"},
{vsn,"0.0.2"},
{registered,[]},
{applications,[kernel,stdlib,sasl]},
{mod,{some_app,[]}},
{env,[]},
{modules,['api.server','api.model','common.common_stuff',
'common.util', some_app, some_server,
something_else]}]}.
Does anybody know how to make "rebar generate" respect ebin's subdirs? I believe this could be reltool's problem as well.
Thanks.
Only flat application structures are well supported by erlang·
"Packages has since it was introduced more than 5 years ago been an experimental feature. Use it at your own risk, we do not actively maintain and develop this feature. It might however be supported some day.
In spite of this packages work quite well, but there are some known issues in tools and other parts where packages don't work well."
http://www.erlang.org/doc/man/packages.html

Erlang: add libraries to application

I use erlIDE (based on Eclipse) to work on Erlang projects. Till today everything was fine, but today I have to use external library (couchbeam) in my application. I found out, what is hell, btw.)
The problem is simple – I cannot include external library to compiler path. I've used rebar to get couchbeam's dependencies and it also downloaded ibrowse, mochiweb and ejson.
How can I include those libraries to compiler path without modifing ERL_LIBS to work on project in erlIDE?
I do not want modify ERL_LIBS, because I can change projects's path, start new one (then I should modify ERL_LIBS again) and so on.
I've tried compiler options in erlIDE:
{pa, {pa, 'site_stater/deps/couchbeam/'}}
or
{pa, {pa, '../deps/couchbeam/'}}
where 'site_stater' – is project's name
I wonder how professional erlang programmers organaze their projects workflow (where they write erlang progs, how debuggin it, deal with external libraries and so on).
Many thanks for your attension.
UPDATE
I wrote simple function to load libraries, but I think it is still wrong way to deal with this problem:
load_libraries() ->
ProjectRoot = filename:join([filename:absname("./"), "site_stater"]),
{ok, DepsList} = file:list_dir(ProjectRoot ++ "/deps/"),
lists:foreach(fun (Folder) ->
RealFolder = ProjectRoot ++ "/deps/" ++ Folder,
case filelib:is_dir(RealFolder) of
true ->
code:add_patha(filename:join([RealFolder, "/ebin"]));
false -> ok
end
end,
DepsList),
ok.
I can't verify it right now, but you should be able to use {pa, '../deps/couchbeam/'} in the compiler options. If that doesn't work, please try using an absolute path.
The compiler settings are not finished yet, we plan to have some simpler way to refer to external libraries but we're not there yet. Every such query from users increases the importance of fixing it!
regards,
Vlad

Resources