Passing BIGINT between Erlang VM and the NIFs - erlang

Is there an efficient way to pass BIGINT (integers exceeding 64 bits for x86_64/amd64 architectures) between Erlang VM and the NIFs? So far I haven't found a supporting function in the enif module. Maybe converting BIGINTs to binaries will help, but there might be another good way.

This post from 2011 says there wasn't any support for big integers in the NIF API at the time. I couldn't find any such function in Erlang/OTP 21's documentation, so the statement is likely true as of today as well.
Here's how you could pass a big integer as an array of bytes:
From Erlang, instead of passing the integer directly, pass two values: the sign of the integer and the binary obtained by calling binary:encode_unsigned/1 on the integer.
Integer = ...,
my_nif_function(Integer < 0, binary:encode_unsigned(Integer)).
In the NIF function, you can get access to the bytes of the second argument using enif_inspect_binary:
ErlNifBinary bin;
enif_inspect_binary(env, bin_term, &bin); // make sure to check the return value of this function in the real code
bin.data now points to bin.size bytes, representing the bytes of the integer in Big Endian order (if you want Little Endian, pass little as the second argument to binary:encode_unsigned/2 above).

Related

Lua dissection functions definition

This code is part of a Lua dissection script. Could you explain the meaning of this code please , especially the functions
add_le and le_uint. Thanks
-- Function: Upload functions request
function upload_function_req(buffer, subtree)
subtree:add_le(buffer(14,2), "func_id:", buffer(14,2):le_uint())
subtree:add_le(buffer(16,4), "fixed_values:", buffer(16,4):le_uint())
subtree:add_le(buffer(20,2), "offset:", buffer(20,2):le_uint())
end
The function adds 3 fields to the protocol tree. The buffer(n,m) is a tvbrange, with n indicating the offset into the buffer and m indicating the length. All 3 fields are unsigned integers in little-endian format. The 1st and 3rd fields are 2-byte integers; the 2nd is a 4 byte integer. The function does some unnecessary work though and could be simplified like so:
function upload_function_req(buffer, subtree)
subtree:add_le(buffer(14,2), "func_id:")
subtree:add_le(buffer(16,4), "fixed_values:")
subtree:add_le(buffer(20,2), "offset:")
end
If you want to learn more about the Lua API in Wireshark, you should have a look at the Wireshark Developer's Guide. Under Chapter 11. Wireshark's Lua API Reference Manual, you will find the relevant sub-chapters.
In particular:
The treeitem:add_le() is described in 11.7.1.3 treeitem:add_le([protofield], [tvbrange], [value],[label]).
The tvbrange:le_uint() is described in 11.8.3.3 tvbrange:le_uint().
add_le is the same as add, but "le" means little endian, since network protocols uses mostly big endian, changing to "le" can be used in place where little endian is necessary. the same is the int a le_uint (takes bytes from buffer and combines them in to the int according to the big or little endian).

I am getting warning for generating srandom(time(NULL)) [duplicate]

With the iPhone 5S update I want my app to be able to support the new 64-Bit processor.
However, using 64-Bit may cause truncation if a larger data type is casted into a smaller one, as in the case of casting a long into an int. Most of the time this can be easily fixed by just using the bigger data type, but in the case of random number generators which are sometimes seeded by using the "time(NULL)" function I cannot do that.
The current code is simple:
srandom(time(NULL));
But in XCode 5 with 64-Bit it is causing the following error: Implicit conversion loses integer precision: 'time_t' (aka 'long') to 'unsigned int'. This is because "time(NULL)" returns a long integer and "srandom" requires an unsigned int. Therefore there are two options:
Convert the long integer to an unsigned int
Replace "time(NULL)" with another function which does the same job but returns an unsigned int.
Which one would you recommend and what function should I use to do it?
NOTE: I use random() instead of arc4random() because I also need to be able to seed the random number generator in order to get a repeatable outcome.
time() typically returns the number of seconds since the epoch (not counting leap seconds), which means if you use it more than once in a second (or two people run the program at the same time) then it will return the same value, resulting in a repeated sequence even when you don't want it. I recommend against using time(NULL) as a seed, even in the absence of a warning (or error with -Werror) caused by the truncation.
You could use arc4random() to get a random seed instead of a seed based on time. It also happens to return an unsigned 32-bit value which will fix the error you're seeing.
srandom(arc4random());
You might consider moving to Objective-C++ so that you can use the standard C++ <random> library, which is much more powerful and flexible, and which also enables simpler and more direct expression of many ideas, than these other libraries
C++ <random> documentation
On iOS, just use arc4random(3) and don't worry about seeding.

Julia: efficient memory allocation

My program is memory-hungry, so I need to save as much memory as I can.
When you assign an integer value to a variable, the type of the value will always be Int64, whether it's 0 or +2^63-1 or -2^63.
I couldn't find a smart way to efficiently allocate memory, so I wrote a function that looks like this (in this case for integers):
function right_int(n)
types = [Int8,Int16,Int32, Int64, Int128]
for t in reverse(types)
try
n = t(n)
catch InexactError
break
end
end
n
end
a = right_int(parse(Int,eval(readline(STDIN))))
But I don't think this is a good way to do it.
I also have a related problem: what's an efficient way of operating with numbers without worrying about typemins and typemaxs? Convert each operand to BigInt and then apply right_int?
You're missing the forest for the trees. right_int is type unstable. Type stability is a key concept in reducing allocations and making Julia fast. By trying to "right-size" your integers to save space, you're actually causing more allocations and higher memory use. As a simple example, let's try making a "right-sized" array of 100 integers from 1-100. They're all small enough to fit in Int8, so that's just 100 bytes plus the array header, right?
julia> #allocated [right_int(i) for i=1:100]
26496
Whoa, 26,496 bytes! Why didn't that work? And why is there so much overhead? The key is that Julia cannot infer what the type of right_int might be, so it has to support any type being returned:
julia> typeof([right_int(i) for i=1:100])
Array{Any,1}
This means that Julia can't pack the integers densely into the array, and instead represents them as pointers to 100 individually "boxed" integers. These boxes tell Julia how to interpret the data that they contain, and that takes quite a bit of overhead. This doesn't just affect arrays, either — any time you use the result of right_int in any function, Julia can no longer optimize that function and ends up making lots of allocations. I highly recommend you read more about type stability in this very good blog post and in the manual's performance tips.
As far as which integer type to use: just use Int unless you know you'll be going over 2 billion. In the cases where you know you need to support huge numbers, use BigInt. It's notable that creating a similar array of BigInt uses significantly less memory than the "right-sized" array above:
julia> #allocated [big(i) for i=1:100]
6496

How to convert DWORD to FILETIME in LUA?

I am trying to read a file that has two DWORDs for the FILETIME (this is a prefetch file).
I read at offset 0x81 (0x80 + 1 because of 1-index in lua). How do I go about taking the 8 bytes and converting into a filetime using only lua?
Starting at 0x80 in my hex editor, I have:
FB54B341B70CCf01
Needs to correlate to 01/08/2014
What is FILETIME
The Windows platform defines FILETIME to be a 64-bit integer "count of 100ns intervals since January 1, 1601 UTC".
You will have at least two challenges with dealing with FILETIME in Lua.
First, it a FILETIME is a 64-bit integer and Lua stores numbers internally as IEEE double precision, which only supports 56 bits of precision. To the precision of the envelope I just scribbled on, you need more than 57 significant bits to name any time today as a FILETIME.
(Aside: I estimated that by noticing that there are about 1e7*pi seconds in a year, 1e7 100ns ticks in a second, and today is about 413 years after the FILETIME epoch. So dates in 2014 need about log2(413e14 * pi) bits, or a little more than 57 bits.)
Second, pure Lua doesn't have easy to use functions for converting binary data structures to and from native Lua data types. It isn't difficult to build such functions out of string.byte() and string.sub() and that is even safe to do since strings are 8-bit clean. But it is something you have to build yourself, or find from a third-party source.
But be aware that although there are binary structure libraries out there, many of them only provide limited support for 64-bit integers due to the limitations of Lua numbers. You may be better suited by a hand-crafted module in C that stores a FILETIME in a userdata and provides suitable operators to allow them to be compared, converted to and from a string, and so forth.
Your Example
Starting at 0x80 in my hex editor, I have:
FB54B341B70CCf01
Needs to correlate to 01/08/2014
Windows on a PC is a little-endian platform. That means that values are stored with the least-significant byte at the lowest address. So we can rewrite your sample timestamp to be more readable by reversing the bytes:
01CF0CB741B354FB
As expected, the 57'th bit is the most significant set bit, so this value is plausible for this century.

Seed random in 64-Bit

With the iPhone 5S update I want my app to be able to support the new 64-Bit processor.
However, using 64-Bit may cause truncation if a larger data type is casted into a smaller one, as in the case of casting a long into an int. Most of the time this can be easily fixed by just using the bigger data type, but in the case of random number generators which are sometimes seeded by using the "time(NULL)" function I cannot do that.
The current code is simple:
srandom(time(NULL));
But in XCode 5 with 64-Bit it is causing the following error: Implicit conversion loses integer precision: 'time_t' (aka 'long') to 'unsigned int'. This is because "time(NULL)" returns a long integer and "srandom" requires an unsigned int. Therefore there are two options:
Convert the long integer to an unsigned int
Replace "time(NULL)" with another function which does the same job but returns an unsigned int.
Which one would you recommend and what function should I use to do it?
NOTE: I use random() instead of arc4random() because I also need to be able to seed the random number generator in order to get a repeatable outcome.
time() typically returns the number of seconds since the epoch (not counting leap seconds), which means if you use it more than once in a second (or two people run the program at the same time) then it will return the same value, resulting in a repeated sequence even when you don't want it. I recommend against using time(NULL) as a seed, even in the absence of a warning (or error with -Werror) caused by the truncation.
You could use arc4random() to get a random seed instead of a seed based on time. It also happens to return an unsigned 32-bit value which will fix the error you're seeing.
srandom(arc4random());
You might consider moving to Objective-C++ so that you can use the standard C++ <random> library, which is much more powerful and flexible, and which also enables simpler and more direct expression of many ideas, than these other libraries
C++ <random> documentation
On iOS, just use arc4random(3) and don't worry about seeding.

Resources