Xcode 7 Memory Address - ios

In Xcode 6:
int i = 17;
printf ("i stores its value at %p\n", &i);
I will see something like this:
i stores its value at 0xbffff738
But in Xcode 7, its output format is:
i stores its value at 0x7fff5fbff7cc
Can someone explain the difference?

It has to do with the built architecture. The first is a 32-bit address. The second is a 64-bit address.

Related

Why are printed memory addresses in Rust a mix of both 40-bit and 48-bit addresses?

I'm trying to understand the way Rust deals with memory and I've a little program that prints some memory addresses:
fn main() {
let a = &&&5;
let x = 1;
println!(" {:p}", &x);
println!(" {:p} \n {:p} \n {:p} \n {:p}", &&&a, &&a, &a, a);
}
This prints the following (varies for different runs):
0x235d0ff61c
0x235d0ff710
0x235d0ff728
0x235d0ff610
0x7ff793f4c310
This is actually a mix of both 40-bit and 48-bit addresses. Why this mix? Also, can somebody please tell me why the addresses (2, 3, 4) do not fall in locations separated by 8-bytes (since std::mem::size_of_val(&a) gives 8)? I'm running Windows 10 on an AMD x-64 processor (Phenom || X4) with 24GB RAM.
All the addresses do have the same size, Rust is just not printing trailing 0-digits.
The actual memory layout is an implementation detail of your OS, but the reason that a prints a location in a different memory area than all the other variables is, that a actually lives in your loaded binary, because it is a value that can already be calculated by the compiler. All the other variables are calculated at runtime and live on the stack.
See the compilation result on https://godbolt.org/z/kzSrDr:
.L__unnamed_4 contains the value 5; .L__unnamed_5, .L__unnamed_6 and .L__unnamed_1 are &5 &&5 and &&&5.
So .L__unnamed_1 is what on your system is at 0x7ff793f4c310. While 0x235d0ff??? is on your stack and calculated in the red and blue areas of the code.
This is actually a mix of both 40-bit and 48-bit addresses. Why this mix?
It's not really a mix, Rust just doesn't display leading zeroes. It's really about where the OS maps the various components of the program (data, bss, heap and stack) in the address space.
Also, can somebody please tell me why the addresses (2, 3, 4) do not fall in locations separated by 8-bytes (since std::mem::size_of_val(&a) gives 8)?
Because println! is a macro which expands to a bunch of stuff in the stackframe, so your values are not defined next to one another in the frame final code (https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=5b812bf11e51461285f51f95dd79236b). Though even if they were there'd be no guarantee the compiler wouldn't e.g. be reusing now-dead memory to save up on frame size.

D: (Win32) lua_rawgeti() pushes nil

I am investigating a strange problem: on Windows, lua_rawgeti() does not return back the value to which I have created reference, but a nil. Code:
lua_State *L = luaL_newstate();
luaL_requiref(L, "_G", luaopen_base, 1);
lua_pop(L, 1);
lua_getglobal(L, toStringz("_G"));
int t1 = lua_type(L, -1);
auto r = luaL_ref(L, LUA_REGISTRYINDEX);
lua_rawgeti(L, LUA_REGISTRYINDEX, r);
int t2 = lua_type(L, -1);
lua_close(L);
writefln("Ref: %d, types: %d, %d", r, t1, t2);
assert(r != LUA_REFNIL);
assert((t1 != LUA_TNIL) && (t1 == t2));
Full source and build bat: https://github.com/mkoskim/games/tree/master/tests/luaref
Compile & run:
rdmd -I<path>/DerelictLua/source/ -I<path>/DerelictUtil/source/ testref.d
64-bit Linux (_G is table, and rawgeti places a table to stack):
$ build.bat
Ref: 3, types: 5, 5
32-bit Windows (_G is table, but rawgeti places nil to stack):
$ build.bat
Ref: 3, types: 5, 0
<assertion fail>
So, either luaL_ref() fails to store reference to _G correctly, or lua_rawgeti() fails to retrieve _G correctly.
Update: I compiled Lua library from sources, and added printf() to lua_rawgeti() (lapi.c:660) to print out the reference:
printf("lua_rawgeti(%d)\n", n);
I also added writeln() to test.d to tell me at which point we call lua_rawgeti(). It shows that D sends the reference number correctly:
lua_rawgeti(2)
lua_rawgeti(0)
Dereferencing:
lua_rawgeti(3)
Ref: 3, types: 5, 0
On Windows, I use:
DMD 2.086.0 (32-bit Windows)
lua53.dll (32-bit Windows, I have tried both lua-5.3.4 and lua-5.3.5), from here: http://luabinaries.sourceforge.net/download.html
DerelictLua newest version (commit 5549c1a)
DerelictUtil newest version (commit 8dda339)
Questions:
Is there any bug in the code I just don't catch? Is there any known "quirks" or such to use 32-bit D and Lua on Windows? There can't be any big problems with my compiler and libraries, because they compile and link together without any errors, and lua calls mostly work (e.g. opening lua state, pushing _G to stack and such).
I was not able to find anything related when googling, so I am pretty sure there is something wrong in my setup (something is mismatching). It is hard to me to suspect problems in Lua libraries, because they have been stable quite some long time (even 32-bit versions).
I would like to know, if people have used 64-bit Windows DMD + Lua successfully. Of course, I would appreciate to hear if people use 32-bit Windows DMD + Lua successfully.
I am bit out of ideas where to look for solution. Any ideas what to try next?
Thanks in advance!
I got an answer from lua mailing list: http://lua-users.org/lists/lua-l/2019-05/msg00076.html
I suspect this is a bug in DerelictLua.
Lua defines lua_rawgeti thus:
int lua_rawgeti (lua_State *L, int index, lua_Integer n);
While DerelictLua defines its binding thus:
alias da_lua_rawgeti = int function(lua_State*, int, int);
I fixed that and created pull request to DerelictLua.

Getting Garbage value while convert into long Objective -C

I am trying to convert NSString to long but I am getting garbage value. Below is my code :
long t1 = [[jsonDict valueForKeyPath:#"detail.amount"]doubleValue] * 1000000000000000000;
long t2 = [[jsonDict valueForKeyPath:#"detail.fee"]doubleValue] * 10000000000000000;
NSLog(#"t1: %ld",t1);
NSLog(#"t2: %ld",t2);
detail.amout = 51.74
detail.fee = 2.72
O/P :
t1: 9223372036854775807 (Getting Garbage value here)
t2: 27200000000000000 (Working fine)
Thanks in advance.
Each number types (int, long, double, float) has limits. For your long 64 bit (because your device is 64bit) number the upper limit is :9,223,372,036,854,775,807 (see here: https://en.wikipedia.org/wiki/9,223,372,036,854,775,807)
In your case, 51.74 * 1,000,000,000,000,000,000 =
51,740,000,000,000,000,000
While Long 64bit only has a maximum of
9,223,372,036,854,775,807
So an overflow happens at 9,223,372,036,854,775,808 and above. Which is what your calculation evaluates into.
Also to note, that what you are doing will also cause problem if you only cater for 64bit long range, because what happens when your app runs on a 32bit (like iPhone 5c or below)?
Generally a bad idea to use large numbers, unless you're doing complex maths. If number accuracies are not critical, then you should consider simplifying the number like 51,740G (G = Giga). etc.
It's because you're storing the product to long type variables t1 and t2.
Use either float or double, and you'll get the correct answer.
Based on C's data types:
Long signed integer type. Capable of containing at least the
[−2,147,483,647, +2,147,483,647] range; thus, it is at least 32
bits in size.
Ref: https://en.wikipedia.org/wiki/C_data_types
9223372036854775807 is the maximum value of a 64-bit signed long. I deduce that [[jsonDict valueForKeyPath:#"detail.amount"]doubleValue] * 1000000000000000000 is larger than the maximum long value, so when you cast it to long, you get the closest value that long can represent.
As you read, it is not possible with long. Since it looks like you do finance math, you should use NSDecimalNumber instead of double to solve that problem.

Swift InputStream status does not match any possible enum values

I have an InputStream in my iOS app (Swift 4) and check the status a few seconds after opening it. The returned value is 7
print(String(self.inputStream!.streamStatus.rawValue)) // prints 7
But there is no corresponding entry with the value 7 in the enum described in the docs: https://developer.apple.com/documentation/foundation/stream.status
How can the value be 7? How can I get a textual representation of the status?
My guess was that the 7 is maybe a combination of other values that can be obtained by interpreting the 7 binary and looking which bits are 1. So in this case the first 3 bits are 1 (decimal 1,2,4) and they belong to (opening=1, open=2, writing=4). But this seems strange to me
EDIT: As Martin said in the comments, 7 is error.

Trouble with phone number in ipad mini

I am developing an app that shows a 10 digit number. When I run it in my iPhone 6 it shows ok. It displays 8183874201. But when I run it in my old iPad Mini it show the number in a format like this -467821389.
The code I am running is:
var telefonoCasaStaff = self.timelineData[0].objectForKey("TelCasa") as Int
self.telCasaTextLabel.text = String(telefonoCasaStaff)
Any ideas?
Int is a 64-bit integer on 64-bit devices, and a 32-bit integer
on 32-bit devices.
8183874201 = 0x1E7CC0299 exceeds the range of 32-bit integers, and
apparently is truncated to 32-bit.
You could use Int64, but generally, storing phone numbers
as integers makes not much sense, and you should use strings instead.

Resources