Xamarin- Copy Data From IntPtr of NSDATA class.Bytes property into uint16 var - ios

I am creating a blutooth app in Xamarin.ios platform. I am having issues collating data which comes from the peripheral that my app is connected. I have below objective c code but i am having hard time converting it to the C#. As i am not sure how to spilt the array of bytes in c#. Any advice how to correctly receive the data from the device.I thought of using Marshal class but not sure if it does the same thing as it does on iOS operating system level.
this is objective c code which works fine on iOS operating system level
UInt16 cValue;
[characteristic.value getBytes:&cValue length:2];
As you can see it calls getBytes method of NSDATA class From Apple API which does the trick but I could not find something similar in NSDATA class From Xamarin.ios API.
And this is what I think of doing it in C#:
byte []destination=new byte[16]();
Marshal.Copy(characteristic.Value.Bytes,destination,2, Not Sure about length);
Here's the marshal method that I used.
http://msdn.microsoft.com/en-us/library/vstudio/ms146631
Last thing, I am not sure about the byte[16] i just assumed that because Uint16 in objective c is 8 bit unsigned integer.

First of all, UInt16 is a 16-bit unsigned integer. byte[16] is a 16-byte array of 8-bit unsigned integers.
You will want a byte[2] to store the Uint16.
byte[] bytes = new byte[characteristic.Value.Length];
Marshal.Copy(characteristic.Value.Bytes, bytes, 0, Convert.ToInt32(characteristic.value.Length));
To finally convert to a ushort that is the equivalent to an Objective-C UInt16, you can use
ushort value = (ushort)(bytes[0] | (bytes[1] << 8))
Update
For a little-endian uint (objc UInt32), the conversion would look like this
uint value = bytes[0] | (bytes[1] << 8) | (bytes[2] << 16) | (bytes[3] << 24);

Related

What type is NSString and how many bytes?

I am new to objective c. Trying to find out the type of NSString in Objective C. I use the sizeof() method from C and lengthOfBytesUsingEncoding method using UTF8 encoding from NSString.
NSString *test=#"a";
NSLog(#"LengthOfBytesUsingEncoding: %lu bytes", [test lengthOfBytesUsingEncoding:NSUTF8StringEncoding]);
printf("NSString: %lu\n", sizeof(test));
This is gonna give me in Console
LengthOfBytesUsingEncoding: 1 bytes
and NSString: 8 bytes
What is the difference between the two results?
Why LengthOfBytesUsingEncoding returns 1 bytes and sizeof method returns 8 bytes?
What is the type of NSString? Int, float, long, long double?
The length of bytes gives you the length of text content using the specified encoding. In this case the string contains a single character, which in UTF8 is encoded as 1 byte.
The sizeof gives you the size of the variable's type, which, in this case is a pointer to NSString. The size of all pointers on 64bit architectures is 8 bytes. It's essentially the size of memory address, where NSString data is stored. sizeof is not a method and it's not even a function. It's an operator. The result is known at compile-time.
In other words:
The actual string contents are stored in memory in a format that is opaque and shouldn't interest you.
On another place in memory, there is NSString data structure that contains a pointer to the contents. You can get the size of this structure using sizeof(NSString) (actually the size will differ depending on concrete NSString subclass, e.g. NSMutableString, NSPlaceholderString etc).
Your variable contains a pointer to NSString, that is, its size is sizeof(NSString*), which is always 8 bytes.
sizeof operator shouldn't interest you much in Objective-C, unless you are dealing with pointer arithmetics, which should be rather rare.

Convert first two bytes of Lua string (in bigendian format) to unsigned short number

I want to have a lua function that takes a string argument. String has N+2 bytes of data. First two bytes has length in bigendian format, and rest N bytes contain data.
Say data is "abcd" So the string is 0x00 0x04 a b c d
In Lua function this string is an input argument to me.
How can I calculate length optimal way.
So far I have tried below code
function calculate_length(s)
len = string.len(s)
if(len >= 2) then
first_byte = s:byte(1);
second_byte = s:byte(2);
//len = ((first_byte & 0xFF) << 8) or (second_byte & 0xFF)
len = second_byte
else
len = 0
end
return len
end
See the commented line (how I would have done in C).
In Lua how do I achieve the commented line.
The number of data bytes in your string s is #s-2 (assuming even a string with no data has a length of two bytes, each with a value of 0). If you really need to use those header bytes, you could compute:
len = first_byte * 256 + second_byte
When it comes to strings in Lua, a byte is a byte as this excerpt about strings from the Reference Manual makes clear:
The type string represents immutable sequences of bytes. Lua is 8-bit clean: strings can contain any 8-bit value, including embedded zeros ('\0'). Lua is also encoding-agnostic; it makes no assumptions about the contents of a string.
This is important if using the string.* library:
The string library assumes one-byte character encodings.
If the internal representation in Lua of your number is important, the following excerpt from the Lua Reference Manual may be of interest:
The type number uses two internal representations, or two subtypes, one called integer and the other called float. Lua has explicit rules about when each representation is used, but it also converts between them automatically as needed.... Therefore, the programmer may choose to mostly ignore the difference between integers and floats or to assume complete control over the representation of each number. Standard Lua uses 64-bit integers and double-precision (64-bit) floats, but you can also compile Lua so that it uses 32-bit integers and/or single-precision (32-bit) floats.
In other words, the 2 byte "unsigned short" C data type does not exist in Lua. Integers are stored using the "long long" type (8 byte signed).
Lastly, as lhf pointed out in the comments, bitwise operations were added to Lua in version 5.3, and if lhf is the lhf, he should know ;-)

How to send int between 32bit and 64bit processors iOS

Pretty much the title, I send an int in a struct using Gamekit and on the receiving end the other device gets it.
Between 64bit cpus (iPhone 5S and over) the number is received fine. But when a iPhone 5 gets it (32bit cpu) the int is received as 0. Whats the correct way?
I've tried sending as NSInteger and the results are the same.
I have to add I have this issue with u_int_32t:
When devices connect, each device trades random numbers. These numbers determine which player starts, and I'm using u_int_32t for this, however, 32bit cpus still receive 0. For example:
I declare
uint32_t _ourRandomNumber;
Then, _ourRandomNumber = arc4random();
And then the numbers are sent, in a struct like this.
typedef struct {
Message message;
uint32_t randomNumber;
} MessageRandomNumber;
Using a method like this:
- (void)sendRandomNumber{
MessageRandomNumber message;
message.message.messageType = kMessageTypeRandomNumber;
message.randomNumber = _ourRandomNumber;
NSData *data = [NSData dataWithBytes:&message length:sizeof(MessageRandomNumber)];
[self sendData:data];
}
When the 32 bit cpu receives it then in the receiving method:
Message *message = (Message*)[data bytes];
if (message->messageType == kMessageTypeRandomNumber) {
MessageRandomNumber *messageRandomNumber = (MessageRandomNumber*)[data bytes];
NSLog(#"Received random number:%d", messageRandomNumber->randomNumber);
The NSLog shows: Received random number:0
NSInteger is going to be 64-bit on a 64-bit platform and 32-bit on a 32-bit platform. If you don't care about 64-bit precision, you could always use an int32_t (or a u_int32_t if you want unsigned) type to explicitly just use a 32-bit value. It is generally wise to be explicit about data lengths when sending values between devices, which is what these types exist for (there's int8_t, int16_t, int32_t, and int64_t and their unsigned counterparts).
It's also worth mentioning that you need to be concerned about the byte order of the values (assuming larger values than int8_t and u_int8_t) when sending values to arbitrary hardware. If you're only working with iOS devices this isn't going to be an issue, however.

iOS 64 bit support with C framework libxml2 "implicit conversion loses integer precision"

My app uses libxml2, which contains a function "xmlReadMemory(const char * buffer,
int size,
const char * URL,
const char * encoding,
int options)"
I do have a "size" to send, but it is an NSUInteger.
This framework is written in C, so it expects me to send an int, which throws a warning in my application since I am now including arm64 as a valid architecture: "Implicit conversion loses integer precision: NSUInteger (aka 'unsigned long') to int". Is there a safe way to resolve this warning?
link to framework API: http://www.xmlsoft.org/html/libxml-parser.html
As long as you're confident that you'll never need to pass it a value larger than the largest 32 bit integer (very unlikely) just cast it to int by adding (int) in front of the parameter in the function call. An unsigned 32 bit value is over 4 GB, which is more than the memory on an iPhone, and would take HOURS to download over the cell network.

Convert stream bytes to string

I am using google protocol buffers to send some data from a C++ server to a IOS app. I use this function on the IOS side to convert the stream bytes to a string:
-(NSString*)convertStreamBytesToString:(NSMutableData*)data
{
int len = [data length];
char raw[len];
[data getBytes:raw length:len];
NSString *protocStruct =[[NSString alloc] initWithBytes:raw length:len encoding:NSUTF8StringEncoding];
return protocStruct;
}
My problem is that sometimes this doesn't work. I can see that I send and receive all the bytes, but when converting some of them are lost. So for example I get 83 bytes, but when printing the string I get about 20 characters. Where are the rest ? Is there a problem I don't know about this converting method ?
NSString is a class for handling Unicode strings. You cannot store arbitrary bytes in it as with a C string. (And even then you probably cannot transmit binary data in place of a character string and expect it to survive the transport)
You will need to convert your binary data to a string in a way that results in a valid text string. For example via Base64 encoding.
There are lots of iOS projects you can get to encode/decode Base64, just google it.
Here's an article about it: http://www.cocoawithlove.com/2009/06/base64-encoding-options-on-mac-and.html?m=1
Allocating C array on stack in obj-c causes error sometimes.
Try to use dynamic memory for char array:
char* raw = (char*)malloc(len*sizeof(char));
And free it afterwards:
free(raw);

Resources