NRF52 java unsigned byte issue - android-ble

I am using ble with nrf52 and sending a byte array command from android with java like below
byte[] header=new byte[]{
(byte )0x5f,
(byte) 0xf0,
(byte) 0xf1,
(byte) 0xf2,
};
when I log these bytes to console, I see as below
[111,-16,-15,-14]
Device gots the command but doesn't send answer
However, if I send the same command from iOS with swift, command data showns as below
[111,240,241,242]
and works as aspected.
What may cause this behavior? May it be related java's unsingned byte support?

You are on the right track. While your iOS App handles the received data as 'unsigned bytes' the Android App interprets them as 'signed bytes'. You might want to treat them as unsigned values with the corresponding *unsigned methods, for example Byte.compareUnsigned

Related

Write BLE gatt characteristic on Android — error status 255

Experimenting with sample kotlin program, trying to read/write BLE device with channels from 0-7
When reading it gives me value like this: (0x06)
onCharacteristicRead(), status=0, value=[uuid='ba7e7814-5b67-43d3-bd80-e72cc83ae801', hexValue=[06]]
but when trying to write same output it gives me, it gives me error GATT 255, out of range:
CharacteristicWriteOperation{MAC='00:A0:50:E8:78:86', characteristic=[uuid='ba7e7814-5b67-43d3-bd80-e72cc83ae801', hexValue=[30, 36]]}
onCharacteristicWrite(), status=255, value=[uuid='ba7e7814-5b67-43d3-bd80-e72cc83ae801']
What you read: [0x06]
onCharacteristicRead(), status=0, value=[uuid='ba7e7814-5b67-43d3-bd80-e72cc83ae801', hexValue=[06]]
What you wrote: [0x30, 0x36] (which may correspond to String "06" as ASCII hex value for '0' is 0x30 and for '6' is 0x36)
CharacteristicWriteOperation{MAC='00:A0:50:E8:78:86', characteristic=[uuid='ba7e7814-5b67-43d3-bd80-e72cc83ae801', hexValue=[30, 36]]}
You probably want to write back hexadecimal value 0x06, not the string "06"
Status 255 is GATT_OUT_OF_RANGE which means that the written value is outside of range accepted by your peripheral.

How to send int between 32bit and 64bit processors iOS

Pretty much the title, I send an int in a struct using Gamekit and on the receiving end the other device gets it.
Between 64bit cpus (iPhone 5S and over) the number is received fine. But when a iPhone 5 gets it (32bit cpu) the int is received as 0. Whats the correct way?
I've tried sending as NSInteger and the results are the same.
I have to add I have this issue with u_int_32t:
When devices connect, each device trades random numbers. These numbers determine which player starts, and I'm using u_int_32t for this, however, 32bit cpus still receive 0. For example:
I declare
uint32_t _ourRandomNumber;
Then, _ourRandomNumber = arc4random();
And then the numbers are sent, in a struct like this.
typedef struct {
Message message;
uint32_t randomNumber;
} MessageRandomNumber;
Using a method like this:
- (void)sendRandomNumber{
MessageRandomNumber message;
message.message.messageType = kMessageTypeRandomNumber;
message.randomNumber = _ourRandomNumber;
NSData *data = [NSData dataWithBytes:&message length:sizeof(MessageRandomNumber)];
[self sendData:data];
}
When the 32 bit cpu receives it then in the receiving method:
Message *message = (Message*)[data bytes];
if (message->messageType == kMessageTypeRandomNumber) {
MessageRandomNumber *messageRandomNumber = (MessageRandomNumber*)[data bytes];
NSLog(#"Received random number:%d", messageRandomNumber->randomNumber);
The NSLog shows: Received random number:0
NSInteger is going to be 64-bit on a 64-bit platform and 32-bit on a 32-bit platform. If you don't care about 64-bit precision, you could always use an int32_t (or a u_int32_t if you want unsigned) type to explicitly just use a 32-bit value. It is generally wise to be explicit about data lengths when sending values between devices, which is what these types exist for (there's int8_t, int16_t, int32_t, and int64_t and their unsigned counterparts).
It's also worth mentioning that you need to be concerned about the byte order of the values (assuming larger values than int8_t and u_int8_t) when sending values to arbitrary hardware. If you're only working with iOS devices this isn't going to be an issue, however.

Xamarin- Copy Data From IntPtr of NSDATA class.Bytes property into uint16 var

I am creating a blutooth app in Xamarin.ios platform. I am having issues collating data which comes from the peripheral that my app is connected. I have below objective c code but i am having hard time converting it to the C#. As i am not sure how to spilt the array of bytes in c#. Any advice how to correctly receive the data from the device.I thought of using Marshal class but not sure if it does the same thing as it does on iOS operating system level.
this is objective c code which works fine on iOS operating system level
UInt16 cValue;
[characteristic.value getBytes:&cValue length:2];
As you can see it calls getBytes method of NSDATA class From Apple API which does the trick but I could not find something similar in NSDATA class From Xamarin.ios API.
And this is what I think of doing it in C#:
byte []destination=new byte[16]();
Marshal.Copy(characteristic.Value.Bytes,destination,2, Not Sure about length);
Here's the marshal method that I used.
http://msdn.microsoft.com/en-us/library/vstudio/ms146631
Last thing, I am not sure about the byte[16] i just assumed that because Uint16 in objective c is 8 bit unsigned integer.
First of all, UInt16 is a 16-bit unsigned integer. byte[16] is a 16-byte array of 8-bit unsigned integers.
You will want a byte[2] to store the Uint16.
byte[] bytes = new byte[characteristic.Value.Length];
Marshal.Copy(characteristic.Value.Bytes, bytes, 0, Convert.ToInt32(characteristic.value.Length));
To finally convert to a ushort that is the equivalent to an Objective-C UInt16, you can use
ushort value = (ushort)(bytes[0] | (bytes[1] << 8))
Update
For a little-endian uint (objc UInt32), the conversion would look like this
uint value = bytes[0] | (bytes[1] << 8) | (bytes[2] << 16) | (bytes[3] << 24);

Decrypting using CCCrypt returns different results on iOS and MacOS

Decrypting using CCCrypt returns different results on iOS (5,6) and MacOS 10.7.
Same code running in different platforms/architectures produces different outputs with the same input. Why? and how to fix it?
I have debugged everything. All the variables share the same value either running on the Mac or in the iPhone. The point where they bifurcate is:
ccStatus = CCCrypt(kCCDecrypt, //decrypt
kCCAlgorithmAES128, //AES128
0, // mode - no Padding
rawAESKey, // simmetric key
kCCKeySizeAES128, //key size
iv,
dataIn, dataInBytesSize, //input
dataOut, dataOutBytesSize, //output
&clearTextSize);
At this point, dataOut has different values depending if it's running in the mac, and in the iPhone. ccStatus returns success in both cases.
Note:
Xcode Version 4.6.2 (4H1003)
iOS SDK 5,6 - binary produced as 32 bits
MacOS SDK 10.7 - binary produced as 64 bits

iOS SecItemCopyMatching RSA public key format?

I'm trying to extract a 1024-bit RSA public key from an already generated key pair (two SecKeyRefs), in order to send it over the wire. All I need is a plain (modulus, exponent) pair, which should take up exactly 131 bytes (128 for the modulus and 3 for the exponent).
However, when I fetch the key info as a NSData object, I get 140 bits instead of 131. Here's an example result:
<30818902 818100d7 514f320d eacf48e1 eb64d8f9 4d212f77 10dd3b48 ba38c5a6
ed6ba693 35bb97f5 a53163eb b403727b 91c34fc8 cba51239 3ab04f97 dab37736
0377cdc3 417f68eb 9e351239 47c1f98f f4274e05 0d5ce1e9 e2071d1b 69a7cac4
4e258765 6c249077 dba22ae6 fc55f0cf 834f260a 14ac2e9f 070d17aa 1edd8db1
0cd7fd4c c2f0d302 03010001>
After retrying the key generation a couple of times and comparing the resulting NSData objects, the bytes that remain the same for all keys are the first 7:
<30818902 818100>
The last three bytes look like the exponent (65537, a common value). There are also two bytes between the "modulus" and the exponent:
<0203>
Can someone with more crypto experience help me identify what encoding is this? DER? How do I properly decode the modulus and exponent?
I tried manually stripping out the modulus and exponent using
NSData* modulus = [keyBits subdataWithRange:(NSRange){ 7, 128 }];
NSData* exponent = [keyBits subdataWithRange:(NSRange){ 7 + 128 + 2, 3 }];
but I get errors when trying to decrypt data which the remote host encoded using that "key".
EDIT:
Here's a gist of the solution I ended up using to unpack the RSA blob: https://gist.github.com/vl4dimir/6079882
Assuming you want the solution to work under iOS, please have a look at this thread. The post confirms that the encoding is DER and shows how to extract the exponent and modulus from the NSData object you started with.
There is another solution that won't work on iOS, but will work on Desktop systems (including MacOS X) that have OpenSSL installed in this thread. Even if you are looking for the iOS-only solution you can still use this to verify your code is working correctly.

Resources