I need to represent a NSInteger and NSString into array of bytes. below are the sample of what I am looking for.
For how, this harcodings are working fine. I want to do this through code. Any clue.
First, NSInteger into bytes of Hex:
NSInteger test = 1;
unsigned char byte[] = { 0x00, 0x01 };
NSInteger test = 16;
unsigned char byte[] = { 0x00, 0x10 };
NSData *data = [NSData dataWithBytes:byte length:sizeof(byte)];
Second, NSString into bytes of Hex:
NSString *test = #"31C5B562-BD07-4616-BCBD-130BA6822790";
unsigned char byte[] = {0x31, 0xC5, 0xB5, 0x62, 0xBD, 0x07, 0x46, 0x16, 0xBC, 0xBD, 0x13, 0x0B, 0xA6, 0x82, 0x27, 0x90};
NSData *data = [NSData dataWithBytes:byte length:sizeof(byte)];
I tried with below code and it works well for my UUID but for NSInteger to to be working I need to send "0010" instead of 16 and "0001" instead of 1. So any clue on how to do this conversion.
- (NSData *)hexData {
NSMutableData *hexData = [NSMutableData data];
int idx = 0;
for (idx = 0; idx+2 <= self.length; idx+=2) {
NSRange range = NSMakeRange(idx, 2);
NSString* hexStr = [self substringWithRange:range];
NSScanner* scanner = [NSScanner scannerWithString:hexStr];
unsigned int intValue;
[scanner scanHexInt:&intValue];
[hexData appendBytes:&intValue length:1];
}
return hexData;
}
EDIT:
int8_t test = -59;
int8_t bytes = CFSwapInt16HostToBig(test);
NSData *data1 = [NSData dataWithBytes:&bytes length:sizeof(bytes)];
Reaching as 0xFF instead of 0xC4
Since your string is a UUID string you can do something like this:
NSString *test = #"";
uuid_t uuid;
uuid_parse([test UTF8String], uuid)
NSData *data = [NSData dataWithBytes:uuid length:16];
For the number you can do:
NSInteger test = 1;
NSData *data = [NSData dataWithBytes:&test length:sizeof(test)];
Keep in mind that NSInteger is probably more than two bytes and you may also need to worry about byte order.
Update: Since it seems you need the integer value to be two bytes, you should do:
uint16_t test = 1;
NSData *data = [NSData dataWithBytes:&test length:sizeof(test)];
This will ensure 2 bytes. You also need to worry about byte ordering so you really need:
uint16_t test = 1;
uint16_t bytes = CFSwapInt16HostToBig(test);
NSData *data = [NSData dataWithBytes:&bytes length:sizeof(bytes)];
Change CFSwapInt16HostToBig to CFSwapInt16HostToLitte if appropriate.
Related
I convert NSString to byte Array. It ok, then I convert NSData to base64 is wrong. if "010203040506" is right but with high number (exam: #"333435363738") is wrong. This is my code. Please help me.
In Android: ISIjJCUm and iOS: MzQ1Njc4.
NSString *command = #"333435363738";
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned long whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length] /2; i++) {
NSLog(#"%d",[command characterAtIndex:i*2]);
NSLog(#"%d",[command characterAtIndex:i*2 + 1]);
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2 + 1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSString *base64String;
if ([commandToSend respondsToSelector:#selector(base64EncodedStringWithOptions:)]) {
base64String = [commandToSend base64EncodedStringWithOptions:kNilOptions]; // iOS 7+
} else {
base64String = [commandToSend base64Encoding]; // pre iOS7
}
Your code produces the string MzQ1Njc4 which is the bas64 encoding of the bytes 0x33, 0x34, 0x35, 0x36, 0x37, 0x38. This appears to be what the code is meant to do.
The string ISIjJCUm is the base64 encoding of 0x21, 0x22, 0x23, 0x24, 0x25, 0x26.
Note that 0x21 is 33 decimal. So it looks like you were either meant to interpret the string as decimal on iOS or as hex on Android.
I have a string variable in iOS and I would like to convert that to a character array and then to a hex bytes like 0xD6, 0xD6 etc.
It will be great if there is a library in Objective-C that I can use for this
swift 4
string to byte:
let strChar = "A"
let data1 = [UInt8](self.strChar.utf8)
may be answer is here:
string to chars:
NSString *s = #"Some string";
const char *c = [s UTF8String];
chars to hex:
- (NSData *)dataFromHexString {
const char *chars = [self UTF8String];
int i = 0, len = self.length;
NSMutableData *data = [NSMutableData dataWithCapacity:len / 2];
char byteChars[3] = {'\0','\0','\0'};
unsigned long wholeByte;
while (i < len) {
byteChars[0] = chars[i++];
byteChars[1] = chars[i++];
wholeByte = strtoul(byteChars, NULL, 16);
[data appendBytes:&wholeByte length:1];
}
return data;
}
reference:NSString (hex) to bytes
I am using the following code to write the 0xDE value for a Bluetooth Caracteristic (Reset Device) using the IOS Core Bluetooth :
...
NSData *bytes = [#"0xDE" dataUsingEncoding:NSUTF8StringEncoding];
[peripheral writeValue:bytes
forCharacteristic:characteristic
type:CBCharacteristicWriteWithResponse];
...
is there any mistake in my code because the value is not written properly?
Swift 3.0: In case anyone is wondering the format for Swift is slightly different as writeValue can get the count from the array.
let value: UInt8 = 0xDE
let data = Data(bytes: [value])
peripheral.writeValue(data, for: characteristic, type: .withResponse)
Try creating your data with an array of single byte values.
const uint8_t bytes[] = {0xDE};
NSData *data = [NSData dataWithBytes:bytes length:sizeof(bytes)];
This is a useful approach for creating arbitrary constant data. For more bytes,
const uint8_t bytes[] = {0x01,0x02,0x03,0x04,0x05};
NSData *data = [NSData dataWithBytes:bytes length:sizeof(bytes)];
If you want to create data to send using variables, I would recommend using NSMutableData and appending the bytes that you need. It isn't very pretty, but it is easy to read / understand, especially when you are matching a packed struct on the embedded side. Example below is from a BLE project where we were making a simple communication protocol.
NSMutableData *data = [[NSMutableData alloc] init];
//pull out each of the fields in order to correctly
//serialize into a correctly ordered byte stream
const uint8_t start = PKT_START_BYTE;
const uint8_t bitfield = (uint8_t)self.bitfield;
const uint8_t frame = (uint8_t)self.frameNumber;
const uint8_t size = (uint8_t)self.size;
//append the individual bytes to the data chunk
[data appendBytes:&start length:1];
[data appendBytes:&bitfield length:1];
[data appendBytes:&frame length:1];
[data appendBytes:&size length:1];
The answer by bensarz is almost correct. Except one thing: you shouldn't use sizeof(int) as the length for NSData. The size of int is 4 or 8 bytes (depending on the architecture). As you want to send 1 byte, use uint8_t or Byte instead:
uint8_t byteToWrite = 0xDE;
NSData *data = [[NSData alloc] initWithBytes:&byteToWrite length:sizeof(&byteToWrite)];
[peripheral writeValue:data
forCharacteristic:characteristic
type:CBCharacteristicWriteWithResponse];
Of courser you could also use int as the variable's type, but you have to initialize NSData with the length of 1.
This code will fix the problem :
NSData * data = [self dataWithHexString: #"DE"];
[peripheral writeValue:data forCharacteristic:characteristic
type:CBCharacteristicWriteWithResponse];
dataWithHexString implementation :
- (NSData *)dataWithHexString:(NSString *)hexstring
{
NSMutableData* data = [NSMutableData data];
int idx;
for (idx = 0; idx+2 <= hexstring.length; idx+=2) {
NSRange range = NSMakeRange(idx, 2);
NSString* hexStr = [hexstring substringWithRange:range];
NSScanner* scanner = [NSScanner scannerWithString:hexStr];
unsigned int intValue;
[scanner scanHexInt:&intValue];
[data appendBytes:&intValue length:1];
}
return data;
}
What you are, in fact, doing here is writing the string "0xDE" to the characteristic. If you want to use binary/octal notation, you need to stay away from strings.
int integer = 0xDE;
NSData *data = [[NSData alloc] initWithBytes:&integer length:sizeof(integer)];
[peripheral writeValue:data
forCharacteristic:characteristic
type:CBCharacteristicWriteWithResponse];
I have the following code in objective-c
NSString* v_plainText = #"1234567890123456789012345678";
NSString* plainText = (const void *) [v_plainText UTF8String];
size_t plainTextBufferSize = [v_plainText length];
size_t bufferPtrSize = (plainTextBufferSize + kCCBlockSize3DES) & ~(kCCBlockSize3DES - 1);
size_t movedBytes = 0;
uint8_t *bufferPtr = malloc( bufferPtrSize * sizeof(uint8_t));
Byte iv[8] = {0,0,0,0,0,0,0,0};
NSString *key = (const void *) [#"12345678ABCDEFGH!##$%^&*" UTF8String];
CCCryptorStatus ccStatus;
ccStatus = CCCrypt(kCCEncrypt,
kCCAlgorithm3DES,
kCCOptionPKCS7Padding & kCCModeCBC,
key,
kCCKeySize3DES,
iv,
plainText,
plainTextBufferSize,
(void *)bufferPtr, // output
bufferPtrSize,
&movedBytes);
NSData* result = [NSData dataWithBytes:(const void*)bufferPtr length:(NSUInteger)movedBytes];
NSString* str = [result base64EncodedStringWithOptions:0];
This gives this result:
geHFnvoept2aKiruo6InSvc7WVPdHNq2
When I run simular code in .NET it gives me this result:
geHFnvoept2aKiruo6InSvc7WVPdHNq2TENQX5q9Beg=
For some reason the objective-c version only returns 24 bytes while the input is 28 bytes. I would expect it to be 32 bytes like in the .NET version. I was unable to determine what I'm doing wrong here.
The encryption error is:
kCCOptionPKCS7Padding & kCCModeCBC,
You need |, not & but kCCModeCBC is not a valid option and no option is needed because CBC is the default mode, you simply need:
kCCOptionPKCS7Padding,
Unfortunatly there are other objectiveC coding errors, mainly:
NSString* plainText = (const void *) [v_plainText UTF8String];
NSString *key = (const void *) [#"12345678ABCDEFGH!##$%^&*" UTF8String];
Instead use:
NSData* plainText = [v_plainText dataUsingEncoding:NSUTF8StringEncoding];
NSData *key = [#"12345678ABCDEFGH!##$%^&*" dataUsingEncoding:NSUTF8StringEncoding];
Here is the complete code converted to use NSData:
NSString* plainText = #"1234567890123456789012345678";
NSString* keyText = #"12345678ABCDEFGH!##$%^&*";
NSData* plainData = [plainText dataUsingEncoding:NSUTF8StringEncoding];
NSData *keyData = [keyText dataUsingEncoding:NSUTF8StringEncoding];
Byte iv[8] = {0,0,0,0,0,0,0,0};
size_t bufferSize = plainData.length + kCCBlockSize3DES;
NSMutableData *cypherData = [NSMutableData dataWithLength:bufferSize];
size_t movedBytes = 0;
CCCryptorStatus ccStatus;
ccStatus = CCCrypt(kCCEncrypt,
kCCAlgorithm3DES,
kCCOptionPKCS7Padding,
keyData.bytes,
kCCKeySize3DES,
iv,
plainData.bytes,
plainData.length,
cypherData.mutableBytes,
cypherData.length,
&movedBytes);
cypherData.length = movedBytes;
NSString* str = [cypherData base64EncodedStringWithOptions:0];
NSLog(#"str: %#", str);
Oputput:
str: geHFnvoept2aKiruo6InSvc7WVPdHNq2TENQX5q9Beg=
I have an NSData object that contains just <64> which is supposed to represent the int 100
How can I convert this NSData to an int?
I can convert it to it's Chr equivalent d using
NSString *string = [[NSString alloc] initWithData:characteristic.value encoding:NSUTF8StringEncoding];
but I need the Dec equivalent of 100
Thanks
<64> means that the NSData object contains a single byte with the value 0x64 = 100,
so the following should work;
const uint8_t *bytes = [data bytes]; // pointer to the bytes in data
int value = bytes[0]; // first byte
int *b = (int *)data.bytes;
printf("%d",*b); //prints 100
Below logic converts NSData to integer perefctly. Length of bytes does not matter. It just works.
NSData *data;
NSString *stringData = [data description];
stringData = [stringData substringWithRange:NSMakeRange(1, [stringData length]-2)];
unsigned dataAsInt = 0;
NSScanner *scanner = [NSScanner scannerWithString: stringData];
[scanner scanHexInt:& dataAsInt];