How to appropriately encrypt and decrypt a NSString with AES 128 - ios

I am using http://aes.online-domain-tools.com to encrypt my NSString and what i get back from this is an array of unsigned char like this c2 84 6b 71 72 6d d2 e7 cd 0b a6 08 cd 85 c3 0c.
Then is use this to convert it into NSString in my code:
const unsigned char encrpytedAppIDbytes[] = {0xe5, 0x35, 0xdf, 0x72, 0x57, 0xaf, 0xf7, 0xe6, 0x1f, 0x6d, 0x51, 0x1d, 0x26, 0xe8, 0x5e, 0xa2};
NSData *appIDToDecrypt = [NSData dataWithBytes:encrpytedAppIDbytes length:sizeof(encrpytedAppIDbytes)];
NSString *decryptedAppID = [[NSString alloc] initWithData:[appIDToDecrypt AES128DecryptedDataWithKey:#"something"] encoding:NSUTF8StringEncoding];
if([decryptedAppID isEqualToString:#"Something"]){} // This fails even when i look at them in the debugger they are the same.
But when i am trying to decrypt it, its showing up as the same string but when i compare it with the same NSString hardcode to check if it is the same string it doesn't work.
This fails some authentication check i have in my app.
Please point anything wrong i am doing here.
Thanks,

Alright so after spending few hours with it i finally found the solutions which might not be optimal but works in my case.
It seems like after decryption, the string contains some other characters which are not visible in the debugger but when i am trying to check the length it shows greater than the number of characters in it which indicates that there is something wrong. For now what i have done is this :
const unsigned char nameBytes[] = {0xa6, 0xf0, 0xea, 0x36, 0x5f, 0x78, 0xb7, 0x52, 0x29, 0x6a, 0x67, 0xb7, 0xeb, 0x73, 0xd5, 0x14};
NSData *nameBytesData = [NSData dataWithBytes:nameBytes length:sizeof(nameBytes)];
NSString *nameBytesString = [[NSString alloc] initWithData:[nameBytesData AES128DecryptedDataWithKey:#"PaymentGateway"] encoding:NSUTF8StringEncoding];
NSCharacterSet * set = [[NSCharacterSet characterSetWithCharactersInString:#"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLKMNOPQRSTUVWXYZ0123456789"] invertedSet];
NSString *safeSearchString = [[nameBytesString componentsSeparatedByCharactersInSet:set] componentsJoinedByString:#""];
NSLog(#"length:%lu",(unsigned long)[safeSearchString length]);
NSLog(#"lengthActual:%lu",(unsigned long)[#"ashutosh" length]);
if ([safeSearchString isEqualToString:#"ashutosh"]) {
NSLog(#"Success");
}
NSLog(#"Decrypted:%#",nameBytesString);
The code above removes all the special characters and replaces it with #"" so the resulted string only has valid chars. For adding support to consider more chars as valid just add them to NSCharacterSet * set.

Related

How to reinterpret data given by AVAudioRecorder as const char *?

PROBLEM:
The problem I am trying to solve is the following. I have audio data recorded by AVAudioRecorder. I can get the NSData by:
NSData *data = [NSData dataWithContentsOfURL: self.audioRecorder.url];
But then I need to convert/reinterpret this NSData to a const char* form which would essentially look like
00 01 00 ff
which are bytes in hex or at least the equivalent string. They don't have to be actually in hex but just needs to be convertible to hex.
QUESTION:
My question is that the NSData has "\0" in them. So if I do something like this:
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
It would not work as the data will be cutoff when it meets the first "\0". I am super new to audio files, but I think it is because of the x00 values in the header. So basically, I don't want to them to be interpreted as "\0" but as "00". Is there a way to do this?
Not sure I understand the question or what you are trying to do. Your memcpy will copy all the bytes to the byteData buffer, it is only when you try to use the byteData buffer as a string (char*) and pass them into a format function (NSLog(%"%s", val)) will it cut off. If you want a string representation of the data as hex:
NSString* bytesToHex(Byte* bytes, NSUInteger count) {
NSMutableString *hex = [NSMutableString string];
for(int i = 0; i < count; i++) [hex appendFormat:#"%.2x " , *(bytes+i)];
return hex;
}
NSString* dataToHex(NSData* data) {
return bytesToHex((Byte*)data.bytes, data.length);
}
will do it, ie:
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSLog(#"%#",NSLog(#"%#", dataToHex(data));
will print:
74 00 68 00 69 00 73 00 20 69 00 73 00 20 61 00 20 74 00 65 00 73 74 00
or
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
NSLog(#"%#", bytesToHex(byteData, len));
will print:
74 00 68 00 69 00 73 00 20 69 00 73 00 20 61 00 20 74 00 65 00 73 74 00
Just remembered something
Even easier, if you use the NSData description property, it gives you the data in hex already!
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSLog(#"%#", data.description);
Will print
<74006800 69007300 20690073 00206100 20740065 00737400>
Not as pretty, but the same thing...

const char array in Objective-C has different value depending upon device

In an app we are working on, there is a literal string we would like to keep secret so we have referenced it in the source as a const char array:
const char secret[] = { 0x63, 0x35, 0x4d, 0x58, 0x52, 0x32, 0x2c, 0x52, 0x53, 0x12, 0x3c, 0x74, 0x51, 0x53, 0x 69, 0x8a, 0x64, 0x12, 0x7f, 0x6e, 0x25, 0x64, 0x4e, 0x32, 0x23, 0x53, 0x12, 0x7b, 0x4c, 0x87, 0x64, 0x23, 0x41, 0x23, 0x56, 0x34, 0x6c, 0x23, 0x75, 0x5e, 0x56, 0x23, 0x65, 0x5b, 0x23, 0x75, 0x12, 0x65, 0x23, 0x76, 0x3a, 0x2f, 0x53, 0x32a, 0x23, 0x54, 0x54, 0x21, 0x64, 0x32, 0x53, 0x13, 0x24, 0x32 };
(I've changed this so it doesn't match our secret :) )
We use -[NSData dataWithBytes:length:] to convert secret to NSData, then base 64 decode it and -[NSString initWithData:encoding:] it.
The problem is, on iPhone 5 & 4s converting the decoded data to a string fails.
Upon inspecting the contents of secret in the debugger, there are more characters than there should be.
Finally, copying the exact same literal to another const char and printing both in succession produces different results.
What is going on?
It seems that the devices in question are reading past the end of the array until they meet some kind of termination character. To stop this from happening, you need the final character in the array to be a newline.
$ echo -en "\n" | xxd -pu
shows us that \n as a hex is 0a, so adding 0x0A as the final element in the secret array literal will stop the OS from reading random memory. You also may want to make sure that the final NSString doesn't contain the newline character :)
Update
The above fixed all problems on Debug builds, but on a Release build the same issue occurred so we replaced 0x0a with 0x0 and both builds started working

IOS:Convert hex values from the characterstic.value result

i am able to retrieve value from the peripheral as hex value and i need to convert as per my requirement.[24/12/14 11:37:00 am] sonali_phatak: I can see that i have received proper response.from 01117100352e36302e313100000000e55a
01 - 01-start byte
11 - 17(Dec) - length of responce packet
71 - response ID
00 - Ignore this byte
So now out of total length 17, first 4 bytes are header, last 2 bytes are CRC. We
need to read remaining 11 bytes and convert them to ASCII.
35 - 5
2e - .
36 - 6
30 - 0
2e - .
31 - 1
31 - 1
So Iam getting version number from watch as 5.60.11
But i need to show the above value 5.60.11 in string and print in console . how to convert it pleas help me
Please try this :
NSString *strOriginalHex= #"01117100352e36302e313100000000e55a";
NSString *strNewHexForVersion = [strOriginalHex substringWithRange:NSMakeRange(8, 14)];
NSLog(#"%#",[self stringFromHexString:strNewHexForVersion]);//5.60.11
- (NSString *)stringFromHexString:(NSString *)aStrHexString
{
// The hex codes should all be two characters.
if (([aStrHexString length] % 2) != 0)
return nil;
NSMutableString *aMutStrNewString = [NSMutableString string];
for (NSInteger i = 0; i < [aStrHexString length]; i += 2)
{
NSString *hex = [aStrHexString substringWithRange:NSMakeRange(i, 2)];
NSInteger decimalValue = 0;
sscanf([hex UTF8String], "%x", &decimalValue);
[aMutStrNewString appendFormat:#"%c", decimalValue];
}
return aMutStrNewString;
}

Create Byte array from NSMutableArray

I want to create a Byte Array like this one;
Byte UUID[] = {0xEB, 0xEF, 0xD0, 0x83, 0x70, 0xA2, 0x47, 0xC8, 0x98, 0x37, 0xE7, 0xB5, 0x63, 0x4D, 0xF5, 0x24};
But the problem here I am facing is, I need to fill all the elements in the above array programatically from a NSMutableArray that holds the values as below;
(
0xEB,
0xEF,
0xD0,
0x83,
0x70,
0xA2,
0x47,
0xC8,
0x98,
0x37,
0xE7,
0xB5,
0x63,
0x4D,
0xF5,
0x24
)
I have tried with the integer values of each index but it is showing '/0' in the Byte Array.
If anyone have any information regarding this please share.
Thanks
Assuming that you have an array of strings "0xEB", "0xEF", ..., the following should work:
NSArray *array = #[#"0xEB", #"0xEF", #"0xD0", #"0x83", #"0x70", #"0xA2", #"0x47", #"0xC8", #"0x98", #"0x37", #"0xE7", #"0xB5", #"0x63", #"0x4D", #"0xF5", #"0x24"];
Byte UUID[16];
for (int i = 0; i < 16; i++) {
UUID[i] = strtoul([array[i] UTF8String], NULL, 16);
}
This works even if the strings do not have the "0x" prefix:
NSArray *array = #[#"EB", #"EF", ...]
because strtoul(string, ..., 16) reads a string with or without "0x" prefix
in base 16, and converts it to an integer.

How to read a NSInputStream with UTF-8?

I try to read a large file in iOS using NSInputStream to separate the files line by newlines (I don't want to use componentsSeparatedByCharactersInSet as it uses too much memory).
But as not all lines seem to be UTF-8 encoded (as they can appear just as ASCII, same bytes) I often get the Incorrect NSStringEncoding value 0x0000 detected. Assuming NSASCIIStringEncoding. Will stop this compatiblity mapping behavior in the near future. warning.
My question is: Is there a way to surpress this warning by e.g. setting a compiler flag?
Furthermore: Is it save to append/concatenate two buffer reads, as reading from the byte stream, then converting the buffer to string and then appending the string could make the string corrupted?
Below an example method that demonstrates that the byte to string conversion will discard the first and second half of the UTF-8 character, as being invalid.
- (void)NSInputStreamTest {
uint8_t testString[] = {0xd0, 0x91}; // #"Б"
// Test 1: Read max 1 byte at a time of UTF-8 string
uint8_t buf1[1], buf2[1];
NSString *s1, *s2, *s3;
NSInteger c1, c2;
NSInputStream *inStream = [[NSInputStream alloc] initWithData:[[NSData alloc] initWithBytes:testString length:2]];
[inStream open];
c1 = [inStream read:buf1 maxLength:1];
s1 = [[NSString alloc] initWithBytes:buf1 length:1 encoding:NSUTF8StringEncoding];
NSLog(#"Test 1: Read %d byte(s): %#", c1, s1);
c2 = [inStream read:buf2 maxLength:1];
s2 = [[NSString alloc] initWithBytes:buf2 length:1 encoding:NSUTF8StringEncoding];
NSLog(#"Test 1: Read %d byte(s): %#", c2, s2);
s3 = [s1 stringByAppendingString:s2];
NSLog(#"Test 1: Concatenated: %#", s3);
[inStream close];
// Test 2: Read max 2 bytes at a time of UTF-8 string
uint8_t buf4[2];
NSString *s4;
NSInteger c4;
NSInputStream *inStream2 = [[NSInputStream alloc] initWithData:[[NSData alloc] initWithBytes:testString length:2]];
[inStream2 open];
c4 = [inStream2 read:buf4 maxLength:2];
s4 = [[NSString alloc] initWithBytes:buf4 length:2 encoding:NSUTF8StringEncoding];
NSLog(#"Test 2: Read %d byte(s): %#", c4, s4);
[inStream2 close];
}
Output:
2013-02-10 21:16:23.412 Test[11144:c07] Test 1: Read 1 byte(s): (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 1: Read 1 byte(s): (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 1: Concatenated: (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 2: Read 2 byte(s): Б
First of all, in line: s3 = [s1 stringByAppendingString:s2]; you are trying to concatenate to 'nil' values. The result would be 'nil' also. So, you may want to concatenate bytes instead of strings:
uint8_t buf3[2];
buf3[0] = buf1[0];
buf3[1] = buf2[0];
s3 = [[NSString alloc] initWithBytes:buf3 length:2 encoding:NSUTF8StringEncoding];
Output:
2015-11-06 12:57:40.304 Test[10803:883182] Test 1: Read 1 byte(s): (null)
2015-11-06 12:57:40.305 Test[10803:883182] Test 1: Read 1 byte(s): (null)
2015-11-06 12:57:40.305 Test[10803:883182] Test 1: Concatenated: Б
Secondary, length of UTF-8 character may lay in [1..6] bytes.
(1 byte) 0aaa aaaa //if symbol lays in 0x00 .. 0x7F (ASCII)
(2 bytes) 110x xxxx 10xx xxxx
(3 bytes) 1110 xxxx 10xx xxxx 10xx xxxx
(4 bytes) 1111 0xxx 10xx xxxx 10xx xxxx 10xx xxxx
(5 bytes) 1111 10xx 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx
(6 bytes) 1111 110x 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx
So, if you are intended to read from NSInputStream raw bytes and then translate them into UTF-8 NSString, you probably want to read byte by byte from NSInputStream until you will get valid string:
#define MAX_UTF8_BYTES 6
NSString *utf8String;
NSMutableData *_data = [[NSMutableData alloc] init]; //for easy 'appending' bytes
int bytes_read = 0;
while (!utf8String) {
if (bytes_read > MAX_UTF8_BYTES) {
NSLog(#"Can't decode input byte array into UTF8.");
return;
}
else {
uint8_t byte[1];
[_inputStream read:byte maxLength:1];
[_data appendBytes:byte length:1];
utf8String = [NSString stringWithUTF8String:[_data bytes]];
bytes_read++;
}
}
ASCII (and hence the newline character) is a subset of UTF-8, so there should not be any conflict.
It should be possible to divide your stream at the newline characters, as you would in a simple ASCII stream. Then you can convert each chunk ("line") into an NSString using UTF-8.
Are you sure the encoding errors are not real, i.e., that your stream may actually contain erroneous characters with respect to a UTF-8 encoding?
Edited to add from the comments:
This presumes that the lines consist of sufficiently few characters to keep a whole line in memory before converting from UTF-8.

Resources