Convert XML Dsig format to DER ASN.1 public key - ios

I am working on an iPhone app that retrieves an RSA public key from an ASP.NET web service in the form:
<RSAKeyValue>
<Modulus>qdd0paiiBJ+xYaN4TKDdbEzrJJw9xlbRAltb5OPdegjLoW60yOjL/sni52WVsGC9QxpNitZR33dnUscmI0cTJoxkXypPjbD94UpH+p4el2tuKBypHlE7bERApuUp55y8BiRkbQNFH8smZFWDwtIc/PsJryeGf8fAryel8c5V3PU=</Modulus>
<Exponent>AQAB</Exponent>
</RSAKeyValue>
I need to then convert this response into an NSData * of the appropriate format (from some intense Googling, most likely 'ASN.1 DER' binary format. I've got code in place to convert both parts from their Base64 representations to the original binary values, but I can't for the life of me figure out a reasonable way to create the one-piece binary key.
The code waiting for the key is the -addPeerPublicKey:(NSString *) keyBits:(NSData *) method of the SecKeyWrapper class from Apple's CryptoExercise example project (Code here).
I would be more than happy to implement this another way--all I need is to encrypt a single string (no decryption required). As far as I can tell, though, the built-in Security framework has what I need, if I could just close this format gap. If there is a way to convert the key and send it Base64-encoded from the webservice, that works for me as well--but I couldn't find any way to ASN.1-encode it there, either.

So, I used the SecKeyWrapper class to generate a random key, then used the -getPublicKeyBits method to get the binary representation of the public key (in whatever format is used internally). Presuming it is some form of DER ASN.1, I NSLog'd it to the console as hex and loaded it into this program. Sure enough, the internal representation is DER ASN.1, but it is a very simplified version of what I normally found for RSA key representations:
![SEQUENCE { INTEGER, INTEGER }][2]
Shouldn't be too tough to construct on the fly from a binary rep. of the modulus and exponent, since the DER encoding is just
30 (for SEQUENCE) LL (total sequence byte length)
02 (INTEGER) LL (modulus byte length) XX XX... (modulus data bytes)
02 LL XX XX XX... (exponent length and bytes)
Here's my code, for simplicity. It uses a few Google libs for XML+base64, just heads up; also Apple's demo code SecKeyWrapper. See my other question for a note on making this work. Also, note that it is not ARC-compatible; this is left as an exercise for the reader (I wrote this years ago, now).
#define kTempPublicKey #"tempPayKey"
-(NSData *)encryptedDataWithXMLPublicKey:(NSString *)base64PublicKey data:(NSData *)data {
if(![data length]){
#throw [NSException exceptionWithName:#"NSInvalidArgumentException" reason:#"Data not set." userInfo:nil];
}
GTMStringEncoding *base64 = [GTMStringEncoding rfc4648Base64StringEncoding];
NSData *keyData = [base64 decode:base64PublicKey];
NSError *err = nil;
GDataXMLDocument *keyDoc = [[GDataXMLDocument alloc] initWithData:keyData options:0 error:&err];
if(err){
NSLog(#"Public key parse error: %#",err);
[keyDoc release];
return nil;
}
NSString *mod64 = [[[[keyDoc rootElement] elementsForName:#"Modulus"] lastObject] stringValue];
NSString *exp64 = [[[[keyDoc rootElement] elementsForName:#"Exponent"] lastObject] stringValue];
[keyDoc release];
if(![mod64 length] || ![exp64 length]){
#throw [NSException exceptionWithName:#"NSInvalidArgumentException" reason:#"Malformed public key xml." userInfo:nil];
}
NSData *modBits = [base64 decode:mod64];
NSData *expBits = [base64 decode:exp64];
/* the following is my (bmosher) hack to hand-encode the mod and exp
* into full DER encoding format, using the following as a guide:
* http://luca.ntop.org/Teaching/Appunti/asn1.html
* this is due to the unfortunate fact that the underlying API will
* only accept this format (not the separate values)
*/
// 6 extra bytes for tags and lengths
NSMutableData *fullKey = [[NSMutableData alloc] initWithLength:6+[modBits length]+[expBits length]];
unsigned char *fullKeyBytes = [fullKey mutableBytes];
unsigned int bytep = 0; // current byte pointer
fullKeyBytes[bytep++] = 0x30;
if(4+[modBits length]+[expBits length] >= 128){
fullKeyBytes[bytep++] = 0x81;
[fullKey increaseLengthBy:1];
}
unsigned int seqLenLoc = bytep;
fullKeyBytes[bytep++] = 4+[modBits length]+[expBits length];
fullKeyBytes[bytep++] = 0x02;
if([modBits length] >= 128){
fullKeyBytes[bytep++] = 0x81;
[fullKey increaseLengthBy:1];
fullKeyBytes[seqLenLoc]++;
}
fullKeyBytes[bytep++] = [modBits length];
[modBits getBytes:&fullKeyBytes[bytep]];
bytep += [modBits length];
fullKeyBytes[bytep++] = 0x02;
fullKeyBytes[bytep++] = [expBits length];
[expBits getBytes:&fullKeyBytes[bytep++]];
SecKeyRef publicKey = [[SecKeyWrapper sharedWrapper] addPeerPublicKey:kTempPublicKey keyBits:fullKey];
[fullKey release];
NSData *encrypted = [[SecKeyWrapper sharedWrapper] wrapSymmetricKey:data keyRef:publicKey];
// remove temporary key from keystore
[[SecKeyWrapper sharedWrapper] removePeerPublicKey:kTempPublicKey];
return encrypted;
}

Related

Copyright/Registered symbol encoding not working

I’ve developed an iOS app in which we can send emojis from iOS to web portal and vice versa. All emojis sent from iOS to web portal are displaying perfect except “© and ®”.
Here is the emoji encoding piece of code.
NSData *data = [messageBody dataUsingEncoding:NSNonLossyASCIIStringEncoding];
NSString *encodedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
// This piece of code returns \251\256 as Unicodes of copyright and registered emojis, as these two Unicodes are not according to standard code so it doesn't display on web portal.
So what should I do to convert them standard Unicodes?
Test Run :
messageBody = #"Copy right symbol : © AND Registered Mark symbol : ®";
// Encoded string i get from the above encoding is
Copy right symbol : \\251 AND Registered Mark symbol : \\256
Where as it should like this (On standard unicodes )
Copy right symbol : \\u00A9 AND Registered Mark symbol : \\u00AE
First, I will try to provide the solution. Then I will try to explain why.
Escaping non-ASCII chars
To escape unicode chars in a string, you shouldn't rely on NSNonLossyASCIIStringEncoding. Below is the code that I use to escape unicode&non-ASCII chars in a string:
// NSMutableString category
- (void)appendChar:(unichar)charToAppend {
[self appendFormat:#"%C", charToAppend];
}
// NSString category
- (NSString *)UEscapedString {
char const hexChar[] = "0123456789ABCDEF";
NSMutableString *outputString = [NSMutableString string];
for (NSInteger i = 0; i < self.length; i++) {
unichar character = [self characterAtIndex:i];
if ((character >> 7) > 0) {
[outputString appendString:#"\\u"];
[outputString appendChar:(hexChar[(character >> 12) & 0xF])]; // append the hex character for the left-most 4-bits
[outputString appendChar:(hexChar[(character >> 8) & 0xF])]; // hex for the second group of 4-bits from the left
[outputString appendChar:(hexChar[(character >> 4) & 0xF])]; // hex for the third group
[outputString appendChar:(hexChar[character & 0xF])]; // hex for the last group, e.g., the right most 4-bits
} else {
[outputString appendChar:character];
}
}
return [outputString copy];
}
(NOTE: I guess Jon Rose's method does the same but I didn't wanna share a method that I didn't test)
Now you have the following string: Copy right symbol : \u00A9 AND Registered Mark symbol : \u00AE
Escaping unicode is done. Now let's convert it back to display the emojis.
Converting back
This is gonna be confusing at first but this is what it is:
NSData *data = [escapedString dataUsingEncoding:NSUTF8StringEncoding];
NSString *converted = [[NSString alloc] data encoding:NSNonLossyASCIIStringEncoding];
Now you have your emojis (and other non-ASCIIs) back.
What is happening?
The problem
In your case, you are trying to create a common language between your server side and your app. However, NSNonLossyASCIIStringEncoding is pretty bad choice for the purpose. Because this is a black-box that is created by Apple and we don't really know what it is exactly doing inside. As we can see, it converts unicode into \uXXXX while converting non-ASCII chars into \XXX. That is why you shouldn't rely on it to build a multi-platform system. There is no equivalent of it in backend platforms and Android.
Yet it is pretty mysterious, NSNonLossyASCIIStringEncoding can still convert back ® from \u00AE while it is converting it into \256 in the first place. I'm sure there are tools on other platforms to convert \uXXXX into unicode chars, that shouldn't be a problem for you.
messageBody is a string there is no reason to convert it to data only to convert it back to a string. Replace your code with
NSString *encodedString = messageBody;
If the messageBody object is incorrect then the way to fix it is to change the way it was created. The server sends data, not strings. The data that the server sends is encoding in some agreed upon way. Generally this encoding is UTF-8. If you know the encoding you can convert the data to a string; if you don't, then the data is gibberish that cannot be read. If the messageBody is incorrect, the problem occurred when it was converted from the data that the server sent. It seems likely that you are parsing it with the incorrect encoding.
The code you posted is just plain wrong. It converts a string to data using one encoding (ASCII) and the reads that data with a different encoding (UTF8). That is like translating a book to Spanish and then having a Portuguese speaker translate it back - it might work for some words, but it is still wrong.
If you are still having trouble then you should share the code of where messageBody is created.
If you server expects a ASCII string with all unicode characters changed to \u00xx then you should first yell at your server guy because he is an idiot. But if that doesn't work you can do the following code
NSString* messageBody = #"Copy right symbol : © AND Registered Mark symbol : ®";
NSData* utf32Data = [messageBody dataUsingEncoding:NSUTF32StringEncoding];
uint32_t *bytes = (uint32_t *) [utf32Data bytes];
NSMutableString* escapedString = [[NSMutableString alloc] init];
//Start a 1 because first bytes are for endianness
for(NSUInteger index = 1; index < escapedString.length / 4 ;index++ ){
uint32_t charValue = bytes[index];
if (charValue <= 127) {
[escapedString appendFormat:#"%C", (unichar)charValue];
}else{
[escapedString appendFormat:#"\\\\u%04X", charValue];
}
}
I'm really do not understand your problem.
You can simply convert ANY character into nsdata and return it into string.
You can simply pass UTF-8 string including both emoji and other symbols using POST request.
NSString* newStr = [[NSString alloc] initWithData:theData encoding:NSUTF8StringEncoding];
NSData* data = [newStr dataUsingEncoding:NSUTF8StringEncoding];
It have to work for both server and client side.
But, of course, you have got the other problem that some fonts do not support allutf-8 chars. That's why, e.g., in terminal you might not see some of them. But this is beyong the scope of this question.
NSNonLossyASCIIStringEncoding is used only then you really wnat to convert symbol into chain of symbols. But it is not needed.

Binary hash representation to HEX/Ascii in Objective-c

I would to log a binary hash representation in the console, using an hex or ascii representation. The algorithm is MD5, so the function is CC_MD5
I get the binary hash representation via a Theos tweak, which is working well.
EDIT: this tweak intercept the CC_MD5 call. The call is implemented in the method described below. When CC_MD5 is called, replaced_CC_MD5 intercept the call.
The app tested, is a simple app which i made myself and it's using this method to calculate MD5 Hash:
- (NSString *) md5:(NSString *) input
{
const char *cStr = [input UTF8String];
unsigned char digest[16];
CC_MD5( cStr, strlen(cStr), digest ); // This is the md5 call
NSMutableString *output = [NSMutableString stringWithCapacity:CC_MD5_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", digest[i]];
return output;
}
The hashing it's ok, and the app returns to me the correct hash for the input
input = prova
MD5 Digest = 189bbbb00c5f1fb7fba9ad9285f193d1
The function in my Theos Tweak where i manipulate the CC_MD5 function is
EDIT: where data would be cStr, len would be strlen(cStr) and md would be digest.
static unsigned char * replaced_CC_MD5(const void *data, CC_LONG len, unsigned char *md) {
CC_LONG dataLength = (size_t) len;
NSLog(#"==== START CC_MD5 HOOK ====");
// hex of digest
NSData *dataDigest = [NSData dataWithBytes:(const void *)md length:(NSUInteger)CC_MD5_DIGEST_LENGTH];
NSLog(#"%#", dataDigest);
// hex of string
NSData *dataString = [NSData dataWithBytes:(const void *)data length:(NSUInteger)dataLength];
NSLog(#"%#", dataString);
NSLog(#"==== END CC_MD5 HOOK ====");
return original_CC_MD5(data, len, md);
}
The log of dataString it's ok: 70726f76 61 which is the HEX representation of prova
The log of dataDigest is e9aa0800 01000000 b8c00800 01000000 which is, if i understood, the binary hash representation.
How can i convert this representation to have the MD5 Hash digest?
In replaced_CC_MD5 you are displaying md before the call to original_CC_MD5 which sets its value. What you are seeing is therefore random data (or whatever was last stored in md).
Move the call to original_CC_MD5 to before the display statement and you should see the value you expect. (You'll of course need to save the result of the call in a local so you can return the value in the return statement.)

Random 256bit key using SecRandomCopyBytes( ) in iOS

I have been using UUIDString as an encrption key for the files stored on my iPAD, but the security review done on my app by a third party suggested the following.
With the launch of the application, a global database key is generated and stored in the keychain. During generation, the method UUIDString of the class NSUUID provided by the iOS is used. This function generates a random string composed of letters A to F, numbers and hyphens and unnecessarily restricts the key space, resulting in a weakening of the entropy.
Since the key is used only by application logic and does not have to be read, understood or processed by an individual, there is no need to restrict the key space to readable characters. Therefore, a random 256-bit key generated via SecRandomCopyBytes () should be used as the master key.
Now I have searched a lot and tried some code implementation but havent found the exact thing.
What I have tried:
NSMutableData* data = [NSMutableData dataWithLength:32];
int result = SecRandomCopyBytes(kSecRandomDefault, 32, data.mutableBytes);
NSLog(#"Description %d",result);
My understanding is that this should give me an integer and I should convert it to an NSString and use this as my key, but I am pretty sure that this is not what is required here and also the above method always gives the result as 0. I am completely lost here and any help is appreciated.
Thanks.
The result of SecRandomCopyBytes should always be 0, unless there is some error (which I can't imagine why that might happen) and then the result would be -1. You're not going to convert that into a NSString.
The thing you're trying to get are the random bytes which are being written into the mutable bytes section, and that's what you'll be using as your "master key" instead of the UUID string.
The way I would do it would be:
uint8_t randomBytes[16];
int result = SecRandomCopyBytes(kSecRandomDefault, 16, randomBytes);
if(result == 0) {
NSMutableString *uuidStringReplacement = [[NSMutableString alloc] initWithCapacity:16*2];
for(NSInteger index = 0; index < 16; index++)
{
[uuidStringReplacement appendFormat: #"%02x", randomBytes[index]];
}
NSLog(#"uuidStringReplacement is %#", uuidStringReplacement);
} else {
NSLog(#"SecRandomCopyBytes failed for some reason");
}
Using a UUIDString feels secure enough to me, but it sounds like your third party security audit firm is trying really hard to justify their fees.
EDITED: since I'm now starting to collect downvotes because of Vlad's alternative answer and I can't delete mine (as it still has the accepted checkmark), here's another version of my code. I'm doing it with 16 random bytes (which gets doubled in converting to Hex).
The NSData generated does not guarantee UTF16 chars.
This method will generate 32byte UTF string which is equivalent to 256bit. (Advantage is this is plain text and can be sent in GET requests ext.)
Since the length of Base64 hash is = (3/4) x (length of input string) we can work out input length required to generate 32byte hash is 24 bytes long. Note: Base64 may pad end with one, two or no '=' chars if not divisible.
With OSX 10.9 & iOS 7 you can use:
-[NSData base64EncodedDataWithOptions:]
This method can be used to generate your UUID:
+ (NSString*)generateSecureUUID {
NSMutableData *data = [NSMutableData dataWithLength:24];
int result = SecRandomCopyBytes(NULL, 24, data.mutableBytes);
NSAssert(result == 0, #"Error generating random bytes: %d", result);
NSString *base64EncodedData = [data base64EncodedStringWithOptions:0];
return base64EncodedData;
}
A UUID is a 16 bytes (128 bits) unique identifier, so you aren't using a 256 bits key here. Also, as #zaph pointed out, UUIDs use hardware identifiers and other inputs to guarantee uniqueness. These factors being predictable are definitely not cryptographically secure.
You don't have to use a UUID as an encryption key, instead I would go for a base 64 or hexadecimal encoded data of 32 bytes, so you'll have your 256 bit cryptographically secure key:
/** Generates a 256 bits cryptographically secure key.
* The output will be a 44 characters base 64 string (32 bytes data
* before the base 64 encoding).
* #return A base 64 encoded 256 bits secure key.
*/
+ (NSString*)generateSecureKey
{
NSMutableData *data = [NSMutableData dataWithLength:32];
int result = SecRandomCopyBytes(kSecRandomDefault, 32, data.mutableBytes);
if (result != noErr) {
return nil;
}
return [data base64EncodedStringWithOptions:kNilOptions];
}
To answer the part about generate UUID-like (secure) random numbers, here's a good way, but remember these will be 128 bits only keys:
/** Generates a 128 bits cryptographically secure key, formatted as a UUID.
* Keep that you won't have the same guarantee for uniqueness
* as you have with regular UUIDs.
* #return A cryptographically secure UUID.
*/
+ (NSString*)generateCryptoSecureUUID
{
unsigned char bytes[16];
int result = SecRandomCopyBytes(kSecRandomDefault, 16, bytes);
if (result != noErr) {
return nil;
}
return [[NSUUID alloc] initWithUUIDBytes:bytes].UUIDString;
}
Cryptography is great, but doing it right is really hard (it's easy to leave security breaches). I cannot recommend you more the use of RNCryptor, which will push you through the use of good encryption standards, will make sure you're not unsafely reusing the same keys, will derivate encryption keys from passwords correctly, etc.
And i try this code for length 16 and bytes 16 :
uint8_t randomBytes[16];
NSMutableString *ivStr;
int result = SecRandomCopyBytes(kSecRandomDefault, 16, randomBytes);
if(result == 0) {
ivStr = [[NSMutableString alloc] initWithCapacity:16];
for(NSInteger index = 0; index < 8; index++)
{
[ivStr appendFormat: #"%02x", randomBytes[index]];
}
NSLog(#"uuidStringReplacement is %#", ivStr);
} else {
NSLog(#"SecRandomCopyBytes failed for some reason");
}
Successful
Since the Key usually needs to be UTF-8 encoded and "readable" - i.e. with no UTF-8 control characters- I decided to filter the randomly generated bytes generated using SecRandomCopyBytes so it'd only have characters from the Basic Latin Unicode block.
/*!
* #brief Generates NSData from a randomly generated byte array with a specific number of bits
* #param numberOfBits the number of bits the generated data must have
* #return the randomly generated NSData
*/
+ (NSData *)randomKeyDataGeneratorWithNumberBits:(int)numberOfBits {
int numberOfBytes = numberOfBits/8;
uint8_t randomBytes[numberOfBytes];
int result = SecRandomCopyBytes(kSecRandomDefault, numberOfBytes, randomBytes);
if(result == 0) {
return [NSData dataWithBytes:randomBytes length:numberOfBytes];
} else {
return nil;
}
}
/*!
* #brief Generates UTF-8 NSData from a randomly generated byte array with a specific number of bits
* #param numberOfBits the number of bits the generated data must have
* #return the randomly generated NSData
*/
+ (NSData *)randomKeyUTF8DataGeneratorWithNumberBits:(int)numberOfBits {
NSMutableData *result = [[NSMutableData alloc] init];
int numberOfBytes = numberOfBits/8;
while (result.length < numberOfBytes) {
// Creates a random byte
NSData *byte = [self randomKeyDataGeneratorWithNumberBits:8];
int asciiValue = [[[NSString alloc] initWithData:byte encoding:NSUTF8StringEncoding] characterAtIndex:0];
// Checks if the byte is UTF-8
if (asciiValue > 32 && asciiValue < 127) {
[result appendData:byte];
}
}
return result;
}
If you want to make your key a little more "readable" you can try and make it Base64 URL Safe
/*!
* #brief Encodes a String Base 64 with URL and Filename Safe Alphabet
* #discussion Base64url Encoding The URL- and filename-safe Base64 encoding described in RFC 4648 [RFC4648] (https://tools.ietf.org/html/rfc4648)
* #discussion Section 5 (https://tools.ietf.org/html/rfc4648#section-5)
* #param string the string to be enconded
* #return the encoded string
*/
+ (NSString *)base64URLandFilenameSafeString:(NSString *)string {
NSString *base64String = string;
base64String = [base64String stringByReplacingOccurrencesOfString:#"/"
withString:#"_"];
base64String = [base64String stringByReplacingOccurrencesOfString:#"+"
withString:#"-"];
return base64String;
}
Generate a UTF-8 256 bits key:
NSData *key = [self randomKeyUTF8DataGeneratorWithNumberBits:256];
NSString *UTF8String = [[NSString alloc] initWithBytes:[key bytes] length:data.length encoding:NSUTF8StringEncoding];
NSString *base64URLSafeString = [self base64URLandFilenameSafeString:UTF8String];

Convert NSData to a NSString returns random characters

I am working on a bluetooth iOS project and have managed to get some data from the bluetooth device.
However, I am struggling to convert this data into something useful, such as an NSString. Whenever I try to NSLog the NSString that was converted from the NSData received, it is a bunch of gibberish. The output is:
ēဥ၆䄀
The bluetooth device is a heart monitor from a manufacturer in Asia and they have provided the protocol reference on how to make calls to the device. This one thing they mention in the protocol reference:
The PC send 16-byte packets to the device, then the device sent back the 16-byte packets. Except for some special commands, all others can use this communication mode.
Can anyone tell me what I am doing wrong? I have tried everything I know, including every single encoding in the apple docs as well as both initWithData and initWithBytes. Thanks!
-(void)peripheral:(CBPeripheral *)peripheral didUpdateValueForCharacteristic:(CBCharacteristic *)characteristic
error:(NSError *)error {
if (error)
{
NSLog(#"erorr in read is %#", error.description);
return;
}
NSData *data= characteristic.value;
NSString *myString = [[NSString alloc] initWithBytes:[data bytes] length:[data length] encoding:NSUTF16StringEncoding];
NSLog(#"Value from device is %#", myString); //OUTPUT IS ēဥ၆䄀
}
What you have here is a string of raw data that can't be directly converted into a human readable string - unless you consider hex-representation to be human readable :)
To make sense of this data you need to either have a protocol specification at hand or prepare for hours (sometimes) days of reverse-engineering.
This byte-sequence can be composed of multiple values formatted in standard (float IEEE 754, uint8_t, uint16_t...) or even proprietary formats.
One important thing to consider when communicating with the outside world is also endianness (ie: does the 'biggest' byte in multi-byte format come first or last).
There are many ways to manipulate this data. To get the raw array of bytes you could do:
NSData *rxData = ...
uint8_t *bytes = (uint8_t *)[rxData bytes];
And then if (for example) first byte tells you what type of payload the string holds you can switch like:
switch (bytes[0])
{
case 0x00:
//first byte 0x00: do the parsing
break;
case 0x01:
//first byte 0x01: do the parsing
break;
// ...
default:
break;
}
Here would be an example of parsing data that consists of:
byte 0: byte holding some bit-coded flags
bytes 1,2,3,4: 32-bit float
bytes 5,6: uint16_t
bool bitFlag0;
bool bitFlag1;
bool bitFlag2;
bool bitFlag3;
uint8_t firstByte;
float theFloat;
uint16_t theInteger;
NSData *rxData = ...
uint8_t *bytes = (uint8_t *)[rxData bytes];
// getting the flags
firstByte = bytes[0];
bitFlag0 = firstByte & 0x01;
bitFlag1 = firstByte & 0x02;
bitFlag2 = firstByte & 0x04;
bitFlag3 = firstByte & 0x08;
//getting the float
[[rxData subdataWithRange:NSMakeRange(1, 4)] getBytes:&theFloat length:sizeof(float)];
NSLog (#"the float is &.2f",theFloat);
//getting the unsigned integer
[[data subdataWithRange:NSMakeRange(6, 2)] getBytes:&theInteger length:sizeof(uint16_t)];
NSLog (#"the integer is %u",theInteger);
One note: depending on the endianness you might need to reverse the 4-float or the 2-uint16_t bytes before converting them. Converting this byte arrays can also be done with unions.
union bytesToFloat
{
uint8_t b[4];
float f;
};
and then:
bytesToFloat conv;
//float would be written on bytes b1b2b3b4 in protocol
conv.b[0] = bytes[1]; //or bytes[4] .. endianness!
conv.b[1] = bytes[2]; //or bytes[3] .. endianness!
conv.b[2] = bytes[3]; //or bytes[2] .. endianness!
conv.b[3] = bytes[4]; //or bytes[1] .. endianness!
theFloat = conv.f,
If for example you know that byte6 and byte7 represent an uint16_t value you can calculate it from raw bytes:
value = uint16_t((bytes[6]<<8)+bytes[7]);
or (again - endianness):
value = uint16_t((bytes[7]<<8)+bytes[6]);
One more note: using simply sizeof(float) is a bit risky since float can be 32-bit on one platform and 64-bit on another.

Obfuscating a number(in a string) Objective C

I'm using the following code to obfuscate a passcode for a test app of mine.
- (NSString *)obfuscate:(NSString *)string withKey:(NSString *)key
{
// Create data object from the string
NSData *data = [string dataUsingEncoding:NSUTF8StringEncoding];
// Get pointer to data to obfuscate
char *dataPtr = (char *) [data bytes];
// Get pointer to key data
char *keyData = (char *) [[key dataUsingEncoding:NSUTF8StringEncoding] bytes];
// Points to each char in sequence in the key
char *keyPtr = keyData;
int keyIndex = 0;
// For each character in data, xor with current value in key
for (int x = 0; x < [data length]; x++)
{
// Replace current character in data with
// current character xor'd with current key value.
// Bump each pointer to the next character
*dataPtr = *dataPtr++ ^ *keyPtr++;
// If at end of key data, reset count and
// set key pointer back to start of key value
if (++keyIndex == [key length])
keyIndex = 0, keyPtr = keyData;
}
return [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
}
This works like a charm with all strings, but i've ran into a bit of a problem comparing the following results
NSLog([[self obfuscate:#"0000", #"maki"]); //Returns 0]<W
NSLog([[self obfuscate:#"0809", #"maki"]); //Returns 0]<W
As you can see, the two strings with numbers in, while different, return the same result! Whats gone wrong in the code i've attached to result in the same result for these two numbers?
Another example:
NSLog([self obfuscate:#"8000" withKey:#"maki"]); //Returns 8U4_
NSLog([self obfuscate:#"8290" withKey:#"maki"]); //Returns 8U4_ as well
I may be misunderstanding the concept of obfuscation, but I was under the impression that each unique string returns a unique obfuscated string!
Please help me fix this bug/glitch
Source of Code: http://iosdevelopertips.com/cocoa/obfuscation-encryption-of-string-nsstring.html
The problem is your last line. You create the new string with the original, unmodified data object.
You need to create a new NSData object from the modified dataPtr bytes.
NSData *newData = [NSData dataWithBytes:dataPtr length:data.length];
return [[NSString alloc] initWithData:newData encoding:NSUTF8StringEncoding];
But you have some bigger issues.
The calls to bytes returns a constant, read-only reference to the bytes in the NSData object. You should NOT be modifying that data.
The result of your XOR on the character data could, in theory, result in a byte stream that is no longer a valid UTF-8 encoded string.
The obfuscation algorithm that you have selected is based on XORing the data and the "key" values together. Generally, this is not very strong. Moreover, since XOR is symmetric, the results are very prone to producing duplicates.
Although your implementation is currently broken, fixing it would not be of much help in preventing the algorithm from producing identical results for different data: it is relatively straightforward to construct key/data pairs that produce the same obfuscated string - for example,
[self obfuscate:#"0123" withKey:#"vwxy"]
[self obfuscate:#"pqrs" withKey:#"6789"]
will produce identical results "FFJJ", even though both the strings and the keys look sufficiently different.
If you would like to "obfuscate" your strings in a cryptographically strong way, use a salted secure hash algorithm: it will produce very different results for even slightly different strings.

Resources