Objective C Google Maps API public key function - ios

At the moment I'm having an issue with producing a public key for a given Google Maps Web Services API query.
The documentation stipulates that the signature must be produced with a modified base-64 HMAC-SHA1 hash, on the path and query part of the URL.
However using this function, and testing it with this tool shows that it isn't working correctly.
+ (NSString *) hmac:(NSString *)data withKey:(NSString *)key{
const char *cKey = [key cStringUsingEncoding:NSUTF8StringEncoding];
const char *cData = [data cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *hash = [HMAC base64EncodedStringWithOptions:0];
hash = [hash stringByReplacingOccurrencesOfString:#"+" withString:#"-"];
hash = [hash stringByReplacingOccurrencesOfString:#"/" withString:#"_"];
return hash;
}
I am calling this function where data has been precent encoded;
[urlString stringByAddingPercentEscapesUsingEncoding:NSASCIIStringEncoding];
Where am I going wrong? Any help greatly appreciated.

You can use this for your url signing : https://github.com/youssman/GMUrlSigner
I am interested in your feedback and comments ;-)

Google has a sample Objective C code - is there any particular reason that you are not using them and some of the google functions in your code? In particular the equivalence of CKey is not plain 7 bit asCII (as coded in your sample) but Base64 (rfc4648Base64WebsafeStringEncoding). For the unfamiliar, Base64 is typically uses in mime Mail to allow attachments to go through plain text email years ago (prior to HTML Mail) problem is if google thinks that your key is base 64 and it is not, the decoding will change your key - example base64 abcedfg = iy in ascii. ... so google is not getting the correct decoding key from your URL.
There are other differences but I guess most of them are just format.

Related

Copyright/Registered symbol encoding not working

I’ve developed an iOS app in which we can send emojis from iOS to web portal and vice versa. All emojis sent from iOS to web portal are displaying perfect except “© and ®”.
Here is the emoji encoding piece of code.
NSData *data = [messageBody dataUsingEncoding:NSNonLossyASCIIStringEncoding];
NSString *encodedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
// This piece of code returns \251\256 as Unicodes of copyright and registered emojis, as these two Unicodes are not according to standard code so it doesn't display on web portal.
So what should I do to convert them standard Unicodes?
Test Run :
messageBody = #"Copy right symbol : © AND Registered Mark symbol : ®";
// Encoded string i get from the above encoding is
Copy right symbol : \\251 AND Registered Mark symbol : \\256
Where as it should like this (On standard unicodes )
Copy right symbol : \\u00A9 AND Registered Mark symbol : \\u00AE
First, I will try to provide the solution. Then I will try to explain why.
Escaping non-ASCII chars
To escape unicode chars in a string, you shouldn't rely on NSNonLossyASCIIStringEncoding. Below is the code that I use to escape unicode&non-ASCII chars in a string:
// NSMutableString category
- (void)appendChar:(unichar)charToAppend {
[self appendFormat:#"%C", charToAppend];
}
// NSString category
- (NSString *)UEscapedString {
char const hexChar[] = "0123456789ABCDEF";
NSMutableString *outputString = [NSMutableString string];
for (NSInteger i = 0; i < self.length; i++) {
unichar character = [self characterAtIndex:i];
if ((character >> 7) > 0) {
[outputString appendString:#"\\u"];
[outputString appendChar:(hexChar[(character >> 12) & 0xF])]; // append the hex character for the left-most 4-bits
[outputString appendChar:(hexChar[(character >> 8) & 0xF])]; // hex for the second group of 4-bits from the left
[outputString appendChar:(hexChar[(character >> 4) & 0xF])]; // hex for the third group
[outputString appendChar:(hexChar[character & 0xF])]; // hex for the last group, e.g., the right most 4-bits
} else {
[outputString appendChar:character];
}
}
return [outputString copy];
}
(NOTE: I guess Jon Rose's method does the same but I didn't wanna share a method that I didn't test)
Now you have the following string: Copy right symbol : \u00A9 AND Registered Mark symbol : \u00AE
Escaping unicode is done. Now let's convert it back to display the emojis.
Converting back
This is gonna be confusing at first but this is what it is:
NSData *data = [escapedString dataUsingEncoding:NSUTF8StringEncoding];
NSString *converted = [[NSString alloc] data encoding:NSNonLossyASCIIStringEncoding];
Now you have your emojis (and other non-ASCIIs) back.
What is happening?
The problem
In your case, you are trying to create a common language between your server side and your app. However, NSNonLossyASCIIStringEncoding is pretty bad choice for the purpose. Because this is a black-box that is created by Apple and we don't really know what it is exactly doing inside. As we can see, it converts unicode into \uXXXX while converting non-ASCII chars into \XXX. That is why you shouldn't rely on it to build a multi-platform system. There is no equivalent of it in backend platforms and Android.
Yet it is pretty mysterious, NSNonLossyASCIIStringEncoding can still convert back ® from \u00AE while it is converting it into \256 in the first place. I'm sure there are tools on other platforms to convert \uXXXX into unicode chars, that shouldn't be a problem for you.
messageBody is a string there is no reason to convert it to data only to convert it back to a string. Replace your code with
NSString *encodedString = messageBody;
If the messageBody object is incorrect then the way to fix it is to change the way it was created. The server sends data, not strings. The data that the server sends is encoding in some agreed upon way. Generally this encoding is UTF-8. If you know the encoding you can convert the data to a string; if you don't, then the data is gibberish that cannot be read. If the messageBody is incorrect, the problem occurred when it was converted from the data that the server sent. It seems likely that you are parsing it with the incorrect encoding.
The code you posted is just plain wrong. It converts a string to data using one encoding (ASCII) and the reads that data with a different encoding (UTF8). That is like translating a book to Spanish and then having a Portuguese speaker translate it back - it might work for some words, but it is still wrong.
If you are still having trouble then you should share the code of where messageBody is created.
If you server expects a ASCII string with all unicode characters changed to \u00xx then you should first yell at your server guy because he is an idiot. But if that doesn't work you can do the following code
NSString* messageBody = #"Copy right symbol : © AND Registered Mark symbol : ®";
NSData* utf32Data = [messageBody dataUsingEncoding:NSUTF32StringEncoding];
uint32_t *bytes = (uint32_t *) [utf32Data bytes];
NSMutableString* escapedString = [[NSMutableString alloc] init];
//Start a 1 because first bytes are for endianness
for(NSUInteger index = 1; index < escapedString.length / 4 ;index++ ){
uint32_t charValue = bytes[index];
if (charValue <= 127) {
[escapedString appendFormat:#"%C", (unichar)charValue];
}else{
[escapedString appendFormat:#"\\\\u%04X", charValue];
}
}
I'm really do not understand your problem.
You can simply convert ANY character into nsdata and return it into string.
You can simply pass UTF-8 string including both emoji and other symbols using POST request.
NSString* newStr = [[NSString alloc] initWithData:theData encoding:NSUTF8StringEncoding];
NSData* data = [newStr dataUsingEncoding:NSUTF8StringEncoding];
It have to work for both server and client side.
But, of course, you have got the other problem that some fonts do not support allutf-8 chars. That's why, e.g., in terminal you might not see some of them. But this is beyong the scope of this question.
NSNonLossyASCIIStringEncoding is used only then you really wnat to convert symbol into chain of symbols. But it is not needed.

how to convert std::string to NSData and vice versa?

I am using google protocol buffer to send and receive data in cocos2d-x multiplayer game via Google Play Games Services iOS sdk.
Protocol buffer converts data to std::string but GPGS iOS sdk sends data via NSData hence I have to convert from std::string to NSData and then back to std::string after receiving data.
I am currently using following method:
(std::string to NSData and NSData to std::string will be done in different functions at different times. Following code just summarise what I am doing overall)
//PlayerData is protocol buffer class
PlayerData data, temp;
std::string dataStr;
data.SerializeToString(&dataStr);
NSString* nsDataStr = [NSString stringWithCString:dataStr.c_str()
encoding:[NSString defaultCStringEncoding]];
NSData* nsData = [nsDataStr dataUsingEncoding:NSUTF8StringEncoding];
NSString* dataStr_2 = [[NSString alloc] initWithData:nsData
encoding:NSUTF8StringEncoding];
std::string foo = [dataStr_2 UTF8String];
temp.ParseFromString(foo);
Initial string i.e dataStr after serializing
"\r\x95n\x99D\x158\xddNDJ\nUmar SaeedR\x12p_CPH64oqq2K-TXxAB"
size: 42
Final string i.e foo before parsing
"\r\xc3\xafn\xc3\xb4D\x158\xe2\x80\xbaNDJ\nUmar SaeedR\x12p_CPH64oqq2K-TXxAB"
size: 46
the ParseFromString function of protocol buffer does not parse foo and returns false.
How to do string conversions so that string remains same?
I know this is old, but... Could you use protobufs ParseFromArray instead of ParseFromString?
IE
const void *bytes = [parseData bytes];
int byteLen = (int)[parseData length];
protobufMessage.ParseFromArray(bytes, byteLen);
Just a thought.
Try this:
std::string str = "123";
NSData *data = [NSData dataWithBytes:str.data() length:str.length()];

Trouble with HMAC for password being sent to website

I am trying to make an app for my school that interacts with PowerSchool, a software that allows user's to view their grades, teachers, schedules, and much more. I found a library for the basics of interacting with PowerSchool written in PHP and have been trying to write it in objective c for the past week. It seems the issue is how I create an HMAC (MD5) with the user's password. Either I am using a hex digest rather than a digest, not sure. The error I get back from the server is an odd number of characters.
Here is the link to the PHP library class I am trying to re-create:
https://github.com/horvste/powerapi-php/blob/master/src/PowerAPI/Core.php
Here is my code in my test project,
Command line main class:
https://gist.github.com/anonymous/c40cdd99a826c06073aa
NSString Category Implementation file:
#import "NSString+MyAdditions.h"
#implementation NSString (MyAdditions)
- (NSString *) hmacMD5WithData: (NSString *) data
{
const char *cKey = [self cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
const unsigned int blockSize = 64;
char ipad[blockSize], opad[blockSize], keypad[blockSize];
unsigned int keyLen = strlen(cKey);
CC_MD5_CTX ctxt;
if(keyLen > blockSize)
{
//CC_MD5(cKey, keyLen, keypad);
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, cKey, keyLen);
CC_MD5_Final((unsigned char *)keypad, &ctxt);
keyLen = CC_MD5_DIGEST_LENGTH;
}
else
{
memcpy(keypad, cKey, keyLen);
}
memset(ipad, 0x36, blockSize);
memset(opad, 0x5c, blockSize);
int i;
for(i = 0; i < keyLen; i++)
{
ipad[i] ^= keypad[i];
opad[i] ^= keypad[i];
}
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, ipad, blockSize);
CC_MD5_Update(&ctxt, cData, strlen(cData));
unsigned char md5[CC_MD5_DIGEST_LENGTH];
CC_MD5_Final(md5, &ctxt);
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, opad, blockSize);
CC_MD5_Update(&ctxt, md5, CC_MD5_DIGEST_LENGTH);
CC_MD5_Final(md5, &ctxt);
const unsigned int hex_len = CC_MD5_DIGEST_LENGTH*2+2;
char hex[hex_len];
for(i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
{
snprintf(&hex[i*2], hex_len-i*2, "%02x", md5[i]);
}
NSData *HMAC = [[NSData alloc] initWithBytes:hex length:strlen(hex)];
NSString *hash = [HMAC base64EncodedStringWithOptions:0];
return hash;
}
#end
Thank you for taking the time to look at this issue!
First, don't build your own HMAC routine here. Use CCHmac. It's built-in and handles HMAC+MD5 correctly.
If at all possible, I recommend going to the API documentation rather than trying to reverse engineer another code base. There are lots of little things going on in the PHP that you may be overlooking; an API doc should explain all of those.
If the PHP code is the only reference you have, then you should break down each piece and see where it's going wrong. For instance, verify that you are getting the auth data in the same form. Then confirm that each program, given the same auth data generates the same HMAC. Then confirm that given the same HMAC, each program generates the same response. Etc. Somewhere you are doing something differently. Make sure that you're using Base64 vs raw data in the same places (PHP devs tend to treat Base64 strings as though they were actually raw data, which causes confusion when coming over to ObjC).
And of course you should examine the server logs to validate that your final request matches the PHP requests.

NSString initWithBytes with openssl encrypted data returns nil with NSUTF8StringEncoding

I'm having an issue where I'm trying to create an NSString from encrypted data created by OpenSSL and I keep getting nil for the string.
The code that I'm using to encrypt and decrypt the data is taken from the following link http://saju.net.in/code/misc/openssl_aes.c.txt
Now here is the code where I'm calling to encrypt my data ("aes_init" is of course called on my application init):
//-- encrypt saved data
int textLen = str.size();
char* buff = const_cast<char*>(str.c_str());
EVP_CIPHER_CTX encryptCtx;
unsigned char *ciphertext = aes_encrypt(&encryptCtx,
reinterpret_cast<unsigned char*>(buff),
&textLen);
NSString * nsstring = [[NSString alloc] initWithBytes:reinterpret_cast<const char*>(ciphertext)
length:strlen(reinterpret_cast<const char*>(ciphertext))
encoding:NSUTF8StringEncoding];
[nsstring autorelease];
UIApplication* clientApp = [UIApplication sharedApplication];
AppController* appController = ((AppController *)clientApp.delegate);
[appController saveData:nsstring]; //--> crash at this line
I've tried different Encoding (NSASCIIStringEncoding and NSUnicodeStringEncoding) and they don't crash but the data is completely wrong after I decode.
Any ideas on how to solve this issue?
Thanks :)
Ciphertext, the output of an encryption function, should be indistinguishable from random. Meaning that any byte value can be generated, including byte values that do not map to characters. Hence it is needed to encode the ciphertext, for instance using base 64 encoding.

Convert XML Dsig format to DER ASN.1 public key

I am working on an iPhone app that retrieves an RSA public key from an ASP.NET web service in the form:
<RSAKeyValue>
<Modulus>qdd0paiiBJ+xYaN4TKDdbEzrJJw9xlbRAltb5OPdegjLoW60yOjL/sni52WVsGC9QxpNitZR33dnUscmI0cTJoxkXypPjbD94UpH+p4el2tuKBypHlE7bERApuUp55y8BiRkbQNFH8smZFWDwtIc/PsJryeGf8fAryel8c5V3PU=</Modulus>
<Exponent>AQAB</Exponent>
</RSAKeyValue>
I need to then convert this response into an NSData * of the appropriate format (from some intense Googling, most likely 'ASN.1 DER' binary format. I've got code in place to convert both parts from their Base64 representations to the original binary values, but I can't for the life of me figure out a reasonable way to create the one-piece binary key.
The code waiting for the key is the -addPeerPublicKey:(NSString *) keyBits:(NSData *) method of the SecKeyWrapper class from Apple's CryptoExercise example project (Code here).
I would be more than happy to implement this another way--all I need is to encrypt a single string (no decryption required). As far as I can tell, though, the built-in Security framework has what I need, if I could just close this format gap. If there is a way to convert the key and send it Base64-encoded from the webservice, that works for me as well--but I couldn't find any way to ASN.1-encode it there, either.
So, I used the SecKeyWrapper class to generate a random key, then used the -getPublicKeyBits method to get the binary representation of the public key (in whatever format is used internally). Presuming it is some form of DER ASN.1, I NSLog'd it to the console as hex and loaded it into this program. Sure enough, the internal representation is DER ASN.1, but it is a very simplified version of what I normally found for RSA key representations:
![SEQUENCE { INTEGER, INTEGER }][2]
Shouldn't be too tough to construct on the fly from a binary rep. of the modulus and exponent, since the DER encoding is just
30 (for SEQUENCE) LL (total sequence byte length)
02 (INTEGER) LL (modulus byte length) XX XX... (modulus data bytes)
02 LL XX XX XX... (exponent length and bytes)
Here's my code, for simplicity. It uses a few Google libs for XML+base64, just heads up; also Apple's demo code SecKeyWrapper. See my other question for a note on making this work. Also, note that it is not ARC-compatible; this is left as an exercise for the reader (I wrote this years ago, now).
#define kTempPublicKey #"tempPayKey"
-(NSData *)encryptedDataWithXMLPublicKey:(NSString *)base64PublicKey data:(NSData *)data {
if(![data length]){
#throw [NSException exceptionWithName:#"NSInvalidArgumentException" reason:#"Data not set." userInfo:nil];
}
GTMStringEncoding *base64 = [GTMStringEncoding rfc4648Base64StringEncoding];
NSData *keyData = [base64 decode:base64PublicKey];
NSError *err = nil;
GDataXMLDocument *keyDoc = [[GDataXMLDocument alloc] initWithData:keyData options:0 error:&err];
if(err){
NSLog(#"Public key parse error: %#",err);
[keyDoc release];
return nil;
}
NSString *mod64 = [[[[keyDoc rootElement] elementsForName:#"Modulus"] lastObject] stringValue];
NSString *exp64 = [[[[keyDoc rootElement] elementsForName:#"Exponent"] lastObject] stringValue];
[keyDoc release];
if(![mod64 length] || ![exp64 length]){
#throw [NSException exceptionWithName:#"NSInvalidArgumentException" reason:#"Malformed public key xml." userInfo:nil];
}
NSData *modBits = [base64 decode:mod64];
NSData *expBits = [base64 decode:exp64];
/* the following is my (bmosher) hack to hand-encode the mod and exp
* into full DER encoding format, using the following as a guide:
* http://luca.ntop.org/Teaching/Appunti/asn1.html
* this is due to the unfortunate fact that the underlying API will
* only accept this format (not the separate values)
*/
// 6 extra bytes for tags and lengths
NSMutableData *fullKey = [[NSMutableData alloc] initWithLength:6+[modBits length]+[expBits length]];
unsigned char *fullKeyBytes = [fullKey mutableBytes];
unsigned int bytep = 0; // current byte pointer
fullKeyBytes[bytep++] = 0x30;
if(4+[modBits length]+[expBits length] >= 128){
fullKeyBytes[bytep++] = 0x81;
[fullKey increaseLengthBy:1];
}
unsigned int seqLenLoc = bytep;
fullKeyBytes[bytep++] = 4+[modBits length]+[expBits length];
fullKeyBytes[bytep++] = 0x02;
if([modBits length] >= 128){
fullKeyBytes[bytep++] = 0x81;
[fullKey increaseLengthBy:1];
fullKeyBytes[seqLenLoc]++;
}
fullKeyBytes[bytep++] = [modBits length];
[modBits getBytes:&fullKeyBytes[bytep]];
bytep += [modBits length];
fullKeyBytes[bytep++] = 0x02;
fullKeyBytes[bytep++] = [expBits length];
[expBits getBytes:&fullKeyBytes[bytep++]];
SecKeyRef publicKey = [[SecKeyWrapper sharedWrapper] addPeerPublicKey:kTempPublicKey keyBits:fullKey];
[fullKey release];
NSData *encrypted = [[SecKeyWrapper sharedWrapper] wrapSymmetricKey:data keyRef:publicKey];
// remove temporary key from keystore
[[SecKeyWrapper sharedWrapper] removePeerPublicKey:kTempPublicKey];
return encrypted;
}

Resources