I am using google protocol buffer to send and receive data in cocos2d-x multiplayer game via Google Play Games Services iOS sdk.
Protocol buffer converts data to std::string but GPGS iOS sdk sends data via NSData hence I have to convert from std::string to NSData and then back to std::string after receiving data.
I am currently using following method:
(std::string to NSData and NSData to std::string will be done in different functions at different times. Following code just summarise what I am doing overall)
//PlayerData is protocol buffer class
PlayerData data, temp;
std::string dataStr;
data.SerializeToString(&dataStr);
NSString* nsDataStr = [NSString stringWithCString:dataStr.c_str()
encoding:[NSString defaultCStringEncoding]];
NSData* nsData = [nsDataStr dataUsingEncoding:NSUTF8StringEncoding];
NSString* dataStr_2 = [[NSString alloc] initWithData:nsData
encoding:NSUTF8StringEncoding];
std::string foo = [dataStr_2 UTF8String];
temp.ParseFromString(foo);
Initial string i.e dataStr after serializing
"\r\x95n\x99D\x158\xddNDJ\nUmar SaeedR\x12p_CPH64oqq2K-TXxAB"
size: 42
Final string i.e foo before parsing
"\r\xc3\xafn\xc3\xb4D\x158\xe2\x80\xbaNDJ\nUmar SaeedR\x12p_CPH64oqq2K-TXxAB"
size: 46
the ParseFromString function of protocol buffer does not parse foo and returns false.
How to do string conversions so that string remains same?
I know this is old, but... Could you use protobufs ParseFromArray instead of ParseFromString?
IE
const void *bytes = [parseData bytes];
int byteLen = (int)[parseData length];
protobufMessage.ParseFromArray(bytes, byteLen);
Just a thought.
Try this:
std::string str = "123";
NSData *data = [NSData dataWithBytes:str.data() length:str.length()];
Related
I would to log a binary hash representation in the console, using an hex or ascii representation. The algorithm is MD5, so the function is CC_MD5
I get the binary hash representation via a Theos tweak, which is working well.
EDIT: this tweak intercept the CC_MD5 call. The call is implemented in the method described below. When CC_MD5 is called, replaced_CC_MD5 intercept the call.
The app tested, is a simple app which i made myself and it's using this method to calculate MD5 Hash:
- (NSString *) md5:(NSString *) input
{
const char *cStr = [input UTF8String];
unsigned char digest[16];
CC_MD5( cStr, strlen(cStr), digest ); // This is the md5 call
NSMutableString *output = [NSMutableString stringWithCapacity:CC_MD5_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", digest[i]];
return output;
}
The hashing it's ok, and the app returns to me the correct hash for the input
input = prova
MD5 Digest = 189bbbb00c5f1fb7fba9ad9285f193d1
The function in my Theos Tweak where i manipulate the CC_MD5 function is
EDIT: where data would be cStr, len would be strlen(cStr) and md would be digest.
static unsigned char * replaced_CC_MD5(const void *data, CC_LONG len, unsigned char *md) {
CC_LONG dataLength = (size_t) len;
NSLog(#"==== START CC_MD5 HOOK ====");
// hex of digest
NSData *dataDigest = [NSData dataWithBytes:(const void *)md length:(NSUInteger)CC_MD5_DIGEST_LENGTH];
NSLog(#"%#", dataDigest);
// hex of string
NSData *dataString = [NSData dataWithBytes:(const void *)data length:(NSUInteger)dataLength];
NSLog(#"%#", dataString);
NSLog(#"==== END CC_MD5 HOOK ====");
return original_CC_MD5(data, len, md);
}
The log of dataString it's ok: 70726f76 61 which is the HEX representation of prova
The log of dataDigest is e9aa0800 01000000 b8c00800 01000000 which is, if i understood, the binary hash representation.
How can i convert this representation to have the MD5 Hash digest?
In replaced_CC_MD5 you are displaying md before the call to original_CC_MD5 which sets its value. What you are seeing is therefore random data (or whatever was last stored in md).
Move the call to original_CC_MD5 to before the display statement and you should see the value you expect. (You'll of course need to save the result of the call in a local so you can return the value in the return statement.)
I'm having an issue where I'm trying to create an NSString from encrypted data created by OpenSSL and I keep getting nil for the string.
The code that I'm using to encrypt and decrypt the data is taken from the following link http://saju.net.in/code/misc/openssl_aes.c.txt
Now here is the code where I'm calling to encrypt my data ("aes_init" is of course called on my application init):
//-- encrypt saved data
int textLen = str.size();
char* buff = const_cast<char*>(str.c_str());
EVP_CIPHER_CTX encryptCtx;
unsigned char *ciphertext = aes_encrypt(&encryptCtx,
reinterpret_cast<unsigned char*>(buff),
&textLen);
NSString * nsstring = [[NSString alloc] initWithBytes:reinterpret_cast<const char*>(ciphertext)
length:strlen(reinterpret_cast<const char*>(ciphertext))
encoding:NSUTF8StringEncoding];
[nsstring autorelease];
UIApplication* clientApp = [UIApplication sharedApplication];
AppController* appController = ((AppController *)clientApp.delegate);
[appController saveData:nsstring]; //--> crash at this line
I've tried different Encoding (NSASCIIStringEncoding and NSUnicodeStringEncoding) and they don't crash but the data is completely wrong after I decode.
Any ideas on how to solve this issue?
Thanks :)
Ciphertext, the output of an encryption function, should be indistinguishable from random. Meaning that any byte value can be generated, including byte values that do not map to characters. Hence it is needed to encode the ciphertext, for instance using base 64 encoding.
At the moment I'm having an issue with producing a public key for a given Google Maps Web Services API query.
The documentation stipulates that the signature must be produced with a modified base-64 HMAC-SHA1 hash, on the path and query part of the URL.
However using this function, and testing it with this tool shows that it isn't working correctly.
+ (NSString *) hmac:(NSString *)data withKey:(NSString *)key{
const char *cKey = [key cStringUsingEncoding:NSUTF8StringEncoding];
const char *cData = [data cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *hash = [HMAC base64EncodedStringWithOptions:0];
hash = [hash stringByReplacingOccurrencesOfString:#"+" withString:#"-"];
hash = [hash stringByReplacingOccurrencesOfString:#"/" withString:#"_"];
return hash;
}
I am calling this function where data has been precent encoded;
[urlString stringByAddingPercentEscapesUsingEncoding:NSASCIIStringEncoding];
Where am I going wrong? Any help greatly appreciated.
You can use this for your url signing : https://github.com/youssman/GMUrlSigner
I am interested in your feedback and comments ;-)
Google has a sample Objective C code - is there any particular reason that you are not using them and some of the google functions in your code? In particular the equivalence of CKey is not plain 7 bit asCII (as coded in your sample) but Base64 (rfc4648Base64WebsafeStringEncoding). For the unfamiliar, Base64 is typically uses in mime Mail to allow attachments to go through plain text email years ago (prior to HTML Mail) problem is if google thinks that your key is base 64 and it is not, the decoding will change your key - example base64 abcedfg = iy in ascii. ... so google is not getting the correct decoding key from your URL.
There are other differences but I guess most of them are just format.
Some background... I am writing code that interacts with javascript via a ObjC-JS bridge utilizing UIWebView's stringByEvaluatingJavaScriptFromString:. The idea is that the "brains" of the app be in JS which tells Objective-C how to behave. There are multiple benefits to this like reduced binary size, flexible updates, etc. However, there is a case where there is some Objective-C only object that the JS needs to have a reference to (JS instructs ObjC when to use/remove the object). This is being done by placing the native object in a dictionary with a unique identifier which can be passed as a string to JS (over the bridge). My problem stems with coming up with a nice identifier for said native Objective-C object.
Thus, I am trying to convert a reference to an object to a string with no luck. This is what I have:
// anObject is a custom class
NSValue *handle = [NSValue valueWithPointer:(__bridge const void *)anObject];
NSData *data = [NSData dataWithValue:handle];
NSString *stringHandle = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
The dataWithValue: function (taken from this SO post):
+ (NSData *)dataWithValue:(NSValue *)value {
NSUInteger size;
const char* encoding = [value objCType];
NSGetSizeAndAlignment(encoding, &size, NULL);
void* ptr = malloc(size);
[value getValue:ptr];
NSData* data = [NSData dataWithBytes:ptr length:size];
free(ptr);
return data;
}
Walking through it in the debugger shows me a nil value for stringHandle:
What am I doing wrong?
What you're doing wrong is trying to treat an address as if it's a UTF-8 encoded string. An address -- or any other chunk of arbitrary data -- isn't very likely to be valid UTF-8 data. (If by chance it were, it still wouldn't be the string you expect.)
If you're trying to get a string containing the pointer value, i.e., the address of the original object, that's just [NSString stringWithFormat:#"%p", anObject];
If you really need to do it from the NSValue, then replace anObject with [theValue pointerValue].
If you want to pretty-print arbitrary data, see How to convert an NSData into an NSString Hex string?
You can get a string representation by calling the NSObject method "description". You can override the "description" method in a subclass if you need.
An NSValue of a pointer will be an object holding the 4 bytes of the 32-bit pointer. It will not hold any of the data pointed to in RAM.
I am working on a bluetooth iOS project and have managed to get some data from the bluetooth device.
However, I am struggling to convert this data into something useful, such as an NSString. Whenever I try to NSLog the NSString that was converted from the NSData received, it is a bunch of gibberish. The output is:
ēဥ၆䄀
The bluetooth device is a heart monitor from a manufacturer in Asia and they have provided the protocol reference on how to make calls to the device. This one thing they mention in the protocol reference:
The PC send 16-byte packets to the device, then the device sent back the 16-byte packets. Except for some special commands, all others can use this communication mode.
Can anyone tell me what I am doing wrong? I have tried everything I know, including every single encoding in the apple docs as well as both initWithData and initWithBytes. Thanks!
-(void)peripheral:(CBPeripheral *)peripheral didUpdateValueForCharacteristic:(CBCharacteristic *)characteristic
error:(NSError *)error {
if (error)
{
NSLog(#"erorr in read is %#", error.description);
return;
}
NSData *data= characteristic.value;
NSString *myString = [[NSString alloc] initWithBytes:[data bytes] length:[data length] encoding:NSUTF16StringEncoding];
NSLog(#"Value from device is %#", myString); //OUTPUT IS ēဥ၆䄀
}
What you have here is a string of raw data that can't be directly converted into a human readable string - unless you consider hex-representation to be human readable :)
To make sense of this data you need to either have a protocol specification at hand or prepare for hours (sometimes) days of reverse-engineering.
This byte-sequence can be composed of multiple values formatted in standard (float IEEE 754, uint8_t, uint16_t...) or even proprietary formats.
One important thing to consider when communicating with the outside world is also endianness (ie: does the 'biggest' byte in multi-byte format come first or last).
There are many ways to manipulate this data. To get the raw array of bytes you could do:
NSData *rxData = ...
uint8_t *bytes = (uint8_t *)[rxData bytes];
And then if (for example) first byte tells you what type of payload the string holds you can switch like:
switch (bytes[0])
{
case 0x00:
//first byte 0x00: do the parsing
break;
case 0x01:
//first byte 0x01: do the parsing
break;
// ...
default:
break;
}
Here would be an example of parsing data that consists of:
byte 0: byte holding some bit-coded flags
bytes 1,2,3,4: 32-bit float
bytes 5,6: uint16_t
bool bitFlag0;
bool bitFlag1;
bool bitFlag2;
bool bitFlag3;
uint8_t firstByte;
float theFloat;
uint16_t theInteger;
NSData *rxData = ...
uint8_t *bytes = (uint8_t *)[rxData bytes];
// getting the flags
firstByte = bytes[0];
bitFlag0 = firstByte & 0x01;
bitFlag1 = firstByte & 0x02;
bitFlag2 = firstByte & 0x04;
bitFlag3 = firstByte & 0x08;
//getting the float
[[rxData subdataWithRange:NSMakeRange(1, 4)] getBytes:&theFloat length:sizeof(float)];
NSLog (#"the float is &.2f",theFloat);
//getting the unsigned integer
[[data subdataWithRange:NSMakeRange(6, 2)] getBytes:&theInteger length:sizeof(uint16_t)];
NSLog (#"the integer is %u",theInteger);
One note: depending on the endianness you might need to reverse the 4-float or the 2-uint16_t bytes before converting them. Converting this byte arrays can also be done with unions.
union bytesToFloat
{
uint8_t b[4];
float f;
};
and then:
bytesToFloat conv;
//float would be written on bytes b1b2b3b4 in protocol
conv.b[0] = bytes[1]; //or bytes[4] .. endianness!
conv.b[1] = bytes[2]; //or bytes[3] .. endianness!
conv.b[2] = bytes[3]; //or bytes[2] .. endianness!
conv.b[3] = bytes[4]; //or bytes[1] .. endianness!
theFloat = conv.f,
If for example you know that byte6 and byte7 represent an uint16_t value you can calculate it from raw bytes:
value = uint16_t((bytes[6]<<8)+bytes[7]);
or (again - endianness):
value = uint16_t((bytes[7]<<8)+bytes[6]);
One more note: using simply sizeof(float) is a bit risky since float can be 32-bit on one platform and 64-bit on another.