byteArray to Hex NSString - adds some wrong hex content - ios

I am trying to convert the byteArray to a Hex NSString.
Here is the solution that I referred to convert it into hex NSString. But, I discovered It add's ffffffffffffff. How can I get correct hex NSString?
Best way to serialize an NSData into a hexadeximal string
const char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
;
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}
The output String
12233445566778ffffffffffffff8912233445566778ffffffffffffff89

Don't use unsigned long for each of your bytes. And what's the point of myByteData if you don't use it?
And since you are not really using char, use uint8_t.
Try this:
const uint8_t myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
size_t len = sizeof(myByteArray) / sizeof(uint8_t);
NSMutableString *myHexString = [NSMutableString stringWithCapacity:len * 2];
for (size_t i = 0; i < len; i++) {
[myHexString appendFormat:#"%02x", (int)myByteArray[i]];
}

Your initial byte data is char rather than unsigned char. This means that any values >127 (0x7f) will be seen as a twos-complement negative number, giving ffffffffffffff89.
If you change your data to be unsigned char you will get the desired result.
const unsigned char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}

Related

Binary hash representation to HEX/Ascii in Objective-c

I would to log a binary hash representation in the console, using an hex or ascii representation. The algorithm is MD5, so the function is CC_MD5
I get the binary hash representation via a Theos tweak, which is working well.
EDIT: this tweak intercept the CC_MD5 call. The call is implemented in the method described below. When CC_MD5 is called, replaced_CC_MD5 intercept the call.
The app tested, is a simple app which i made myself and it's using this method to calculate MD5 Hash:
- (NSString *) md5:(NSString *) input
{
const char *cStr = [input UTF8String];
unsigned char digest[16];
CC_MD5( cStr, strlen(cStr), digest ); // This is the md5 call
NSMutableString *output = [NSMutableString stringWithCapacity:CC_MD5_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", digest[i]];
return output;
}
The hashing it's ok, and the app returns to me the correct hash for the input
input = prova
MD5 Digest = 189bbbb00c5f1fb7fba9ad9285f193d1
The function in my Theos Tweak where i manipulate the CC_MD5 function is
EDIT: where data would be cStr, len would be strlen(cStr) and md would be digest.
static unsigned char * replaced_CC_MD5(const void *data, CC_LONG len, unsigned char *md) {
CC_LONG dataLength = (size_t) len;
NSLog(#"==== START CC_MD5 HOOK ====");
// hex of digest
NSData *dataDigest = [NSData dataWithBytes:(const void *)md length:(NSUInteger)CC_MD5_DIGEST_LENGTH];
NSLog(#"%#", dataDigest);
// hex of string
NSData *dataString = [NSData dataWithBytes:(const void *)data length:(NSUInteger)dataLength];
NSLog(#"%#", dataString);
NSLog(#"==== END CC_MD5 HOOK ====");
return original_CC_MD5(data, len, md);
}
The log of dataString it's ok: 70726f76 61 which is the HEX representation of prova
The log of dataDigest is e9aa0800 01000000 b8c00800 01000000 which is, if i understood, the binary hash representation.
How can i convert this representation to have the MD5 Hash digest?
In replaced_CC_MD5 you are displaying md before the call to original_CC_MD5 which sets its value. What you are seeing is therefore random data (or whatever was last stored in md).
Move the call to original_CC_MD5 to before the display statement and you should see the value you expect. (You'll of course need to save the result of the call in a local so you can return the value in the return statement.)

Same HMAC algorithm in obj-c and swift creates different hashes

I have two methods that create a sha1 hash from a string. Using the same input data this algorithms create different hashes, however they should create the same hashes.
In swift (creates 617fb90f14f2eacecc333d558237bf8bb9fc85f7):
static func sha1FromMessage(message: String) -> String {
let cKey = RestUtils.API_KEY.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
let output = NSMutableString(capacity: Int(CC_SHA1_DIGEST_LENGTH))
for byte in cHMAC {
output.appendFormat("%02hhx", byte)
}
return output
}
and obj-c (creates d80b816f0b46d5211b6d9487089597e181717ea6)
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
I would like the swift method to return the same hash as the obj-c method. Any ideas where the problem is?
The reason is that cData created by
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
includes the terminating NUL-character of the message string and that is counted in
UInt(cData.count) as well. You could fix that by using UInt(strlen(cData)) instead,
as in your Objective-C code.
But a better solution is to convert the input strings
to NSData objects instead:
let cKey = RestUtils.API_KEY.dataUsingEncoding(NSASCIIStringEncoding)!
let cData = message.dataUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey.bytes, UInt(cKey.length), cData.bytes, UInt(cData.length), &cHMAC)
With this modification, Swift and Objective-C code produce the same message digest.

Create hash in swift using key and message

I want to create an SHA1 hmac hash of a string using a key in swift. In obj-c I used this and it worked great:
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%s", cData);
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
However now I am having a hard time to translate this into swift. This is what I have so far:
static func sha1FromMessage(message: String){
let cKey = RestUtils.apiKey.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
let cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
...
}
and this line
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
already gives me an error Int is not convertible to CCHmacAlgorithm. Any ideas how to translate the obj-c code to swift?
The last parameter of the CCHmac() function has the type UnsafeMutablePointer<Void> because
that's where the result is written to. You have to declare cHMAC as variable
and pass it as an in-out expression with &. In addition, some type conversions
are necessary:
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
Apple enum values are defined differently in Swift. Instead of kCCHmacAlgSHA1, it's probably defined like CCHmacAlgorithm.SHA1.
This is answered here:
https://stackoverflow.com/a/24411522/2708650

MD5 from NSData is always diffrent

I create an MD5 of a file that is deployed with my bundle to decide if I need to import it.
My problem is that when I create the MD5 is always different. Even when I call the MD5 method 10 times in a loop with the same data, I got different results.
Here is my MD% method:
- (NSString*)hashForData:(NSData*)data
{
unsigned char md5Buffer[CC_MD5_DIGEST_LENGTH];
CC_MD5((__bridge const void*)(data), (CC_LONG)data.length, md5Buffer);
NSMutableString* output = [NSMutableString stringWithCapacity:CC_MD5_DIGEST_LENGTH * 2];
for (int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x", md5Buffer[i]];
return output;
}
- (NSData*)data
{
if (!_data) {
_data = [NSData dataWithContentsOfFile:_path];
}
return _data;
}
Any idea what could be wrong?
Shouldn't that be:
CC_MD5((__bridge const void*)([data bytes]), (CC_LONG)[data length], md5Buffer);
// ^^^^^^^^^^^^ ^^^^^^^^^^^^^
(i.e. you are calculating the MD5 hash of the NSData object (and adjacent memory) instead of the data within the NSData object).

Chinese character to ASCII or Hexadecimal

Im struggling to covert chinese word/characters to ascii or hexadecimal and all the values I've got up until now is not what I was suppose to get.
Example of conversion is the word 手 to hex is 1534b.
Methods Ive followed till now are as below, and I got varieties of results but the one I was looking for,
I really appreciate if you can help me out on this issue,
Thanks,
Mike
- (NSString *) stringToHex:(NSString *)str{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
[hexString appendFormat:#"%02x", chars[i]]; //EDITED PER COMMENT BELOW
}
free(chars);
return hexString;}
and
const char *cString = [#"手" cStringUsingEncoding:NSASCIIStringEncoding];
below is the similar code in Java for Android, Maybe it helps
public boolean sendText(INotifiableManager manager, String text) {
final int codeOffset = 0xf100;
for (char c : text.toCharArray()) {
int code = (int)c+codeOffset;
if (! mConnection.getBoolean(manager, "SendKey", Integer.toString(code))) {
}
Your Java code is just doing this:
Take each 16-bit character of the string and add 0xf100 to it.
If you do the same thing in your above Objective-C code you will get the result you want.

Resources