I want to create an SHA1 hmac hash of a string using a key in swift. In obj-c I used this and it worked great:
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%s", cData);
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
However now I am having a hard time to translate this into swift. This is what I have so far:
static func sha1FromMessage(message: String){
let cKey = RestUtils.apiKey.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
let cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
...
}
and this line
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
already gives me an error Int is not convertible to CCHmacAlgorithm. Any ideas how to translate the obj-c code to swift?
The last parameter of the CCHmac() function has the type UnsafeMutablePointer<Void> because
that's where the result is written to. You have to declare cHMAC as variable
and pass it as an in-out expression with &. In addition, some type conversions
are necessary:
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
Apple enum values are defined differently in Swift. Instead of kCCHmacAlgSHA1, it's probably defined like CCHmacAlgorithm.SHA1.
This is answered here:
https://stackoverflow.com/a/24411522/2708650
Related
sipurl and regurl are NSString which I get from UItextField
const char *sipAddr = [sipUrl cStringUsingEncoding:NSUTF8StringEncoding];
const char *regAddr = [regUrl cStringUsingEncoding:NSUTF8StringEncoding];
const char *uname = [orgUsername cStringUsingEncoding:NSUTF8StringEncoding];
const char *pword = [password cStringUsingEncoding:NSUTF8StringEncoding];
//const char *pword = [password UTF8String];
const char *realm = [realmStr cStringUsingEncoding:NSUTF8StringEncoding];
//char *sipAddrArr=strdup(sipAddr);
//char *regAddrArr=strdup(regAddr);
//char *unamearr=strdup(uname);
//char *pwordarrnew = strdup(pword);
char sipAddrArr[[sipUrl length]];
strcpy(sipAddrArr,sipAddr);
char regAddrArr[[regUrl length]];
strcpy(regAddrArr,regAddr);
char unamearr[[orgUsername length]] ;
strcpy(unamearr, uname);
char pwordarrnew[[password length]];
strcpy(pwordarrnew,pword);
char realmchr[[realmStr length]];
strcpy(realmchr,realm);
char pwordarr1[] = {"10000"};
printf("\nPassword value sent: %s",pword);
printf("\nPassword value sent: %s",pwordarr1);
pword has value while trying to copy constant char to char I use strcpy function it returns no value for pword. The same code works fine when testing in simulator but doesn't work when connected to iOS phone.
This may or may not be the cause of your problem, but it is an error in your code:
You are taking an NSString, converting it into a UTF8 C-string, and then copying the C-string into a C-array sized according to the length of the NSString. This is wrong, the NSString and its UTF8 translation can be different lengths. You need to determine the length of the UTF8 string and allow for the end-of-string marker. For example:
char sipAddrArr[strlen(sipAddr)+1];
strcpy(sipAddrArr,sipAddr);
and similarly for the other arrays.
HTH
I tried to generate SHA256 in iOS using Arcane library with following data:
String: Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS
Key: 71DD0F73AFFBB47825FF9864DDE95F3B
Result was 409dc622b3bef5c9fc46e45c3210111fcb4536d3a55833316fe0dc8154b3ea34
which I thought to be correct. However, the Windows counterpart is generating SHA256 using following code:
Windows Phone Source Code:
public static string HmacSha256(string secretKey, string value)
{
var msg = CryptographicBuffer.ConvertStringToBinary(value, BinaryStringEncoding.Utf8);
byte[] convertedHash = new byte[secretKey.Length / 2];
for (int i = 0; i < secretKey.Length / 2; i++)
{
convertedHash[i] = (byte)Int32.Parse(secretKey.Substring(i * 2, 2), System.Globalization.NumberStyles.HexNumber);
}
// Create HMAC.
var objMacProv = MacAlgorithmProvider.OpenAlgorithm(MacAlgorithmNames.HmacSha256);
CryptographicHash hash = objMacProv.CreateHash(convertedHash.AsBuffer());
hash.Append(msg);
return CryptographicBuffer.EncodeToHexString(hash.GetValueAndReset());
}
and the result is: 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73 which is different from what I got. Can somebody help me with this and let me know what the above code is doing and how can I replicate it in iOS.
Edit:
iOS Source code
let key = self.md5(string: "71DD0F73AFFBB47825FF9864DDE95F3B")
let hash = HMAC.SHA256(str, key: key)
The key here is you need to convert your secret, which is a hex string, into NSData. In other words, NSData byte stream would "look" like the secret.
This should do what you want:
// Hex string to NSData conversion from here http://stackoverflow.com/questions/7317860/converting-hex-nsstring-to-nsdata
NSString *secret = #"71DD0F73AFFBB47825FF9864DDE95F3B";
NSData *dataIn = [#"Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS" dataUsingEncoding:NSUTF8StringEncoding];
NSMutableData *macOut = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];
secret = [secret stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *secretData = [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [secret length]/2; i++) {
byte_chars[0] = [secret characterAtIndex:i*2];
byte_chars[1] = [secret characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[secretData appendBytes:&whole_byte length:1];
}
CCHmac(kCCHmacAlgSHA256, secretData.bytes, secretData.length, dataIn.bytes, dataIn.length, macOut.mutableBytes);
NSMutableString *stringOut = [NSMutableString stringWithCapacity:macOut.length];
const unsigned char *macOutBytes = macOut.bytes;
for (NSInteger i=0; i<macOut.length; ++i) {
[stringOut appendFormat:#"%02x", macOutBytes[i]];
}
NSLog(#"dataIn: %#", dataIn);
NSLog(#"macOut: %#", macOut);
NSLog(#"stringOut: %#", stringOut);
Output:
2016-09-27 20:18:54.181 JKS[27562:5321334] dataIn: <416d6f75 6e743d35 30264269 6c6c6572 49443d35 39264368 616e6e65 6c49443d 3226436f 6e746578 743d3334 7c636865 636b7c74 65737426 52657475 726e5552 4c3d6874 7470733a 2f2f7561 742e6d79 6661746f 6f72612e 636f6d2f 52656365 69707450 4f432e61 73707826 54786e52 65664e75 6d3d3030 30303030 30303030 32303030 33265573 65724e61 6d653d44 4353>
2016-09-27 20:18:54.181 JKS[27562:5321334] macOut: <94a20ca3 9c8487c7 763823ec 9c918d9e 38ae83cb 741439f6 d129bcde f9edba73>
2016-09-27 20:18:54.181 JKS[27562:5321334] stringOut: 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73
Updated with Swift (code should be cleaned up)
// http://stackoverflow.com/questions/29799361/generate-a-hmac-swift-sdk8-3-using-cchmac
func generateHMAC(key: String, data: String) -> String {
let keyData = key.dataFromHexadecimalString()! as NSData
let dataIn = data.data(using: .utf8)! as NSData
var result: [CUnsignedChar]
result = Array(repeating: 0, count: Int(CC_SHA256_DIGEST_LENGTH))
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA256), keyData.bytes, keyData.length, dataIn.bytes, dataIn.length, &result)
let hash = NSMutableString()
for val in result {
hash.appendFormat("%02hhx", val)
}
return hash as String
}
You can use this extension to convert the hex string to Data
// Modified slightly http://stackoverflow.com/questions/26501276/converting-hex-string-to-nsdata-in-swift
extension String {
func dataFromHexadecimalString() -> Data? {
var data = Data(capacity: characters.count / 2)
let regex = try! NSRegularExpression(pattern: "[0-9a-f]{1,2}", options: .caseInsensitive)
regex.enumerateMatches(in: self, options: [], range: NSMakeRange(0, characters.count)) { match, flags, stop in
let byteString = (self as NSString).substring(with: match!.range)
var num = UInt8(byteString, radix: 16)
data.append(&num!, count: 1)
}
return data
}
}
And to use do something like:
let secret = "71DD0F73AFFBB47825FF9864DDE95F3B"
let value = "Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS"
print("\(generateHMAC(key: secret, data: value))")
Your output should be 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73
You will need #import <CommonCrypto/CommonCrypto.h> in your bridging header.
The Windows code takes the string, interprets it as a hexadecimal number, and converts two characters a time into one byte.
Your Mac code most like takes the string as it is. Since the key starts with "71", your windows code takes that as a single byte with value 0x71 = 129, your Mac code takes it as two bytes with values '7' = 55 and '1' = 49.
All you need to do is convert the bytes on the Mac exactly as you do it on Windows. You might have to do the unthinkable and look at the source code of the Mac library to see how it does the actual hash calculation.
#import <CommonCrypto/CommonHMAC.h>
+ (NSString *)hmacSHA256EncryptString{
NSString * parameterSecret = #"input secret key";
NSString *plainString = #"input encrypt content string";
const char *secretKey = [parameterSecret cStringUsingEncoding:NSUTF8StringEncoding];
const char *plainData = [plainString cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, secretKey, strlen(secretKey), plainData, strlen(plainData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *bufferChar = (const unsigned char *)[HMACData bytes];
NSMutableString *hmacString = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[hmacString appendFormat:#"%02x", bufferChar[i]];
}
return hmacString;
}
This is my code. Any time items return is null.I have tried in swift and objective c,but nothing.
let certName : String = "private_key"//name of the certificate//
//get p12 file path
let resourcePath: String = NSBundle.mainBundle().pathForResource(certName, ofType: "p12")!
let p12Data: NSData = NSData(contentsOfFile: resourcePath)!
//create key dictionary for reading p12 file
let key : NSString = kSecImportExportPassphrase as NSString
let options : NSDictionary = [key : "password"]
//create variable for holding security information
var privateKeyRef: SecKeyRef? = nil
var items : CFArray?
let securityError: OSStatus = SecPKCS12Import(p12Data, options, &items)
print(items)
I'm using pkcs8. And you need to include
#include <CommonCrypto/CommonDigest.h>
#include <openssl/engine.h>
openssl download from Github
+ (NSString*) getSignatureData:(NSString*) signableData {
// get private key path
NSString* path = [[NSBundle mainBundle] pathForResource:#"private"
ofType:#"der"];
NSData* datasss = [signableData dataUsingEncoding:NSNonLossyASCIIStringEncoding];
char* text = (char*) [signableData UTF8String];
unsigned char *data;
data = (unsigned char *) text;
//creates a new file BIO with mode, mode the meaning of mode is the same as the stdio function fopen()
BIO *in = BIO_new_file([path cStringUsingEncoding:NSUTF8StringEncoding], "rb");
//PKCS#8 private key info structure
PKCS8_PRIV_KEY_INFO *p8inf = d2i_PKCS8_PRIV_KEY_INFO_bio(in, NULL);
EVP_PKEY *pkey = EVP_PKCS82PKEY(p8inf);
PKCS8_PRIV_KEY_INFO_free(p8inf);
BIO_free(in);
uint8_t * cipherBuffer = NULL;
// Calculate the buffer sizes.
unsigned int cipherBufferSize = RSA_size(pkey->pkey.rsa);
unsigned int signatureLength;
// Allocate some buffer space.
cipherBuffer = malloc(cipherBufferSize);
memset((void *)cipherBuffer, 0x0, cipherBufferSize);
unsigned char hashedChars[32];
//return a pointer to the hash value
unsigned char *openSSLHash = CC_SHA256(datasss.bytes, (CC_LONG)signableData.length, hashedChars);
//unsigned char *openSSLHash1 = SHA256(data, signableData.length, NULL);
/*
* The following function sign and verify a X509_SIG ASN1 object inside
* PKCS#8 padded RSA encryption
*/
RSA_sign(NID_sha256, openSSLHash, SHA256_DIGEST_LENGTH, cipherBuffer, &signatureLength, pkey->pkey.rsa);
NSData *signedData = [NSData dataWithBytes:(const void*)cipherBuffer length:signatureLength];
EVP_PKEY_free(pkey);
NSString *base64String = [signedData base64EncodedStringWithOptions:0];
return base64String;
}
I am trying to convert the byteArray to a Hex NSString.
Here is the solution that I referred to convert it into hex NSString. But, I discovered It add's ffffffffffffff. How can I get correct hex NSString?
Best way to serialize an NSData into a hexadeximal string
const char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
;
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}
The output String
12233445566778ffffffffffffff8912233445566778ffffffffffffff89
Don't use unsigned long for each of your bytes. And what's the point of myByteData if you don't use it?
And since you are not really using char, use uint8_t.
Try this:
const uint8_t myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
size_t len = sizeof(myByteArray) / sizeof(uint8_t);
NSMutableString *myHexString = [NSMutableString stringWithCapacity:len * 2];
for (size_t i = 0; i < len; i++) {
[myHexString appendFormat:#"%02x", (int)myByteArray[i]];
}
Your initial byte data is char rather than unsigned char. This means that any values >127 (0x7f) will be seen as a twos-complement negative number, giving ffffffffffffff89.
If you change your data to be unsigned char you will get the desired result.
const unsigned char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}
I have two methods that create a sha1 hash from a string. Using the same input data this algorithms create different hashes, however they should create the same hashes.
In swift (creates 617fb90f14f2eacecc333d558237bf8bb9fc85f7):
static func sha1FromMessage(message: String) -> String {
let cKey = RestUtils.API_KEY.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
let output = NSMutableString(capacity: Int(CC_SHA1_DIGEST_LENGTH))
for byte in cHMAC {
output.appendFormat("%02hhx", byte)
}
return output
}
and obj-c (creates d80b816f0b46d5211b6d9487089597e181717ea6)
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
I would like the swift method to return the same hash as the obj-c method. Any ideas where the problem is?
The reason is that cData created by
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
includes the terminating NUL-character of the message string and that is counted in
UInt(cData.count) as well. You could fix that by using UInt(strlen(cData)) instead,
as in your Objective-C code.
But a better solution is to convert the input strings
to NSData objects instead:
let cKey = RestUtils.API_KEY.dataUsingEncoding(NSASCIIStringEncoding)!
let cData = message.dataUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey.bytes, UInt(cKey.length), cData.bytes, UInt(cData.length), &cHMAC)
With this modification, Swift and Objective-C code produce the same message digest.