unsigned char in Swift - ios

In Obj-C this code is used to convert an NSData to unsigned char:
unsigned char *dataToSentToPrinter = (unsigned char *)malloc(commandSize);
In Swift unsigned char is supposedly called CUnsignedChar, but how do i convert an NSData object to CUnsignedChar in Swift?

This could be what you are looking for:
let commandsToPrint: NSData = ...
// Create char array with the required size:
var dataToSentToPrinter = [CUnsignedChar](count: commandsToPrint.length, repeatedValue: 0)
// Fill with bytes from NSData object:
commandsToPrint.getBytes(&dataToSentToPrinter, length: sizeofValue(dataToSentToPrinter))
Update: Actually you don't need to copy the data at all (neither in the
Objective-C code nor in Swift). It is sufficient to have a pointer to the data.
So your code could look like this (compare
Error ("'()' is not identical to 'UInt8'") writing NSData bytes to NSOutputStream using the write function in Swift
for a similar problem):
let dataToSentToPrinter = UnsafePointer<CUnsignedChar>(commandsToPrint.bytes)
let commandSize = commandsToPrint.length
var totalAmountWritten = 0
while totalAmountWritten < commandSize {
let remaining = commandSize - totalAmountWritten
let amountWritten = starPort.writePort(dataToSentToPrinter, totalAmountWritten, remaining)
totalAmountWritten += amountWritten
// ...
}

Related

iOS a Very Amazing(malloc_error_break)

first this my code
#pragma pack (4)
typedef struct _Login{
char user[32];
char pwd[32];
int userID;
}Login,*PLogin;
const unsigned long MSG_TAG_HEADER_YXHY = 0x59485859;
#pragma pack (2)
typedef struct tagTcpPacketHeader
{
int ulHtag;
char ucVersion;
char ucCmd;
int ulUserId;
short usPacketNum;
int ulDataLen;
}TcpPacketHeader,*LPTcpPacketHeader;
#pragma pack ()
const unsigned int TCP_HEADER_PACKET_LEN = sizeof(TcpPacketHeader);
- (NSData*)sendDataFileWithUserId:(const int)nUserId nCmd:(const int)nCmd pData:(NSData*)data{
NSData* sendData;
void* sendObj = malloc(data.length);
[data getBytes:sendObj length:data.length];
static int nPacketNum = 0;
int nLen = (int)data.length + TCP_HEADER_PACKET_LEN;
char *pTmpBuf = malloc(nLen);
LPTcpPacketHeader tcpHeader = (LPTcpPacketHeader)pTmpBuf;
tcpHeader->ulHtag = MSG_TAG_HEADER_YXHY;
tcpHeader->ucVersion = 1;
tcpHeader->ucCmd = nCmd;
tcpHeader->ulUserId = nUserId;
tcpHeader->usPacketNum = nPacketNum;
tcpHeader->ulDataLen = nLen;
memcpy(tcpHeader + TCP_HEADER_PACKET_LEN,sendObj, data.length);
sendData = [NSData dataWithBytes:pTmpBuf length:nLen];
nPacketNum++;
free(pTmpBuf);
free(sendObj);
return sendData;
}
- (NSData*)get_File_Login:(NSString*)userID{
int length = sizeof(Login);
Login log_in = {"123","456",userID.intValue};
NSData* login_data = [NSData dataWithBytes:&log_in length:length];
NSData* ret = [self sendDataFileWithUserId:log_in.userID nCmd:5 pData:login_data];
return ret;
}
Use
NSData* ms = [self get_File_Login:#"123"];
NSLog(#"%#",ms);
After frequent use can be a problem
question
This question makes me very headache why appear “ set a breakpoint in malloc_error_break to debug ”
I have added the "malloc_error_break" the breakpoint,But it doesn't work......
Who can tell me the answer???
When you use the pointer in memcpy this way
memcpy(tcpHeader + TCP_HEADER_PACKET_LEN,sendObj, data.length);
this means that you want to copy into memory location pointed by tcpHeader plus TCP_HEADER_PACKET_LEN times the size of the data the pointer points to. It is the same as doing &tcpHeader[TCP_HEADER_PACKET_LEN].
Assuming you want to write to a location right after the header there are two ways to fix it:
1) use a pointer with a size of 1, meaning a char*. In your code you have a pointer pTmpBuf that is such so just change the code to:
memcpy(pTmpBuf + TCP_HEADER_PACKET_LEN, sendObj, data.length);
2) use the size 1 for this calculation. Since the size of the data it points to is the same as TCP_HEADER_PACKET_LEN then multiplying it by one gives the correct location:
memcpy(tcpHeader + 1, sendObj, data.length);
I would recommend the first since it's clear what you are calculating. In the second it is unclear why you would add one, as well as using a pointer to one type when copying data that isn't that type.

Generating SHA256 in iOS

I tried to generate SHA256 in iOS using Arcane library with following data:
String: Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS
Key: 71DD0F73AFFBB47825FF9864DDE95F3B
Result was 409dc622b3bef5c9fc46e45c3210111fcb4536d3a55833316fe0dc8154b3ea34
which I thought to be correct. However, the Windows counterpart is generating SHA256 using following code:
Windows Phone Source Code:
public static string HmacSha256(string secretKey, string value)
{
var msg = CryptographicBuffer.ConvertStringToBinary(value, BinaryStringEncoding.Utf8);
byte[] convertedHash = new byte[secretKey.Length / 2];
for (int i = 0; i < secretKey.Length / 2; i++)
{
convertedHash[i] = (byte)Int32.Parse(secretKey.Substring(i * 2, 2), System.Globalization.NumberStyles.HexNumber);
}
// Create HMAC.
var objMacProv = MacAlgorithmProvider.OpenAlgorithm(MacAlgorithmNames.HmacSha256);
CryptographicHash hash = objMacProv.CreateHash(convertedHash.AsBuffer());
hash.Append(msg);
return CryptographicBuffer.EncodeToHexString(hash.GetValueAndReset());
}
and the result is: 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73 which is different from what I got. Can somebody help me with this and let me know what the above code is doing and how can I replicate it in iOS.
Edit:
iOS Source code
let key = self.md5(string: "71DD0F73AFFBB47825FF9864DDE95F3B")
let hash = HMAC.SHA256(str, key: key)
The key here is you need to convert your secret, which is a hex string, into NSData. In other words, NSData byte stream would "look" like the secret.
This should do what you want:
// Hex string to NSData conversion from here http://stackoverflow.com/questions/7317860/converting-hex-nsstring-to-nsdata
NSString *secret = #"71DD0F73AFFBB47825FF9864DDE95F3B";
NSData *dataIn = [#"Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS" dataUsingEncoding:NSUTF8StringEncoding];
NSMutableData *macOut = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];
secret = [secret stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *secretData = [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [secret length]/2; i++) {
byte_chars[0] = [secret characterAtIndex:i*2];
byte_chars[1] = [secret characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[secretData appendBytes:&whole_byte length:1];
}
CCHmac(kCCHmacAlgSHA256, secretData.bytes, secretData.length, dataIn.bytes, dataIn.length, macOut.mutableBytes);
NSMutableString *stringOut = [NSMutableString stringWithCapacity:macOut.length];
const unsigned char *macOutBytes = macOut.bytes;
for (NSInteger i=0; i<macOut.length; ++i) {
[stringOut appendFormat:#"%02x", macOutBytes[i]];
}
NSLog(#"dataIn: %#", dataIn);
NSLog(#"macOut: %#", macOut);
NSLog(#"stringOut: %#", stringOut);
Output:
2016-09-27 20:18:54.181 JKS[27562:5321334] dataIn: <416d6f75 6e743d35 30264269 6c6c6572 49443d35 39264368 616e6e65 6c49443d 3226436f 6e746578 743d3334 7c636865 636b7c74 65737426 52657475 726e5552 4c3d6874 7470733a 2f2f7561 742e6d79 6661746f 6f72612e 636f6d2f 52656365 69707450 4f432e61 73707826 54786e52 65664e75 6d3d3030 30303030 30303030 32303030 33265573 65724e61 6d653d44 4353>
2016-09-27 20:18:54.181 JKS[27562:5321334] macOut: <94a20ca3 9c8487c7 763823ec 9c918d9e 38ae83cb 741439f6 d129bcde f9edba73>
2016-09-27 20:18:54.181 JKS[27562:5321334] stringOut: 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73
Updated with Swift (code should be cleaned up)
// http://stackoverflow.com/questions/29799361/generate-a-hmac-swift-sdk8-3-using-cchmac
func generateHMAC(key: String, data: String) -> String {
let keyData = key.dataFromHexadecimalString()! as NSData
let dataIn = data.data(using: .utf8)! as NSData
var result: [CUnsignedChar]
result = Array(repeating: 0, count: Int(CC_SHA256_DIGEST_LENGTH))
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA256), keyData.bytes, keyData.length, dataIn.bytes, dataIn.length, &result)
let hash = NSMutableString()
for val in result {
hash.appendFormat("%02hhx", val)
}
return hash as String
}
You can use this extension to convert the hex string to Data
// Modified slightly http://stackoverflow.com/questions/26501276/converting-hex-string-to-nsdata-in-swift
extension String {
func dataFromHexadecimalString() -> Data? {
var data = Data(capacity: characters.count / 2)
let regex = try! NSRegularExpression(pattern: "[0-9a-f]{1,2}", options: .caseInsensitive)
regex.enumerateMatches(in: self, options: [], range: NSMakeRange(0, characters.count)) { match, flags, stop in
let byteString = (self as NSString).substring(with: match!.range)
var num = UInt8(byteString, radix: 16)
data.append(&num!, count: 1)
}
return data
}
}
And to use do something like:
let secret = "71DD0F73AFFBB47825FF9864DDE95F3B"
let value = "Amount=50&BillerID=59&ChannelID=2&Context=34|check|test&ReturnURL=https://uat.myfatoora.com/ReceiptPOC.aspx&TxnRefNum=000000000020003&UserName=DCS"
print("\(generateHMAC(key: secret, data: value))")
Your output should be 94a20ca39c8487c7763823ec9c918d9e38ae83cb741439f6d129bcdef9edba73
You will need #import <CommonCrypto/CommonCrypto.h> in your bridging header.
The Windows code takes the string, interprets it as a hexadecimal number, and converts two characters a time into one byte.
Your Mac code most like takes the string as it is. Since the key starts with "71", your windows code takes that as a single byte with value 0x71 = 129, your Mac code takes it as two bytes with values '7' = 55 and '1' = 49.
All you need to do is convert the bytes on the Mac exactly as you do it on Windows. You might have to do the unthinkable and look at the source code of the Mac library to see how it does the actual hash calculation.
#import <CommonCrypto/CommonHMAC.h>
+ (NSString *)hmacSHA256EncryptString{
NSString * parameterSecret = #"input secret key";
NSString *plainString = #"input encrypt content string";
const char *secretKey = [parameterSecret cStringUsingEncoding:NSUTF8StringEncoding];
const char *plainData = [plainString cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, secretKey, strlen(secretKey), plainData, strlen(plainData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *bufferChar = (const unsigned char *)[HMACData bytes];
NSMutableString *hmacString = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[hmacString appendFormat:#"%02x", bufferChar[i]];
}
return hmacString;
}

iOS (objective-c) compression_decode_buffer() returns zero

I'm converting a very large json result on my server to a compressed format that I can decompress on my objective c app. I would prefer to use the iOS 9 compression lib if possible (libcompression.tbd), described in Apple's CompressionSample/BlockCompression.c sample code.
I'm passing the compressed NSData result to the following method:
#include "compression.h"
...
- (NSData *) getDecompressedData:(NSData *) compressed {
size_t dst_buffer_size = 20000000; //20MB
uint8_t *dst_buffer = malloc(dst_buffer_size);
uint8_t *src_buffer = malloc(compressed.length);
[compressed getBytes:src_buffer length:compressed.length];
size_t decompressedSize = compression_decode_buffer(dst_buffer, dst_buffer_size, src_buffer, compressed.length, nil, COMPRESSION_ZLIB);
NSData *decompressed = [[NSData alloc] initWithBytes:dst_buffer length:decompressedSize];
return decompressed;
}
The compressed parameter has a length that matches my server logs, but the result from compression_decode_buffer is always zero and dst_buffer is not modified. I'm not receiving any errors, and the log has no relevant info.
I've tried ZLIB and LZ4 compression / decompression methods and several libraries on the server side, all with the same result.
What am I doing wrong here?
After much testing and research, I found that the compression library I was using on my server adds a compression header (1st two bytes), per RFC1950. I skipped those two bytes and compression_decode_buffer works like a champ!
- (NSData *) getDecompressedData:(NSData *) compressed {
size_t dst_buffer_size = 20000000; //20MB
uint8_t *dst_buffer = malloc(dst_buffer_size);
uint8_t *src_buffer = malloc(compressed.length);
[compressed getBytes:src_buffer range:NSMakeRange(2, compressed.length - 2)];
size_t decompressedSize = compression_decode_buffer(dst_buffer, dst_buffer_size, src_buffer, compressed.length - 2, nil, COMPRESSION_ZLIB);
NSData *decompressed = [[NSData alloc] initWithBytes:dst_buffer length:decompressedSize];
return decompressed;
}
Thank you so much, azcoastal - saved me heaps of time!
Here's some working Swift code ..
let bytes = [UInt8](data) // Data -> [Uint8]
// Need to remove the first 2 bytes (a header) from the array!!
let slice = bytes[2...bytes.count-1]
let noheader = Array(slice)
let dst_count = bytes.count * MULTIPLY
var dst = [UInt8](repeating: 0, count: dst_count) // destination
let size = compression_decode_buffer(&dst, dst_count,
noheader, noheader.count, nil, COMPRESSION_ZLIB)

Same HMAC algorithm in obj-c and swift creates different hashes

I have two methods that create a sha1 hash from a string. Using the same input data this algorithms create different hashes, however they should create the same hashes.
In swift (creates 617fb90f14f2eacecc333d558237bf8bb9fc85f7):
static func sha1FromMessage(message: String) -> String {
let cKey = RestUtils.API_KEY.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
let output = NSMutableString(capacity: Int(CC_SHA1_DIGEST_LENGTH))
for byte in cHMAC {
output.appendFormat("%02hhx", byte)
}
return output
}
and obj-c (creates d80b816f0b46d5211b6d9487089597e181717ea6)
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
I would like the swift method to return the same hash as the obj-c method. Any ideas where the problem is?
The reason is that cData created by
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
includes the terminating NUL-character of the message string and that is counted in
UInt(cData.count) as well. You could fix that by using UInt(strlen(cData)) instead,
as in your Objective-C code.
But a better solution is to convert the input strings
to NSData objects instead:
let cKey = RestUtils.API_KEY.dataUsingEncoding(NSASCIIStringEncoding)!
let cData = message.dataUsingEncoding(NSUTF8StringEncoding)!
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey.bytes, UInt(cKey.length), cData.bytes, UInt(cData.length), &cHMAC)
With this modification, Swift and Objective-C code produce the same message digest.

Create hash in swift using key and message

I want to create an SHA1 hmac hash of a string using a key in swift. In obj-c I used this and it worked great:
+(NSString *)sha1FromMessage:(NSString *)message{
const char *cKey = [API_KEY cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [message cStringUsingEncoding:NSUTF8StringEncoding];
NSLog(#"%s", cData);
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMACData = [NSData dataWithBytes:cHMAC length:sizeof(cHMAC)];
const unsigned char *buffer = (const unsigned char *)[HMACData bytes];
NSMutableString *HMAC = [NSMutableString stringWithCapacity:HMACData.length * 2];
for (int i = 0; i < HMACData.length; ++i){
[HMAC appendFormat:#"%02hhx", buffer[i]];
}
return HMAC;
}
However now I am having a hard time to translate this into swift. This is what I have so far:
static func sha1FromMessage(message: String){
let cKey = RestUtils.apiKey.cStringUsingEncoding(NSASCIIStringEncoding)!
let cData = message.cStringUsingEncoding(NSUTF8StringEncoding)!
let cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
...
}
and this line
CCHmac(kCCHmacAlgSHA1, cKey, cKey.count, cData, cData.count, cHMAC)
already gives me an error Int is not convertible to CCHmacAlgorithm. Any ideas how to translate the obj-c code to swift?
The last parameter of the CCHmac() function has the type UnsafeMutablePointer<Void> because
that's where the result is written to. You have to declare cHMAC as variable
and pass it as an in-out expression with &. In addition, some type conversions
are necessary:
var cHMAC = [CUnsignedChar](count: Int(CC_SHA1_DIGEST_LENGTH), repeatedValue: 0)
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, UInt(cKey.count), cData, UInt(cData.count), &cHMAC)
Apple enum values are defined differently in Swift. Instead of kCCHmacAlgSHA1, it's probably defined like CCHmacAlgorithm.SHA1.
This is answered here:
https://stackoverflow.com/a/24411522/2708650

Resources