CCCrypt decrypting in AES - ios

In my iOS app I have to decrypt data coming from a server. I use CommonCrypto framework and, after several trials, I successfully decrypted with
CCCrypt(kCCDecrypt, // operation
kCCAlgorithmAES128, // Algorithm
kCCOptionPKCS7Padding | kCCModeCBC, // options
key.bytes, // key
key.length, // keylength
nil,// iv
cipherData.bytes, // dataIn
cipherData.length, // dataInLength,
decryptedData.mutableBytes, // dataOut
decryptedData.length, // dataOutAvailable
&outLength); // dataOutMoved
In the java server the data is crypted with
byte[] buff = new byte[100];
byte[] buf2 = new byte[32];
byte[] mainKey = ...
byte[] raw = ...
PaddedBufferedBlockCipher cipher = new PaddedBufferedBlockCipher(new AESEngine());
KeyParameter par = new KeyParameter(mainKey);
int minSize = cipher.getOutputSize(data.length);
byte[] outBuf = new byte[minSize];
int length1 = cipher.processBytes(data, 0, data.length, outBuf, 0);
int length2 = cipher.doFinal(outBuf, length1);
int actualLength = length1 + length2;
byte[] result = new byte[actualLength];
System.arraycopy(outBuf, 0, result, 0, result.length);
Now, I don't understand the sense of kCCOptionPKCS7Padding | kCCModeCBC. kCCOptionPKCS7Padding = 0x0001 and kCCModeCBC = 2 so kCCOptionPKCS7Padding | kCCModeCBC = 3 but not exist options for block ciphers with value 3.
Is there someone that can help me to understand?

Your use of kCCModeCBC here is incorrect. All CCOption enum values begin with kCCOption. kCCModeCBC is part of the CCMode enum. You can't combine them this way. You're getting away with it because CBC happens to be the default. You should remove | kCCModeCBC. (CCMode is used by a newer interface called CCCryptorCreateWithMode. The interface you're using defaults to CBC and has an option to switch to ECB mode instead.))
To your deeper question, these are bit fields. So "bit zero" (which has a value of 1) is PKCS7 padding. Bit one (which has a value of 2) turns on ECB (not CBC). If you "or" them (which this the same as adding them), you get 3, which means both options. This is an extremely common way to pass boolean data in C, giving each field one bit in a larger integer.
If there were more fields, they would have values 4, 8, 16, 32, etc. All powers of two. So the options you turn on or off are exactly a binary number of ones (on) and zeros (off).
C does not have a really good way to maintain type safety for these kinds of values, so it won't stop you from combining unrelated enums like you've done here.
The reason it "works" with kCCModeCBC is that it has the same value as kCCOptionECBMode. Your encryption is in ECB mode, not CBC mode. (Which happens to mean that your cypher is almost certainly deeply insecure, but that's a separate issue.)

Related

Inconsistent RSA encryption length result

i'm using external library OpenSSL-Universal for encrypting password using RSA_PKCS1_PADDING. Unfortunately the output of char encoded result have an inconsistent length. Let say i have 2048 bit modulus the length result i expect is 128 otherwise it will be failed to decrypt back to plain text.
BIGNUM *xponent = BN_new();
BIGNUM *modulus = BN_new();
BN_hex2bn(&xponent,xponentInHex);
BN_hex2bn(&modulus,modInHex);
RSA *rsa = RSA_new();
rsa->e = xponent;
rsa->n = modulus;
char encoded[1024] = {0};
RSA_public_encrypt(
(int)strlen(charString),// from len
(const unsigned char *)charString, // from
(unsigned char *)encoded, // to
rsa,
RSA_PKCS1_PADDING
);
RSA_free(rsa);
NSLog(#"%lu", strlen(encoded));
if there is anything wrong with my implements or if you have some explanation about inconsistent length result please let me know
Encrypted Output from RSA_public_encrypt is not a string, so you cannot check it's length using strlen. I hope you do realize that strlen will consider any '\0' seen in your encrypted output as end of string and will return the numbers of bytes till that point as length of the string. But, encrypted buffer can have valid bytes which can be '\0'.
Also, please note that RSA_public_encrypt returns the length of the encrypted data, which you should be using.
You can try using Objective-C-RSA third-party libraries,Or read the source code and then realize by yourself.

How to change Cipher.getInstance("AES") into objective c CCCrypt [duplicate]

I am encrypting a string in objective-c and also encrypting the same string in Java using AES and am seeing some strange issues. The first part of the result matches up to a certain point but then it is different, hence when i go to decode the result from Java onto the iPhone it cant decrypt it.
I am using a source string of "Now then and what is this nonsense all about. Do you know?"
Using a key of "1234567890123456"
The objective-c code to encrypt is the following: NOTE: it is a NSData category so assume that the method is called on an NSData object so 'self' contains the byte data to encrypt.
- (NSData *)AESEncryptWithKey:(NSString *)key {
char keyPtr[kCCKeySizeAES128+1]; // room for terminator (unused)
bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)
// fetch key data
[key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding];
NSUInteger dataLength = [self length];
//See the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t bufferSize = dataLength + kCCBlockSizeAES128;
void *buffer = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
if (cryptStatus == kCCSuccess) {
//the returned NSData takes ownership of the buffer and will free it on deallocation
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
}
free(buffer); //free the buffer;
return nil;
}
And the java encryption code is...
public byte[] encryptData(byte[] data, String key) {
byte[] encrypted = null;
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
byte[] keyBytes = key.getBytes();
SecretKeySpec keySpec = new SecretKeySpec(keyBytes, "AES");
try {
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS7Padding", "BC");
cipher.init(Cipher.ENCRYPT_MODE, keySpec);
encrypted = new byte[cipher.getOutputSize(data.length)];
int ctLength = cipher.update(data, 0, data.length, encrypted, 0);
ctLength += cipher.doFinal(encrypted, ctLength);
} catch (Exception e) {
logger.log(Level.SEVERE, e.getMessage());
} finally {
return encrypted;
}
}
The hex output of the objective-c code is -
7a68ea36 8288c73d f7c45d8d 22432577 9693920a 4fae38b2 2e4bdcef 9aeb8afe 69394f3e 1eb62fa7 74da2b5c 8d7b3c89 a295d306 f1f90349 6899ac34 63a6efa0
and the java output is -
7a68ea36 8288c73d f7c45d8d 22432577 e66b32f9 772b6679 d7c0cb69 037b8740 883f8211 748229f4 723984beb 50b5aea1 f17594c9 fad2d05e e0926805 572156d
As you can see everything is fine up to -
7a68ea36 8288c73d f7c45d8d 22432577
I am guessing I have some of the settings different but can't work out what, I tried changing between ECB and CBC on the java side and it had no effect.
Can anyone help!? please....
Since the CCCrypt takes an IV, does it not use a chaining block cipher method (such as CBC)? This would be consistant with what you see: the first block is identical, but in the second block the Java version applies the original key to encrypt, but the OSX version seems to use something else.
EDIT:
From here I saw an example. Seems like you need to pass the kCCOptionECBMode to CCCrypt:
ccStatus = CCCrypt(encryptOrDecrypt,
kCCAlgorithm3DES,
kCCOptionECBMode, <-- this could help
vkey, //"123456789012345678901234", //key
kCCKeySize3DES,
nil, //"init Vec", //iv,
vplainText, //"Your Name", //plainText,
plainTextBufferSize,
(void *)bufferPtr,
bufferPtrSize,
&movedBytes);
EDIT 2:
I played around with some command line to see which one was right. I thought I could contribute it:
$ echo "Now then and what is this nonsense all about. Do you know?" | openssl enc -aes-128-ecb -K $(echo 1234567890123456 | xxd -p) -iv 0 | xxd
0000000: 7a68 ea36 8288 c73d f7c4 5d8d 2243 2577 zh.6...=..]."C%w
0000010: e66b 32f9 772b 6679 d7c0 cb69 037b 8740 .k2.w+fy...i.{.#
0000020: 883f 8211 7482 29f4 7239 84be b50b 5aea .?..t.).r9....Z.
0000030: eaa7 519b 65e8 fa26 a1bb de52 083b 478f ..Q.e..&...R.;G.
I spent a few weeks decrypting a base64 encoded, AES256 encrypted string. Encryption was done by CCCrypt (Objective-C)on an iPad. The decryption was to be done in Java (using Bouncy Castle).
I finally succeeded and learnt quite a lot in the process. The encryption code was exactly the same as above (I guess it's taken from the Objective-C sample in the iPhone developer documentation).
What CCCrypt() documentation does NOT mention is that it uses CBC mode by default (if you don't specify an option like kCCOptionECBMode). It does mention that the IV, if not specified, defaults to all zeros (so IV will be a byte array of 0x00, 16 members in length).
Using these two pieces of information, you can create a functionally identical encryption module using CBC (and avoid using ECB which is less secure) on both Java and OSx/iphone/ipad(CCCrypt).
The Cipher init function will take the IV byte array as a third argument:
cipher.init(Cipher.ENCRYPT_MODE, keySpec, IV).
For anyone else who needs this, disown was absolutely spot on... the revised call to create the crypt in objective-c is as follows (note you need the ECB mode AND the padding)...
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionECBMode + kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
Just to add to the first post: in your objective C/cocoa code you used CBC mode and in your java code you used EBC and an IV initialization vector wasn't used in either.
EBC cipher is block by block and CBC chains upon the preceding block, so if your text is smaller than 1 block (=16 bytes in your example), the cipher text produced by both are decryptable by the other (the same).
If you are looking for a way to standardize your use of the ciphers, NIST Special Publication 800-38A, 2001 Edition has test vectors. I can post code for the AES CBC and EBC vectors if it's helpful to anyone.

Get SecKeyRef from modulus/exponent

I have a RSA key (pair) represented as big integeger modulus and exponent and need to encrypt/decrypt with those.
I figured out how to handle keys as needed in iOS using swift.
To my question: Is there any way to convert the modulus/exponent representation to a standard SecKeyRef?
Both is formatted as big int (coming from android),
a modulus for example looks like this:
23986589886077318012326064844037831693417390067186403792990846282531380456965701688980194375481519508455379138899060072530724598302129656976140458275478340281694599774176865257462922861492999970413042311221914141827738166785420817621605554859384423695247859963064446809695729281306530681131568503935369097838468173777374667631401317163094053418212485192857751897040859007584244053136110895205839896478287122804119514727484734998762296502939823974188856604771622873660784676915716476754048257418841069214486772931445697194023455179601077893872576165858771367831752886749210944303260745331014786145738511592470796648651
I had exactly the same task - given a modulus and an exponent I had to create a public key and encrypt a message using that key. After a long time spent in reading and trying various libraries, I was able to accomplish this with OpenSSL. I'm posting my way of doing it below. Although it's written in Objective-C, it might be helpful.
NSData* message, modulus, exponent;
BIGNUM* mod = BN_bin2bn((unsigned char *)[modulus bytes], (int)modulus.length, NULL);
if (mod == NULL) {
NSLog(#"Error creating modulus BIGNUM");
}
BIGNUM* exp = BN_bin2bn((unsigned char *)[exponent bytes], (int)exponent.length, NULL);
if (exp == NULL) {
NSLog(#"Error creating exponent BIGNUM");
}
RSA* rsa = RSA_new();
rsa->pad = 0;
rsa->e = exp;
rsa->n = mod;
int keylen = RSA_size(rsa);
unsigned char* enc = malloc(keylen);
char* err = malloc(130);
int status = RSA_public_encrypt((int)message.length, (const unsigned char*)[message bytes], enc, rsa, RSA_NO_PADDING);
if (status != -1) {
NSData* encryptedMessage = [NSData dataWithBytes:enc length:keylen];
NSLog(#"Encryption SUCCESSFUL: %#", encryptedMessage);
}
else {
ERR_load_crypto_strings();
ERR_error_string(ERR_get_error(), err);
NSLog(#"Encryption failed with error: %s", err);
}
free(enc);
free(err);
So first I'm creating big integers out of my NSData modulus and exponent. You already have them as big integers, but if they're not represented as OpenSSL's BIGNUM type, you'll have to convert them. BIGNUM has other useful functions for creating big integers like BN_hex2bn and BN_dec2bn - these create big integers out of C strings containing hexadecimal or decimal numbers. In my case the modulus and exponent are stored as a byte array in an NSData and BN_bin2bn creates a BIGNUM directly from that.
Moving on, I create an RSA structure which represents a key and holds the modulus and exponent, and the enc buffer, which will hold the raw encrypted bytes. The length of enc is the same as the size of the key, because RSA can not encrypt messages longer that the key.
The main work is done by the RSA_public_encrypt() function. It takes five arguments - the size of the message that you're going to encrypt, the actual message bytes, an output buffer to store the encrypted message in, the RSA key and a padding scheme. I'm using no padding here, because my message is exactly the same size as the key, but in the rsa.h there are macros that represent the most common padding schemes.
Lastly I check the status which holds the number of encrypted bytes and print an error message if something went wrong.
I hope this will help you and somebody else. Tell me if you managed to do it in Swift. Cheers ;-)
P.S. Adding OpenSSL to your iOS project is easy using CocoaPods. Just add
pod 'OpenSSL-Universal', '1.0.1.k'
to your podfile.

aes decryption \0 character ios

i've a problem..when i decrypt the data that is returned from my php page,
if the length of the string is less than 16, the char \0 is append to string.
Original string is: 100000065912248
I decrypt the encrypted string with this function:
#define FBENCRYPT_ALGORITHM kCCAlgorithmAES128
#define FBENCRYPT_BLOCK_SIZE kCCBlockSizeAES128
#define FBENCRYPT_KEY_SIZE kCCKeySizeAES256
+ (NSData*)decryptData:(NSData*)data key:(NSData*)key iv:(NSData*)iv;
{
NSData* result = nil;
// setup key
unsigned char cKey[FBENCRYPT_KEY_SIZE];
bzero(cKey, sizeof(cKey));
[key getBytes:cKey length:FBENCRYPT_KEY_SIZE];
// setup iv
char cIv[FBENCRYPT_BLOCK_SIZE];
bzero(cIv, FBENCRYPT_BLOCK_SIZE);
if (iv) {
[iv getBytes:cIv length:FBENCRYPT_BLOCK_SIZE];
}
// setup output buffer
size_t bufferSize = [data length] + FBENCRYPT_BLOCK_SIZE;
void *buffer = malloc(bufferSize);
int length = [data length];
// do decrypt
size_t decryptedSize = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCDecrypt,
FBENCRYPT_ALGORITHM,
0,
cKey,
FBENCRYPT_KEY_SIZE,
cIv,
[data bytes],
[data length],
buffer,
bufferSize,
&decryptedSize);
if (cryptStatus == kCCSuccess) {
result = [NSData dataWithBytesNoCopy:buffer length:decryptedSize];
} else {
free(buffer);
NSLog(#"[ERROR] failed to decrypt| CCCryptoStatus: %d", cryptStatus);
}
return result;
}
I send a nil "iv" parameter to the function and after i use "cIv" in function, and it contain this:
The result is exactly, but the length of string is 16 instead of 15 (string: 100000065912248). In fact, the last character is \0.
Why? how can i solve?
EDIT:
PHP encrypt function:
function encrypt($plaintext) {
$key = 'a16byteslongkey!a16byteslongkey!';
$base64encoded_ciphertext = base64_encode(mcrypt_encrypt(MCRYPT_RIJNDAEL_128, $key, $plaintext, MCRYPT_MODE_CBC));
$base64encoded_ciphertext = trim($base64encoded_ciphertext);
return $base64encoded_ciphertext;
}
AES is a block cypher and encrypts/decrypts blocks of length 128 bits (16 bytes). So if the data is not a block size some padding must be added. The most popular and supported by Apple is PKCS7.
Interfacing with PHP one must consider padding and possible base64 encoding.
The solution is to use the same padding on both sides, PHP and iOS.
AES always operates on 16 bytes, there is no option--so, if you have 15 bytes a byte is going to have to be added, that is padding. From what I understand (not much about PHP encryption) PHP does not do true PCKS7padding and it is best to pad yourself. Lookup PKCS7 in Wikipedia.
You should be OK with zero padding (the default) if you only operate on strings, but I would recommend PKCS#7 padding, if only for interoperability reasons.
With zero padding the plaintext is padded with 00 valued bytes, but only if required. This is different from PKCS#7 padding, which is always deployed. After decryption you can use the trim function on the resulting plaintext after decryption. You should then get the original string.
This obviously wont work on binary data because it may end with a character that is removed by the trim function. Beware that trim in PHP seems to strip off 00 bytes. This is not a given, officially 00 is not whitespace, even though it is treated that way by many runtimes.
You have to remove padding from the decrypted data
function removePadding($decryptedText){
$strPad = ord($decryptedText[strlen($decryptedText)-1]);
$decryptedText= substr($decryptedText, 0, -$strPad);
return $decryptedText;
}

OpenSSL RSA_public_encrypt strange behavior

I’m trying to write a simple encryption routine in C using OpenSSL and I’ve found something strange. I’m not a C guru nor OpenSSL professional. So I might have made a mistake.
The function is as follows
char *rsa_encrypt(char *data)
{
const char xponent_in_hex[] = "010001";
const char modulus_in_hex[] = "D0BA16F11907E7B0819705A15264AC29BEE9F1EC5F22642992
D3E27100B7F212864A624A12FFB6D531712B0B0225AAD0C2E313D077A7DB2A5A33483EEFF41A9D";
BIGNUM *xponent = NULL;
BIGNUM *modulus = NULL;
BN_hex2bn(&xponent, xponent_in_hex);
BN_hex2bn(&modulus, modulus_in_hex);
RSA *rsa = RSA_new();
rsa->e = xponent;
rsa->n = modulus;
rsa->iqmp = NULL;
rsa->d = NULL;
rsa->p = NULL;
rsa->q = NULL;
char encoded[512] = { 0 };
RSA_public_encrypt(
strlen(data),
(const unsigned char *)data,
(unsigned char *)encoded,
rsa,
RSA_PKCS1_OAEP_PADDING
);
RSA_free(rsa);
return (encoded);
}
int _tmain(int argc, _TCHAR* argv[])
{
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
}
I call that function on same data several times and it generates different value each time it is called. It is apparently wrong because exponent and modulus for created RSA structure are constant and input data is the same in each call.
So why RSA_public_encrypt behaves that way?
How should I generate a public key for RSA encryption based on exponent and modulus?
And where I’ve made mistake?
This is actually correct, and you're not making a mistake. Your confusion stems from the RSA_PKCS1_OAEP_PADDING parameter to RSA_public_encrypt.
The RSA encryption process is actually:
Take the plaintext (plain) and encode it , producing encoded_plain.
Encrypt encoded_plain.
(As you would expect, the decryption process requires you to both decrypt the value, and then decode the message).
The RSA_PKCS1_OAEP_PADDING parameter specifies how the plaintext should be encoded (that OAEP encoding should be used).
A simplified explanation is that OAEP padding uses some random values for the padding, so both xxxxxxxABC and yyyyyyyABC and zzzzzzzABC are all valid encoded_plain values for your plaintext, and those encoded_plain encrypts to a different value. If you perform the corresponding decrypt (and decode, by passing the same RSA_PKCS1_OAEP_PADDING paramater to RSA_private_decrypt) operation, you should still get "ABC" as an output for each of the ciphertexts, as the padding stripped off all three.
(If you want to be precise, the OAEP encoding scheme is more complicated that, see RFC 3447 section 7.1.1. But those are probably details you don't care about.)
The scope of encoded ends at the end of the rsa_encrypt function. Your return pointer will point to an invalid area of memory, that might not contain what you expect anymore because somebody else (another thread, for example) wrote over it. The answer explaining the padding is correct.

Resources