I’m trying to write a simple encryption routine in C using OpenSSL and I’ve found something strange. I’m not a C guru nor OpenSSL professional. So I might have made a mistake.
The function is as follows
char *rsa_encrypt(char *data)
{
const char xponent_in_hex[] = "010001";
const char modulus_in_hex[] = "D0BA16F11907E7B0819705A15264AC29BEE9F1EC5F22642992
D3E27100B7F212864A624A12FFB6D531712B0B0225AAD0C2E313D077A7DB2A5A33483EEFF41A9D";
BIGNUM *xponent = NULL;
BIGNUM *modulus = NULL;
BN_hex2bn(&xponent, xponent_in_hex);
BN_hex2bn(&modulus, modulus_in_hex);
RSA *rsa = RSA_new();
rsa->e = xponent;
rsa->n = modulus;
rsa->iqmp = NULL;
rsa->d = NULL;
rsa->p = NULL;
rsa->q = NULL;
char encoded[512] = { 0 };
RSA_public_encrypt(
strlen(data),
(const unsigned char *)data,
(unsigned char *)encoded,
rsa,
RSA_PKCS1_OAEP_PADDING
);
RSA_free(rsa);
return (encoded);
}
int _tmain(int argc, _TCHAR* argv[])
{
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
printf("%s\n", base64_encode(rsa_encrypt("ABC")));
}
I call that function on same data several times and it generates different value each time it is called. It is apparently wrong because exponent and modulus for created RSA structure are constant and input data is the same in each call.
So why RSA_public_encrypt behaves that way?
How should I generate a public key for RSA encryption based on exponent and modulus?
And where I’ve made mistake?
This is actually correct, and you're not making a mistake. Your confusion stems from the RSA_PKCS1_OAEP_PADDING parameter to RSA_public_encrypt.
The RSA encryption process is actually:
Take the plaintext (plain) and encode it , producing encoded_plain.
Encrypt encoded_plain.
(As you would expect, the decryption process requires you to both decrypt the value, and then decode the message).
The RSA_PKCS1_OAEP_PADDING parameter specifies how the plaintext should be encoded (that OAEP encoding should be used).
A simplified explanation is that OAEP padding uses some random values for the padding, so both xxxxxxxABC and yyyyyyyABC and zzzzzzzABC are all valid encoded_plain values for your plaintext, and those encoded_plain encrypts to a different value. If you perform the corresponding decrypt (and decode, by passing the same RSA_PKCS1_OAEP_PADDING paramater to RSA_private_decrypt) operation, you should still get "ABC" as an output for each of the ciphertexts, as the padding stripped off all three.
(If you want to be precise, the OAEP encoding scheme is more complicated that, see RFC 3447 section 7.1.1. But those are probably details you don't care about.)
The scope of encoded ends at the end of the rsa_encrypt function. Your return pointer will point to an invalid area of memory, that might not contain what you expect anymore because somebody else (another thread, for example) wrote over it. The answer explaining the padding is correct.
Related
i'm using external library OpenSSL-Universal for encrypting password using RSA_PKCS1_PADDING. Unfortunately the output of char encoded result have an inconsistent length. Let say i have 2048 bit modulus the length result i expect is 128 otherwise it will be failed to decrypt back to plain text.
BIGNUM *xponent = BN_new();
BIGNUM *modulus = BN_new();
BN_hex2bn(&xponent,xponentInHex);
BN_hex2bn(&modulus,modInHex);
RSA *rsa = RSA_new();
rsa->e = xponent;
rsa->n = modulus;
char encoded[1024] = {0};
RSA_public_encrypt(
(int)strlen(charString),// from len
(const unsigned char *)charString, // from
(unsigned char *)encoded, // to
rsa,
RSA_PKCS1_PADDING
);
RSA_free(rsa);
NSLog(#"%lu", strlen(encoded));
if there is anything wrong with my implements or if you have some explanation about inconsistent length result please let me know
Encrypted Output from RSA_public_encrypt is not a string, so you cannot check it's length using strlen. I hope you do realize that strlen will consider any '\0' seen in your encrypted output as end of string and will return the numbers of bytes till that point as length of the string. But, encrypted buffer can have valid bytes which can be '\0'.
Also, please note that RSA_public_encrypt returns the length of the encrypted data, which you should be using.
You can try using Objective-C-RSA third-party libraries,Or read the source code and then realize by yourself.
I am encrypting a string in objective-c and also encrypting the same string in Java using AES and am seeing some strange issues. The first part of the result matches up to a certain point but then it is different, hence when i go to decode the result from Java onto the iPhone it cant decrypt it.
I am using a source string of "Now then and what is this nonsense all about. Do you know?"
Using a key of "1234567890123456"
The objective-c code to encrypt is the following: NOTE: it is a NSData category so assume that the method is called on an NSData object so 'self' contains the byte data to encrypt.
- (NSData *)AESEncryptWithKey:(NSString *)key {
char keyPtr[kCCKeySizeAES128+1]; // room for terminator (unused)
bzero(keyPtr, sizeof(keyPtr)); // fill with zeroes (for padding)
// fetch key data
[key getCString:keyPtr maxLength:sizeof(keyPtr) encoding:NSUTF8StringEncoding];
NSUInteger dataLength = [self length];
//See the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t bufferSize = dataLength + kCCBlockSizeAES128;
void *buffer = malloc(bufferSize);
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
if (cryptStatus == kCCSuccess) {
//the returned NSData takes ownership of the buffer and will free it on deallocation
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
}
free(buffer); //free the buffer;
return nil;
}
And the java encryption code is...
public byte[] encryptData(byte[] data, String key) {
byte[] encrypted = null;
Security.addProvider(new org.bouncycastle.jce.provider.BouncyCastleProvider());
byte[] keyBytes = key.getBytes();
SecretKeySpec keySpec = new SecretKeySpec(keyBytes, "AES");
try {
Cipher cipher = Cipher.getInstance("AES/ECB/PKCS7Padding", "BC");
cipher.init(Cipher.ENCRYPT_MODE, keySpec);
encrypted = new byte[cipher.getOutputSize(data.length)];
int ctLength = cipher.update(data, 0, data.length, encrypted, 0);
ctLength += cipher.doFinal(encrypted, ctLength);
} catch (Exception e) {
logger.log(Level.SEVERE, e.getMessage());
} finally {
return encrypted;
}
}
The hex output of the objective-c code is -
7a68ea36 8288c73d f7c45d8d 22432577 9693920a 4fae38b2 2e4bdcef 9aeb8afe 69394f3e 1eb62fa7 74da2b5c 8d7b3c89 a295d306 f1f90349 6899ac34 63a6efa0
and the java output is -
7a68ea36 8288c73d f7c45d8d 22432577 e66b32f9 772b6679 d7c0cb69 037b8740 883f8211 748229f4 723984beb 50b5aea1 f17594c9 fad2d05e e0926805 572156d
As you can see everything is fine up to -
7a68ea36 8288c73d f7c45d8d 22432577
I am guessing I have some of the settings different but can't work out what, I tried changing between ECB and CBC on the java side and it had no effect.
Can anyone help!? please....
Since the CCCrypt takes an IV, does it not use a chaining block cipher method (such as CBC)? This would be consistant with what you see: the first block is identical, but in the second block the Java version applies the original key to encrypt, but the OSX version seems to use something else.
EDIT:
From here I saw an example. Seems like you need to pass the kCCOptionECBMode to CCCrypt:
ccStatus = CCCrypt(encryptOrDecrypt,
kCCAlgorithm3DES,
kCCOptionECBMode, <-- this could help
vkey, //"123456789012345678901234", //key
kCCKeySize3DES,
nil, //"init Vec", //iv,
vplainText, //"Your Name", //plainText,
plainTextBufferSize,
(void *)bufferPtr,
bufferPtrSize,
&movedBytes);
EDIT 2:
I played around with some command line to see which one was right. I thought I could contribute it:
$ echo "Now then and what is this nonsense all about. Do you know?" | openssl enc -aes-128-ecb -K $(echo 1234567890123456 | xxd -p) -iv 0 | xxd
0000000: 7a68 ea36 8288 c73d f7c4 5d8d 2243 2577 zh.6...=..]."C%w
0000010: e66b 32f9 772b 6679 d7c0 cb69 037b 8740 .k2.w+fy...i.{.#
0000020: 883f 8211 7482 29f4 7239 84be b50b 5aea .?..t.).r9....Z.
0000030: eaa7 519b 65e8 fa26 a1bb de52 083b 478f ..Q.e..&...R.;G.
I spent a few weeks decrypting a base64 encoded, AES256 encrypted string. Encryption was done by CCCrypt (Objective-C)on an iPad. The decryption was to be done in Java (using Bouncy Castle).
I finally succeeded and learnt quite a lot in the process. The encryption code was exactly the same as above (I guess it's taken from the Objective-C sample in the iPhone developer documentation).
What CCCrypt() documentation does NOT mention is that it uses CBC mode by default (if you don't specify an option like kCCOptionECBMode). It does mention that the IV, if not specified, defaults to all zeros (so IV will be a byte array of 0x00, 16 members in length).
Using these two pieces of information, you can create a functionally identical encryption module using CBC (and avoid using ECB which is less secure) on both Java and OSx/iphone/ipad(CCCrypt).
The Cipher init function will take the IV byte array as a third argument:
cipher.init(Cipher.ENCRYPT_MODE, keySpec, IV).
For anyone else who needs this, disown was absolutely spot on... the revised call to create the crypt in objective-c is as follows (note you need the ECB mode AND the padding)...
CCCryptorStatus cryptStatus = CCCrypt(kCCEncrypt, kCCAlgorithmAES128, kCCOptionECBMode + kCCOptionPKCS7Padding,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted);
Just to add to the first post: in your objective C/cocoa code you used CBC mode and in your java code you used EBC and an IV initialization vector wasn't used in either.
EBC cipher is block by block and CBC chains upon the preceding block, so if your text is smaller than 1 block (=16 bytes in your example), the cipher text produced by both are decryptable by the other (the same).
If you are looking for a way to standardize your use of the ciphers, NIST Special Publication 800-38A, 2001 Edition has test vectors. I can post code for the AES CBC and EBC vectors if it's helpful to anyone.
I have a RSA key (pair) represented as big integeger modulus and exponent and need to encrypt/decrypt with those.
I figured out how to handle keys as needed in iOS using swift.
To my question: Is there any way to convert the modulus/exponent representation to a standard SecKeyRef?
Both is formatted as big int (coming from android),
a modulus for example looks like this:
23986589886077318012326064844037831693417390067186403792990846282531380456965701688980194375481519508455379138899060072530724598302129656976140458275478340281694599774176865257462922861492999970413042311221914141827738166785420817621605554859384423695247859963064446809695729281306530681131568503935369097838468173777374667631401317163094053418212485192857751897040859007584244053136110895205839896478287122804119514727484734998762296502939823974188856604771622873660784676915716476754048257418841069214486772931445697194023455179601077893872576165858771367831752886749210944303260745331014786145738511592470796648651
I had exactly the same task - given a modulus and an exponent I had to create a public key and encrypt a message using that key. After a long time spent in reading and trying various libraries, I was able to accomplish this with OpenSSL. I'm posting my way of doing it below. Although it's written in Objective-C, it might be helpful.
NSData* message, modulus, exponent;
BIGNUM* mod = BN_bin2bn((unsigned char *)[modulus bytes], (int)modulus.length, NULL);
if (mod == NULL) {
NSLog(#"Error creating modulus BIGNUM");
}
BIGNUM* exp = BN_bin2bn((unsigned char *)[exponent bytes], (int)exponent.length, NULL);
if (exp == NULL) {
NSLog(#"Error creating exponent BIGNUM");
}
RSA* rsa = RSA_new();
rsa->pad = 0;
rsa->e = exp;
rsa->n = mod;
int keylen = RSA_size(rsa);
unsigned char* enc = malloc(keylen);
char* err = malloc(130);
int status = RSA_public_encrypt((int)message.length, (const unsigned char*)[message bytes], enc, rsa, RSA_NO_PADDING);
if (status != -1) {
NSData* encryptedMessage = [NSData dataWithBytes:enc length:keylen];
NSLog(#"Encryption SUCCESSFUL: %#", encryptedMessage);
}
else {
ERR_load_crypto_strings();
ERR_error_string(ERR_get_error(), err);
NSLog(#"Encryption failed with error: %s", err);
}
free(enc);
free(err);
So first I'm creating big integers out of my NSData modulus and exponent. You already have them as big integers, but if they're not represented as OpenSSL's BIGNUM type, you'll have to convert them. BIGNUM has other useful functions for creating big integers like BN_hex2bn and BN_dec2bn - these create big integers out of C strings containing hexadecimal or decimal numbers. In my case the modulus and exponent are stored as a byte array in an NSData and BN_bin2bn creates a BIGNUM directly from that.
Moving on, I create an RSA structure which represents a key and holds the modulus and exponent, and the enc buffer, which will hold the raw encrypted bytes. The length of enc is the same as the size of the key, because RSA can not encrypt messages longer that the key.
The main work is done by the RSA_public_encrypt() function. It takes five arguments - the size of the message that you're going to encrypt, the actual message bytes, an output buffer to store the encrypted message in, the RSA key and a padding scheme. I'm using no padding here, because my message is exactly the same size as the key, but in the rsa.h there are macros that represent the most common padding schemes.
Lastly I check the status which holds the number of encrypted bytes and print an error message if something went wrong.
I hope this will help you and somebody else. Tell me if you managed to do it in Swift. Cheers ;-)
P.S. Adding OpenSSL to your iOS project is easy using CocoaPods. Just add
pod 'OpenSSL-Universal', '1.0.1.k'
to your podfile.
I am trying to make an app for my school that interacts with PowerSchool, a software that allows user's to view their grades, teachers, schedules, and much more. I found a library for the basics of interacting with PowerSchool written in PHP and have been trying to write it in objective c for the past week. It seems the issue is how I create an HMAC (MD5) with the user's password. Either I am using a hex digest rather than a digest, not sure. The error I get back from the server is an odd number of characters.
Here is the link to the PHP library class I am trying to re-create:
https://github.com/horvste/powerapi-php/blob/master/src/PowerAPI/Core.php
Here is my code in my test project,
Command line main class:
https://gist.github.com/anonymous/c40cdd99a826c06073aa
NSString Category Implementation file:
#import "NSString+MyAdditions.h"
#implementation NSString (MyAdditions)
- (NSString *) hmacMD5WithData: (NSString *) data
{
const char *cKey = [self cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
const unsigned int blockSize = 64;
char ipad[blockSize], opad[blockSize], keypad[blockSize];
unsigned int keyLen = strlen(cKey);
CC_MD5_CTX ctxt;
if(keyLen > blockSize)
{
//CC_MD5(cKey, keyLen, keypad);
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, cKey, keyLen);
CC_MD5_Final((unsigned char *)keypad, &ctxt);
keyLen = CC_MD5_DIGEST_LENGTH;
}
else
{
memcpy(keypad, cKey, keyLen);
}
memset(ipad, 0x36, blockSize);
memset(opad, 0x5c, blockSize);
int i;
for(i = 0; i < keyLen; i++)
{
ipad[i] ^= keypad[i];
opad[i] ^= keypad[i];
}
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, ipad, blockSize);
CC_MD5_Update(&ctxt, cData, strlen(cData));
unsigned char md5[CC_MD5_DIGEST_LENGTH];
CC_MD5_Final(md5, &ctxt);
CC_MD5_Init(&ctxt);
CC_MD5_Update(&ctxt, opad, blockSize);
CC_MD5_Update(&ctxt, md5, CC_MD5_DIGEST_LENGTH);
CC_MD5_Final(md5, &ctxt);
const unsigned int hex_len = CC_MD5_DIGEST_LENGTH*2+2;
char hex[hex_len];
for(i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
{
snprintf(&hex[i*2], hex_len-i*2, "%02x", md5[i]);
}
NSData *HMAC = [[NSData alloc] initWithBytes:hex length:strlen(hex)];
NSString *hash = [HMAC base64EncodedStringWithOptions:0];
return hash;
}
#end
Thank you for taking the time to look at this issue!
First, don't build your own HMAC routine here. Use CCHmac. It's built-in and handles HMAC+MD5 correctly.
If at all possible, I recommend going to the API documentation rather than trying to reverse engineer another code base. There are lots of little things going on in the PHP that you may be overlooking; an API doc should explain all of those.
If the PHP code is the only reference you have, then you should break down each piece and see where it's going wrong. For instance, verify that you are getting the auth data in the same form. Then confirm that each program, given the same auth data generates the same HMAC. Then confirm that given the same HMAC, each program generates the same response. Etc. Somewhere you are doing something differently. Make sure that you're using Base64 vs raw data in the same places (PHP devs tend to treat Base64 strings as though they were actually raw data, which causes confusion when coming over to ObjC).
And of course you should examine the server logs to validate that your final request matches the PHP requests.
I tried to solve my issue for a week and the only hope is for You!
I receive from a user a string via uitextview and cast it using next code:
unsigned char* pAr = [myuitextview.text UTF8string];
Then after some work i want to show resulting pAr in the myuitextview using this code:
Myuitextview.text = [NSString stringWithUTF8string:pAr];
As result i see blank myuitextview.
After investigation i discovered,that stringWithUtf8string returns nil, but i received it in utf8 and create back with utf8.
Then i discovered that UTF8string
returns nil also.
Also i discovered that it happens when i use unsigned char* instead of const char* returned by utf8string method. When i receive to const char* it returns a c string but if i try unsigned char* it returns nil.
What can be the reason of null after stringwithutf8string? As i understand Unsigned char * and const char* are safe casts?
You can't so easily transfer unsigned char to NSString
Try this one
NSString* s = [[NSString alloc] initWithBytes:pAr length:sizeof(pAr) encoding:NSASCIIStringEncoding];
NSString has a method "UTF8String" (note the use of caps")
I don't see a method "utf8string" (all lower case). Where does that method come from? (Objective-C method names are case sensitive, so those are differennt method names.)
Also, note that in the docs for UTF8String say:
The returned C string is a pointer to a structure inside the string object, which may have a lifetime shorter than the string object and will certainly not have a longer lifetime. Therefore, you should copy the C string if it needs to be stored outside of the memory context in which you called this method.
If you use that method you meed to make sure that your text field sticks around and that you don't change it's string contents or the pointer you get back from the UTF8String method will become invalid.