I'm trying to generate RSA-2048 key pair in ActionScript. The library I'm using is flame crypto library.
It takes more than 60 seconds, so it exceeds maximum script execution time and halts.
So I compiled C math library (libtommath) into SWC using Crossbridge and called its prime number generation function. But it still takes more than 40 seconds.
How can I solve this problem?
RSA key generation is a dog. It's probably the most demanding algorithm you'll ever encounter. It requires a lot of random data, and the prime tests are dog slow as well. Furthermore, RSA key pair generation time is rather indeterministic; it may find a prime fast or it may not. It doesn't help that scripting languages are not that good in performing these kind of low level functions.
There are two things you could do:
break out to OpenSSL, OpenSSL is one of the fastest libraries out there;
create the key pair in a secure environment and copy it securely into the ActionScript environment.
That last option seems strange, as you may want the user to generate the key pair instead. But ask yourself this question: how can you trust the public key generated by ActionScript?
Related
I'm using Airdrop to transfer application internal data between two phones. Because Airdrop was intended for file sharing, it could occur that user choose "save the file" to save the data file in Files app by accident. Since my app is a financial planning app, I'm considering to encrypt the file transferred by Airdrop to keep user's data secure. The encryption only applies to the temp file transferred by Airdrop. Once the app on the receiver phone receivers it, it decrypts the file immediately.
I'm referring to this thread to determine how I should answer the export compliance question if I encrypt the temp file. And I noticed these two exemption items:
(iii) your app uses, accesses, implements or incorporates encryption with key lengths not exceeding 56 bits symmetric, 512 bits asymmetric and/or 112 bit elliptic curve
(iv) your app is a mass market product with key lengths not exceeding 64 bits symmetric, or if no symmetric algorithms, not exceeding 768 bits asymmetric and/or 128 bits elliptic curve.
I don't quite understand the difference between the conditions in the two items (what is a mass market product?). But I don't think either helps, because the ciphers provided by iOS Cryptokit contains only AES and ChaChaPoly - the former takes a minimum key size of 128 bits and the latter takes 256 bits key size.
Since there are a lot of apps using Airdrop to transfer application internal data (I can tell that from the discussions on SO), I wonder how other people deal with this? Is this considered an exemption case?
BTW, I considered other options, but none is satisfying:
Don't encrypt the data. Obscure it instead (for example, using something like Caesar cipher). But that feels very unprofessional.
Don't use Airdrop. Implement my own data transfer mechanism. For example, start a tiny web server on sender side and the receiver side get the data through HTTPS, which from my understanding is an exemption case. I don't choose this approach because a) Airdrop provides a much better user experience than this approach, b) I'll need to use Bonjour to discover service, which requires local network permission. I'd like to avoid that if possible.
The answer depends on what cipher you use to encrypt the data.
Apple summarises your requirements in a couple of documents.
First, in the CryptoKit documentation
Typically, the use of encryption that’s built into the operating system—for example, when your app makes HTTPS connections using URLSession—is exempt from export documentation upload requirements, whereas the use of proprietary encryption is not. To determine whether your use of encryption is considered exempt, see Determine your export compliance requirements.
This leads you to this document which has a table, that I have shown in part:
Assuming that you use AES from Apple's Crypto Kit framework, the second clause would apply. You don't need to provide any documentation to Apple but you should submit a self classification report to the us government.
The exemptions you listed in your question do not apply since you wouldn't use a symmetric cipher with a key length of 64 or 56 bits.
My app computes hash of some strings (that identify in-app purchases) using simple function of my own making. This function is very far from something sophisticated like MD5 - it is just simple hash function with result multiplied few times by large primes - the whole computation is 8 lines in Swift. The hash is then stored using NSUserDefaults. The app does not do anything else that could be considered encryption.
When submitting my app Apple asks me to fill Export Compliance starting with this question:
Is your app designed to use cryptography or does it contain or incorporate cryptography?
So does it? The Export Comliance is required by Apple due to US Export Administration Regulations. Here is the regulation guide linked by Apple and here are some notes about it by Apple.
Incorporating/using hashing is not using encryption, you are not incorporating cryptography.
Cryptographic hash functions are one-way function, there is no reversal/decryption possible, it is not encryption.
Common Crypto is not Objective-C, it is "C".
Using weaker algorithms in place of standard algorithms because it is easier is not professional.
MD5 should not be used in new work, use SHA256 or better. On an iPhone 6s SHA255 is about 4x faster than MD5.
The Common Crypto implementation is FIPS-140-2 certified.
Short answer: yes
MD5 is a cryptographic one way function designed to be difficult to reverse. It uses a 128 bit key to perform the hash. Export restrictions require permission for any key length or 56 bits of higher
I am encrypting downloaded files and saving them locally in app's documents directory.
To read them you must decrypt those file and store some where temporarily.
My concerns are:
1.if I store them in doc directory for time they are being used, for that time window one can get those files using tools like iExplorer.
2.My idea is to store them in memory for the time they are being used and flush the vault after use.
This option is good for small files but for large files say 50 MB or video of 100 MB, I am afraid that app will receive memory warning in result will terminate abruptly.
I want to know the best approach for doing this.
There is no perfect security storing local files in a safe way. If a person has full access to the device, he can always find a way to decrypt the files, as long as your application is able to decrypt it.
The only question is: How much effort is necessary to decrypt the files?
If your only concern is that a person may use iExplorer to copy and open these files, a simple local symmetric encryption will do the trick.
Just embed a random symmetric key in your application and encrypt the data block by block while you download it.
You can use the comfortable "Security Transforms" framework to do the symmetric encryption. There are some good examples in the Apple Documentation.
When you load the files, you can use the same key to decrypt them while you load them from the file system.
Just to make things clear: This is not a perfect protection of the files. But to decrypt the files, one has access to your app binary. Analyse this binary in a debugger and searching for the decryption part to extract your symmetric key. This is a lot effort necessary just to decrypt the files.
Split your files into smaller sizes before saving them, then decrypt on load.
Later edit: I noticed this is mentioned in the comments. I agree splitting files isn't the easiest thing in the world, but presumably you'll only need this for video. About 100MB is a lot of text or audio. If your PDF weights as much, it's probably scanned text, and you can change it into a series if images.
And yes, splitting better be done server-side, don't want the user waste battery in video processing.
Decrypt them, obfuscate them with a toy algorithm (e. g. XOR with a constant block), and store them in documents. When needed, load and decrypt.
Since the problem has no solution in theory (a determined enough attacker can read your process memory after all), it's as good a solution as any.
I'm working on a so-called cartridge, for the geo-location based WheriGo (http://wherigo.com) game. The architecture that is used for these cartridges is 32-bit and big endian. However, my luac will create chunks that are 64-bit and little endian.
While there is an online compilation service for WheriGo, I'd rather be able to produce the proper binary format for myself. Especially, because there are things I'd rather keep a bit obscured in a stripped chunk, loaded by loadstring(), rather than having the full debug information available.
So my question is this: How hard would it be to generate a lua tool chain, that generates byte code for a different architecture, than the one it is running under?
If the floating point representation of both machines is compatible, then this should be just modifications to ldump.c and lundump.c
Care taken to ensure types E.g. long are same size. I have done this for integer lua on x86,x64
You could always run a 32-bit big-endian machine as VM, e.g. Aurélien’s prebuilt images for Debian/mips (notes). It’ll be slow but work and can be automated easily. (Do a dist-upgrade to at least wheezy from squeeze, then get latest Lua.)
I've run VMs like that often enough… it's slow, but I think of it as batch processing: I start a job (apt or compile), then look at it occasionally (or: the next day) to see whether it finished. Most of the time, this works out pretty well; some things of course do not work right in emulation (e.g. due to emulator bugs or differences), but to get a big-endian 32-bit Lua, this might work).
Suggested reading: lua bytecode portability and middle-endian doubles on ARM (both on the lua mailing list) – since PocketPC machines are mostly ARM, you might run into that. Best to check the actual Wherigo cartridges to see what settings they use…
The gist of these postings is: endianness, sizeof(int), sizeof(size_t), sizeof(Instruction), sizeof(lua_Number), and type of lua_Number must be the same for the bytecode to be compatible across architectures (says Luiz Henrique de Figueiredo), and middle-endian floats (both single and double) do exist in the wild (steve donovan and Dimiter 'malkia' Stanev).
Do tell if you do it – I'm interested because I'm a geocacher myself (though need to figure out how to play cartridges, no player for my platforms).
I'm a very newb programmer trying to write some iOS programs, and when I reached the part where I must encrypt my data, I ran into a misty and ill-documented wall. Apple apparently provides all the tools one needs to encrypt data but doesn't write about it anywhere. Currently I am experimenting with stuff found in https://github.com/AlanQuatermain/aqtoolkit, which apparently work. However, I read in http://robnapier.net/blog/aes-commoncrypto-564 that one should not use user selected passwords as encryption keys, but I have seen a few examples of people using the user's password directly with this library and others. Does this apply here, and should I run the user password through a small hurdle race before using it?
It is a good idea to use the hash of a password as a key for your crypto routines. One reason for that is that different algorithms may need keys of a different length and by selecting the appropriate hashing algorithm (e.g. SHA256 for AES256) you automatically get a key with the appropriate length.