I want to build a website in which people can upload files to my S3 bucket via a rails app. I want the upload to be encrypted so that I have no knowledge of what is being uploaded and I want only the user to have the key to decrypt it.
Could someone give me some suggestions on how to go about this or some methods of achieving this?
You can only encrypt it localy, everything else on the serverside (or even the ISP) can be manipulated somehow what is delivered
Lichtamberg is right, the best and most secure way would be for the user to do it clientside. Perhaps you could tell them what encryption types are accepted (such as GPG) and provide instructions for doing so, or recommend tools that might make it easier.
You could probably enforce this in your code by checking whether an uploaded file is encrypted, and rejecting it if not. The check would be similar to an image upload feature that rejects non-image files, for instance.
Related
My question is very clear. I need save some sensitive static data. For example, the url of my service or a password of encrypt. Now I have the next doubt: Is secure save this data in Localizable.strings?
No. A malicious user can easily see this in the IPA of an iTunes backup. But the user can also see this in any file in your app bundle. You will need to encrypt the string somehow. The tricky part is to hide the key as well: it may be a good idea to calculate the key somehow (you can be creative here).
Also pay attention to secure your transmission: if you would be using plain HTTP anyone who can use Wireshark would be able to see your sensitive information. Make sure you've set up HTTPS correctly and that you are validating the certificate of the server on connect (search StackOverflow about that).
I totally agree with #DarkDust. Just to add more things:
A malicious user can see the data because he does the jailbreak on one of the devices. Then he installs the app and may get whole contents of the app. He may change some code and run it.
Whole process of getting the data is called reverse engineering. It's quite wide branch and it's good to know the basics if you care about data security.
You may read more about reverse engineering at e.g. this free book: https://github.com/iosre/iOSAppReverseEngineering.
The best hacker always gets the data, it's just the matter of time. For you, as a developer, the task is to forbid getting the data for less experienced "hackers".
To make things more difficult, you can obfuscate the data.
If you need to save some credentials in app (eg login token), always use the keychain, never any other storage.
I have a RoR web app that allow users upload images and use Cloudinary as cloud storage. I read their document and find a cool way called "direct uploading" which reduce my server's loading. To my knowledge, the spirit is changing workflow
image -> server -> Cloudinary
to
image -> Cloudinary
and my server only store an Cloudinary url to database, not the image file (Tell me if I'm wrong, thx).
So my question is, how to check whether I have changed to "direct uploading" method successfully? Open element inspector to see time cost for each POST and GET requests? Other better options?
I expect big advances via this way, but how can I feel it?
Thanks form a rookie =)
# The app is deployed on heroku.
# Doesn't change to direct uploading method yet.
# This app is private, only serve for around 10 people.
You can indeed (and it is very recommended to) bypass your server and let Cloudinary take care of the upload processing directly. This indeed lowers the processing of your server to simply store the uploaded image's details, and the image is directly stored in your Cloudinary account. This indeed quickens the upload process. You can test out the sample project which demonstrates both server-side and client-side uploads.
I am trying to hide 2 secrets that I am using in one of my apps.
As I understand the keychain is a good place but I can not add them before I submit the app.
I thought about this scenario -
Pre seed the secrets in my app's CoreData Database by spreading them in other entities to obscure them. (I already have a seed DB in that app).
As the app launches for the first time, generate and move the keys to the keychain.
Delete the records from CoreData.
Is that safe or can the hacker see this happening and get those keys?
*THIRD EDIT**
Sorry for not explaining this scenario from the beginning - The App has many levels, each level contains files (audio, video, images). The user can purchase a level (IAP) and after the purchase is completed I need to download the files to his device.
For iOS6 the files are stored with Apple new "Hosted Content" feature. For iOS5 the files are stored in amazon S3.
So in all this process I have 2 keys:
1. IAP key, for verifying the purchase at Apple IAP.
2. S3 keys, for getting the files from S3 for iOS5 users:
NSString *secretAccessKey = #"xxxxxxxxx";
NSString *accessKey = #"xxxxxxxxx";
Do I need to protect those keys at all? I am afraid that people will be able to get the files from S3 with out purchasing the levels. Or that hackers will be able to build a hacked version with all the levels pre-downloaded inside.
Let me try to break down your question to multiple subquestions/assumption:
Assumptions:
a) Keychain is safe place
Actually, it's not that safe. If your application is installed on jailbroked device, a hacker will be able to get your keys from the keychain
Questions:
a) Is there a way to put some key into an app (binary which is delivered form AppStore) and be completely secure?
Short answer is NO. As soon as there is something in your binary, it could be reverse engineered.
b) Will obfuscation help?
Yes. It will increase time for a hacker to figure it out. If the keys which you have in app will "cost" less than a time spend on reverse engineering - generally speaking, you are good.
However, in most cases, security through obscurity is bad practice, It gives you a feeling that you are secure, but you aren't.
So, this could be one of security measures, but you need to have other security measures in place too.
c) What should I do in such case?*
It's hard to give you a good solution without knowing background what you are trying to do.
As example, why everybody should have access to the same Amazon S3? Do they need to read-only or write (as pointed out by Kendall Helmstetter Gein).
I believe one of the most secure scenarios would be something like that:
Your application should be passcode protected
First time you enter your application it requests a user to authenticate (enter his username, password) to the server
This authenticates against your server or other authentication provider (e.g. Google)
The server sends some authentication token to a device (quite often it's some type of cookie).
You encrypt this token based on hash of your application passcode and save it in keychain in this form
And now you can do one of two things:
hand over specific keys from the server to the client (so each client will have their own keys) and encrypt them with the hash of your application passcode
handle all operation with S3 on the server (and require client to send)
This way your protect from multiple possible attacks.
c) Whoooa.... I don't plan to implement all of this stuff which you just wrote, because it will take me months. Is there anything simpler?
I think it would be useful, if you have one set of keys per client.
If even this is too much then download encrypted keys from the server and save them in encrypted form on the device and have decryption key hardcoded into your app. I would say it's minimally invasive and at least your binary doesn't have keys in it.
P.S. Both Kendall and Rob are right.
Update 1 (based on new info)
First of all, have you seen in app purchase programming guide.
There is very good drawing under Server Product Model. This model protects against somebody who didn't buy new levels. There will be no amazon keys embedded in your application and your server side will hand over levels when it will receive receipt of purchase.
There is no perfect solution to protect against somebody who purchased the content (and decided to rip it off from your application), because at the end of days your application will have the content downloaded to a device and will need it in plain (unencrypted form) at some point of time.
If you are really concerned about this case, I would recommend to encrypt all your assets and hand over it in encrypted form from the server together with encryption key. Encryption key should be generated per client and asset should be encrypted using it.
This won't stop any advanced hacker, but at least it will protect from somebody using iExplorer and just copying files (since they will be encrypted).
Update 2
One more thing regarding update 1. You should store files unencrypted and store encryption key somewhere (e.g. in keychain).
In case your game requires internet connection, the best idea is to not store encryption key on the device at all. You can get it from the server each time when your app is started.
DO NOT store an S3 key used for write in your app! In short order someone sniffing traffic will see the write call to S3, in shorter order they will find that key and do whatever they like.
The ONLY way an application can write content to S3 with any degree of security is by going through a server you control.
If it's a key used for read-only use, meaning your S3 cannot be read publicly but the key can be used for read-only access with no ability to write, then you could embed it in the application but anyone wanting to can pull it out.
To lightly obscure pre-loaded sensitive data you could encrypt it in a file and the app can read it in to memory and decrypt before storing in the keychain. Again, someone will be able to get to these keys so it better not matter much if they can.
Edit:
Based on new information you are probably better off just embedding the secrets in code. Using a tool like iExplorer a causal user can easily get to a core data database or anything else in your application bundle, but object files are somewhat encrypted. If they have a jailbroken device they can easily get the un-encrypted versions but it still can be hard to find meaningful strings, perhaps store them in two parts and re-assemble in code.
Again it will not stop a determined hacker but it's enough to keep most people out.
You might want to also add some code that would attempt to ask your server if there's any override secrets it can download. That way if the secrets are leaked you could quickly react to it by changing the secrets used for your app, while shutting out anyone using a copied secret. To start with there would be no override to download. You don't want to have to wait for an application update to be able to use new keys.
There is no good way to hide a secret in a piece of code you send your attacker. As with most things of this type, you need to focus more on how to mitigate the problem when the key does leak rather than spend unbounded time trying to protect it. For instance, generating different keys for each user allows you to disable a key if it is being used abusively. Or working through a intermediary server allows you to control the protocol (i.e. the server has the key and is only willing to do certain things with it).
It is not a waste of time to do a little obfuscating. That's fine. But don't spend a lot of time on it. If it's in the program and it's highly valuable, then it will be hacked out. Focus on how to detect when that happens, and how to recover when it does. And as much as possible, move that kind of sensitive data into some other server that you control.
How can I accept large file uploads of around 250mb ?
http://dropitto.me/ seems ok, but it only allows up to 75MB uploads, it requires your actual dropbox account password, and it does not use HTTPS for authentication - so a few red flags there.
I have a Dropbox Pro account and EC2 and S3 resources. I'm looking for a method to allow non-technical users to send files between 100 - 250mb.
I'm not crazy about using FTP because I think it might be too technical for some users to set up. One option might be to ask them to register for a dropbox.com account and install the client and share a folder. Or share a folder with them to initiate the process.
But the real reason I'm asking this on StackOverflow is because I hope that there is some library that is really useful for doing this kind of stuff - and would be fast for the end users since the backbone could be "on the cloud". I don't really care what language it's written in.
Also let me just say that I'm not crazy about the idea of using rapidshare.com or megaupload.com or a service like that, but let me know if you would support those as the solution.
Check out http://kicksend.com - they allow up to a certain amount in the browser, but they have applications for mac/windows which allow for basically unlimited file transfer very easily.
I have seen quite a few code samples/plugins that promote uploading assets directly to S3. For example, if you have a user object with an avatar, the file upload field would load directly to S3.
The only way I see this being possible is if the user object is already created in the database and your S3 bucket + path is something like
user_avatars.domain.com/some/id/partition/medium.jpg
But then if you had an image tag that tried to access that URL when an avatar was not uploaded, it would yield a bad result. How would you handle checking for existence?
Also, it seems like this would not work well for most has many associations. For example, if a user had many songs/mp3s, where would you store those and how would you access them.
Also, your validations will be shot.
I am having trouble thinking of situations where direct upload to S3 (or any cloud) is a good idea and was hoping people could clarify either proper use cases, or tell me why my logic is incorrect.
Why pay for storage/bandwidth/backups/etc. when you can have somebody in the cloud handle it for you?
S3 (and other Cloud-based storage options) handle all the headaches for you. You get all the storage you need, a good distribution network (almost definitely better than you'd have on your own unless you're paying for a premium CDN), and backups.
Allowing users to upload directly to S3 takes even more of the bandwidth load off of you. I can see the tracking concerns, but S3 makes it pretty easy to handle that situation. If you look at the direct upload methods, you'll see that you can force a redirect on a successful upload.
Amazon will then pass the following to the redirect handler: bucket, key, etag
That should give you what you need to track the uploaded asset after success. Direct uploads give you the best of both worlds. You get your tracking information and it unloads your bandwidth.
Check this link for details: Amazon S3: Browser-Based Uploads using POST
If you are hosting your Rails application on Heroku, the reason could very well be that Heroku doesn't allow file-uploads larger than 4MB:
http://docs.heroku.com/s3#direct-upload
So if you would like your users to be able to upload large files, this is the only way forward.
Remember how web servers work.
Unless you're using a sort of async web setup like you could achieve with Node.JS or Erlang (just 2 examples), then every upload request your web application serves ties up an entire process or thread while the file is being uploaded.
Imagine that you're uploading a file that's several megabytes large. Most internet users don't have tremendously fast uplinks, so your web server spends a lot of time doing nothing. While it's doing all of that nothing, it can't service any other requests. Which means your users start to get long delays and/or error responses from the server. Which means they start using some other website to get the same thing done. You can always have more processes and threads running, but each of those costs additional memory which eventually means additional $.
By uploading straight to S3, in addition to the bandwidth savings that Justin Niessner mentioned and the Heroku workaround that Thomas Watson mentioned, you let Amazon worry about that problem. You can have a single-process webserver effectively handle very large uploads, since it punts that actual functionality over to Amazon.
So yeah, it's more complicated to set up, and you have to handle the callbacks to track things, but if you deal with anything other than really small files (and even in those cases), why cost yourself more money?
Edit: fixing typos