I want to know if their might be a problem uploading images to my website when I have several EC2 servers behind a load balancer without stickiness. I thought about using uploadify, but I am not sure if it will work every time when user upload files to the server. Any insight on that?
Depends on where you end up saving the files. Typically for a farm, you are better off saving the files on S3 (since you are already on EC2). That way, all servers have access to it. Saving it on a particular server handing that particular session doesn't scale at all. Ideally, you would want your servers to be completely stateless so you can scale them up or down. Saving the file on a particular server make it stateful.
What library you use on the client to upload is irrelevant for this discussion. Uploadify seems to do the job very well.
Related
I've built an app where users can upload their avatars. I used the paperclip gem and everything works fine on my local machine. On Heroku everything works fine until server restart. Then every uploaded images disappear. Is it possible to keep them on the server?
Notice: I probably should use services such as Amazon S3 or Google Cloud. However each of those services require credit card or banking account information, even if you want to use a free mode. This is a small app just for my portfolio and I would rather avoid sending that information.
No, this isn't possible. Heroku's filesystem is ephemeral and there is no way to make it persistent. You will lose your uploads every time your dyno restarts.
You must use an off-site file storage service like Amazon S3 if you want to store files long-term.
(Technically you could store your images directly in your database, e.g. as a bytea in Postgres, but I strongly advise against that. It's not very efficient and then you have to worry about how to provide the saved files to the browser. Go with S3 or something similar.)
Im struggling to find an answer to this. I have a website that is deployed in a shared hosting environment. I want to allow people to upload files to my azure blob storage account.
I have this working locally, using the storage emulator, however when I publish the site I get a Security Exception.
Is this actually possible under a shared hosting envrionment ?
Cheers
A bit more detail would help, in understanding how these uploads are taking place. That said, I'll make the assumption that people are uploading directly to Blob Storage, and not through your Website (or Web Service).
To allow direct uploads, you need to provide either a public blob or container (which everyone in the world can see), or create a temporary Shared Access Signature (SAS) on a specific blob or container, that grants access for a short time window.
If your app is Silverlight, then you are probably running into a cross-domain issue (and you'll need to correct that with an access policy).
If you provide more details around the way uploads are being sent, as well as the client and server technology, I can edit my answer to be more specific.
I need to set up a server so that files can be uploaded from an iOS app. I don't know how best to proceed. I thought about FTP but not sure if there is a better option.
any ideas appreciated
GC
Also I must add that I will be building the iOS app so can use server APIs in my code.
It's not ideal to set up a blind File/FTP server and hardcode the details into your app because all it takes is one person to intercept the login details and they have access to your server where they can upload (and potentially execute) bad things.
A possible idea could be to set up an API frontend on your server in a language of your choice (PHP, Ruby, Python or similar) where you can 'POST' images to the server. With the API frontend, you can also do validation to ensure that only valid images are being uploaded and all nefarious stuff is thrown away. Then, in your iOS app, you can set up code to interact with your API frontend and send the actual images which will then be stored on your server.
This is a fairly conceptual idea rather than an absolute implementation idea. It does require some thinking/reading and more setup/coding on the server side.
Edit: Just to add, if you only want a central location to store/get your images without controlling it on a per user basis then you may want to look into Amazon S3 as a File Server.
I'm building a Ruby on Rails app that scrapes the images off a website. What is the best location to save this images to?
Edit:
To be clear, I know the file system is the best type of storage, but where on the file system? I suppose I have to stay in the RoR app directory, but which folder is best suitable for this? public?
On your file server (static Apache server), on your app server (save some where locally in the disk and serve via the app server) or on Amazon S3
But I would suggest not to store in Database. (Some people think it's alright. So, I would be limited to suggestion)
in ROR, under <app_name>/public/images see here -- but the data will be public. If you are worried about privacy, probably this is not right.
If you are concerned about privacy, see the options discussed here How to store private pictures and videos in Ruby on Rails But as a sughestion: serving files from app-server may be painful in high traffic conditions and my experience is it better off-loaded to a file server or a cloud like S3.
It's not hard to write and/or create a server that is only serving images from a file store outside your website's directory structure. A simple rewrite of the URL can provide your code with the info it needs to the actual file location, which it then outputs to the browser.
An alternate is to have the image's URL mapped to the image's directory path in a database, then do a lookup. Make the URL field an indexed lookup and it will be very fast.
I wrote an image server in Ruby a couple years ago along those lines and it was a pretty simple task.
Imagine the following use case:
You have a basecamp style application hosting files with S3. Accounts all have their own files, but stored on S3.
How, therefore, would a developer go about securing files so users of account 1, couldn't somehow get to files of account 2?
We're talking Rails if that's a help.
S3 supports signed time expiring URLs that mean you can furnish a user with a URL that effectively lets only people with that link view the file, and only within a certain time period from issue.
http://www.miracletutorials.com/s3-amazon-expiring-urls/
If you want to restrict control of those remote resources you could proxy the files through your app. For something like S3 this may defeat the purpose of what you are trying to do, but it would still allow you to keep the data with amazon and restrict access.
You should be careful with an approach like this as it could cause your ruby thread to block while it is proxying the file, which could become a real problem with the application.
Serve the files using an EC2 Instance
If you set your S3 bucket to private, then start up an EC2 instance, you could serve your files on S3 via EC2, using the EC2 instance to verify permissions based on your application's rules. Because there is no charge for EC2 to transfer to/from S3 (within the same region), you don't have to double up your bandwidth consumption costs at Amazon.
I haven't tackled this exact issue. But that doesn't stop me from having an opinion :)
Check out cancan:
http://github.com/ryanb/cancan
http://railscasts.com/episodes/192-authorization-with-cancan
It allows custom authorization schemes, without too much hassle.