rails heroku app/assets/images or upload to AWS/Cloudinary - ruby-on-rails

In my application I have around 400 images that need to be displayed at various times. There will be no user uploaded imagery. In other words, I control all the pictures being used within my application.
I'm wondering what the recommended route is. Would it be best to put all the images in app/assets/images or would it be better to upload all of them to a 3rd party service like AWS?
The application will eventually be living through Heroku. Thanks.

From this question (and first comment), your total compiled code and assets cannot exceed 100MB. As long as you keep under this, you'll be fine with Heroku. However, if you exceed that, or the number of files will change dramatically or consistently, I'd recommend Cloudinary, which gives you 500MB of FREE (file)storage and is available as a Heroku Add-on.

Related

Heroku - hosting files and static files for my project

I want to use Heroku for hosting my Ruby on Rails project. It will involve lots of file uploads, mostly images. Can I host and serve that static files on Heroku or is it wiser to use services like Amazon S3. What is Your opinion on that approach ? What are my options for hosting that static files on Heroku ?
To answer your question, Heroku's "ephemeral filesystem" will not serve as a storage for static uploads. Heroku is an app server, period. You have to plug into data storage elsewhere.
From Heroku's spec:
Ephemeral filesystem
Each dyno gets its own ephemeral filesystem, with a fresh copy of the most recently deployed code. During the dyno’s lifetime its running processes can use the filesystem as a temporary scratchpad, but no files that are written are visible to processes in any other dyno and any files written will be discarded the moment the dyno is stopped or restarted. For example, this occurs any time a dyno is replaced due to application deployment and approximately once a day as part of normal dyno management.
Heroku is a great option for RoR in my opinion. I have used it personally and ran to the problem that has been mentioned here already (you can't store anything in Heroku's filesystem). I therefore used S3 following this tutorial: https://devcenter.heroku.com/articles/s3
Hope it helps!
PD: Make sure not to store the S3 credentials on any file, but rather create variables as described here: https://devcenter.heroku.com/articles/config-vars
I used to have them on a file and long story short someone gained access to my Amazon account and my account was billed several thousands of dollars (just from a couple of days). The Amazon staff was kind enough to waive those. Just something to have in mind.
As pointed out, you shouldn't do this with Heroku for the specific reason of ephemeral storage, but to answer your question more broadly storing user-uploaded content on a local filesystem on any host has a few inherent issues:
You can quickly run out of local storage space on the disk
You can lose all your user-uploaded content if the hardware crashes / the directory gets deleted / etc.
Heroku, EC2, Digital Ocean, etc. all provide servers that don't come with any guarantee of persistence (ephemeral storage especially). This means that your instance may shut down at any point, be swapped out, etc.
You can't scale your application horizontally. The files on one server won't be accessible from another (or dyno, or whatever your provider of choice calls them).
S3, however, is such a widely-used solution because:
It's incredibly cheap (we store 20 TB of data for something like $500 a month)
Your uploaded files aren't at risk of disappearing due to hardware failure
Your uploaded files are decoupled from the application, meaning any server / dyno / whatever could access them.
You can always publish your S3 buckets into cloud front if you need a CDN without any extra effort.
And certainly many more reasons. The most important thing to remember, is that by storing uploaded content locally on a server, you put yourself in a position where you can't scale horizontally, regardless of how you're hosting your app.
It it wiser to host files on S3, and actually it is even more wiser to use direct uploads to S3.
You can read the arguments, for example, here.
Main point: Heroku is really, really expensive thing.
So you need to save every bit of resources you have. And the only option to store static files on Heroku is having separate dyno running app server for you. And static files don't need app server. So it's just a waste of CPU time (and you should read that as "a waste of a lot of my money").
Also, uploading huge amount of huge files will quickly get you out of memory quota (read that as "will waste even more of my money because I will need to run more dynos"). So it's best to upload files directly to S3.
Heroku is great for hosting your app. Use the tool that best suites the task.
UPD. Forgot to tell you – not only you will need separate dyno for static assets, your static assets will die every time this dyno is restarted.
I had the same problem. I do solve it by adding all my images in my rails app. I then reference the images using their links that might be something like
myapp.herokuapp.com/assets/image1.jpg
I might add the link from the CMS. It might not be the best option, but it works.

RoR:-use public images in CDN for better performance in production

I have implemented a rails application and have deployed it on azure webserver.
The issue I am getting is that some of the images present in the public folder take a long time to load hence the website performance is very low.Some of the images are as small as 20kb and still takes around 13 secs to load.
My question is that if i were to put the images present in the public directory in the CDN(Content delivery network) and then load via cache, will it give better performance or will it not affect the overall performance.
Is it also possible to put all the images in the CDN for the rails app in production.
Thanks.
Well, remember that a CDN is just another webserver. All you are doing when you use a CDN is hyperlinking to the resource on that other webserver.
<img src="http://www.quackit.com/pix/milford_sound/milford_sound_t.jpg" />
Now, will it speed up the load times of your app? Maybe. There are lots of factors effecting this, namely:
Why is your app loading slow? Is it your connection? Are you on dialup? A CDN can't help that.
Why is your azure server slow? Is there lots of traffic? If so, a CDN will help.
Most large production applications likely use a CDN for all of their static assets, such as images, css, and javascript. (They probably own the CDN, but still, its a CDN.) So, yes, every image in your site could be stored in a CDN. (Very easy if these are all static images.) However, CDN's that do this are not usually free.
Why are you choosing to use azure for a rails app? It's possible, but it would be much easier to use something like Heroku or Engineyard. You could even use a VPS service like Digital Ocean, and set up your own small army of CDN's using VPS providers around the country. (If your a cheapskate, like me.)
There usally aren't that many images in a production rails app that are located in /public. Usually those images are in assets/images/... the only thing I would keep in public would be a small front end site and perhaps some 404/error pages.

Can I host images in heroku? Or do I need S3?

I'm deploying my web app (it's for a corporate client). So, users will not add images, but only the business will.
I've deployed to Heroku, and my images are still showing. When do I need to use S3? Ill have like 100 images in total in the site, and size will vary like > 7 a week. Can I use only heroku?
The short answer: if you allow users or admins to upload images, you should not use Heroku's file system for this as the images will suddenly vanish.
As explained in the Heroku documentation:
Each dyno gets its own ephemeral filesystem, with a fresh copy of the most recently deployed code. During the dyno’s lifetime its running processes can use the filesystem as a temporary scratchpad, but no files that are written are visible to processes in any other dyno and any files written will be discarded the moment the dyno is stopped or restarted.
This means that user uploaded images on the Heroku filesystem are not only wiped out with every push, but also with every dyno restart, which occasionally happens (even if you would ping them frequently to prevent them going to sleep).
Once you start using a second web dyno, it will not be able to read the other dyno's filesystem, so then images would only be visible from one dyno. This would cause weird issues where users can sometimes see images and sometimes they don't.
That said, you can temporarily store images on the Heroku filesystem if you implement a pass-through file upload to an external file store.
Asset Pipeline
FiveDigit's answer is very good - there is something more to consider; the role of the asset pipeline in Rails
If the images you have are used as assets (IE they are used in the layout; are not changeable by the user), then you can store them in the assets/images folder. There is no limit to the number of assets you can keep with your application, but you must be sure on what these are - they are files which aid your application's operation; not files which can be uploaded / manipulated:
The asset pipeline provides a framework to concatenate and minify or
compress JavaScript and CSS assets. It also adds the ability to write
these assets in other languages and pre-processors such as
CoffeeScript, Sass and ERB.
The asset pipeline will compress & fingerprint the stylesheet, image and js files it has, when you deploy your application to the likes of Heroku, or any other server. This means if those files don't change, you can store them in there
-
S3
The reason you'd want to use the likes of S3 is specifically if your images files are designed to change (user can upload / edit them). Regardless of Heroku's filesystem, if the images are tied to changes in the DB, you'll have to keep a central store for them - if you change servers, they need to be reachable
To do this, you should ensure you appreciate how you want the files to work - are they going to be manipulated constantly by the user or not? If so, you'll have to explore integrating S3 into your app

s3_swf_upload fails regularly while uploading files to s3

I've been having this issue for sometime now. On fillim.com (indie film distribution, so large files) we're using using this fork of the s3_swf_upload gem for rails. We're getting everyone complaining that it will fail sometimes 3-4 times before it will fully upload the file, like almost everyone.
We're on Heroku, and we're then of course needing to do direct uploads to S3.
We're not getting any errors generated, in our logs or in the browser, and we just can not for the life of us find the cause.
Has anyone had these issues before? Does anyone know of alternatives? If anyone knows of an alternative that supports files larger than 2GB, that would be even better.
If You are trying to upload files on amazon s3, Then use AWS::S3 a Ruby Library for uploading files.
http://amazon.rubyforge.org/
I thing default size
:fileSizeLimit (integer = 524288000)
Individual file size limit in bytes (default is 512 MB)
you need to increase your filesizelimit
The repeated failures is unsurprising. If you're going to upload files that large, you want to leverage S3's "multipart upload" support. Essentially, the file is broken-up into pieces, sent in parts, then reassembled on the S3-side.
The official AWS SDK for Ruby supports this feature, but you'd have to implement it into your gem. I don't know whether or not that's outside the scope of what you were looking for.
Also, am I correct in understanding that you're wanting to allow users to upload files > 2GB from their web browsers?

Should I store my site's images on heroku?

Should I store my site's images on heroku?
images such as the logo of my site and so on.
I talking just about the images of the design of the site.
Will it affect my sites performance?
Yes. Your logo and other associated images that make up the site should not be terribly large, and will not negatively affect your slug size or slight performance much.
The downside of having these assets stored and served separately is that you application will not be all-together, which adds an extra layer of difficulty to development, as you have to update images in a separate place from your code.
Any large files uploaded by users, that are not part of the application itself but stored by it, should be stored on something like S3 (not that you can write to the Heroku FS anyway).
typically anything that is core to my application (ie images for layout, logos etc) I commit to git and will deploy to Heroku - assets like uploaded images/pdfs etc all go to Heroku

Resources