Image organization strategy with Rails & Sass - ruby-on-rails

As the quantity of our pages increases, the number of images in the rails project has increased greatly. Does anyone have a good strategy for keeping track of these images?
We have experimented with a couple ideas but nothing seems to solve everything. Here are the high level requirements we have come up with:
Images should be available to any developer locally (given they have code access).
Updated images can be automatically sent to amazon s3 (and out to cloud front) on deploy
An easy way to "expire" an image when it changes forcing browsers to download instead of using a cached version

We actually deployed a Rake Task to archive this and keep all the files between our application and (in our case) Cloudfiles in sync.
The rake tasks checks for new files or files that have been changed and uploads them to Cloudfiles. If a developer adds an asset, they can just run the rake task and update the cloud. They also check in the file into our version control so other dev's have access to it.

Related

Execute "assets:precompile" by code/client

I have a site that has a lot of images (+ 10k) with several hundreds displayed by pages, so I use the principle of ruby on rails assets to compile these images. The problem is that my client has the possibility to change these images via an FTP and it forces me to connect each time in ssh to the server to launch the command "assets: precompile". Would there be a way to create a link button my client will click to launch the command from the code.
I tried to create a cron job with the Whenever gem (https://github.com/javan/whenever) that runs the "assets: precompile" command every hour but it does not work.
every :hour do
rake "assets:precompile"
end
I tried to create this method linked to a link but without success also
def compile_image
system('rvm use 2.2.0#project && cd /var/www/project-folder/ && rake assets:precompile')
redirect_to imports_index_path, notice: 'Images compiled'
end
I'm on ruby 2.2 and rails 4.2.
Thanks for your help
Asset compilation should only be used for static images used in your app (icons, backgrounds...), not for the actual data your site uses.
This is especially true if it is a dynamic and large collection of images.
This isn't really a programming problem anymore, it's more of a general software engineering problem, but here are my two cents anyways.
I'd suggest you use a CDN to make the photos available to browsers. Your Rails app should store (or build) the URLs to the CDN, so that the client browser can fetch and display them.
For instance, set up a HTTP server such as Nginx on the same machine that hosts the FTP server, then build a consistent URL schema so that your FTP files can be served through HTTP, without copy.
New files added to the server via FTP are automatically available through HTTP, and maybe a cron job or an asynchronous worker can check for new files and register them to your Rails app.
Of course, that's just an example of what can be done. There are tons of other solutions, but without further details, it's hard to tell.

Rails app to Heroku with existing images. Moving images to AWS bucket?

I'm migrating my Rails apps to Heroku, but from what I read I should not save the images in the public/uploads folder.
I have figured out how to use the AWS buckets for saving new images, but how do I go about moving the existing images?
Also, is it absolutely necessary?
I have read it everywhere, but even after a month of my apps being online I haven't seen my images deleted/moved.
Thank you.
Existing files that are part of your repository will not be deleted.
But new images will get lost, because Heroku might reset the Dyno with a fresh version from the repository.

Are files uploaded to heroku immediately deleted?

I see everywhere that heroku deletes changes you made to the filesystem once you deploy or scale your application but in my case it seems files disappear immediately.
My rails application uploads files to public/uploads and then a delayed job tries to read those files, when it tries to read them they are not found.
If I do everything in the same thread it works, but when I try to use the delayed job or check the filesystem using heroku run bash the files are gone.
Why this happens?
heroku is a read only file system. so actually you don't even write the files but just keep them in memory while in one thread.
if you want to use some free storage system i recommend google drive. you'll need to do some searching of how to use that since not too long ago they changed they're login policy only with Oauth, no more password/username login

How to deploy to Heroku without losing tmp files?

I have a scheduled job running every 12 hrs that unzips image files from an FTP server into my tmp folder. The problem is that due to Heroku's ephemeral filesystem, whenever I deploy new code, the tmp folder is cleared when the dyno's restart and the files are no longer there. I'd like to be able to deploy code at will, without this concern.
I have thought of creating a second app that runs this task and connects to the same database. As per this SO answer. This way I can deploy code updates unrelated to this task to my production server, and can chose more selectively when to deploy to the second server.
Does anyone have any experience with having two apps running on the same database? Or is there a better way to solve my issue? I've read that Heroku may change database URL's at any time, so the second app may lose its connection. How common is this? Thanks!
I would create a folder under public e.g. public/storage and save unzipped files here.
I believe that it is possible using an app on Heroku.
Check this out: https://devcenter.heroku.com/articles/s3

How to use paperclip with rails and how does it work in deployment?

I've done 2 Rails apps involving image uploads with paperclip. The first one I did a while ago worked fine locally but I had issues when I deployed to Heroku; I realized I needed to use AWS to enable image uploads.
I did that project a while ago. I recently started another project and tried to incorporate similar functionality. Before I enabled AWS with paperclip when I tried to deploy, I just wanted to test what would happen if I tried to upload an image. To my surprise, it worked without AWS! I was hoping to understand why it didn't work the first time and why it does work now. How does paperclip work with heroku and rails behind the scenes?
It's likely the first application was hosted on the legacy Bamboo stack, that had a read-only file system.
In the new Cedar stack, the file system is no longer read-only, so the upload will not fail. However, you should keep using AWS or any other external storage because Heroku distributes your compiled application across several machines, and it's not guaranteed that the image will be visible from another request. They call it ephemeral filesystem.
In other words, with the Bamboo stack it was easy to understand that you could not store files on the file system, because you were experiencing an immediate crash. With Cedar the upload succeeds, but the uploaded image will be unavailable at anytime in the future.
The moral of the story is: you should keep using AWS or any other storage outside Heroku file-system.

Resources