How to query for old files in CarrierWave? - ruby-on-rails

I know that I can keep old files after an uploader is updated:
class AvatarUploader < CarrierWave::Uploader::Base
configure do |config|
config.remove_previously_stored_files_after_update = false
end
end
But how can I query for those old files? Assuming someone updates their avatar, the old file stays, is there a way to show the old files?

Carrierwave doesn't track old files for you, so if you want this feature you're going to have to implement it yourself. You may have some luck with a gem like paper_trail.

Related

CarrierWave file not saving in production

I'm using Rails 4.2.6 and CarrierWave 1.0.0. I have an uploader setup like so:
class LetterUploader < CarrierWave::Uploader::Base
storage :file
def store_dir
"#{Rails.root}/public/uploads/#{model.id}"
end
end
In development, the file saves just fine, in /public/uploads/etc..., but in production, it's not saving. It is getting cached, there is a file in the /tmp directory, but not actually being saved. I've looked all over for a solution, and I can't seem to find one. My intuition was that it was a folder permission issue in my production server, but I made sure that the public folder recursively belonged to my user name, and that didn't help. I even set permissions to 777, to no avail. I can provide more info (on the controller, or anything else) on request, any help is appreciated.
The problem wasn't with CarrierWave, but with the fact that an uploaded file doesn't store until the model it is attached to is saved. I re-factored so that the model saves, sends the file to the other api, and then updates the original model, and it works!

Rails, where does CarrierWave store files?

Couldn't find/understand good documentation on this. I wonder where uploads that are done by CarrierWave goes? As from my understanding it goes directly into the db, right? Could I force it to store (or create like a reference to the file) in my assets pipeline? Today had an issue when couldn't use image_tag since it grabs asset only from the assets pipeline. Could it be that letting users to store files in asset pipeline could be a potentially very risky and harmful?
So my questions:
Can I store / reference file in the assets pipeline?
Would it be a good idea?
Thanks for sharing!
If you look at your uploader you'll see a method called store_dir the default looks like this...
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
So for an attachment named "oranges.jpg" in a field called image in a model called FilmReview in the record with id 45 it's stored in...
public/uploads/film_review/image/45/oranges.jpg
You can change store_dir to store the image in a different directory, or upload it to a cloud service like AWS... see railscasts or other resources for examples of how to do ths.

How can I migrate CarrierWave files to a new storage mechanism?

I have a Ruby on Rails site with models using CarrierWave for file handling, currently using local storage. I want to start using cloud storage and I need to migrate existing local files to the cloud. I am wondering if anyone can point out a method for doing this?
Bonus points for using a model attribute that would allow me to do this row-by-row in the background without interrupting my site for extended downtime (in other words, some model rows would still have local storage while others used cloud storage).
My first instinct is to create a new uploader for each model that uses cloud storage, so I have two uploaders on each model, then transferring the files from one to the other, setting an attribute to indicate which file should be used until they are all transferred, then removing the old uploader. That seems a little excessive.
Minimal to Possibly Zero Donwtime Procedure
In my opinion, the easiest and fastest way to accomplish what you want with almost no downtime is this: (I will assume that you will use AWS cloud, but similar procedure is applicable to any cloud service)
Figure out and setup your assets bucket, bucket policies etc for making the assets publicly accessible.
Using s3cmd (command line tool for interacting with S3) or a GUI app, copy entire assets folder from file system to the appropriate folder in S3.
In your app, setup carrierwave and update your models/uploaders for :fog storage.
Do not restart your application yet. Instead bring up rails console and for your models, check that the new assets URL is correct and accessible as planned. For example, for a video model with picture asset, you can check this way:
Video.first.picture.url
This will give you a full cloud URL based on the updated settings. Copy the URL and paste in a browser to make sure that you can get to it fine.
If this works for at least one instance of each model that has assets, you are good to restart your application.
Upon restart, all your assets are being served from cloud, and you didn't need any migrations or multiple uploaders in your models.
(Based on comment by #Frederick Cheung): Using s3cmd (or something similar) rsync or sync the assets folder from the filesystem to S3 to account for assets that were uploaded between steps 2 and 5, if any.
PS: If you need help setting up carrierwave for cloud storage, let me know.
I'd try the following steps:
Change the storage in the uploaders to :fog or what ever you want to use
Write a migration like rails g migration MigrateFiles to let carrierwave get the current files, process them and upload them to the cloud.
If your model looks like this:
class Video
mount_uploader :attachment, VideoUploader
end
The migration would look like this:
#videos = Video.all
#videos.each do |video|
video.remote_attachment_url = video.attachment_url
video.save
end
If you execute this migration the following should happen:
Carrierwave downloads each image because you specified a remote url for the attachment(the current location, like http://test.com/images/1.jpg) and saves it to the cloud because you changed that in the uploader.
Edit:
Since San pointed out this will not work directly you should maybe create an extra column first, run a migration to copy the current attachment_urls from all the videos into that column, change the uploader after that and run the above migration using the copied urls in that new column. With another migration just delete the column again. Not that clean and easy but done in some minutes.
When we use Heroku, most of people suggest to use cloudinary. Free and simple setup.
My case is when we use cloudinary service and need move into aws S3 for some reasons.
This is what i did with the uploader:
class AvatarUploader < CarrierWave::Uploader::Base
def self.set_storage
if ENV['UPLOADER_SERVICE'] == 'aws'
:fog
else
nil
end
end
if ENV['UPLOADER_SERVICE'] == 'aws'
include CarrierWave::MiniMagick
else
include Cloudinary::CarrierWave
end
storage set_storage
end
also, setup the rake task:
task :migrate_cloudinary_to_aws do
profile_image_old_url = []
Profile.where("picture IS NOT NULL").each do |profile_image|
profile_image_old_url << profile_image
end
ENV['UPLOADER_SERVICE'] = 'aws'
load("#{Rails.root}/app/uploaders/avatar_uploader.rb")
Profile.where("picture IS NOT NULL OR cover IS NOT NULL").each do |profile_image|
old_profile_image = profile_image_old_url.detect { |image| image.id == profile_image.id }
profile_image.remote_picture_url = old_profile_image.picture.url
profile_image.save
end
end
The trick is how to change the uploader provider by env variable. Good luck!
I have migrated the Carrier wave files to Amazon s3 with s3cmd and it works.
Here are the steps to follow:
Change the storage kind of the uploader to fog.
Create a bucket on Amazon s3 if you already dont have one.
Install s3cmd on the remote server sudo apt-get install s3cmd
Configure s3cmd s3cmd --configure.
You would need to enter public and secret key here, provided by Amazon.
Sync the files by this command s3cmd sync /path_to_your_files ://bucket_name/
Set this flag --acl-public to upload the file as public and avoid permission issues.
Restart your server
Notes:
sync will not duplicate your records. It will first check if the file is present on remote server or not.

CKEditor gem with Paperclip and Amazon S3

I'm using CKEditor and I've configured it to work with Paperclip but I can't tell it to store files in S3, so it's storing them using Paperclip but on the local filesystem.
So I was wondering if there is some way to tell Paperclip to explicitly use S3 everytime it's used.
I know how I can configure Paperclip with S3 on certain models (pretty easy, described on the paperclip github wiki). I'm deploying on Heroku that's why I can't write to the local filesystem.
One way is to see what the ckeditor install generator is doing.
For example, if using ActiveRecord as ORM, take a look at the templates being used for the models that use Paperclip here.
The generator actually copies this templates into your app/models/ckeditor folder. You could edit them and configure as needed for Paperclip to use S3.
For ActiveRecord, the models are:
/app/models/ckeditor/attachment_file.rb
/app/models/ckeditor/picture.rb
Keep in mind that this approach could give you extra work in the future if the ckeditor gem is updated and the update process needs to overwrite this models.
Else, you can use Paperclip default options. In you Paperclip initializer (/config/initializers/paperclip.rb) use:
Paperclip::Attachment.default_options.merge!(
YOUR OPTIONS FOR S3 HERE
)
For carrierwave, you can generate the uploader and there you can configure your s3 or whatever you want.
class CkeditorAttachmentFileUploader < CarrierWave::Uploader::Base
include Ckeditor::Backend::CarrierWave
# Choose what kind of storage to use for this uploader:
if Rails.env.production?
storage :fog
else
storage :file
end
....
end
It's pretty straight forward. You can use this post to get you started. Alternatively, you can look at this similar question for further details.

How to copy a file using Paperclip

Does anyone know of a way to copy files with Paperclip using S3 for storage? Before I try to write my own, I just wanted to make sure there wasn't already a way to do this. Thanks
After some more messing around with paperclip, I figured it out. It's ridiculously simple to copy files!
# Stupid example method that just copies a user's profile pic to another user.
def copy_profile_picture(user_1, user_2)
user_2.picture = user_1.picture
user_2.save # Copied the picture and we're done!
end
This also works great with amazon s3. Sweet

Resources