Update path of file in carrerwave S3 - ruby-on-rails

I have a Document model with a file that is uploaded to S3 using Carrierwave(fog) with my uploader (mount_uploader :file, DocumentUploader). I am also using the 'paranoia' gem's acts_as_paranoid to soft delete the documents. Upon destroy I wish to move the attached file to an 'archive' folder in the same directory. Then I plan on moving it back to the original(parent) directory when a deleted document is restored.
I have the following in my model:
skip_callback :commit, :after, :remove_file!
before_destroy :move_file_to_archive
after_restore :fetch_file_from_archive
And within the method move_file_to_archive, I establish a connection to amazon using fog and do the following to move the file to archive:
bucket = connection.directories.get(bucket_name)
file = bucket.files.get(self.file.file.path)
new_path = file.key.split('/')[0..-2].join('/') + '/archive/' + file.key.split('/')[-1]
new_file = file.copy(bucket_name, new_path)
file.destroy
The problem is that I cannot find a way to get my document object to point to the new(archived) file instead of the old one. Somehow, when the object is being destroyed, I want the self.file.path to change to the archived path instead of the original path. And then revert it when the document is being restored. Any help would be appreciated!

Got it to work myself. I added a condition to my DocumentUploader to set the path to contain /archive/ in case the document had a value present in paranoia's deleted_at field. Just doing that makes carrierwave look at the archive path if the document is currently in deleted state.

Related

CarrierWave: allowing one-by-one file uploads as well as a bulk zip upload

I'm using CarrierWave to upload and manage resources on my ActiveRecord models. I've defined my own Uploader and mounted it to a bunch of properties on one of my models as shown below:
class Theme < ActiveRecord::Base
...
mount_uploader :masthead, ThemeResourceUploader
mount_uploader :background, ThemeResourceUploader
mount_uploader :footer, ThemeResourceUploader
...
end
This works as expected when creating a new Theme from the params in my Rails controller, but in addition to allowing the user to upload one image at a time I also want to allow them to upload an zip file containing all these images and then use this zip to construct the Theme.
To try and accomplish this I created a new Uploader for the zip file and a controller method which uses Rubyzip to extract the uploaded zip in memory and then tries to assign the resultant stream to my ActiveRecord model's properties.
def import
require 'zip'
#theme = Theme.new
zip_upload = params.require(:theme).require(:zip)
uploader = ThemeImportUploader.new
uploader.cache!(zip_upload)
Zip::File.open(uploader.file.path) do |zip_file|
#theme.masthead = zip_file.get_input_stream('masthead.png')
#theme.background = zip_file.get_input_stream('background.png')
#theme.footer = zip_file.get_input_stream('footer.png')
end
#theme.save
end
Unfortunately this doesn't work. I don't receive any error or failure, but the Theme is saved with empty values for the resources and the files are not created in my upload folder.
I believe I can get this working by extracting the zip to temporary files and then reading those files into the CarrierWave properties, but this seems like a very round the bush way of solving the problem.
How can I upload and extract a zip in memory and assign its contents to my CarrierWave enhanced models?

Updating Paperclip path file names from on server to s3

I have a paperclip instance that I am migrating my files to a different area. Originally the files were stored on my server and just given a filename based on the id of the record created and the original id. Now I'm moving them to s3 and want to update the filenames to work appropriately. I setup my paperclip config like so:
:path => ":class/:attachment/:hash-:style.:extension",
:url => ":s3_domain_url",
:hash_secret => SECRET,
:hash_data => ":class/:attachment/:id/:updated_at"
I updated the original records filenames for my files to be unique and moved them over to my s3 instance. Unfortunately now I am unable to pull down the files from s3 and I think it is because paperclip is using the wrong path for the filenames. One that is based off the path default that is now set using my config file. I want to be able to update my files file_name field so that the path is correct for the new files and I am able to download them appropriately. Is there a way to call paperclips hashing function based on my secret and hash_data directly so I can update those file_name fields and be able to pull those records now? Everything that has been uploaded since the move from my original servers seems to work appropriately.
Say you have a model User with an attachment named profile_pic;
Go into the rails console eg. rails c and then get an object for the model you have the attachment on, eg. u = User.find(100).
Now type u.profile_pic.url to get the url or u.profile_pic_file_name to get the filename.
To see the effect of other options (for example your old options) you can do;
p = u.profile_pic # gets the paperclip attachment for profile_pic
puts p.url # gets the current url
p.options.merge!(url: '/blah/:class/:attachment/:id_partition/:style/:filename')
puts p.url # now shows url with the new options
Similarly p.path will show the local file path with whatever options you pick.
Long story short, something like;
User.where('created_at < some_date').map do |x|
"#{x.id} #{x.profile_pic_file_name} #{x.profile_pic.path}"
end
should give you what you want :)

Carrierwave + Fog + S3 remove file without going through a model

I am building an application that has a chat component to it. The application allows users to upload files to the chat. The chat is all javascript but i wanted to use Carrierwave for the uploads because i am using it elsewhere in the application. I am doing the handling of the uploads through AJAX so that i can get into Rails land and let Carrierwave take over.
I have been able to get the chat to successfully upload the files to the correct location in my S3 bucket. The thing i can't figure out is how to delete the files. Here is my code the uploads the files - this is the method that is called from the route that the AJAX call hits.
def upload
file = File.open(params[:file_0].tempfile)
uploader = ChatUploader.new
uploader.store!(file)
end
There is little to no documentation with Carrierwave on how to upload files without going through a model and basically NO documentation on how to remove files without going through a model. I assume it is possible though - i just need to know what to call. So i guess my question is how do i delete files?
UPDATE (11/23)
I got the code to save and delete files from S3 using these methods:
# code to save the file
def upload
file = File.open(params[:file_0].tempfile)
uploader = ChatUploader.new
uploader.store!(file)
uploader.store_path()
end
# code to remove files
def remove_file
file = params[:file]
uploader = ChatUploader.new
uploader.retrieve_from_store!(file)
uploader.remove!
end
My only issue now is that the filename for the uploaded file is not correct. It saves all files with a "RackMultipart" and then some numbers which look like a date, time, and identifier? (example: RackMultipart20141123-17740-1tq4j1g) Need to try and use the original filename plus maybe a timestamp for uniqueness.
I believe it has something to do with these two lines:
file = File.open(params[:file_0].tempfile)
and
uploader.store!(file)

Paperclip rename uploaded files by user

Is it possible to allow the user to rename the uploaded file?
If there is a share link, will it be automatically updated. I am not able to do this since i cant first figure out how to rename the file.
You can rename the files and then change the record file name. For instance, based on this answer, you can do:
(record.image.styles.keys+[:original]).each do |style|
path = record.image.path(style)
FileUtils.move(path, File.join(File.dirname(path), new_file_name))
end
record.image_file_name = new_file_name
record.save
If you're using Amazon S3, you can do:
AWS::S3::S3Object.move_to record.image.path(style), new_file_path, record.image.bucket_name
Check this out: Paperclip renaming files after they're saved

Paperclip interpolation for filename db/actual mis-match

I'm making a small app to upload plain text files using Paperclip. I have an Upload model that has a document attachment. I want to rename the uploaded file so that it is the same as Upload.title.
I've used a Paperclip interpolation to do this.
#config/initializers/paperclip.rb
Paperclip.interpolates('upload_title') do |attachment, style|
attachment.instance.title.parameterize
end
#app/models/upload.rb
has_attached_file :document,
:url => "/:attachment/:id/:upload_title.:extension",
:path => ":rails_root/public/:attachment/:id/:upload_title.:extension"
However, the file itself is renamed but the document_file_name in the database remains as it was.
I've made a test app and uploaded to github here
Here I create a new Upload and attach the file "Original File Name.txt"
garethrees.co.uk/misc/new.JPG
Here you see the new Upload created, still with the original file name.
garethrees.co.uk/misc/created.JPG
And also in the database, the document_file_name remains the same as it was.
garethrees.co.uk/misc/db.JPG
However, in the actual filesystem the document is renamed.
garethrees.co.uk/misc/finder.JPG
I really need both records to match as I need to use the Paperclip path in order for users to download the files.
Thanks
create a callback function for after_document_post_process where you set the document_file_name yourself to the title + extension.

Resources