How to send file to user with carrierwave? - ruby-on-rails

Here's my old code to sends a file to the browser:
def show
send_file File.join(Rails.root, 'tmp', 'price.xls')
end
But recently I've found out that tmp folder can't be used as a persistent storage on Heroku, so I decided to move the file to AWS S3.
That's what I've got so far:
def show
uploader = PriceUploader.new
uploader.retrieve_from_store!('price.xls')
end
Now, how do I send the file to the browser?
upd
I itentionally didn't mount the uploader

Figured it out.
def show
uploader = PriceUploader.new
uploader.retrieve_from_store!('price.xls')
uploader.cache_stored_file!
send_file uploader.file.path
end

In my case
# find uploader ...
send_file(uploader.path,
filename: uploader.filename,
type: "application/<some-type>")

Related

How do I open a CSV file that I uploaded with Carrierwave and Fog to Amazon S3?

I have a model called client_billing_file where I use Carrierwave to upload a CSV file like this:
mount_uploader :billing_file_name, UsageFileUploader
and I schedule a job to run 5 minutes after commiting the creation of a new record:
after_commit :generate_usage_file, on: :create
def generate_usage_file
Resque.enqueue_in(5.minutes, GenerateUsageFileQueue, id, admin.email)
end
This is my background job:
def self.perform(client_billing_file_id, email)
cbf = ClientBillingFile.find(client_billing_file_id)
filepath = cbf.billing_file_name.current_path
csv_file = CSV.read(filepath, headers: true)
.
.
.
end
This is working in my development and testing environments, but it fails when I try to open the CSV file in the staging environment (where it actually uploads the file to the S3 bucket). I checked the bucket and the file is getting uploaded to the specified directory correctly, but for some reason the job is throwing the following error:
Exception Errno::ENOENT
Error No such file or directory # rb_sysopen - my_path/my_file.csv
Versions:
Ruby 2.6.6
Rails 4.2.11
Carrierwave 0.8.0
Fog 1.38.0
I tried Jared Beck's idea and it's working now, basically I added this condition to my BG job:
if Rails.env.production? || Rails.env.staging?
url = cbf.billing_file_name.url
cbf.billing_file_name.download!(url)
end
So the final code looks like this:
def self.perform(client_billing_file_id, email)
cbf = ClientBillingFile.find(client_billing_file_id)
if Rails.env.production? || Rails.env.staging?
url = cbf.billing_file_name.url
cbf.billing_file_name.download!(url)
end
filepath = cbf.billing_file_name.current_path
csv_file = CSV.read(filepath, headers: true)
.
.
.
end

Download an active Storage attachment to disc

The guide says that I can save an attachment to disc to run a process on it like this:
message.video.open do |file|
system '/path/to/virus/scanner', file.path
# ...
end
My model has an attachment defined as:
has_one_attached :zip
And then in the model I have defined:
def process_zip
zip.open do |file|
# process the zip file
end
end
However I am getting an error :
private method `open' called
on the zip.open call.
How can I save the zip locally for processing?
As an alternative in Rails 5.2 you can do this:
def process_zip
# Download the zip file in temp dir
zip_path = "#{Dir.tmpdir}/#{zip.filename}"
File.open(zip_path, 'wb') do |file|
file.write(zip.download)
end
Zip::File.open(zip_path) do |zip_file|
# process the zip file
# ...
puts "processing file #{zip_file}"
end
end
That’s an edge guide (note edgeguides.rubyonrails.org in the URL); it applies to the master branch of the rails/rails repository on GitHub. The latest changes in master haven’t been included in a released version of Rails yet.
You’re likely using Rails 5.2. Use edge Rails to take advantage of ActiveStorage::Blob#open:
gem "rails", github: "rails/rails"

Carrierwave upload and access file on S3

I am struggling to access files on S3 with Carrierwave.
In my uploader file doc_uploader.rb I have the following code
storage :file
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
to uplooad "doc" model defined as follow
class Doc < ActiveRecord::Base
belongs_to :user
mount_uploader :doc, DocUploader
end
To access the uploaded file I have the following line of code in a controller
#doc = current_user.docs.order("created_at").last #last file uploaded by user
io = open("#{Rails.root}/public" + #doc.doc.url)
Everything works perfectly locally. Now I want to move my file to S3 in the uploader I use fog and replace
storage :file
by
storage :fog
I adjust my config file carrierwave.rb and uploading works perfectly. However, to access the file I try to use
#doc = current_user.docs.order("created_at").last
io = open("#{#doc.doc.url}")
and I get the following error
No such file or directory # rb_sysopen - /uploads/doc/doc/11/the_uploaded_file.pdf
Could anyone give me the right syntax to access the file on S3 please? Thanks.
When accessing the asset through the console, it gives you only the path, you might need to append the protocol & host to the #doc.doc.url, something like:
io = open("http://example.com#{#doc.doc.url}")
Or you can set the asset url on the environment you need to, but this is not really necessary:
config.asset_host = 'http://example.com'
This only applies if you are using the console, on any web view this will not apply, carrierwave seems to handle it

Delete original file after using carrierwave gem to upload

I am using carrierwave to upload and process a file. After the process is done I want to be able to delete the original file. I put
after :store, :unlink_original
def unlink_original(file)
return unless delete_original_file
file.delete if version_name.blank?
end
in my uploader. I also added
class CarrierWave::Uploader::Base
add_config :delete_original_file
end
CarrierWave.configure do |config|
config.delete_original_file = true
end
to my config/initializers/carrierwave.rb:
The original file is still in the directory along with the processed file. How would I go about deleting this file the right way after carrierwave is done with it?

Download a Carrierwave upload from S3

I'd like to download an image that was uploaded to S3 using carrierwave. The image is on the Card model, mounted as an uploader. I saw this answer, but had trouble getting that solution to work. My code is:
#download image from S3
uploader = card.image #image is the mounted uploader
uploader.retrieve_from_store!(File.basename(card.image.url))
uploader.cache_stored_file!
that last line throws: "... caused an exception (undefined method `body' for nil:NilClass)..."
My carrierwave config looks like:
#config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.storage = :fog
config.cache_dir = "#{Rails.root}/tmp/upload"
...
end
Thanks apneadiving. It was as easy as:
image = MiniMagick::Image::open(card.image.to_s)
image.write(somepath)
I have tried this in Rails 5 to download file from AWS S3.
def download
image = card.image
# validate existing image from AWS S3
if image.try(:file).exists?
data = open(image.url)
send_data data.read, type: data.content_type, x_sendfile: true
end
end
I hope help everyone.

Resources