Rails - Paper_Clip - Support for Multi File Uploads - ruby-on-rails

I have paper_clip installed on my Rails 3 app, and can upload a file - wow that was fun and easy!
Challenge now is, allowing a user to upload multiple objects.
Whether it be clicking select fileS and being able to select more than one. Or clicking a more button and getting another file upload button.
I can't find any tutorials or gems to support this out of the box. Shocking I know...
Any suggestions or solutions. Seems like a common need?
Thanks

Okay, this is a complex one but it is doable. Here's how I got it to work.
On the client side I used http://github.com/valums/file-uploader, a javascript library which allows multiple file uploads with progress-bar and drag-and-drop support. It's well supported, highly configurable and the basic implementation is simple:
In the view:
<div id='file-uploader'><noscript><p>Please Enable JavaScript to use the file uploader</p></noscript></div>
In the js:
var uploader = new qq.FileUploader({
element: $('#file-uploader')[0],
action: 'files/upload',
onComplete: function(id, fileName, responseJSON){
// callback
}
});
When handed files, FileUploader posts them to the server as an XHR request where the POST body is the raw file data while the headers and filename are passed in the URL string (this is the only way to upload a file asyncronously via javascript).
This is where it gets complicated, since Paperclip has no idea what to do with these raw requests, you have to catch and convert them back to standard files (preferably before they hit your Rails app), so that Paperclip can work it's magic. This is done with some Rack Middleware which creates a new Tempfile (remember: Heroku is read only):
# Embarrassing note: This code was adapted from an example I found somewhere online
# if you recoginize any of it please let me know so I pass credit.
module Rack
class RawFileStubber
def initialize(app, path=/files\/upload/) # change for your route, careful.
#app, #path = app, path
end
def call(env)
if env["PATH_INFO"] =~ #path
convert_and_pass_on(env)
end
#app.call(env)
end
def convert_and_pass_on(env)
tempfile = env['rack.input'].to_tempfile
fake_file = {
:filename => env['HTTP_X_FILE_NAME'],
:type => content_type(env['HTTP_X_FILE_NAME']),
:tempfile => tempfile
}
env['rack.request.form_input'] = env['rack.input']
env['rack.request.form_hash'] ||= {}
env['rack.request.query_hash'] ||= {}
env['rack.request.form_hash']['file'] = fake_file
env['rack.request.query_hash']['file'] = fake_file
if query_params = env['HTTP_X_QUERY_PARAMS']
require 'json'
params = JSON.parse(query_params)
env['rack.request.form_hash'].merge!(params)
env['rack.request.query_hash'].merge!(params)
end
end
def content_type(filename)
case type = (filename.to_s.match(/\.(\w+)$/)[1] rescue "octet-stream").downcase
when %r"jp(e|g|eg)" then "image/jpeg"
when %r"tiff?" then "image/tiff"
when %r"png", "gif", "bmp" then "image/#{type}"
when "txt" then "text/plain"
when %r"html?" then "text/html"
when "js" then "application/js"
when "csv", "xml", "css" then "text/#{type}"
else 'application/octet-stream'
end
end
end
end
Later, in application.rb:
config.middleware.use 'Rack::RawFileStubber'
Then in the controller:
def upload
#foo = modelWithPaperclip.create({ :img => params[:file] })
end
This works reliably, though it can be a slow process when uploading a lot of files simultaneously.
DISCLAIMER
This was implemented for a project with a single, known & trusted back-end user. It almost certainly has some serious performance implications for a high traffic Heroku app and I have not fire tested it for security. That said, it definitely works.

The method Ryan Bigg recommends is here:
https://github.com/rails3book/ticketee/commit/cd8b466e2ee86733e9b26c6c9015d4b811d88169
https://github.com/rails3book/ticketee/commit/982ddf6241a78a9e6547e16af29086627d9e72d2
The file-uploader recommendation by Daniel Mendel is really great. It's a seriously awesome user experience, like Gmail drag-and-drop uploads. Someone wrote a blog post about how to wire it up with a rails app using the rack-raw-upload middleware, if you're interested in an up-to-date middleware component.
http://pogodan.com/blog/2011/03/28/rails-html5-drag-drop-multi-file-upload
https://github.com/newbamboo/rack-raw-upload
http://marc-bowes.com/2011/08/17/drag-n-drop-upload.html
There's also another plugin that's been updated more recently which may be useful
jQuery-File-Upload
Rails setup instructions
Rails setup instructions for multiples
And another one (Included for completeness. I haven't investigated this one.)
PlUpload
plupload-rails3
These questions are highly related
Drag-and-drop file upload in Google Chrome/Chromium and Safari?
jQuery Upload Progress and AJAX file upload

I cover this in Rails 3 in Action's Chapter 8. I don't cover uploading to S3 or resizing images however.
Recommending you buy it based solely on it fixing this one problem may sound a little biased, but I can just about guarantee you that it'll answer other questions you have down the line. It has a Behaviour Driven Development approach as one of the main themes, introducing you to Rails features during the development of an application. This shows you not only how you can build an application, but also make it maintainable.
As for the resizing of images after they've been uploaded, Paperclip's got pretty good documentation on that. I'd recommend having a read and then asking another question on SO if you don't understand any of the options / methods.
And as for S3 uploading, you can do this:
has_attached_file :photo, :styles => { ... }, :storage => :s3
You'd need to configure Paperclip::Storage::S3 with your S3 details to set it up, and again Paperclip's got some pretty awesome documentation for this.
Good luck!

Related

Rails app (production) interfacing with another rails app on same machine

I created a small application that generates reports (mainly PDFs) from an sqlite3 database exported from GNUCash. I separated the app with a possibility of sharing it with other VFW posts that are currently using GNUCash or stuck with a paper general ledger. I call it a quasi API in that most reports can be generated in both HTML and PDF or it can just send data. Before I separated it, I had tried a multiple database approach, adding the sqlite3 database to my database.yaml along with my main postgresql database. It worked but that made the possibility of sharing difficult and I was a little skeptical about the approach.
All is fine with PDFs (prawn generated). I use RestClient to get the reports from the pdf server running on the same machine on a different port. For example, to generate a General Ledger this controller get action is called:
def ledger_pdf
resp = RestClient.get "http://localhost:8600/ledgers/#{params[:id]}/ledger_pdf"
send_data resp.body, filename: "Ledger#{params[:id]}",
type: "application/pdf",
disposition: "inline"
end
On the pdf server, this action responds
def ledger_pdf
pdf = Pdf::Ledger.new(view_context,params)
send_data pdf.render, filename: "Ledger#{params[:id]}",
type: "application/pdf",
disposition: "inline"
end
Getting the HTML version became a little more challenging! On the pdf server I have a one page menu that lists all the reports available, some by date and others with a collection route. This was for the 'sharing' version where someone only needs the reports and does not have another rails application that does other stuff. The server could be stand alone running on some box.
Rather then getting the data and creating a page on the main application from the data, I got the menu from the pdf server in an iframe tag. I though this was working great when I discovered that, in a view:
<iframe src="http://localhost:8600/ledgers/" width="100%" height="100%" id="rails_iframe">error!</iframe>
was calling localhost:8600 on my machine (which was running a development version and fooled me into thinking it was working!) rather than on the production server. After some researching I discovered that you can't do that. I guess the RestClient call to localhost:8600 is hidden from the browser, but the iframe call is not.
I guess I can get the HTML versions using the same RestClient approach and return text/html and use nokogiri to get the body/html and yield it somehow, if I knew how to do that.
The html.slim is really simple and I could just duplicate it on my main server after getting the data, but then I'd have two code bases to maintain.
Another approach would be to make it a real API and make it public so that the iframe approach would work, something I was trying to avoid (authentication and all that).
I guess my question is, does someone have another approach or can refine or give me pointers on my approaches.
You just need to add another action to your 'pdf' server which will render menu:
def menu
render json: {first: 'first/report', second: 'second/report'}
end
And then just fetch that menu from your main server (which is facing users):
def menu
resp = RestClient.get ...
# parse resp.body to get menu structure
# convert JSON response from pdf server to HTML with your ERB template
#menu_items = ...
end
Then just have route on your main server to pass information from collection server as well, something like:
get "reports/:collection/:date/:report", to: 'reports#fetch'
# then in your controller
def fetch
resp = RestClient.get "...#{params[:collection]}/#{params[:date]}/#{params[:report]}
...
end
Personally, I wouldn't use Rails server for pdf generation. I would've just used cron script which would run Ruby script for PDF generation, and then I would just upload them to S3 or some other cloud storage. My Rails server would've just return the link for that S3 file.

Uploading a file in Rails

I'm new to rails, and I'm writing a RESTful website using the CRUD technique. So far I have created three pages, all of which allow the user to create, edit, and delete a row from the database. However, my fourth page will need to include an upload file form, but a) I don't know how the filesystem works with Rails thus I don't know where files should be stored. The file would be around 100kb and couldn't be stored in temporary storage because it will be constantly downloaded. And b) I don't know how to write to a file.
It would be great if you could tell me how to do what I mentioned above - create an upload input on an input form, and to then write the file to a filepath in a separate directory.
Update 2018
While everything written below still holds true, Rails 5.2 now includes active_storage, which allows stuff like uploading directly to S3 (or other cloud storage services), image transformations, etc. You should check out the rails guide and decide for yourself what fits your needs.
While there are plenty of gems that solve file uploading pretty nicely (see https://www.ruby-toolbox.com/categories/rails_file_uploads for a list), rails has built-in helpers which make it easy to roll your own solution.
Use the file_field-form helper in your form, and rails handles the uploading for you:
<%= form_for #person do |f| %>
<%= f.file_field :picture %>
<% end %>
You will have access in the controller to the uploaded file as follows:
uploaded_io = params[:person][:picture]
File.open(Rails.root.join('public', 'uploads', uploaded_io.original_filename), 'wb') do |file|
file.write(uploaded_io.read)
end
It depends on the complexity of what you want to achieve, but this is totally sufficient for easy file uploading/downloading tasks. This example is taken from the rails guides, you can go there for further information: http://guides.rubyonrails.org/form_helpers.html#uploading-files
Sept 2018
For anyone checking this question recently, Rails 5.2+ now has ActiveStorage by default & I highly recommend checking it out.
Since it is part of the core Rails 5.2+ now, it is very well integrated & has excellent capabilities out of the box (still all other well-known gems like Carrierwave, Shrine, paperclip,... are great but this one offers very good features that we can consider for any new Rails project)
Paperclip team deprecated the gem in favor of the Rails ActiveStorage.
Here is the github page for the ActiveStorage & plenty of resources are available everywhere
Also I found this video to be very helpful to understand the features of Activestorage
There is a nice gem especially for uploading files : carrierwave. If the wiki does not help , there is a nice RailsCast about the best way to use it . Summarizing , there is a field type file in Rails forms , which invokes the file upload dialog. You can use it , but the 'magic' is done by carrierwave gem .
I don't know what do you mean with "how to write to a file" , but I hope this is a nice start.
Okay. If you do not want to store the file in database and store in the application, like assets (custom folder), you can define non-db instance variable defined by attr_accessor: document and use form_for - f.file_field to get the file,
In controller,
#person = Person.new(person_params)
Here person_params return whitelisted params[:person] (define yourself)
Save file as,
dir = "#{Rails.root}/app/assets/custom_path"
FileUtils.mkdir(dir) unless File.directory? dir
document = #person.document.document_file_name # check document uploaded params
File.copy_stream(#font.document, "#{dir}/#{document}")
Note, Add this path in .gitignore & if you want to use this file again add this path asset_pathan of application by application.rb
Whenever form read file field, it get store in tmp folder, later you can store at your place, I gave example to store at assets
note: Storing files like this will increase the size of the application, better to store in the database using paperclip.
In your intiallizer/carrierwave.rb
if Rails.env.development? || Rails.env.test?
config.storage = :file
config.root = "#{Rails.root}/public"
if Rails.env.test?
CarrierWave.configure do |config|
config.storage = :file
config.enable_processing = false
end
end
end
use this to store in a file while running on local

Rails Functional Test Case and Uploading Files to ActionDispatch::Http::UploadFile

I'm am adding tests to a Rails app that remotely stores files. I'm using the default Rails functional tests. How can I add file uploads to them? I have:
test "create valid person" do
post(:create, :person => { :avatar => fixture_file_upload('avatar.jpeg') })
end
This for some reason uploads a Tempfile and causes the AWS/S3 gem to fail with:
NoMethodError: undefined method `bytesize' for Tempfile
Is their any way that I can get the test to use an ActionDispatch::Http::UploadedFile and perform more like it does when testing with the web browser? Is fixture_file_upload the way to test uploading files to a controller? If so why doesn't it work like the browser?
As a note, I really don't want to switch testing frameworks. Thanks!
I use the s3 gem instead of the aws/s3 gem. The main reasons for this are no support for european buckets and development of aws/s3 seems to be stopped.
If you want to test file upload than using the fixtures_file_upload method is correct, it maps directly to Rack::Test::UploadedFile.new (you can use this if the test file isn't in the fixtures folder).
But I've also noticed that the behavior of the Rack::Test::Uploaded file objects isn't exactly the same as the ActionDispatch::Http::UploadedFile object (that's the class of uploaded files). The basic methods (original_filename, read, size, ...) all work but there are some differences when working with the file method. So limit your controller to these methods and all will be fine.
An other possible solution is by creating an ActionDispatch::Http::Uploaded file object and using that so:
upload = ActionDispatch::Http::UploadedFile.new({
:filename => 'avatar.jpeg',
:type => 'image/jpeg',
:tempfile => File.new("#{Rails.root}/test/fixtures/avatar.jpeg")
})
post :create, :person => { :avatar => upload }
I'd recommend using mocks.
A quick google search reveals:
http://www.ibm.com/developerworks/web/library/wa-mockrails/index.html
You should be able to create an object that will respond to the behaviors you want it to. Mostly used in a Unit test environment, so you can test your stuff in isolation, as integration tests are supposed to fully exercise the entire stack. However, I can see in this case it'd be useful to mock out the S3 service because it costs money.
I'm not familiar with the AWS/S3 gem, but it seems that you probably aren't using the :avatar param properly. bytesize is defined on String in ruby1.9. What happens if you call read on the uploaded file where you pass it into AWS/S3?

How to integrate Paperclip and Mimetype-fu

First, a little background, because there is a lot of interaction going on: I'm grabbing emails via Fetcher, and processing them using MMS2R to extract the attachments. These attachments are generally going to be PDF files or MS Word documents, so you'd expect that their content-type would be application/pdf and application/msword respectively, but unfortunately it appears that many mail programs do not do this.
Instead, the attachments are application/x-pdf and application/x-doc. I need these to be set correctly so that scribd-fu will properly iPaper the documents. Now, mimetype-fu will manage to figure out the proper content-type, but for the life of me, I just can figure out how to properly set the content-type of the paperclip'd attachment.
Here's a snippet of the code:
mms.process do |media_type, files|
# go through each file
files.each do |filename|
# if it's a format we support, create a record
if media_type =~ /pdf/ # just pdfs for now, to reduce confusion
File.open(filename) do |tempfile|
# Somewhere in here I'd like to change filename.content_type
# to the proper type using mimetype-fu
# except doing tempfile.content_type = whatever doesn't seem to work.
thing = Thing.new
thing.document = tempfile
thing.save!
end
end
end
end
Any help would be appreciated, because I've been beating my head against a wall trying all manner of things to try to get this working. I've tried these links already either without success or without grokking what needs doing:
http://gist.github.com/55009/
http://railsforum.com/viewtopic.php?id=27448
http://github.com/dbackeus/paperclip/commit/a514bd03664fc6a764787f59c3169397336702b1
Thanks greatly!
Can you just do
thing.document_content_type = whatever
or are you doing your scribd-fu in document= or something?

Rails Binary Stream support

I'm going to be starting a project soon that requires support for large-ish binary files. I'd like to use Ruby on Rails for the webapp, but I'm concerned with the BLOB support. In my experience with other languages, frameworks, and databases, BLOBs are often overlooked and thus have poor, difficult, and/or buggy functionality.
Does RoR spport BLOBs adequately? Are there any gotchas that creep up once you're already committed to Rails?
BTW: I want to be using PostgreSQL and/or MySQL as the backend database. Obviously, BLOB support in the underlying database is important. For the moment, I want to avoid focusing on the DB's BLOB capabilities; I'm more interested in how Rails itself reacts. Ideally, Rails should be hiding the details of the database from me, and so I should be able to switch from one to the other. If this is not the case (ie: there's some problem with using Rails with a particular DB) then please do mention it.
UPDATE: Also, I'm not just talking about ActiveRecord here. I'll need to handle binary files on the HTTP side (file upload effectively). That means getting access to the appropriate HTTP headers and streams via Rails. I've updated the question title and description to reflect this.
As for streaming, you can do it all in an (at least memory-) efficient way. On the upload side, file parameters in forms are abstracted as IO objects that you can read from; on the download side, look in to the form of render :text => that takes a Proc argument:
render :content_type => 'application/octet-stream', :text => Proc.new {
|response, output|
# do something that reads data and writes it to output
}
If your stuff is in files on disk, though, the aforementioned solutions will certainly work better.
+1 for attachment_fu
I use attachment_fu in one of my apps and MUST store files in the DB (for annoying reasons which are outside the scope of this convo).
The (one?) tricky thing dealing w/BLOB's I've found is that you need a separate code path to send the data to the user -- you can't simply in-line a path on the filesystem like you would if it was a plain-Jane file.
e.g. if you're storing avatar information, you can't simply do:
<%= image_tag #youruser.avatar.path %>
you have to write some wrapper logic and use send_data, e.g. (below is JUST an example w/attachment_fu, in practice you'd need to DRY this up)
send_data(#youruser.avatar.current_data, :type => #youruser.avatar.content_type, :filename => #youruser.avatar.filename, :disposition => 'inline' )
Unfortunately, as far as I know attachment_fu (I don't have the latest version) does not do clever wrapping for you -- you've gotta write it yourself.
P.S.
Seeing your question edit - Attachment_fu handles all that annoying stuff that you mention -- about needing to know file paths and all that crap -- EXCEPT the one little issue when storing in the DB. Give it a try; it's the standard for rails apps. IF you insist on re-inventing the wheel, the source code for attachment_fu should document most of the gotchas, too!
You can use the :binary type in your ActiveRecord migration and also constrain the maximum size:
class BlobTest < ActiveRecord::Migration
def self.up
create_table :files do |t|
t.column :file_data, :binary, :limit => 1.megabyte
end
end
end
ActiveRecord exposes the BLOB (or CLOB) contents as a Ruby String.
I think your best bet is the attachment_fu plug-in:
http://github.com/technoweenie/attachment_fu/tree/master
UPDATE: Found some more info here http://groups.google.com/group/rubyonrails-talk/browse_thread/thread/a81beffb93708bb3
Look into the plugin, x_send_file too.
"The XSendFile plugin provides a simple interface for sending files via the X-Sendfile HTTP header. This enables your web server to serve the file directly from disk, instead of streaming it through your Rails process. This is faster and saves a lot of memory if you‘re using Mongrel. Not every web server supports this header. YMMV."
I'm not sure if it's usable with Blobs, it may just be for files on the file system. But you probably need something that doesn't tie up the web server streaming large chunks of data.

Resources