I have some template files I would like to use in my rails App. I was wondering where(under which directory) to put them given two scenarios:
They are private to my application (Only webmaster can delete, change them)
They are private to my application but also they can be managed by admins(deleted, modified)
Update after comments
Since you want to serve the files locally, just put them outside of the /public/ folder and outside of any of the /assets/ folders and you should be good. You can read more about the public and assets folders here: Section 2 How to use the Asset Pipeline Let's say:
/private/
I believe Section 11 send_file also used in the SO question linked in my original answer below is still the way for you to provide access to files through a controller rather than statically. Adapted from the docs:
send_file("#{Rails.root}/private/#{filename}",
:filename => "#{filename}",
:type => "application/pdf", #for example if pdf
:disposition => 'inline') #send inline instead of attachment
Original answer for remote serving together with send_file below
Regarding 1) files private to the application
You can lock up these private files in a system like Amazon S3 that provides authorized access as Callmeed explains in this SO question. Then only your application will be able to authorize access to a file.
Regarding 2) also accessible to admins
The problem with just using part 1) is that it unlocks the files for a limited time period during which I assume they are publicly available. So if you want to get around that, I think you need to take the solution from Pavel Shved actually in the same SO question above.
In that solution, files are provided through a route/controller that provides the binary data of the file rather than using a URL that points to the file.
Combined solution
Read the file from S3 with only your application authorized to do that access (not opening it publicly). Then provide the data directly through the controller which can authorize whomever you want.
Caveats
Providing binary data directly from the controller seems like it would kill performance of the
application if it is used often, but I've never tried it.
If you can find a more simple way to do part 1), part 2) will still work with that solution
Related
I have a Rails 4 application that needs to use a number of excel files, representing rosters, (20 or so, grouped by their own individual committee) that have to be read in and editable by the User. Pre-deploy I had the system working perfectly where these files would live in public/rosters and could be referenced and edited by any authenticated user, unfortunately when I deployed to Heroku I could no longer do this.
I have been using an S3 bucket to host the other files necessary for this and other related apps, and it's been working wonderfully, for what I've been using it for; so I decided to try it as a solution to this problem. Unfortunately it would appear as if I could only access the files the way I had been by making them publicly accessible, which is not something that I want to do.
So my question is this: what would be the best way to reference these files (using my access_key_id and secret_access_key to authenticate ideally) and allow a User to push changes that will overwrite the file on the S3 bucket.
You have to use aws-sdk-ruby to write file to S3 which works using access_key_id and secret_access_key. Check this documentation. Hope this helps. Thanks!
I need to upload multiple files on my website.
But I need not just a form for uploading multiple files, I need to upload whole directories.
How's this possible for the minimalist?
Yours, Joern.
According to my somewhat limited knowledge this is not possible, only file transfer is possible, not directories.
Here are some workarounds, based on discussion on Velocity Reviews and another discussion:
upload a zip, which you unzip at the server side
upload directories over ftp (web page can be a front end to this)
upload files one by one
I would go either for zip or ftp. Note: someone might have produced a gem that enables uploading directories (I know nothing of such thing, but I will be happy to find out, if there is).
Adding another option to the list provided by Sorrow:
upload via REST/JSON
OK, this is a partial solution, but it does give you the opportunity to write a script that reads your directory and POSTS to your website.
Basically, this is what my app does:
It sends an AJAX request
The server creates a file
The server sends back the URL of the
file location
The client-side will attempt to
create a dialog to download the file
at that location (probably using a
frame? I haven't got this far yet).
My question is, how do I dynamically route to the files I create so that they are accessible when you browse to them? If I don't add a route for them, then they will get a 404 if they try and access the directory they're in.
The files are currently stored in a folder in public.
Would the best way to deal with this make the folder somehow not require a route, so that it can be browsed to directly, and then have an index page on it so they can't view the full list of files? If so, please let me know how I can accomplish this. And on a side note, if you have an idea of how I can accomplish JS displaying the download dialog let me know.
It's Rails 3 by the way.
Thanks!
For a full private set of files: choose a place for your files outside your public directory, then configure X-SendFile support in your web server and finally use send_file in your rails application.
I'm writing a Rails application that serves files stored on a remote server to the end user.
In my case the files are stored on S3 but the user requests the file via the Rails-application (hiding the actual URL). If the file was on my servers local file-system, I could use the Apache header X-Sendfile to free up the Ruby process for other requests while Apache took over the task of sending the file to the client. But in my case - where the file is not on the local file-system, but on S3 - it seems that I'm forced to download it temporarily inside Rails before sending it to the client.
Isn't there a way for Apache to serve a "remote" file to the client that is not actually on the server it self. I don't mind if Apache has to download the file for this to work, as long as I don't have to tie up the Ruby process while it's going on.
Any suggestions?
Thomas, I have similar requirements/issues and I think I can answer your problem. First (and I'm not 100% sure you care for this part), hiding the S3 url is quite easy as Amazon allows you to point CNAMES to your bucket and use a custom URL instead of the amazon URL. To do that, you need to point your DNS to the correct amazon URL. When I set mine up it was similar to this: files.domain.com points to files.domain.com.s3.amazonaws.com. Then you need to create the bucket with the name of your custom URL (files.domain.com in this example). How to call that URL will be different depending on which gem you use, but a word of warning was that the attachment_fu plugin I was using was incorrectly sending me to files.domain.com/files.domain.com/name_of_file.... I couldn't find the setting to fix it, so a simple .sub method for the S3 portion of the plugin fixed it.
On to your other questions, to execute some rails code (like recording the hit in the db) before downloading you can simply do this:
def download
file = File.find(...
# code to record 'hit' to database
redirect_to 3Object.url_for(file.filename,
bucket,
:expires_in => 3.hours)
end
That code will still cause the file to be served by S3, but and still give you the ability to run some ruby. (Of course the above code won't work as is, you will need to point it to the correct file and bucket and my amazon keys are saved in a config file. The above is also using the syntax for the AWS::S3 gem - http://amazon.rubyforge.org/).
Second, the Content-Disposition: attachment issue is a bit more tricky. Hopefully, your situation is a bit more simple than mine and the following solution can work. Assuming the object 'file' (in this example) is the correct S3 object, you can set the disposition to attachment by
file.content_disposition = "attachment"
file.save
The above code can be executed after the file exists on the S3 server (unlike some other headers and permissions), which is nice and it can also be added when you upload the file (syntax depends on your plugin). I'm still trying to find a way to tell S3 to send it as an attachment and only when requested (not every time), and if you find that, please let me know your solution. I need to be able to sometimes download it and other times save embed images (for example) into HTML. I'm not using the above mentioned redirect but fortunately it seems that if you embed (such as a HTML image tag) a file with the content-disposition/attachment header, and the browser still displays the image normally (but I haven't throughly tested that across enough browsers to send it in the wild).
Hope that helps! Good luck.
I'm maintaining a Rails app that has content in the public/ folder that will now need to be protected by a login. We're considering moving those folders of files into a path outside of public/ and writing a Rails controller to serve up the content.
Before we begin writing this, I was curious if anyone else has ran into this sort of problem? I looked for some gems / plugins that might already do this but didn't find anything. Has anyone created a gem for this?
I've done this on a site where people pay to download certain files, and the files are stored in RAILS_ROOT/private. The first thing to know is that you want the web server to handle sending the file, otherwise your app will be held up transmitting large files and this will quickly bring your site to a halt if you have any kind of download volume. So, if you need to check authorization in a controller, then you also need a way to pass control of the download back to the web server. The best way of doing this (that I know of) is the X-Sendfile header, which is supported by Nginx, Apache (with module), and others. With X-Sendfile configured, when your web server receives a X-Sendfile header from your app, it takes over sending the file to the client.
Once you have X-Sendfile working for your web server, a private controller method like this is helpful:
##
# Send a protected file using the web server (via the x-sendfile header).
# Takes the absolute file system path to the file and, optionally, a MIME type.
#
def send_file(filepath, options = {})
options[:content_type] ||= "application/force-download"
response.headers['Content-Type'] = options[:content_type]
response.headers['Content-Disposition'] = "attachment; filename=\"#{File.basename(filepath)}\""
response.headers['X-Sendfile'] = filepath
response.headers['Content-length'] = File.size(filepath)
render :nothing => true
end
Then your controller action could look something like this:
##
# Private file download: check permission first.
#
def download
product = Product.find_by_filename!(params[:filename])
if current_user.has_bought?(product) or current_user.is_superuser?
if File.exist?(path = product.filepath)
send_file path, :content_type => "application/pdf"
else
not_found
end
else
not_authorized
end
end
Obviously your authorization method will vary and you'll need to change the headers if you're offering files other than PDFs or you want the file to be viewed in the browser (get rid of application/force-download content type).
You could use Amazon S3. You could use the controllers to generate and serve up the urls behind your secure area, and it also has a feature that basically makes resources available only for a certain amount of time once a url is generated.
Check out this url: http://docs.amazonwebservices.com/AmazonS3/2006-03-01/index.html?RESTAuthentication.html
AFAIK, X-SendFile is not supported by nginx. Nginx has its own extension allowing this, called X-Accel-Redirect.
You will find more information about this here :
https://www.nginx.com/resources/wiki/start/topics/examples/xsendfile/
There is also a rails plugin implementig this feature, on github: goncalossilva/X-Accel-Redirect
If you want to tie content delivery with your Rails authentication and authorization system, then you essentially have to put the content behind a controller.
If you are looking at a more simple login approach, you can handle it with HTTP Auth and settings in your hosting environment (using htaccess, for example).
Making the file available at an unpredictable URL is a simple solution currently used in some production systems.
E.g.: GitLab. The following image was uploaded to an issue of a private repository, https://gitlab.com/cirosantilli/test-private/issues/1, but you can still see it:
Note the unguessable 90574279de prefix automatically added to the URL.
Bitbucket (non-Rails) also uses this technique.