File access control in Rails - ruby-on-rails

I have a web application which allows users to upload files and share them with other people across the internet. Anyone who has access can download the files, but if the uploader doesn't specifically share the file with someone else, that person can't download the files.
Since the user permissions are controlled by rails, each time someone tries to download a file it sent to the user from a rails process. This is a serious bottle neck - rails is needed for the file upload and permissions but it shouldn't be in the way taking up memory just for others to download files.
I would like to split the application on different servers for the frontend, database and file server. If the user does to my site, they should have the ability to download the file directly from something like my-fileserver.domain.com/file/38183 instead of running it through rails.
What is the best option for this? I would like to control file access at the database level, not the file system - but I don't want rails taking up all of the memory on my system for such a simple process. Any ideas?
Edit:
One thing I may be able to do is load a list of files/permissions from mysql into a node.js app and give access rights to the file server as a true/false response based on what the file server sends in. This still requires the file server to run a web server, however.

May be You could generator a rand url for file, and control by center system .

Related

Can I transfer files from a local server to my Nextcloud server without using the internet and allow users to access them?(same computer)

I used docker technology to set up a nextcloud server for myself and my family
Can I transfer files from a local server to my Nextcloud server without using the internet and allow users to access them?
Because I have discovered two strange things:
1.Placing files directly under a specific user's file path on the server does not allow the user to successfully access the file.
2.As long as I don't delete the files added by the user, even if I directly change the content of the files on the server, the user can still accurately and correctly read the original content.
Or is the user profile path that I think is incorrect?
I think it's /var/www/html/data/"USERID"/files
I would like to know how to solve it, but at the same time, I also want to know what is the reason that causes the following two problems.
Thank you so much.

Web application security concerns with user-uploaded files on Amazon S3?

Background
I have a web application where users may upload a wide variety of files (e.g. jpg, png, r, csv, epub, pdf, docx, nb, tex, etc). Currently, we whitelist exactly which files types are a user may upload. This limitation is sometimes annoying for users (i.e. because they must zip disallowed files, then upload the zip) and for us (i.e. because users write support asking for additional file types to be whitelisted).
Ideal Solution
Ideally, I'd love to whitelist files more aggressively. Specifically, I'd would like to (1) figure out which file types may be trusted and (2) whitelist them all. Having a larger whitelist would be more convienient for users and it reduce the number of support tickets (if only very slightly).
What I Know
I've done a few hours of research and have identified common problems (e.g. path traversal, placing assets in root directory, htaccess vulnerabilities, failure to validate mime type, etc). While this research has been interesting, my understanding is that many of these issues are moot (or considerably mitigated) if your assets are stored on Amazon S3 (or a similar cloud storage service) – which is how most modern web application manage user-uploaded files.
Hasn't this question already been asked a zillion times?!
Please don't mistake this as a general "What are the security risks of user-uploaded content?" question. There are already many questions like that and I don't want to rehash that discussion here.
More specifically, my question is, "What risks, if any, exist given a conventional / modern web application setup?" In other words, I don't care about some old PHP app or vulnerabilities related to IE6. What should I be worried about assuming files are stored in a cloud service like AmazonS3?
Context about infrastructure / architecture
So... To answer that, you'll probably need more context about my setup. That said, I suspect this is a relatively common setup and therefore hope the answers will be broadly useful to anyone writing a modern web application.
My stack
Ruby on Rails application, hosted on Heroku
Users may upload a variety of files (via Paperclip)
Server validates both mime type and extension (against a whitelist)
Files are stored on Amazon S3 (with varying ACL permissions)
When a user uploads a file...
I upload the file directly on AS3 in a tmp folder (hasn't touched my server yet)
My server then downloads the file from the tmp folder on AS3.
Paperclip runs validations and executes any processing (e.g. cutting thumbnails of images)
Finally, Paperclip places the file(s) back on AS3 in their new, permanent location.
When a user downloads a file...
User clicks to download a file which sends a request to my API (e.g. /api/atricle/123/download)
Internally, my API reads the file from AS3 and then serves it to the user (as content type attachment)
Thus the file does briefly pass through my server (i.e. not merely a redirect)
From the user's perspective, the file is served from my API (i.e. the user has no idea the file live on AS3)
Questions
Given this setup, is it safe to whitelist a wide range of file types?
Are there some types of files that are always best avoided (e.g. JS files)?
Are there any glaring flaws in my setup? I suspect not, but if so, please alert me!

Is there a way to know if my user has finished a download?

For a project, I'll need to know if my user has finished to download a file to delete it on my remote server. Is there a way to do that ?
There are a couple ways of doing this, some more efficient than others, but here is what I've come up with.
Download through your application
If your application is downloading/passing the file through to the user you can trigger a function at the end of the stream to delete the file.
S3 Bucket Access Logging
S3 has access server logs (http://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html) that log information for each request. Depending on how your application is structured, you may be able to process these to see what's been accessed.
There may be up to a 30-60 minute delay in log availability
Other Options
There are some other options, though perhaps not ideal (without knowing the specifics of your application I don't know whether these are acceptable).
Use Object Expiration (http://docs.aws.amazon.com/AmazonS3/latest/dev/ObjectExpiration.html)
Related SO question (PHP instead of ROR, but the concepts should apply) Resumable downloads when using PHP to send the file?

Serving files through controllers with partial download support

I need to serve files through grails, only users with permission have access, so I cant serve them with a static link to a container. The system is able to stream binary files to the client without problems,but now (for bandwidth performance issues on the client) I need to implement segmented or partial downloads in the controllers.
Theres a plugin or proven solution to this problem?
May be some kind of tomcat/apache plugin to restrict access to files with certain rules or temporal tickets so I can delegate the "resume download" or "segmented download" problem to the container.
Also i need to log and save stats on the downloads of the users.
I need good performance so, I think doing this in the controller is not good idea.
Sorry bad english.
There is a plugin for apache - https://tn123.org/mod_xsendfile/ It doesn't matter what you're using behind apache at this case. By using this plugin you will respond with special header X-SENDFILE, with path to file to serve, and Apache will take care about actual file downloading for current request.
If you're using Nginx, you have to use X-Accel-Redirect header, see http://wiki.nginx.org/XSendfile

mod_xsendfile alternatives for a shared hosting service without it

I'm trying to log download statistics for .pdfs and .zips (5-25MB) in a rails app that I'm currently developing and I just hit a brick wall; I found out our shared hosting provider doesn't support mod_xsendfile. The sources I've read state that without this, multiple downloads could potentially cause a DoS issue—something I'm definitely trying to avoid. I'm wondering if there are any alternatives to this method of serving files through rails?
Well, how sensitive are the files you're storing?
If you hosted these files somewhere under your app's /public directory, you could just do a meta tag or javascript redirect to the public-facing URL of these files after your users hit some sort of controller action that will update your download statistics.
In this case, your users would probably need to get one of those "Your download should commence in a few moments" pages before the browser would start the file download.
Under this scenario, your Rails application won't be streaming the file out, your web server will, which will give you the same effect as xsendfile. On the other hand, this won't work very well if you need to control access to those downloadable files.

Resources