S3 like signed url for nginx file serving - docker

So , I have setup my project and I am planning to self serve my files from my server. Currently I have two folders from where I want users to access files , 1. Media folder and another is static folder . I wanna set up signed urls just like S3 provides so that one cannot bruteforce and get all my files from the server's either folders. Also I want to make it so that I can handle uploading of files directly from nginx rather than through my application.

For the signed URLS - need to generate them with your application. Then you have two options:
Serve the files with your application after validating the URL is valid
Serve the files with nginx but do the validation/authorization with your app. This is what you need:
http://nginx.org/en/docs/http/ngx_http_auth_request_module.html#auth_request
For the upload part the situation is similar, you can either:
Handle uploads with your app
Use nginx-upload-module and handle the authorization only with your app
I would definitely recommend handling all uploads within your app, it is just simpler.

Related

Serve audio file from GCS with Rails

I have audio files located on a private GCS bucket. I want to serve these audio files for users to listen to.
I cant use Active Storage for this as these files are created/deleted outside of my Rails application.
I could download files using google-cloud-storage gem. It would cover authentication, file download. But if I understand correctly I can only serve files from the public directory? So do I need to download those to Rails.public_path?
Furthermore, I really don't want to manage these files after downloading them - caching, deleting them after some time, etc.
What would be the best way to achieve this?
The best option in my opinion would be to use the google-cloud-storage gem,
since both Google::Cloud::Storage::Bucket and Google::Cloud::Storage::File have the #signed_url method. This way you can find the relevant file(s) that you need and create a temporary url, send the url to the client, which will be in charge of downloading the file directly.
If you don't want the client do download the file directly from Google Cloud you can just download the file from GC yourself, and use #send_data or #send_file in the controller.

Azure file storage/indexing solution

I'm developing a Web Application, and it is running as an Azure Web App. This application has a section in which a user can navigate a directory, and allows the user to open the files and browse sub-directories in said directory.
At the moment, the sub-directories and files are inside "~/Content/Documents", and I am browsing the directory's by using Directory.GetFiles() and Directory.GetDirectories(); functions which are provided by System.IO.
The files in question would be retrieved and downloaded several times a day, and there is no way to manually path one-by-one, seeing as there is a large quantity, and they are subject to change.
However, I has become inconvenient to store the files within the web directory. So my two questions are:
What Azure service can I use to store and retrieve my files?
and
Which of these services provides the ability to index/map a path, which would fit with my web-app?
Please note that the users do not have the ability to edit or otherwise upload any of the files, and there is therefor no need for the service to allow non-authenticated upload.
The newish Azure File Storage feature can be used to store files in Azure Storage and make them accessible via an SMB file share. This will allow for legacy application that require the use of a traditional file share for saving / retrieving files. This allows for easier integration into existing applications without needing to completely rewrite the file storage code.
https://azure.microsoft.com/en-us/blog/azure-file-storage-now-generally-available/

Storing assets in cloud and read them securely

I am developing an iOS app that uses a large amount of images that are needed for animations for short videos. I want to save my application assets as static files in cloud and once they are needed download them using secure API call (either JSON, XML or any other alternative for that matter).
What is the best option for that. I have checked Parse, Dropbox, iCloud, Google Drive, but I am puzzled since I see only instructions for dynamic data that lets users access content they have created and not static assets.
What would be best option for that?
If you just want an easy way to serve static files I would take a look at Amazon S3. You can just upload files through the online console and then get the public URL to those files to use in your app. You can also use the S3 API to upload files through your web service or iOS app.
Hope this helps!
I'd go for Parse (basically because it is fast to learn and develop), you can create a table with the images and change the writing permissions if you are afraid somebody could modify the table.
Another option that you can check it's the special Config table so you can upload custom files (zip files i.e.) and download them in demand.

Dynamically generate file package from S3 assets

I have a service set up where when the user registers, they are able to download a file to their device. The file is dynamically generated from some local information from our database such custom field information (username, email, web url, etc) and then account specific assets stored on S3 (avatar, icons, background art).
I'm not sure of the best way to handle these S3 files as part of the generation process.
Using a Ruby Tempfile class generates a file that has a unique filename that doesn't match what we are expecting. Using Ruby's File class generates the files we want, but it also litters the filesystem with a bunch of files and I worry won't handle concurrent requests for the same assets properly. We're also using Heroku, and they tend to frown on that from what I read.
What's a best practice/recommended way to handle dynamically generating files based on a mix of local and remote assets and then presenting it to the user?

Rails + Amazon S3 + Heroku: url to files on S3 are public how to secure, and how can admin add files to users folders?

I have a rails app, where Im having a drop box like feature-set.
Each user has a login an password
Each user can upload and download their own files.
On their index page they see all the files they have uploaded.
The urls to the files are saved in the db: within heroku
I have a few questions on how to approach some functionality that Id like to add to the app.
1) I, as an admin, would like to add files to the users folders, which will show up when the user logs into their app next time. Currently even if I drop the files in the folders users cant see it becuase their index.html page pulls up only those files that have their urls stored in the db
2) Currently file acces is by url so its public. This is a big problem. I would like to set up the app such that the url is not public. Since Im using heroku I cannot store them on the heroku servers and I wouldn't want to stream them into the app and then provide them to the user through Heroku. So whats the best way to server them directly from S3 but not reveal the url.
Thanks for your help
I think the answer to 1) is to create an action that allows the admin to create a file object and associate it with a user.
As for 2) (and this should help with figuring out 1), incidentally), the Paperclip gem supports attaching files to a model, with an option to store the file on S3 (and the ability to specify the URL to that file).
Here's one of many related tutorials that walks through some considerations for protecting access to those files.

Resources