I have a service set up where when the user registers, they are able to download a file to their device. The file is dynamically generated from some local information from our database such custom field information (username, email, web url, etc) and then account specific assets stored on S3 (avatar, icons, background art).
I'm not sure of the best way to handle these S3 files as part of the generation process.
Using a Ruby Tempfile class generates a file that has a unique filename that doesn't match what we are expecting. Using Ruby's File class generates the files we want, but it also litters the filesystem with a bunch of files and I worry won't handle concurrent requests for the same assets properly. We're also using Heroku, and they tend to frown on that from what I read.
What's a best practice/recommended way to handle dynamically generating files based on a mix of local and remote assets and then presenting it to the user?
Related
I have audio files located on a private GCS bucket. I want to serve these audio files for users to listen to.
I cant use Active Storage for this as these files are created/deleted outside of my Rails application.
I could download files using google-cloud-storage gem. It would cover authentication, file download. But if I understand correctly I can only serve files from the public directory? So do I need to download those to Rails.public_path?
Furthermore, I really don't want to manage these files after downloading them - caching, deleting them after some time, etc.
What would be the best way to achieve this?
The best option in my opinion would be to use the google-cloud-storage gem,
since both Google::Cloud::Storage::Bucket and Google::Cloud::Storage::File have the #signed_url method. This way you can find the relevant file(s) that you need and create a temporary url, send the url to the client, which will be in charge of downloading the file directly.
If you don't want the client do download the file directly from Google Cloud you can just download the file from GC yourself, and use #send_data or #send_file in the controller.
I am developing an iOS app that uses a large amount of images that are needed for animations for short videos. I want to save my application assets as static files in cloud and once they are needed download them using secure API call (either JSON, XML or any other alternative for that matter).
What is the best option for that. I have checked Parse, Dropbox, iCloud, Google Drive, but I am puzzled since I see only instructions for dynamic data that lets users access content they have created and not static assets.
What would be best option for that?
If you just want an easy way to serve static files I would take a look at Amazon S3. You can just upload files through the online console and then get the public URL to those files to use in your app. You can also use the S3 API to upload files through your web service or iOS app.
Hope this helps!
I'd go for Parse (basically because it is fast to learn and develop), you can create a table with the images and change the writing permissions if you are afraid somebody could modify the table.
Another option that you can check it's the special Config table so you can upload custom files (zip files i.e.) and download them in demand.
I'd like to give our business team the ability to edit certain pages and content themselves via a CMS solution in our grails application, and Weceem plugin seems like a good choice.
The potential showstopper I see is that is uses the local server file system for uploaded content, which is no good in a horizontally scaled cloud environment like ours (we run in AWS).
Question is, is it possible to tell Weceem to use the database to store binary/uploaded content, or (better yet) override the content upload handlers to use Amazon S3 instead of the file system (we already have code that uploads to S3 in our main app, so the question is just how to hook into Weceem)
I assume that in such situation its possible to create your own content type (domain class) in your app that stores binary uploaded content. This class should be a subclass of org.weceem.content.WcmContent class. In weceem you can check a small example for storing such content, see org.weceem.files.WcmContentFileDB class Also, here there is an information how to extend plugin with custom content type. I hope the information can be helpful.
As for uploading: in Weceem we use CKeditor plugin for uploading additional files/resources, also org.weceem.files.WcmContentFile is used, it stores files on file system, the files are uploaded using paths provided with org.weceem.services.WcmContentRepositoryService.getUploadPath(...) method. This path is calculated from configuration property that is provided in application config (e.g. 'weceem.upload.dir'). Not sure that you can hook here.
I have a website that shows galleries. Users can upload their own content from the web (by entering a URL) or by uploading a picture from their computer.
I am storing the URL in the database which works fine for the first use case but I need to figure out where to store the actual images if a user does a upload from their computer.
Is there any recommendation here or best practice on where I should store these?
Should I save them in the appdata or content folders? Should they not be stored with the website at all because it's user content?
You should NOT store the user uploads anywhere they can be directly accessed by a known URL within your site structure. This is a security risk as users could upload .htm file and .js files. Even a file with the correct extension can contain malicious code that can be executed in the context of your site by an authenticated user allowing server-side or client-side attacks.
See for example http://www.acunetix.com/websitesecurity/upload-forms-threat.htm and What security issues appear when users can upload their own files? which mention some of the issues you need to be aware of before you allow users to upload files and then present them for download within your site.
Don't put the files within your normal web site directory structure
Don't use the original file name the user gave you. You can add a content disposition header with the original file name so they can download it again as the same file name but the path and file name on the server shouldn't be something the user can influence.
Don't trust image files - resize them and offer only the resized version for subsequent download
Don't trust mime types or file extensions, open the file and manipulate it to make sure it's what it claims to be.
Limit the upload size and time.
Depending on the resources you have to implement something like this, it is extremely beneficial to store all this stuff in Amazon S3.
Once you get the upload you simply push it over to Amazon and pop the URL in your database as you're doing with the other images. As mentioned above it would probably be wise to open up the image and resize it before sending it over. This both checks it is actually an image and makes sure you don't accidentally present a full camera resolution image to an end user.
Doing this now will make it much, much easier if you ever have to migrate/failover your site and don't want to sync gigabytes of image assets.
One way is to store the image in a database table with a varbinary field.
Another way would be to store the image in the App_Data folder, and create a subfolder for each user (~/App_Data/[userid]/myImage.png).
For both approaches you'd need to create a separate action method that makes it possible to access the images.
While uploading images you need to verify the content of the file before uploading it. The file extension method is not trustable.
Use magic number method to verify the file content which will be an easy way.
See the stackoverflow post and see the list of magic numbers
One way of saving the file is converting it to binary format and save in our database and next method is using App_Data folder.
The storage option is based on your requirement. See this post also
Set upload limit by setting maxRequestLength property to Web.Config like this, where the size of file is specified in KB
<httpRuntime maxRequestLength="51200" executionTimeout="3600" />
You can save your trusted data just in parallel of htdocs/www folder so that any user can not access that folder. Also you can add .htaccess authentication on your trusted data (for .htaccess you should kept your .htpasswd file in parallel of htdocs/www folder) if you are using apache.
I sort of want to do the reverse of this.
Instead of unzipping and adding the collection files to S3 I want to
On user's request:
generate a bunch of xml files
zip the xml files with some images (pre-existing images hosted on s3)
download zip
Does anybody know agood way of doing this? I think I could manage this no problem on a normal machine but Heroku complicates things somewhat in that it has a read-only filesystem.
From the heroku documentation on the read-only filesystem:
There are two directories that are writeable: ./tmp and ./log (under your application root). If you wish to drop a file temporarily for the duration of the request, you can write to a filename like #{RAILS_ROOT}/tmp/myfile_#{Process.pid}. There is no guarantee that this file will be there on subsequent requests (although it might be), so this should not be used for any kind of permanent storage.
You should be able to pretty easily write your generated xml files to tmp/ and keep track of the names, download and write the s3 files to the same directory, and (maybe?) invoke a zip command as long as the output is in tmp/, then serve the file to the browser with the correct mime type to prompt a download. I would only be concerned with how big the filesize is and if heroku has an undocumented limit on what they'll allow in the tmp directory. Especially since you are only performing this action for a one-time download in the duration of a single request, I think you have a good chance of being able to do it.
Edit: Looking around a bit, you might be able to use something like RubyZip to create your zip file if you want to avoid calling system commands.