Paperclip: attach already uploaded files - ruby-on-rails

I'm interested if it is possible to attach already uploaded files to new activerecord instances? For example I have a lot of pictures uploaded and want to choose one of them. Is it even possible using paperclip? I know Ckeditor can handle it somehow when used with activeadmin.

You can pass a local path to the file to trigger storage processing from somewhere accessible to the server.
User.attribute_that_is_attachment = File.new('/local/path/to/file.txt')

Related

Upload file directly to S3 without need to use forms in Rails

For my Rails application, I download a bunch of files from a remote URL to my application. I would like to directly upload them to Amazon S3, without needing a form to do the upload, since I will temporarily cache the file I downloaded on the EC2 instance.
I would also like to retain the links to the files I uploaded so I can download them later.
I am essentially reposting the files I downloaded.
I looked around, but most of the solution seem to involve form uploading to S3 with a user.
Is there s direct upload solution?
You can upload directly to S3 using the AWS SDK for Ruby. The easiest way is:
require 'aws-sdk'
s3 = Aws::S3::Resource.new(region:'us-west-2')
obj = s3.bucket('bucket-name').object('key')
obj.upload_file('/path/to/source/file')
Or you can find a couple other options here.
You can simply use EvaporateJS to achieve this. You can also take advantage of sending ajax request to update file name to the database after each file upload. Though javascript exposes few details your bucket is not vulnerable to hack as S3 service provide a bucket policy.
Just set the <AllowedOrigin>*</AllowedOrigin> to <AllowedOrigin>specificwebsite.com</AllowedOrigin> in production mode.

Rails s3_direct_upload without file field

My website generates a file in javascript (audio recording) and I then want it to be uploaded to Amazon S3.
I first managed to get the uploading part working by sending the generated file to my server, where it is uploaded. However I would like now to upload the file directly to S3, without going through my server.
So I started to use the s3_direct_upload gem, which works great when using a file_field. However my file is generated by the javascript and :
- The value of a file field has to be set by the user for security reasons
- I do not want the user to have to interact with the upload
I tried to play with the S3Uploader class and to directly add data, without any success for now, it seems that I do not use the correct method.
Does anyone has any idea on how to achieve S3 direct upload without a file field ?
Thanks
Never mind, I found out that the S3Uploader class used by the s3_direct_upload gem has the same methods as the jQuery-File-Upload from which it is derived.
So one can call $("#s3_uploader").fileupload('send', {files: [f]});
And the f File will be uploaded to S3 directly

Custom filepath on server parse.com

I'm working with parse.com for my server end. I'm wondering if there's a way for files to be saved into subfolders. For example my file is currently saved with a url like this:
http://files.parsetfss.com/bb2767e6-fc18-4ff5-a071-199803c9aac2/tfss-d056e28e-1e02-49dd-930b-e46790a2e38d-Drums.png
is there a way I can get it to look like this instead:
http://files.parsetfss.com/bb2767e6-fc18-4ff5-a071-199803c9aac2/tfss-d056e28e-1e02-49dd-930b-e46790a2e38d/Drums.png
and for the same extension (tfss-d056e28e-1e02-49dd-930b-e46790a2e38d) apply to each row?
The reason I need this is because I'm actually uploading html files and it can't find its assets if they get renamed...
Have a look at the Cloud Hosting documentation here:
https://parse.com/docs/hosting_guide
Basically whatever files/folders you put in the "public" folder will be publicly available.
You can use it to upload files you want to be shared normally, instead of the way you described in your question which is for files you want to attach to objects.

Configuring Multiple store directories - Carrierwave s3 upload

I have application where we have posts to which we upload photos. I have implemented S3 uploading module using carrier-wave and fog integration which is successful. But when images are uploaded along with versions the original file also getting stored in the same directory.
Is there any way to configure a separate folder inside my bucket to store only original images and rest of the images separately.
I also searched and learned that operating with multiple buckets is not yet possible with carrier-wave.
Kindly please direct me on this. Thanks in advance.

where is the best place to save images from users upload

I have a website that shows galleries. Users can upload their own content from the web (by entering a URL) or by uploading a picture from their computer.
I am storing the URL in the database which works fine for the first use case but I need to figure out where to store the actual images if a user does a upload from their computer.
Is there any recommendation here or best practice on where I should store these?
Should I save them in the appdata or content folders? Should they not be stored with the website at all because it's user content?
You should NOT store the user uploads anywhere they can be directly accessed by a known URL within your site structure. This is a security risk as users could upload .htm file and .js files. Even a file with the correct extension can contain malicious code that can be executed in the context of your site by an authenticated user allowing server-side or client-side attacks.
See for example http://www.acunetix.com/websitesecurity/upload-forms-threat.htm and What security issues appear when users can upload their own files? which mention some of the issues you need to be aware of before you allow users to upload files and then present them for download within your site.
Don't put the files within your normal web site directory structure
Don't use the original file name the user gave you. You can add a content disposition header with the original file name so they can download it again as the same file name but the path and file name on the server shouldn't be something the user can influence.
Don't trust image files - resize them and offer only the resized version for subsequent download
Don't trust mime types or file extensions, open the file and manipulate it to make sure it's what it claims to be.
Limit the upload size and time.
Depending on the resources you have to implement something like this, it is extremely beneficial to store all this stuff in Amazon S3.
Once you get the upload you simply push it over to Amazon and pop the URL in your database as you're doing with the other images. As mentioned above it would probably be wise to open up the image and resize it before sending it over. This both checks it is actually an image and makes sure you don't accidentally present a full camera resolution image to an end user.
Doing this now will make it much, much easier if you ever have to migrate/failover your site and don't want to sync gigabytes of image assets.
One way is to store the image in a database table with a varbinary field.
Another way would be to store the image in the App_Data folder, and create a subfolder for each user (~/App_Data/[userid]/myImage.png).
For both approaches you'd need to create a separate action method that makes it possible to access the images.
While uploading images you need to verify the content of the file before uploading it. The file extension method is not trustable.
Use magic number method to verify the file content which will be an easy way.
See the stackoverflow post and see the list of magic numbers
One way of saving the file is converting it to binary format and save in our database and next method is using App_Data folder.
The storage option is based on your requirement. See this post also
Set upload limit by setting maxRequestLength property to Web.Config like this, where the size of file is specified in KB
<httpRuntime maxRequestLength="51200" executionTimeout="3600" />
You can save your trusted data just in parallel of htdocs/www folder so that any user can not access that folder. Also you can add .htaccess authentication on your trusted data (for .htaccess you should kept your .htpasswd file in parallel of htdocs/www folder) if you are using apache.

Resources