i am having some problem integrating Paperclip with non-us S3 server. Paperclip seem to assume that the S3 server is in us and return back a url that is at http://s3.amazonaws.com/path/to/my/file.
My question is how to change it to point to a non-us S3 server(singapore for example)? The files are uploading, i just need to get paperclip to return a correct path.
Using :
paperclip-2.4.5
aws-s3-0.6.2
Tian Wei, check this out:
http://techspry.com/ruby_and_rails/amazons-s3-european-buckets-and-paperclip-in-rails-3/
But a better answer would be just use the US one, and avoid the hassle.
Related
I wonder how manage on a rails-api the upload of a "big" file (arround 100m). Is it possible to stream to s3 the rack tempfile during the upload?
I'm creating a new service how just need to receive post request with parameters and files ask the original app if it's everything is ok, and process them. Lot's people over blogs tell that ruby is not the best language for that, it's not a problem to change.
So I would like to know if making a rails api who will received post, return status, and communicate with the other rails app is a good thing.
Thanks in advance
Yes it's possible. Amazon has their own AWS SDK Gem which provides the s3 functionality.
s3 = Aws::S3::Client.new
Here's how to get started using it.
This question however is a little off-topic because you show no proof of trial and error.
I'm working on a Ruby on Rails web app. The user can upload files and they get stored to Amazons S3. For file uploads I use the paperclip gem.
How can I encrypt files with AES256 before they get saved? I know S3 has server side encryption, but that doesn't really work for me because I'm opening the site in the mobile app and would like to handle decryption on the client.
I know I can use the paperclip processors or the before_post_process methods but how can I get the file that is being uploaded and change it?
Look at this paperclip recipe on asynchronous upload to S3. You could use that and then change the callback code to:
def upload_to_s3
self.remote_avatar = encrypt(local_avatar.to_file)
self.local_avatar = nil
self.save!
end
Where the method encrypt is the AES256 function.
It may be worth looking into this add-on gem for CarrierWave if you are not set on paperclip, it might save you some time.
I'm storing images in Amazon S3 using Fog and Carrierwave. It returns a url like bucket.s3.amazonaws.com/my_image.jpg.
DNS entries have been set up so that images.mysite.com points to bucket.s3.amazonaws.com.
I want to adjust my views an APIs to use the images.mysite.com/my_image.jpg URL. Carrierwave, however, only spits out the Amazon based one. Is there a simple way to tell Carrierwave and/or Fog to use a different host for its URLs than normal? If not, how would I modify the uploader to spit it out?
Come to find out that, as of June 6th, 2012, Amazon AWS does not support custom SSL certs, which makes this a moot point.
I read here that Heroku doesn't allow you to store photos on their server and that people use CarrierWave gem with Amazon to store photos. However, I just watched Ryan Bate's Carrier Wave RailsCasts and he also mentions how CarrierWave has a remote url option whereby it will, in his words, "download" the photo from a URL and display it on your site. Does this mean that it stays on the remote server and just gets presented by CarrierWave on the Heroku site? I assume Carrier Wave's not somehow attempting to transfer the image at the url to the new server?
Might be a stupid question but I don't know a lot about servers (or anything :)))
the remote url option for CarrierWave gives the user a different way of providing the picture to your server: instead of uploading the picture file directly, the user may give a URL where the picture is (say, on a Flickr account, or something). When this is provided to the application using CarrierWave, the picture is downloaded from the third-party location (given by the url) to the application server -- just as if the user had uploaded it directly -- and then stored to Amazon's S3.
Server side is Rails.
Client side is Flash, users will upload directly to S3
I need a flexible way to generate S3 policy files, base64 encode them, and then distribute the resulting signed policy to the client.
Is there a good library/gem for this, or do I need to roll my own?
I'll be using paperclip to store the file, as per:
http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip
I've had a look at:
https://github.com/geemus/fog
https://github.com/jnicklas/carrierwave
https://github.com/marcel/aws-s3
These look like they'll help me get bits done, but I can't tell if they'll help me generate flexible policies.
EDIT: Going to give the "Generate an upload signature..." bit here a shot:
http://www.kiakroas.com/blog/44/
Here is a sample project for how to do this using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload