I have Rails application and use fog gem to upload files to cloud storage (Rackspace). As of now I have been successfully uploading local files to cloud storage.
#service = Fog::Storage.new(options)
directory = #service.directories.new :key => 'test'
directory.files.create :key => path, :body => file, :content_type => content_type
I have a new requirement now. I want to be able to use remote link (public-url) and have it uploaded to cloud storage. Is there a way to achieve this without downloading it locally or loading the whole thing in memory?
I'm looking for something like this:
directory.files.create :key => path, :body => 'url-to-remote-file', :content_type => content_type
A stream-based approach would also be quite helpful.
Thanks.
Related
Using Paperclip to attach avatars to User profiles for my rails application. I followed the instructions on the paperclip github to initialize and attach to my app.
I have an image in the public/images/medium/missing.png and for both cases (when I upload or when I fallback on the default) I get no image. I've checked my directory and there is an image where it says it is looking but does not grab it. Additionally when I have tried uploading images, I know the image is uploaded correctly because when calling the User in rails console shows all the information properly attached.
I am calling the image in my view like:
<%= image_tag(#blog.user.avatar.url(":medium"), :class => "image-circle avatar") %>
my Paperclip declaration in the User model looks like the following:
has_attached_file :avatar, :styles => { :medium => "300x300>", :thumb => "100x100#"}, :default_url => "/public/images/:style/missing.png"
validates_attachment_content_type :avatar, :content_type => /\Aimage\/.*\Z/
Really not sure what is going on. The route errors that appear when I inspect the improperly loaded image point directly to the image in my local server. And the fact that it can't grab either the missing or the uploaded file also has me at a loss. Any help would be super appreciated!!
And for good measure the output when I examine a user with an uploaded avatar:
avatar_file_name: "11390219_10206114805925816_6595348111261743639_n.j...", avatar_content_type: "image/jpeg", avatar_file_size: 101926, avatar_updated_at: "2015-07-10 18:51:44">
Thanks in advance!
EDIT
This is the URL that is providing the 404 error:
http://localhost:3000/images/medium/missing.png
while in my local directory it goes "root/public/images/medium/missing.png"
not sure how its not grabbing it, unless I am just missing something really obvious somewhere. (i tried hard routing the public in there as well, but to no avail).
EDIT
There is the possibility that you're simply not serving the static assets, add:
config.serve_static_assets = true
to your development.rb
ORIGINAL POST
In you application.rb ( or an environment specific file ), you should have a config.paperclip_defaults = { ... }, here is the link in the docs: https://github.com/thoughtbot/paperclip#defaults
Here is an example one, using fog:
config.paperclip_defaults = {
:storage => :fog,
:fog_credentials => {
:provider => "Local",
:local_root => "#{Rails.root}/public"
},
:fog_directory => "",
:fog_host => "localhost:3000"
}
Do you have something like that in your application? I just tested on an app of mine, and I was able to upload an image, but not to see any without the paperclip_defaults hash. Also, don't forget to restart your app after you update the config files. I hope this helps!
Have you tried killing the quotes around the image style? Around :medium?
<%= image_tag(#blog.user.avatar.url(:medium), :class => "image-circle avatar") %>
As show here in the Paperclip Docs:
https://github.com/thoughtbot/paperclip#show-view
Same problem, solved it by reinstalling imagemagic with brew
Details: In case you're imagemagic, try to update the imagemagic and if it asks you to link it,
Try with this:
brew link --overwrite imagemagick
It worked for me. Hope its helpful
I'm writing a Rails 3 app that uses Paperclip to transcode a video file attachment into a bunch of other formats, and then to store the resulting files. It all works fine for local storage, but I am trying to make it work using Paperclip's Fog support to store files in a bucket on our own Ceph cluster. However, I can't seem to find the right configuration options to make Fog talk to my Ceph server.
Here is a snippet from my Rails class:
has_attached_file :videofile,
:storage => :fog,
:fog_credentials => { :aws_access_key_id => 'xxx', :aws_secret_access_key => 'xxx', :provider => 'AWS'},
:fog_public => true,
:url => ":id/:filename",
:fog_directory => 'replay',
:fog_host => 'my-hostname',
Writes using this setup fail because Paperclip attempts to save to Amazon S3 rather than the host I've provided. I have a non-Rails / non-Paperclip toy script working just fine:
conn = Fog::Storage.new({
:aws_access_key_id => 'xxx',
:aws_secret_access_key => 'xxx',
:host => 'my-hostname',
:path_style => true,
:provider => "AWS",
})
This correctly connects to my local Ceph server. So I suspect there is something I'm not configuring in Paperclip properly - but what?
Here's the relevant hunk from fog.rb that I think is causing the connection to only go to AWS:
def host_name_for_directory
if #options[:fog_directory].to_s =~ Fog::AWS_BUCKET_SUBDOMAIN_RESTRICTON_REGEX
"#{#options[:fog_directory]}.s3.amazonaws.com"
else
"s3.amazonaws.com/#{#options[:fog_directory]}"
end
end
the error was just from an improperly configured Ceph cluster. For anyone who finds this thread, as long as you:
Have your wildcard DNS set up properly for your Ceph frontend;
Ceph configured to recognize as such
Pass in :host in :fog_credentials, which would be the FQDN of the Ceph frontend
:fog_host, which apparently needs to be the URL for your bucket, e.g. https://bucket.ceph-server.foobar.com.
Paperclip will work out of the box. I don't think that it is documented anywhere that you can use :host but it works.
Our users have two ways of uploading images. One is through a simple HTML form and the other is through an iPhone app called Aurigma. We use Paperclip to process the images and store them on S3. Images that are uploaded with Aurigma end up having the wrong content-type, which causes them to open as an application.
I tried two solutions:
before_save :set_content_type
def set_content_type
self.image.instance_write(:content_type,"image/png")
end
And:
before_post_process :set_content_type
def set_content_type
self.image.instance_write(:content_type, MIME::Types.type_for(self.image_file_name).to_s)
end
It seems as if both solutions are ignored.
Using paperclip version 3.0.2, Aurigma version 1.3 and I'm uploading a screenshot from my iPhone. This is my paperclip configuration:
has_attached_file :image, {
:convert_options => { :all => '-auto-orient' },
:styles => {
:iphone3 => "150x150",
:web => "300x300"
},
:storage => :s3,
:bucket => ENV['S3_BUCKET'],
:s3_credentials => {
:access_key_id => ENV['S3_KEY'],
:secret_access_key => ENV['S3_SECRET']
},
:path => "/pictures/:id/:style.:extension",
:url => "/pictures/:id/:style.:extension"}
}
I just answered a similar question.
You need to do a copy to itself or use a pre-signed url with the content-type specified in the querystring.
Using the AWS SDK for Ruby and url_for:
object = bucket.objects.myobject
url = object.url_for(:read, :response_content_type => "image/png")
As far as I understand first you upload all the files from client devices to your own server (through Aurigma Up) and then these files are uploaded to Amazon S3. I have similar problem trying to change content-type on client device. This is not possible. You should send files on your server and then change content type before uploading files to S3.
Well my problem is that I'm using send_data on my Rails 3 application to send to the user a file from AWS S3 service with something like
Base.establish_connection!( :access_key_id => 'my_key', :secret_access_key => 'my_super_secret_key')
s3File = S3Object.find dir+filename, "my_unique_bucket"
send_data(open(s3File.url).read,:filename=>filename, :disposition => 'attachment')
but seems like the browser is buffering the file and before buffering it sends the file to download taking no time on the download but at the buffering time it's taking as long as the file size .... but what i need is the user to view the download process as normal, they won't know what happening with the loader only on the browsers tab:
They'd rather see a download process i guess to figure out there's something happening there
is there any way i can do this with send_data?
It's not the browser that's buffering/delaying, it's your Ruby server code.
You're downloading the entire file from S3 before sending it back to the user as an attachment.
It may be better to serve this content to your user directly from S3 using a redirect. Here's a link to building temporary access URLs that will allow a download with a given token for a short period of time:
http://docs.amazonwebservices.com/AmazonS3/latest/dev/S3_QSAuth.html
Base.establish_connection!( :access_key_id => 'my_key', :secret_access_key => 'my_super_secret_key')
s3File = S3Object.find dir+filename, "my_unique_bucket"
redirect_to s3File.url(:expires_in => 30)
Set Your Content Disposition
You'll need to set the content-disposition of the S3 url for it download instead of opening up in the browser. Here is my basic implementation:
Think of attachment as your s3file.
In your attachment.rb
def download_url
s3 = AWS::S3.new.buckets[ 'bucket_name' ]
s3.url_for( :read,
expires_in: 60.minutes,
use_ssl: true,
response_content_disposition: "attachment; filename='#{file_name}'" ).to_s
end
In your views
<%= link_to 'Download Avicii by Avicii', attachment.download_url %>
Thanks to guilleva for his guidance.
Keep getting a broken pipe after uploading a mp3 with paperclip to S3. What did i do wrong?
Model
has_attached_file :mp3,
:storage => :s3,
:path => 'mp3/:class/:id/:style.:extension',
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:bucket => 'cobras-production',
:url => ':s3_domain_url'
Controller
def create
#track = Track.new(params[:track])
if #track.save
redirect_to(#track, :notice => 'Track was successfully created.')
else
render :action => "new"
end
end
I think there may be an issue with non us bucket locations.
I have 2 applications set up to run on heroku, and was running into the issue you mention. When i changed my bucket location to US the paperclip lib worked perfectly with exact same file.
Where you using singapore or tokyo as your bucket locaiton.
https://github.com/marcel/aws-s3/issues/#issue/4
this explains the issue better
In my case it was because I chose a new (as of now) AWS region 'Oregon'.
When I switched back to US Standard for my bucket, I had no problems.
It might be worth pointing out that buckets are not created automatically on-demand - you have to create them yourself. If you're using the aws-s3 gem, the command for that is
AWS::S3::Bucket.create("cobras-production")
For future googlers: I had the same issue, the reason was in wrong time on my computer, which was included in request. Amazon's server compared my time and their, which caused an error.