What are some possible implications for Image upload with Paperclip working on my local machine but not when deployed to Heroku?
When its deployed to Heroku, the image won't save.
As far as I know you can't write directly to Heroku's file system, so I am assuming that is your problem. It makes sense to use something like Amazon s3 for image storing. Take a look at this: Amazon S3 in Heroku
Once you have configured your s3, you want to change the paperclip's has_attached_file to something like this:
has_attached_file :my_picture,
:styles => { :medium => "275x275>" },
:storage => :s3, :s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => "user/:attachment/:style/:id.:extension"
Where s3.yml would be the configuration file where you define access keys, buckets...
It should look something like this:
production:
access_key_id: [Your Key]
secret_access_key: [Your Secret]
bucket: [Your bucket name]
Here's another guide/article written by one of Paperclip's developers, it explains in detail how to integrate Paperclip with Heroku and S3
Related
I am trying to host my website that have paperclip attachment images on aws-s3 with fog gem. But my fog directory takes the wrong path but it appends my local file system path with it.
this is my code
class RealEstate < ActiveRecord::Base
has_attached_file :image,
:storage => :fog,
:fog_credentials => "#{Rails.root}/config/s3.yml",
:fog_directory => "#{Rails.root}/config/fog.yml"
end
if i define the bucket name here only then it would work but then it would not be able to use different bucket for different env
:fog_directory => "development_bucket_name" #works fine but cant use different bucket for different env
this is my fog.yml
development:
fog_directory: development_bucket
staging:
fog_directory: testing_bucket
production:
fog_directory: production_bucket
the path it creates is:
https://s3.amazonaws.com//home/Desktop//config/fog.yml/real_estate/image/000/000/185/original/4bec7.png?1396429186
Paperclip has no idea that the string you're passing is a path to a config file - it's expecting the actual bucket name.
You need to parse the yaml file and extract the bucket name from it. For example
directories = YAML.load(File.read(Rails.root.join('config', 'fog.yml')))
has_attached_file :image,
:storage => :fog,
:fog_credentials => "#{Rails.root}/config/s3.yml",
:fog_directory => directories[Rails.env]['fog_directory']
How is it possible to allow users to upload images on a website, but the actual uploading is done completely on amazon's servers (so as to not burden your own servers with upload throughput).
Can someone explain how this is performed?
i.e. a user wants to upload an image, instead of streaming the file to my server, and then from my server to amazon's s3 service, it bypasses my server altogether and sends it to amazon.
You can check out these docs provided by Amazon.
You can implement the process by using a SWF uploader, or this gem.
CarrierWave can be used with CarrierWaveDirect to upload images directly to S3. This will also allow you to process the image in a background job.
However, if you want to completely eliminate both the upload and processing burden from your dynos, check out Cloudinary which is unique in that it does all image processing on their servers as well as providing storage for them.
if your using paperclip cant you just do the following?
create a s3.yml file in config
development:
bucket: bucket-dev
access_key_id: xxx
secret_access_key: xxx
test:
bucket: bucket-test
access_key_id: xxx
secret_access_key: xxx
production:
bucket: bucket-pro
access_key_id: xxx
secret_access_key: xxx
#paperclip
has_attached_file :photo,
:styles => {
:thumb=> "100x100#",
:small => "400x400>" },
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => "/:style/:id/:filename"
Cant find a way for S3 to work with spree. There seem to exist few gems for that but dont seem to work for me.
Running rails 3.1.1 with spree 0.70.3.
I am running rails 3.0.10 and spree 0.60 and was able to get spree to use s3 storage over writing to the public folder of the app by doing the following The process should be alike.
add aws-s3 gem to your Gemfile
gem 'aws-s3'
bundle installed and after doing that I created a yaml file in the config directory called s3.yml and it should look something like this.
development: &DEFAULTS
bucket: "YOUR_BUCKET"
access_key_id: "YOUR_ACCESS_KEY"
secret_access_key: "YOUR_ACCESS_SECRET"
test:
<<: *DEFAULTS
bucket: "YOUR_BUCKET"
production:
<<: *DEFAULTS
bucket: "YOUR_BUCKET"
You can specify individual credentials per environment if you like but since mine are all using the same S3 accont I opted to set defaults.
after that you are going to have to override the image model or make a decorator for your which tells paperclip to use S3 and to have it parse the yaml file created for credentials.
the area you want want to override would be this
has_attached_file :attachment,
:styles => {:mini => '48x48>', :small => '200x100>', :product => '240x240>', :large => '600x600>'},
:default_style => :small,
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:url => "/assets/products/:id/:style/:basename.:extension",
:path => ":rails_root/public/assets/products/:id/:style/:basename.:extension"
you can change these properties as needed but whats important is that you specify :storage and :s3_credentials.
In the current version of Spree, you can set these values in the admin tools. But if you prefer to maintain it in code but without overriding the Image model, you can set these values in config/initializers/spree.rb. Make sure not to edit them via the admin portal.
S3_CONFIG = YAML.load_file("#{Rails.root}/config/s3.yml")[Rails.env]
Spree.config do |config|
config.attachment_styles = ActiveSupport::JSON.encode({
"mini" => "100x100>",
"small" => "200x200>",
"medium" => "400x600>",
"product" => "400x600>",
"large" => "600x600>",
"xl" => "800x800>",
"xxl" => "1200x1200>",
})
#AWS S3
config.use_s3 = true
config.s3_bucket = S3_CONFIG['bucket']
config.s3_access_key = S3_CONFIG['access_key_id']
config.s3_secret = S3_CONFIG['secret_access_key']
config.attachment_url = 'products/:id/:style/:basename.:extension'
config.attachment_path = 'products/:id/:style/:basename.:extension'
end
You can also try the BitNami Spree AMIs at http://bitnami.org/stack/spree. Regards.
I'm pretty new to ROR. I've recently deployed an app on heroku and have tried to add an attachment function to the app via paperclip.
I've followed all the steps in adding aws-s3 to my app. Here was my initial code:
user.rb (model)
has_attached_file :avatar,
:styles => {:small => "70x70>"},
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => ":attachment/:id/:style/:basename.:extension"
validates_attachment_size :avatar, :less_than => 1.megabytes
validates_attachment_content_type :avatar, :content_type => ['image/jpeg', 'image/png']
s3.yml (file is located in config folder) note: all of these buckets exist on my aws-s3
development:
bucket: my_avatar-dev
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
test:
bucket: myapp_avatar-test
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
production:
bucket: myapp_avatar-pro
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
gemfile
gem 'aws-s3'
When running this configuration, I would get a error page 500 error when loading my app. Running Heroku logs showed the following error: AWS::S3::MissingAccessKey (You did not provide both required access keys.
So I followed some advice and defined the key and secret_key as environment variables to heroku, using the following line of code:
heroku config:add S3_KEY=amazonaccesskey S3_SECRET=amazon_secret_key
I then added an initializer to test environments and launch via key or .yml file depending on environment, code is as follows:
initializers/s3.rb
if Rails.env == "production"
# set credentials from ENV hash
S3_CREDENTIALS = { :access_key_id => ENV['S3_KEY'], :secret_access_key => ENV['S3_SECRET'], :bucket => "myapp_avatar-pro"}
else
# get credentials from YML file
S3_CREDENTIALS = Rails.root.join("config/s3.yml")
end
user.rb model was then update to the following:
has_attached_file :avatar, :storage => :s3, :s3_credentials => S3_CREDENTIALS
I then deployed to heroku and tested the app, but I still keep getting the same error (page 500) and error code: AWS::S3::MissingAccessKey (You did not provide both required access keys.
How is this possible if I have defined the variables in heroku? Is there something I am missing? Is it possible it's something with the gem? Also, I'm using HAML for styling... not sure that matters at all, but just in case it does. I'm quite lost, so any help would be greatly appreciated. Thank you so much!
Having just worked through the same problem and trawling a number of similar posts. I found that any of the possible configurations in the above answer i.e. declaring all of the hashes in the model, using the .yml or using the initializer all work fine from my dev and on heroku as long as the S3 bucket is of US Standard type> The choice is just about how DRY you want to be.
When I originally set S3 up, I used a European bucket. This gave me the spurious error message:
AWS::S3::MissingAccessKey (You did not provide both required access keys.
I note from the AWS site : http://docs.amazonwebservices.com/general/latest/gr/index.html?rande.html
that AWS uses a specific endpoint address for each region to reduce latency and am guessing (because I am a novice coder) that the US standard is either a default or coded into the AWS-S3 plugin. (Maybe someone can edit this up into a more complete answer?)
I solved this problem with this:
:s3_credentials => {
:access_key_id => 'mykey',
:secret_access_key => 'mykey'
First off thanks so much for taking the time to read my question. Thank you! It appears that i'm having trouble implementing aws s3 on my webapp. I have a ROR app deployed o heroku and I'd like to allow users to upload a profile picture to their profile.
I've installed imagemagick and paperclip to handle the attachments. Then someone informed me that heroku does not accept uploads and that I'll need to subscribe to aws-s3. That made sense. So I signed up for aws and added the following code to my project:
user.rb (model)
has_attached_file :avatar,
:styles => {:small => "70x70>"},
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => ":attachment/:id/:style/:basename.:extension"
validates_attachment_size :avatar, :less_than => 1.megabytes
validates_attachment_content_type :avatar, :content_type => ['image/jpeg', 'image/png']
s3.yml (file is located in config folder) note: all of these buckets exist on my aws-s3
development:
bucket: my_avatar-dev
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
test:
bucket: myapp_avatar-test
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
production:
bucket: myapp_avatar-pro
access_key_id: amazonaccesskey
secret_access_key: amazon_secret_access_key
gemfile
gem 'aws-s3'
When I run the application in the development environment (localhost) everything functions properly... I checked AWS and my uploads appear there; however, when I deploy my app to heroku it breaks. To elaborate, the app loads the sign-in screen but as soon as a user signs in, the application breaks and redirects to the error 500 page: "We're sorry, but something went wrong. We've been notified about this issue and we'll take a look at it shortly."
When I hide the following code (and replace :path with some other value) and re-deploy... the app loads, but obviously it lacks the functionality it needs to redirect to aws-s3.
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => ":attachment/:id/:style/:basename.:extension"
As I mentioned, I'm pretty new to rails... so I'm not sure what I'm doing wrong. Was I supposed to link s3.yml somewhere else, a route or something? Maybe it's something that I need to do while deploying? I'd like to thank anyone who can help me, and I am thankful for your time!