Store file to amazon s3 rails? - ruby-on-rails

I have a working heroku app. But since heroku doesn't provide persistent file storage, I'd like to use amazon s3.
I found heroku tutorial https://devcenter.heroku.com/articles/s3
But it seems confusing to me and may be a little bit complicated.
Right now I use carrierwave gem for storing files.
So may be you can give me a small and simple example of code u use to store files to amazon file storage?
UPDATE:
I found this code(is it that simple just a few lines in carrierwave, add gem called fog and that's all? or will I need something else?):
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => "YOUR AMAZON ACCESS KEY",
:aws_secret_access_key => "YOUR AMAZON SECRET KEY",
:region => 'us-west-1' # Change this for different AWS region. Default is 'us-east-1'
}
config.fog_directory = "bucket-name"
end

Related

Upload files to amazon aws3 rails with carrierwave error hostname does not match the server certificate

I followed this tutorial:
http://lifesforlearning.com/uploading-images-with-carrierwave-to-s3-on-rails/
I had working carrierwave uploader which was storing files to disk space
What I did step by step:
1)added fog gem and run bundle install and bundle update
2)in config/initializers I created r3.rb file with this:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'mykey',
:aws_secret_access_key => 'mysecretkey',
:region => 'us-west-2' # Change this for different AWS region.
}
config.fog_directory = "bucket-main"
end
I ran rails s and tried to save some photo. But as you can see on the picture my bucket is empty.So they must be stored to my disk.
What do I do now?
Update I changed storage to fog.
Here is my photouploader class code:
# encoding: utf-8
class PhotoUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
And now I get this error:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not
match the server certificate (OpenSSL::SSL::SSLError)
i eventually solved my problem by updating
bundle update fog
and
bundle update carrierwave
Try adding path_style to your config and the fog_directory
config.fog_credentials = {
...
:path_style => true
}
config.fog_directory = 'bucket-main'
I just spent a few hours tracking down the cause of this error, which I was also getting:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not match the server certificate (OpenSSL::SSL::SSLError)
The odd thing is how the bucket name is repeated twice in the hostname. It turned out I had configured the wrong region name. Notice in your config.fog_credentials you have
:region => 'us-west-2'
...but the hostname in the exception has s3-us-west-1? If your bucket is in one AWS region, but you configure a different region in your Fog credentials, Fog will try to follow a redirect from AWS, and somehow the bucket name gets doubled up in this situation. Fog produces a warning about the redirect, but Carrierwave ends up hiding this from you.
Set :region in your Fog credentials to where the bucket actually is in AWS, and the does not match the server certificate exception will stop happening.

how to Make file uploaded to s3 private

I have a rails app in which employers can upload files for a freelancer to work on. i am using amazon s3 to store the files. The problem is that amazon s3 assigns the file a url that if some has has, they can access the file. Employers will often upload private files that only the freelancer should be able to see. How do I make it so when an employer uploads a file, only the freelancer can see it?
Here is the file uploader code:
CarrierWave.configure do |config|
config.storage = :fog
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key => ENV['AWS_ACCESS'],
:aws_secret_access => ENV['AWS_SECRET']
}
config.fog_directory = ENV['S_BUCKET']
end
Use the config.fog_public = false option to make the files private. And fog_authenticated_url_expiration (time in seconds) to add a TTL to each file URL. See the fog Storage module for more info: https://github.com/carrierwaveuploader/carrierwave/blob/master/lib/carrierwave/storage/fog.rb

403 Error When Uploading Image on Heroku to S3 with Carrierwave and Fog

Everything is working as expected locally. Once I push to heroku I can no longer upload images.
The error code I get from heroku logs is:
Excon::Errors::Forbidden (Expected(200) <=> Actual(403 Forbidden)
The XML response contains: <Code>AccessDenied</Code><Message>Access Denied</Message>
My fog.rb:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => ENV["ACCESS_KEY_ID"],
:aws_secret_access_key => ENV["SECRET_ACCESS_KEY"]
#:region => 'eu-west-1'
}
#Required for Heroku
config.cache_dir = "#{Rails.root}/tmp/uploads"
config.fog_directory = ENV["BUCKET_NAME"]
end
My Uploader:
class ImageUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
Heroku has the correct environment variables - I used the figaro gem. I also set them manually after I got the 403 the first few times to make sure figaro had no errors.
I thought this may be a problem with the region but my bucket is US and carrierwave documentation says the default is us-east-1
What is causing the issue on Heroku but not locally?
Forbidden may mean an issue with the configured directory (rather than the other credentials). Are you using the same BUCKET_NAME value both locally and on Heroku? I know I've certainly tried to use things with a different bucket that I had not yet created (which might also given this error). So checking the value is what you expect (and that the bucket already exists) are a couple good starting points. Certainly happy to discuss and continue helping if that doesn't solve it for you though.

carrierwave not recognizing keys

I just pulled an older version of a project I'm working on in rails from github and am having problems with carrierwave for image upload. I used the figaro gem to store my secret keys so they were not in the file I pulled down (figaro puts an application.yml file that then gets listed in .gitignor). So I added the figaro configuration, but carrierwave still refuses to work. I even tried putting the keys into the carrierwave configuration directly to see if it was something with figaro, but no luck.
My config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_KEY'],
:region => 'us-east-1',
}
config.fog_directory = 'bucketname'
config.fog_public = true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}
end
I am pretty sure that my keys are being stored properly in my development environment, but I have no idea why carrierwave is not working like it did before.
Thanks!

carrierwave + amazon s3 socket error

My problem is really similar to this one:
rails + carrierwave + fog + S3 socket error
I was experimenting with regions but without a luck.
I'm trying to use carrierwave + fog + amazon s3 as described on wiki and in few similar questions here on stackoverflow. So, here are my files:
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_directory = 'folder_name_here' # my bucket name
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'access_key_here', # copied from amazon
:aws_secret_access_key => 'secret_key_here', # copied from amazon
:region => 'eu-west-1' # my bucket has region set to Ireland
}
end
My uploader has a storage parameter set to :fog.
And now, when i'll try to save my model, i'm getting the following error:
getaddrinfo: Name or service not known (SocketError)
Have you any idea what am i possibly doing wrong? If any further info is needed, please just let me know.
Did you restart your server when adding in the carrierwave and fog gem?

Resources