Fog issue using iam profile and fetching urls from aws - ruby-on-rails

Using Fog w/ AWS instance profiles and after 3 day my s3 urls are no longer working. I'm getting fresh urls, but the error returned from AWS is The provided token has expired. Restarting the application gets everything working again, but no errors other than the one from AWS are present.
I have read that switching to keys should fix my issue, but I was hoping to keep my iam profile. Has anyone run into this?
my Carrierwave config is bellow and I am using Carrierwave version 0.9.0 and Fog 1.28.0
CarrierWave.configure do |config|
fog_credentials = {
:provider => 'AWS',
:region => 'us-east-1',
:path_style => true,
:host => 's3-external-1.amazonaws.com' # routes all requests to Northern Virginia datacenter
}
if defined?(Settings.use_iam_profile) && Settings.use_iam_profile
fog_credentials[:use_iam_profile] = true
else
fog_credentials[:aws_access_key_id] = Settings.s3_access_key
fog_credentials[:aws_secret_access_key] = Settings.s3_secret_key
end
config.fog_credentials = fog_credentials
config.fog_directory = Settings.s3_bucket_name # required
config.fog_public = false # optional, defaults to true
config.root = File.join(Rails.root, 'private')
end

So this link got updated, basically the issue was While the signing token was being refreshed correctly when downloading files with fog, it wasn't being refreshed when signing an S3 URL.
There was a pull request made on fog to fix this issue

Related

Upload files to amazon aws3 rails with carrierwave error hostname does not match the server certificate

I followed this tutorial:
http://lifesforlearning.com/uploading-images-with-carrierwave-to-s3-on-rails/
I had working carrierwave uploader which was storing files to disk space
What I did step by step:
1)added fog gem and run bundle install and bundle update
2)in config/initializers I created r3.rb file with this:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'mykey',
:aws_secret_access_key => 'mysecretkey',
:region => 'us-west-2' # Change this for different AWS region.
}
config.fog_directory = "bucket-main"
end
I ran rails s and tried to save some photo. But as you can see on the picture my bucket is empty.So they must be stored to my disk.
What do I do now?
Update I changed storage to fog.
Here is my photouploader class code:
# encoding: utf-8
class PhotoUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
And now I get this error:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not
match the server certificate (OpenSSL::SSL::SSLError)
i eventually solved my problem by updating
bundle update fog
and
bundle update carrierwave
Try adding path_style to your config and the fog_directory
config.fog_credentials = {
...
:path_style => true
}
config.fog_directory = 'bucket-main'
I just spent a few hours tracking down the cause of this error, which I was also getting:
hostname "bucket-main.bucket-main.s3-us-west-1.amazonaws.com" does not match the server certificate (OpenSSL::SSL::SSLError)
The odd thing is how the bucket name is repeated twice in the hostname. It turned out I had configured the wrong region name. Notice in your config.fog_credentials you have
:region => 'us-west-2'
...but the hostname in the exception has s3-us-west-1? If your bucket is in one AWS region, but you configure a different region in your Fog credentials, Fog will try to follow a redirect from AWS, and somehow the bucket name gets doubled up in this situation. Fog produces a warning about the redirect, but Carrierwave ends up hiding this from you.
Set :region in your Fog credentials to where the bucket actually is in AWS, and the does not match the server certificate exception will stop happening.

Store file to amazon s3 rails?

I have a working heroku app. But since heroku doesn't provide persistent file storage, I'd like to use amazon s3.
I found heroku tutorial https://devcenter.heroku.com/articles/s3
But it seems confusing to me and may be a little bit complicated.
Right now I use carrierwave gem for storing files.
So may be you can give me a small and simple example of code u use to store files to amazon file storage?
UPDATE:
I found this code(is it that simple just a few lines in carrierwave, add gem called fog and that's all? or will I need something else?):
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => "YOUR AMAZON ACCESS KEY",
:aws_secret_access_key => "YOUR AMAZON SECRET KEY",
:region => 'us-west-1' # Change this for different AWS region. Default is 'us-east-1'
}
config.fog_directory = "bucket-name"
end

403 Error When Uploading Image on Heroku to S3 with Carrierwave and Fog

Everything is working as expected locally. Once I push to heroku I can no longer upload images.
The error code I get from heroku logs is:
Excon::Errors::Forbidden (Expected(200) <=> Actual(403 Forbidden)
The XML response contains: <Code>AccessDenied</Code><Message>Access Denied</Message>
My fog.rb:
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => ENV["ACCESS_KEY_ID"],
:aws_secret_access_key => ENV["SECRET_ACCESS_KEY"]
#:region => 'eu-west-1'
}
#Required for Heroku
config.cache_dir = "#{Rails.root}/tmp/uploads"
config.fog_directory = ENV["BUCKET_NAME"]
end
My Uploader:
class ImageUploader < CarrierWave::Uploader::Base
storage :fog
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
Heroku has the correct environment variables - I used the figaro gem. I also set them manually after I got the 403 the first few times to make sure figaro had no errors.
I thought this may be a problem with the region but my bucket is US and carrierwave documentation says the default is us-east-1
What is causing the issue on Heroku but not locally?
Forbidden may mean an issue with the configured directory (rather than the other credentials). Are you using the same BUCKET_NAME value both locally and on Heroku? I know I've certainly tried to use things with a different bucket that I had not yet created (which might also given this error). So checking the value is what you expect (and that the bucket already exists) are a couple good starting points. Certainly happy to discuss and continue helping if that doesn't solve it for you though.

carrierwave not recognizing keys

I just pulled an older version of a project I'm working on in rails from github and am having problems with carrierwave for image upload. I used the figaro gem to store my secret keys so they were not in the file I pulled down (figaro puts an application.yml file that then gets listed in .gitignor). So I added the figaro configuration, but carrierwave still refuses to work. I even tried putting the keys into the carrierwave configuration directly to see if it was something with figaro, but no luck.
My config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_KEY'],
:region => 'us-east-1',
}
config.fog_directory = 'bucketname'
config.fog_public = true
config.fog_attributes = {'Cache-Control'=>'max-age=315576000'}
end
I am pretty sure that my keys are being stored properly in my development environment, but I have no idea why carrierwave is not working like it did before.
Thanks!

carrierwave + amazon s3 socket error

My problem is really similar to this one:
rails + carrierwave + fog + S3 socket error
I was experimenting with regions but without a luck.
I'm trying to use carrierwave + fog + amazon s3 as described on wiki and in few similar questions here on stackoverflow. So, here are my files:
config/initializers/carrierwave.rb
CarrierWave.configure do |config|
config.fog_directory = 'folder_name_here' # my bucket name
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'access_key_here', # copied from amazon
:aws_secret_access_key => 'secret_key_here', # copied from amazon
:region => 'eu-west-1' # my bucket has region set to Ireland
}
end
My uploader has a storage parameter set to :fog.
And now, when i'll try to save my model, i'm getting the following error:
getaddrinfo: Name or service not known (SocketError)
Have you any idea what am i possibly doing wrong? If any further info is needed, please just let me know.
Did you restart your server when adding in the carrierwave and fog gem?

Resources