Carrierwave getaddrinfo: Name or service not known - ruby-on-rails

I'm currently fighting to get S3 uploads to work via Carrierwave, Carrierwave-aws & Figaro.
But I keep getting
SocketError in OffersController#create
getaddrinfo: Name or service not known
I've tried changing asset host to '127.0.0.1' still appears to produce this error.
carrierwave.rb
CarrierWave.configure do |config|
config.storage = :aws
config.aws_bucket = ENV.fetch('S3_BUCKET_NAME')
config.aws_acl = 'public-read'
# Optionally define an asset host for configurations that are fronted by a
# content host, such as CloudFront.
config.asset_host = 'localhost'
# The maximum period for authenticated_urls is only 7 days.
config.aws_authenticated_url_expiration = 60 * 60 * 24 * 7
# Set custom options such as cache control to leverage browser caching
config.aws_attributes = {
expires: 1.week.from_now.httpdate,
cache_control: 'max-age=604800'
}
config.aws_credentials = {
access_key_id: ENV.fetch('AWS_ACCESS_KEY_ID'),
secret_access_key: ENV.fetch('AWS_SECRET_ACCESS_KEY'),
region: ENV.fetch('AWS_REGION') # Required
}
end
gemfile
# Figaro
gem "figaro"
# Carrierwave Integration
gem 'carrierwave'
# Carrierwave AWS
gem 'carrierwave-aws'
Any help on this would be fantastic.

Try remove config.asset_host = 'localhost' from your CarrierWave.configure. It's optional and mainly use to set third party assets path like cloudfront.
SO remove config.asset_host = 'localhost' and you are done.

Related

Retrieving Carrierwave images with S3 signature ultra SLOW

I'm using Carrierwave + S3 to store my record images in S3.
The problem is when I retrieve 250 records from a JSON file, loading gets ULTRA slow because it needs to sign individually each version of each image of each record:
"url":"https://xxxxxx.s3.eu-west-3.amazonaws.com/uploads/product/images/1156/1.jpg?X-Amz-Algorithm=AWS4-HMAC-SHA256\u0026X-Amz-Credential=xxxxx-west-3%2Fs3%2Faws4_request\u0026X-Amz-Date=20220921T134602Z\u0026X-Amz-Expires=604800\u0026X-Amz-SignedHeaders=host\u0026X-Amz-Signature=xxxxxx
¿How can I retrieve images without needing to sign each one in a fast way?
My Carrierwave config file:
CarrierWave.configure do |config|
config.storage = :aws
config.aws_bucket = ENV['S3_BUCKET_NAME'] # for AWS-side bucket access permissions config, see section below
config.aws_acl = 'private'
# Optionally define an asset host for configurations that are fronted by a
# content host, such as CloudFront.
# The maximum period for authenticated_urls is only 7 days.
config.aws_authenticated_url_expiration = 60 * 60 * 24 * 7
# Set custom options such as cache control to leverage browser caching.
# You can use either a static Hash or a Proc.
config.aws_attributes = -> { {
expires: 1.week.from_now.httpdate,
cache_control: 'max-age=604800'
} }
config.aws_credentials = {
access_key_id: ENV['AWS_ACCESS_KEY_ID'],
secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],
region: ENV['AWS_REGION'], # Required
stub_responses: Rails.env.test? # Optional, avoid hitting S3 actual during tests
}
# Optional: Signing of download urls, e.g. for serving private content through
# CloudFront. Be sure you have the `cloudfront-signer` gem installed and
# configured:
# config.aws_signer = -> (unsigned_url, options) do
# Aws::CF::Signer.sign_url(unsigned_url, options)
# end
end

Carrierwave JSON response contains relative url but I want absolute url

Carrierwave is returning a JSON response like this:
"url": "/mys3bucket/uploads/entrees/photo/32/4c312e9aed37a59319096a03_1.jpg",
I need the absolute url. Images are hosted on Amazon S3. How can I get the absolute url?
My temporary hack is to add following to Carrierwave initializer:
config.asset_host = "s3.#{ENV.fetch('AWS_REGION')}.amazonaws.com/mybucket"
CarrierWave uses the combination of the filename and the settings
specified in your uploader class to generate the proper URL. This
allows you to easily swap out the storage backend without making any
changes to your core application.
That said, you cannot store the full URL. You can set CarrierWave's asset_host config setting that is based on envrionment.
What storage are you using on Production? Here is my configuration and It works very well. Hope it helps.
CarrierWave.configure do |config|
config.root = Rails.root
if Rails.env.production?
config.storage = :fog
config.fog_credentials = {
provider: "AWS",
aws_access_key_id: ENV["AWS_ACCESS_KEY_ID"],
aws_secret_access_key: ENV["AWS_SECRET_ACCESS_KEY"],
region: ENV["S3_RESION"]
}
config.fog_directory = ENV["S3_BUCKET_NAME"]
# config.asset_host = ENV["S3_ASSET_HOST"]
else
config.storage = :file
# config.asset_host = ActionController::Base.asset_host
end
end

Heroku asset_sync gem not uploading to s3

My asset_sync gem does not upload to s3 when I precompile assets
asset_sync.rb
if defined?(AssetSync)
AssetSync.configure do |config|
config.fog_provider = 'AWS'
config.aws_access_key_id = ENV['AWS_ACCESS_KEY_ID']
config.aws_secret_access_key = ENV['AWS_SECRET_ACCESS_KEY']
# To use AWS reduced redundancy storage.
# config.aws_reduced_redundancy = true
config.fog_directory = ENV['FOG_DIRECTORY']
# Invalidate a file on a cdn after uploading files
# config.cdn_distribution_id = "12345"
# config.invalidate = ['file1.js']
# Increase upload performance by configuring your region
config.fog_region = 'ap-southeast-2'
#
# Don't delete files from the store
# config.existing_remote_files = "keep"
#
# Automatically replace files with their equivalent gzip compressed version
# config.gzip_compression = true
#
# Use the Rails generated 'manifest.yml' file to produce the list of files to
# upload instead of searching the assets directory.
# config.manifest = true
#
# Fail silently. Useful for environments such as Heroku
# config.fail_silently = true
end
end
production.rb
config.assets.enabled = true
config.assets.digest = true
config.action_controller.asset_host = "//#{ENV['FOG_DIRECTORY']}.s3.amazonaws.com"
config.assets.initialize_on_precompile = true
application.rb
config.assets.enabled = true
config.assets.digest = true
When I precompile, I'm not even getting any message to show it is uploading.
Is there a reason why this is happening?
may be you have:
group :assets do
gem 'asset_sync'
end
you should put that gem out from that group
I know this issue is old... but I just inherited a Heroku app and am reading how to asset_sync with Heroku and came across this, which states ENV variables are not available when compiling assets on Heroku.

asset_sync not uploading to S3

I'm have trouble uploading my assets to S3 with asset_sync
production.rb
config.action_controller.asset_host = "//#{ENV['FOG_DIRECTORY']}.s3-eu-west-1.amazonaws.com"
config.assets.digest = true
config.assets.enabled = true
config.assets.initialize_on_precompile = true
asset_sync.rb
if defined?(AssetSync)
AssetSync.configure do |config|
config.fog_provider = 'AWS'
config.aws_access_key_id = ENV['AWS_ACCESS_KEY_ID']
config.aws_secret_access_key = ENV['AWS_SECRET_ACCESS_KEY']
config.fog_directory = ENV['FOG_DIRECTORY']
config.fog_region = 'eu-west-1'
end
end
heroku config
AWS_ACCESS_KEY_ID: XXX
AWS_SECRET_ACCESS_KEY: XXX
FOG_DIRECTORY: bucket_name
FOG_PROVIDER: AWS
FOG_REGION: 'eu-west-1'
$export
declare -x AWS_ACCESS_KEY_ID= XXX
declare -x AWS_SECRET_ACCESS_KEY= XXX
declare -x FOG_DIRECTORY="bucket_name"
declare -x FOG_PROVIDER="AWS"
http://blog.firmhouse.com/complete-guide-to-serving-your-rails-assets-over-s3-with-asset_sync
ones push on heroku assets points to //bucket_name.s3-eu-west-1.amazonaws.com/assets/icons/name_xxxxxxxxxx.png
and when running $rake assets:precompile the files dose not get uploaded to S3 and gets only precompile locally. Any idea ? Thanks a lot.
EDIT:
I've just changed the Gemfile from :
group :assets do
gem 'asset_sync'
end
to the global gems
gem 'asset_sync'
and now I having warning message [WARNING] fog: followed redirect to bucket_name.s3-external-3.amazonaws.com, connecting to the matching region will be more performant
I think I can figure this out, but only the css files gets uploaded.not the js file and images.
Your bucket_name needs to be the bucket you have on S3
You should change these commands:
Heroku
FOG_DIRECTORY: your_bucket_real_name
Local
declare -x FOG_DIRECTORY=your_bucket_real_name
Also, you should change this in your production.rb file:
config.action_controller.asset_host = "//#{ENV['FOG_DIRECTORY']}.s3.amazonaws.com"
I think that will resolve your issue. I'm using S3 on EU-West with exactly the same setup (bar the differences I've cited), and it's working in the most part :)

Carrierwave Upload with Amazon S3 - 403 Forbidden Error

I am attempting to use Carrierwave with Amazon S3 in my Rails app, and I keep getting the error
"Excon::Errors::Forbidden (Expected(200) <=> Actual(403 Forbidden)."
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.
I also receive the warning
"[WARNING] fog: the specified s3 bucket name() is not a valid dns name, which will negatively impact performance. For details see: http://docs.amazonwebservices.com/AmazonS3/latest/dev/BucketRestrictions.html"
config/initializers/carrierwave.rb:
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV["AWS_ACCESS_KEY_ID"],
aws_secret_access_key: ENV["AWS_ACCESS_KEY"]
}
config.fog_directory = ENV["AWS_BUCKET"]
end
My bucket name is "buildinprogress"
I've double checked that my access key ID and access key are correct.
How can I fix this error??
It is a problem with Fog/Excom that kept throwing random errors for me too.
My fix was to remove gem 'fog' and replace it with gem 'carrierwave-aws' instead.
Then, in your *_uploader.rb change
storage :fog ---> storage :aws
and update your carrierwave.rb file Ex.:
CarrierWave.configure do |config|
config.storage = :aws # required
config.aws_bucket = ENV['S3_BUCKET'] # required
config.aws_acl = :public_read
config.aws_credentials = {
access_key_id: ENV['S3_KEY'], # required
secret_access_key: ENV['S3_SECRET'] # required
}
config.aws_attributes = {
'Cache-Control'=>"max-age=#{365.day.to_i}",
'Expires'=>'Tue, 29 Dec 2015 23:23:23 GMT'
}
end
For more info check out the carrierwave-aws GitHub page

Resources