Why is ActiveStorage with S3 raising InvalidBucketName - ruby-on-rails

I've just added ActiveStorage to a Rails app using S3. My first use of AS on a staging env raises
Aws::S3::Errors::InvalidBucketName: The specified bucket is not valid.
The bucket already exists on S3 and ActiveStorage has the correct bucket name, region, etc. defined in storage.yml.
ActiveStorage has been given AWS IAM credentials that have full S3 access.
The bucket has been configured for CORS
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
Why am I getting an InvalidBucketName error? What other things should I look at?
#storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV.fetch('AWS_ACCESS_KEY_ID') %>
secret_access_key: <%= ENV.fetch('AWS_SECRET_ACCESS_KEY') %>
region: us-east-1
bucket: <%= ENV.fetch('AWS_S3_BUCKET') %>

Related

Can't retrieve file from S3 with Rails Active Storage

I am trying to upload a photo to S3 using Rails Active Storage.
I can attach a photo:
user.photo.attach(io: File.open('spec/images/filename.png'), filename: 'filename.png')
I can save the user and I can get the photo service url and see it in the browser, and in my bucket:
user.photo.service_url
However, if I restart my console and try to get the service url, I receive the following error:
Module::DelegationError (service_url delegated to attachment, but attachment is nil)
Here are my bucket settings:
storage.yml:
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
application.rb:
config.active_storage.service = :amazon
user.rb:
has_one_attached :photo
I am also having trouble using public: true in the storage.yml file.
I receive the following error if I try to set the config:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
public: true
I also wanted to upload my files to AWS S3, and have them publically available.
I ran into this issue as well and found the following:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
comes from this file in the aws-sdk-ruby gem. As per the error message, the aws-sdk-ruby gem does not support the public: true option.
I used the following work-around (special thanks to this article):
I updated my storage.yml to:
public_amazon:
service: S3
access_key_id: some_key
secret_access_key: some_secret
bucket: some-bucket-name
region: some-region
upload:
acl: "public-read"
The above sets the uploaded file permissions to be public.
Retrieve the public URL like this:
user.photo.attach(params[:file])
url = user.photo.service.send(:object_for, user.photo.key).public_url

How to Setup Transfer Acceleration for S3 on Ruby on Rails Active Storage

Transfer Accelerated S3 bucket allows you to upload/download faster. The setup should be simple in theory:
The endpoint changes from:
mybucket.s3.us-east-1.amazonaws.com
to:
mybucket.s3-accelerate.amazonaws.com
In Ruby on Rails config/storage.yml the environment variables look like this:
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: <%= ENV['AWS_REGION'] %>
bucket: <%= ENV['AWS_BUCKET'] %>
The problem is the pattern is different, so I can't just change the AWS_REGION. There is an extra .s3 in there.
I wish to learn how to implement S3 transfer acceleration with ROR and Active Storage?
I had to add this below the bucket...:
use_accelerate_endpoint: true

How to change watir chrome pref default_directory to amazon s3?

storage.yml
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: <%= ENV['AWS_REGION'] %>
bucket: <%= ENV['S3_BUCKET_NAME'] %>
confi/enviroments/development.rb
-----------------------------------------------
# Store uploaded files on the local file system (see config/storage.yml for options)
config.active_storage.service = :amazon
Watir Chrome Downaload
-----------------------------------------
prefs = {
download: {
prompt_for_download: false,
default_directory: '/path/to/dir'
}
}
b = Watir::Browser.new :chrome, options: {prefs: prefs}
**My query how to do i set default_directory ro amazon in pref for watir chrom download
**
is this correct --
prefs = {
download: {
prompt_for_download: false,
default_directory: amazon
}
}
Need help :)
enter link description here
If you specify the directory name then it will create the directory in your current directory. Give the full path there. It will work perfectly. I am using this and it's working perfectly for me.

Using multiple buckets with ActiveStorage

Does anyone know if there is a way to configure custom buckets for specific attachments?
class MyModel < ApplicationRecord
...
has_one_attached :logo, bucket: 'custom_bucket'
...
end
Although there isn't a way to use specific "buckets", one can pretty easily add multiple active storage configurations for multiple buckets (I believe introduced in v6.1):
https://edgeguides.rubyonrails.org/active_storage_overview.html#attaching-files-to-records
For example, you might have a "amazon_s3_cold" and an "amazon_s3_hot", they will have all the same configurations aside from the bucket. You may then configure your buckets accordingly on AWS.
# config/storage.yml
amazon_s3_hot:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-1
bucket: my_hot_bucket
amazon_s3_cold:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-1
bucket: my_cold_bucket
# controllers
class User < ApplicationRecord
has_one_attached :avatar, service: :amazon_s3_hot
end
class DocumentRecord < ApplicationRecord
has_one_attached :document_upload, service: :amazon_s3_cold
end
Note - hot/cold doesn't apply to the question directly, but provides some context. Hot/cold storage is a concept pertaining to cloud storage services that trades off costs for access frequencies.
You could follow a similar pattern to how a traditional database.yml file inherits settings which is just YML variables. My storage.yml file looks somewhat like this which allows me to store each Active Storage attachment type in their own folder.
The S3 provider which is what powers the DO provider requires a bucket name which I've just specified as 'default' but you could call it 'all' or 'general' and then override only the ones you care about.
(storage.yml)
do: &do
service: S3
endpoint: <%= Rails.application.credentials.dig(:digitalocean, :endpoint) %>
access_key_id: <%= Rails.application.credentials.dig(:digitalocean, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:digitalocean, :secret_access_key) %>
region: 'nyc3'
bucket: default
do_user_uploads:
<<: *do
bucket: user_uploads
(user.rb)
has_one_attached :upload, service: :do_user_uploads
Hope that helps, I came here looking for the same answer!
There isn’t, sorry. Active Storage is designed for use with a single bucket.

Rails 4 Dragonfly Gem with Amazon S3 Direct Upload

I'm trying to figure out how to incorporate S3 with dragonfly. This gem works locally, but because I'm using Heroku, the files gets deleted... so, I'm thinking I need to implement it into S3.
I'm using this gem: dragonfly-s3_data_store where my dragonfly.rb looks like this:
/config/initializers/dragonfly.rb
require 'dragonfly'
require 'dragonfly/s3_data_store'
# Configure
Dragonfly.app.configure do
plugin :imagemagick
url_format "/media/:job/:name"
datastore :s3,
bucket_name: 'thelab',
access_key_id: '{my access key}',
secret_access_key: '{my secret key}',
region: "us-west-1"
end
... #more code
This is my error when I'm trying to upload an image:
hostname "thelab.thelab.s3-us-west-1.amazonaws.com" does not match the server certificate (OpenSSL::SSL::SSLError)
I'm not sure what's causing the issue, but also on my console, I see this popping up:
[fog][WARNING] fog: followed redirect to thelab.s3-us-west-2.amazonaws.com, connecting to the matching region will be more performant
There's inconsistency here... I've also tried to switch the region to us-west-2, but I still get the same errors and warning signs.
form:
= simple_form_for(resource, as: resource_name, url: registration_path(resource_name), html: { method: :put }) do |f|
= f.file_field :image
= f.button :submit, "Save changes"
And of course, the model, in this case user.rb, has this:
dragonfly_accessor :image
Am I missing anything else that needs to be configured?
Also, my CORS Configuration for thelab bucket
<CORSConfiguration>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
</CORSConfiguration>

Resources