Using multiple buckets with ActiveStorage - ruby-on-rails

Does anyone know if there is a way to configure custom buckets for specific attachments?
class MyModel < ApplicationRecord
...
has_one_attached :logo, bucket: 'custom_bucket'
...
end

Although there isn't a way to use specific "buckets", one can pretty easily add multiple active storage configurations for multiple buckets (I believe introduced in v6.1):
https://edgeguides.rubyonrails.org/active_storage_overview.html#attaching-files-to-records
For example, you might have a "amazon_s3_cold" and an "amazon_s3_hot", they will have all the same configurations aside from the bucket. You may then configure your buckets accordingly on AWS.
# config/storage.yml
amazon_s3_hot:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-1
bucket: my_hot_bucket
amazon_s3_cold:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-1
bucket: my_cold_bucket
# controllers
class User < ApplicationRecord
has_one_attached :avatar, service: :amazon_s3_hot
end
class DocumentRecord < ApplicationRecord
has_one_attached :document_upload, service: :amazon_s3_cold
end
Note - hot/cold doesn't apply to the question directly, but provides some context. Hot/cold storage is a concept pertaining to cloud storage services that trades off costs for access frequencies.

You could follow a similar pattern to how a traditional database.yml file inherits settings which is just YML variables. My storage.yml file looks somewhat like this which allows me to store each Active Storage attachment type in their own folder.
The S3 provider which is what powers the DO provider requires a bucket name which I've just specified as 'default' but you could call it 'all' or 'general' and then override only the ones you care about.
(storage.yml)
do: &do
service: S3
endpoint: <%= Rails.application.credentials.dig(:digitalocean, :endpoint) %>
access_key_id: <%= Rails.application.credentials.dig(:digitalocean, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:digitalocean, :secret_access_key) %>
region: 'nyc3'
bucket: default
do_user_uploads:
<<: *do
bucket: user_uploads
(user.rb)
has_one_attached :upload, service: :do_user_uploads
Hope that helps, I came here looking for the same answer!

There isn’t, sorry. Active Storage is designed for use with a single bucket.

Related

How is the best method to add an active storage representation to a mailer?

I would like to use an active_storage representation in a mailer.
rails_blob_path(#post.photos.first.img).variant(resize: "300x300") don't work and all the links generated through rails_blob_path(#post.photos.first.img) expire in 5 mins.
Is there a way to generate permanent long lived urls?
Rails.application.routes.url_helpers.rails_blob_url(#post.photos.first.img.variant(resize: "300x300"), only_path: true)
Return NoMethodError: undefined method `signed_id'
Rails 6.1 introduced support for public storage. For example:
s3_public:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:s3, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:s3, :secret_access_key) %>
bucket: bucket_name
public: true
You can set the storage service on a per-attachment basis in case you don't want everything public:
has_one_attached :pdf, service: :s3_public

Can't retrieve file from S3 with Rails Active Storage

I am trying to upload a photo to S3 using Rails Active Storage.
I can attach a photo:
user.photo.attach(io: File.open('spec/images/filename.png'), filename: 'filename.png')
I can save the user and I can get the photo service url and see it in the browser, and in my bucket:
user.photo.service_url
However, if I restart my console and try to get the service url, I receive the following error:
Module::DelegationError (service_url delegated to attachment, but attachment is nil)
Here are my bucket settings:
storage.yml:
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
application.rb:
config.active_storage.service = :amazon
user.rb:
has_one_attached :photo
I am also having trouble using public: true in the storage.yml file.
I receive the following error if I try to set the config:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: us-east-2
bucket: <%= Rails.application.credentials.dig(:aws, :bucket) %>
public: true
I also wanted to upload my files to AWS S3, and have them publically available.
I ran into this issue as well and found the following:
ArgumentError (Cannot load `Rails.config.active_storage.service`:)
invalid configuration option `:public'
comes from this file in the aws-sdk-ruby gem. As per the error message, the aws-sdk-ruby gem does not support the public: true option.
I used the following work-around (special thanks to this article):
I updated my storage.yml to:
public_amazon:
service: S3
access_key_id: some_key
secret_access_key: some_secret
bucket: some-bucket-name
region: some-region
upload:
acl: "public-read"
The above sets the uploaded file permissions to be public.
Retrieve the public URL like this:
user.photo.attach(params[:file])
url = user.photo.service.send(:object_for, user.photo.key).public_url

How to Setup Transfer Acceleration for S3 on Ruby on Rails Active Storage

Transfer Accelerated S3 bucket allows you to upload/download faster. The setup should be simple in theory:
The endpoint changes from:
mybucket.s3.us-east-1.amazonaws.com
to:
mybucket.s3-accelerate.amazonaws.com
In Ruby on Rails config/storage.yml the environment variables look like this:
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: <%= ENV['AWS_REGION'] %>
bucket: <%= ENV['AWS_BUCKET'] %>
The problem is the pattern is different, so I can't just change the AWS_REGION. There is an extra .s3 in there.
I wish to learn how to implement S3 transfer acceleration with ROR and Active Storage?
I had to add this below the bucket...:
use_accelerate_endpoint: true

Can I add Shrine upload credentials to the model

I have a multi-tenant site built on rails 5, each of the tenants adds their own s3 credentials, therefore, any uploads that happen on their tenant site get uploaded to their own s3 account.
The problem I have at the moment is that Shrine seems to only let me add s3 credentials in the initializer. This works great but I would like to add it to the model so that I can dynamically populate the s3 credentials depending on which tenant is being used at the time. Does anyone know anyway shrine can help me?
I managed to do this with paperclip but it came with other problems such as background processing etc.
You could define all the storages in the initializer:
Shrine.storages = {
first_storage: Shrine::Storage::S3.new(
bucket: "my-first-bucket", # required
region: "eu-west-1", # required
access_key_id: "abc",
secret_access_key: "xyz"),
second_storage: Shrine::Storage::S3.new(
bucket: "my-second-bucket", # required
region: "eu-east-1", # required
access_key_id: "efg",
secret_access_key: "uvw")
}
Note: This is not all the storages code - both the :cache and the :store storages should be defined.
And then use them in the models:
class Photo
include ImageUploader::Attachment(:image)
end
photo = Photo.new
photo.image_attacher.upload(io, :first_storage)
photo.image_attacher.upload(other_io, :second_storage)
See Shrine attacher's doc page and source code

How to change S3 bucket URL to bucket first then url second

I am creating a rails API app using Paperclip and aws-sdk gems.
The app saves the URL as a string. The url saved is the following.
http://s3.amazonaws.com/S3_BUCKET_/profiles/avatars/000/000/001/original/avatar.png?1457514823
I cant open the above image. Its because the url for it when taken from s3 is the following
http://S3_BUCKET_/s3.amazonaws.com//profiles/avatars/000/000/001/original/avatar.png?1457514823
See how the bucket is first? But the url saved in the database has the bucket second? How do i change the saved URL to have the bucket first?
config/initializers/paperclip.rb
Paperclip::Attachment.default_options.update(
default_url: "https://#{Rails.application.secrets.bucket}.s3-ap-southeast-2.amazonaws.com/" \
"/profiles/avatars/default/missing.jpg")
config/aws.yml
development: &defaults
access_key_id: s3_access_key
secret_access_key: s3 secret key
s3_region: ap-southeast-2
test:
secret_access_key: s3 secret key
staging:
<<: *defaults
access_key_id: s3_access_key
secret_access_key: <%= ENV["SECRET_KEY_BASE"] %>
production:
<<: *defaults
access_key_id: s3_access_key
secret_access_key: <%= ENV["SECRET_KEY_BASE"] %>
profile.rb it has the attachment saved
require "base64"
class Profile < ActiveRecord::Base
belongs_to :user
validates :user, presence: true
has_attached_file :avatar, styles: { thumb: "100x100>" }
validates_attachment_content_type :avatar, content_type: /image/i
def avatar_url
avatar && avatar.url
end
def avatar_base64=(image_base64)
file = Paperclip.io_adapters.for(image_base64)
file.original_filename = file.content_type.sub("image/", "avatar.")
self.avatar = file
end
You can add a default url in config/initializers/paperclip.rb like this:
Paperclip::Attachment.default_options[:url] = ':s3_domain_url'
Or you can configure directly in your environment configuration, i.e. config/environments/production.rb:
config.paperclip_defaults = {
storage: :s3,
url: ':s3_domain_url',
...
}
It's important to note that :s3_domain_url is a string, not a symbol.

Resources