I am using ActiveStorage for uploading PDFs and images. The PDFs need to be stored locally because of some privacy concerns, while the images need to be stored using Amazon S3. However, it looks like ActiveStorage only supports setting one service type per environment (unless you use the mirror functionality, which doesn't do what I need it to in this case).
Is there a way to use different service configs within the same environment? For example, if a model has_one_attached pdf it uses the local service:
local:
service: Disk
root: <%= Rails.root.join("storage") %>
And if another model has_one_attached image it uses the amazon service:
amazon:
service: S3
access_key_id: ""
secret_access_key: ""
Rails 6.1 now supports this.
As per this article, you can specify the service to use for each attached:
class MyModel < ApplicationRecord
has_one_attached :private_document, service: :disk
has_one_attached :public_document, service: :s3
end
ActiveStorage is great, but if you're in need of multiple service types per environment it currently won't work for you (as George Claghorn mentioned above). If you need an alternate option, I solved this problem by using Shrine.
The trick is to setup multiple 'stores' in your initializer:
# config/initializers/shrine.rb
Shrine.storages = {
cache: Shrine::Storage::FileSystem.new('storage', prefix: 'uploads/cache'),
pdf_files: Shrine::Storage::FileSystem.new('storage', prefix: 'uploads'),
images: Shrine::Storage::S3.new(**s3_options)
}
And then use the default_storage plugin in each uploader (which you connect to a given model). Note that it won't work unless you specify the default_storage in both uploaders:
class PdfFileUploader < Shrine
plugin :default_storage, cache: :cache, store: :pdf_files
end
class ImageFileUploader < Shrine
plugin :default_storage, cache: :cache, store: :images
end
Sorry, I’m afraid Active Storage doesn’t support this.
Related
I need to upload a video to a 3rd party via API.
Using the API I have requested this "Upload Location" which is valid for 15-minute. Now I need to upload my Active Storage video directly to this remote upload location. This remote location is not managed by me.
I have read the official documentation but it's not clear where I can change the default upload location url with this one.
Doc: https://edgeguides.rubyonrails.org/active_storage_overview.html#direct-uploads
Upload location:
{"uploadLocation"=>"https://storage-3rd-party.s3.eu-west-1.amazonaws.com/staging/folderr/12345/xxxx?X-Amz-
Security-Token=xxx...."}
If I'm understanding correctly, you're going to want to implement a custom ActiveStorage::Service for this 3rd party API. Behind the scenes, rails invokes url_for_direct_upload to get the URL that you're wanting to customize.
You should be able to something close to working if you implemented a new service like so:
class ThirdPartyStorageService < ActiveStorage::Service
def url_for_direct_upload(key, expires_in:, content_type:, content_length:, checksum:)
ThirdPartyAPI.get_upload_location(...)
end
# Implement other abstract methods...
end
You then need to add your service in config/storage.yml:
third_party:
service: ThirdPartyStorageService
# username: ...
# password: ...
# other config...
And then you can set it up to be used in a specific model, or globally.
# app/models/my_model.rb
class MyModel
has_one_attached :file, service: :third_party
end
# or config/application.rb
config.active_storage.service = :third_party
It's a bit of work, but I think this should set you up for success! Make sure to read the docs on ActiveStorage::Service and you can look at the implementations for Azure, AWS and Google storage services for inspiration if you aren't sure how to implement a certain method.
For a school project, I'm working on a Rails app which "sells" pics of kittens. I picked 10 pictures of cats online, they are currently on my computer. I'm using Postgresql for the DB. I have a class/model Item which represents the kitten photos.
What I'm looking for is a way to, when generating fake data through seeds.rb loops, attaching a kitten photo to each Item class/model, which will be then stored to an AWS S3 bucket that is already created (it's called catz-temple). I have my two access and secret S3 keys on a .env file, I already have modified my storage.yml file like so :
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: eu-central-1
bucket: catz-temple
I found out there was a gem called aws-sdk-ruby, but I just can't find out what approach I should have on this topic.
For now, I just put my bucket in public access and take each bucket photos' urls, but there's no API and secure approach into this...
Thank you all
Starting by follow the guides for configuring ActiveStorage and S3. Then setup the attachments on your model.
class Kitteh < ApplicationRecord
has_one_attached :photo
end
With ActiveStorage you can directly attach files to records by passing an IO object:
photos = Rails.root.join('path/to/the/images', '*.{jpg,gif,png}')
100.times do |n|
path = photos.sample
File.open(path) do |file|
Kitteh.new(name: "Kitteh #{n}") do |k|
k.photo.attach(
io: file,
filename: path.basename
)
end.save!
end
end
This example creates 100 records with a random image selected from a directory on your hard drive and will upload it to the storage you have configured.
A user uploads a document and this gets stored in Azure with ActiveStorage. The next step is that the backend processes this and therefore I have a service object to do this. So I need to download the file from Azure to the tmp folder within the Rails app. How do I download the file? I cannot use rails_blob_url because it is not available in a service object, only in controllers and views.
When I still used Paperclip I did something like this:
require 'open-uri'
file = Rails.root.join('tmp', user.attachment_file_name)
name = user.attachment_file_name
download = open(user.attachment.url)
download_result = IO.copy_stream(download, file)
How can I do something similar with ActiveStorage?
You can use ActiveStorage::Blob#open:
Downloads the blob to a tempfile on disk. Yields the tempfile.
Given this example from the guides:
class User < ApplicationRecord
has_one_attached :avatar
end
You can do this with:
user.avatar.open do |tempfile|
# do something with the file
end
If its has_many_attached you of course need to loop through the attachments.
See:
Active Storage Overview
I'm using Active Storage to store files in a Rails 5.2 project. I've got files saving to S3, but they save with random string filenames and directly to the root of the bucket. I don't mind the random filenames (I actually prefer it for my use case) but would like to keep different attachments organized into folders in the bucket.
My model uses has_one_attached :file. I would like to specify to store all these files within a /downloads folder within S3 for example. I can't find any documentation regarding how to set these paths.
Something like has_one_attached :file, folder: '/downloads' would be great if that's possible...
The ultimate solution is to add an initializer. You can add a prefix based on an environment variable or your Rails.env :
# config/initializer/active_storage.rb
Rails.configuration.to_prepare do
ActiveStorage::Blob.class_eval do
before_create :generate_key_with_prefix
def generate_key_with_prefix
self.key = if prefix
File.join prefix, self.class.generate_unique_secure_token
else
self.class.generate_unique_secure_token
end
end
def prefix
ENV["SPACES_ROOT_FOLDER"]
end
end
end
It works perfectly with this. Other people suggest using Shrine.
Credit to for this great workaround : https://dev.to/drnic/how-to-isolate-your-rails-blobs-in-subfolders-1n0c
As of now ActiveStorage doesn't support that kind of functionality. Refer to this link. has_one_attached just accepts name and dependent.
Also in one of the GitHub issues, the maintainer clearly mentioned that they have clearly no idea of implementing something like this.
The workaround that I can imagine is, uploading the file from the front-end and then write a service that updates key field in active_storage_blob_statement
There is no official way to change the path which is determined by ActiveStorage::Blob#key and the source code is:
def key
self[:key] ||= self.class.generate_unique_secure_token
end
And ActieStorage::Blog.generate_unique_secure_token is
def generate_unique_secure_token
SecureRandom.base36(28)
end
So a workaround is to override the key method like the following:
# config/initializers/active_storage.rb
ActiveSupport.on_load(:active_storage_blob) do
def key
self[:key] ||= "my_folder/#{self.class.generate_unique_secure_token}"
end
end
Don't worry, this will not affect existing files. But you must be careful ActiveStorage is very new stuff, its source code is variant. When upgrading Rails version, remind yourself to take look whether this patch causes something wrong.
You can read ActiveStorage source code from here: https://github.com/rails/rails/tree/master/activestorage
Solution using Cloudinary service
If you're using Cloudinary you can set the folder on storage.yml:
cloudinary:
service: Cloudinary
folder: <%= Rails.env %>
With that, Cloudinary will automatically create folders based on your Rails env:
This is a long due issue with Active Storage that seems to have been worked around by the Cloudinary team. Thanks for the amazing work ❤️
# config/initializers/active_storage.rb
ActiveSupport.on_load(:active_storage_blob) do
def key
sql_find_order_id = "select * from active_storage_attachments where blob_id = #{self.id}"
active_storage_attachment = ActiveRecord::Base.connection.select_one(sql_find_order_id)
# this variable record_id contains the id of object association in has_one_attached
record_id = active_storage_attachment['record_id']
self[:key] = "my_folder/#{self.class.generate_unique_secure_token}"
self.save
self[:key]
end
end
Active Storage by default doesn't contain a path/folder feature but you can override the function by
model.file.attach(key: "downloads/filename", io: File.open(file), content_type: file.content_type, filename: "#{file.original_filename}")
Doing this will store the key with the path where you want to store the file in the s3 subdirectory and upload it at the exact place where you want.
I've been implementing an Active Storage Google strategy on Rails 5.2, at the moment I am able to upload files using the rails console without problems, the only thing I am missing is if there is a way to specify a directory inside a bucket. Right now I am uploading as follows
bk.file.attach(io: File.open(bk.source_dir.to_s), filename: "file.tar.gz", content_type: "application/x-tar")
The configuration on my storage.yml
google:
service: GCS
project: my-project
credentials: <%= Rails.root.join("config/myfile.json") %>
bucket: bucketname
But in my bucket there are different directories such as bucketname/department1 and such. I been through the documentation and have not found a way to specify further directories and all my uploads end up in bucket name.
Sorry, I’m afraid Active Storage doesn’t support that. You’re intended to configure Active Storage with a bucket it can use exclusively.
Maybe you can try metaprogramming, something like this:
Create config/initializers/active_storage_service.rb to add set_bucket method to ActiveStorage::Service
module Methods
def set_bucket(bucket_name)
# update config bucket
config[:bucket] = bucket_name
# update current bucket
#bucket = client.bucket(bucket_name, skip_lookup: true)
end
end
ActiveStorage::Service.class_eval { include Methods }
Update your bucket before uploading or downloading files
ActiveStorage::Blob.service.set_bucket "my_bucket_name"
bk.file.attach(io: File.open(bk.source_dir.to_s), filename: "file.tar.gz", content_type: "application/x-tar")