I am running a Ruby on Rails website and am currently using Rails' ActiveStorage to store my images and videos.
I am using a AWS based space for storage (DigitalOcean) and they recently rolled out support for custom CDN support. Meaning, instead of referencing my-space.nyc3.digitalocean.com, I would reference assets.akinyele.ca.
Everything has been setup on my DigicalOcean dashboard. But I was wondering if I could use assets.akinyele.ca on ActiveStorage instead.
I have tried not specifying a bucket that failed automatically because it looks like the ActiveStorage API requires that field, and uses it to build the space storage service's URL. I also tried specifying the endpoint to assets.akinyele.ca, but that gave me my-space.assets.akinyele.ca.
This is what a part of config looks like:
# config/storage.yml
local: #
development: #
# This is what I need to replace, and this is was I am using right now.
amazon:
service: S3
access_key_id: <%= ENV["TANOSHIMU_SPACE_ACCESS_KEY_ID"] %>
secret_access_key: <%= ENV["TANOSHIMU_SPACE_SECRET_ACCESS_KEY"] %>
region: nyc3
bucket: my space
endpoint: 'https://nyc3.digitaloceanspaces.com'
Thank you.
You can try to override url method for ActiveStorage::Service::S3Service
P.S. Use bucket: '' in your config/storage.yml
Related
For a school project, I'm working on a Rails app which "sells" pics of kittens. I picked 10 pictures of cats online, they are currently on my computer. I'm using Postgresql for the DB. I have a class/model Item which represents the kitten photos.
What I'm looking for is a way to, when generating fake data through seeds.rb loops, attaching a kitten photo to each Item class/model, which will be then stored to an AWS S3 bucket that is already created (it's called catz-temple). I have my two access and secret S3 keys on a .env file, I already have modified my storage.yml file like so :
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: eu-central-1
bucket: catz-temple
I found out there was a gem called aws-sdk-ruby, but I just can't find out what approach I should have on this topic.
For now, I just put my bucket in public access and take each bucket photos' urls, but there's no API and secure approach into this...
Thank you all
Starting by follow the guides for configuring ActiveStorage and S3. Then setup the attachments on your model.
class Kitteh < ApplicationRecord
has_one_attached :photo
end
With ActiveStorage you can directly attach files to records by passing an IO object:
photos = Rails.root.join('path/to/the/images', '*.{jpg,gif,png}')
100.times do |n|
path = photos.sample
File.open(path) do |file|
Kitteh.new(name: "Kitteh #{n}") do |k|
k.photo.attach(
io: file,
filename: path.basename
)
end.save!
end
end
This example creates 100 records with a random image selected from a directory on your hard drive and will upload it to the storage you have configured.
I have a Rails 5.2 API set up and have followed the documentation on how to attach images to a model object - that's all working fine. The problem I'm having is I want to return in a JSON object the attachment's public URL so that I can use that URL as the source in an <img src... in my React front end. Is there a way to return the actual URL from the AWS S3 bucket, where the image would show up if clicked on?
Right now, I've tried rails_blob_path, service_url, and I do get URLs in return, but neither of them actually render the image the way I'd hope. Any workarounds to this?
Again, just want the attachment's actual public URL from s3 so I can plug it in to the src attribute inside an <img> and have it display. Thanks.
My development.rb file configures config.active_stoarge.service = :amazon.
My storage.yml file has amazon configured like so:
amazon:
service: S3
access_key_id: <%= Rails.application.secrets.amazon[:access_key] %>
secret_access_key: <%= Rails.application.secrets.amazon[:secret_key] %>
region: us-east-2
bucket: my_bucket_name_here
ActiveStorage 5.2.2
Rails 5.2.2
You should be able to use rails_blob_url or rails_blob_path to create a link to the actual file.
https://edgeguides.rubyonrails.org/active_storage_overview.html#linking-to-files
I've been implementing an Active Storage Google strategy on Rails 5.2, at the moment I am able to upload files using the rails console without problems, the only thing I am missing is if there is a way to specify a directory inside a bucket. Right now I am uploading as follows
bk.file.attach(io: File.open(bk.source_dir.to_s), filename: "file.tar.gz", content_type: "application/x-tar")
The configuration on my storage.yml
google:
service: GCS
project: my-project
credentials: <%= Rails.root.join("config/myfile.json") %>
bucket: bucketname
But in my bucket there are different directories such as bucketname/department1 and such. I been through the documentation and have not found a way to specify further directories and all my uploads end up in bucket name.
Sorry, I’m afraid Active Storage doesn’t support that. You’re intended to configure Active Storage with a bucket it can use exclusively.
Maybe you can try metaprogramming, something like this:
Create config/initializers/active_storage_service.rb to add set_bucket method to ActiveStorage::Service
module Methods
def set_bucket(bucket_name)
# update config bucket
config[:bucket] = bucket_name
# update current bucket
#bucket = client.bucket(bucket_name, skip_lookup: true)
end
end
ActiveStorage::Service.class_eval { include Methods }
Update your bucket before uploading or downloading files
ActiveStorage::Blob.service.set_bucket "my_bucket_name"
bk.file.attach(io: File.open(bk.source_dir.to_s), filename: "file.tar.gz", content_type: "application/x-tar")
I am using ActiveStorage for uploading PDFs and images. The PDFs need to be stored locally because of some privacy concerns, while the images need to be stored using Amazon S3. However, it looks like ActiveStorage only supports setting one service type per environment (unless you use the mirror functionality, which doesn't do what I need it to in this case).
Is there a way to use different service configs within the same environment? For example, if a model has_one_attached pdf it uses the local service:
local:
service: Disk
root: <%= Rails.root.join("storage") %>
And if another model has_one_attached image it uses the amazon service:
amazon:
service: S3
access_key_id: ""
secret_access_key: ""
Rails 6.1 now supports this.
As per this article, you can specify the service to use for each attached:
class MyModel < ApplicationRecord
has_one_attached :private_document, service: :disk
has_one_attached :public_document, service: :s3
end
ActiveStorage is great, but if you're in need of multiple service types per environment it currently won't work for you (as George Claghorn mentioned above). If you need an alternate option, I solved this problem by using Shrine.
The trick is to setup multiple 'stores' in your initializer:
# config/initializers/shrine.rb
Shrine.storages = {
cache: Shrine::Storage::FileSystem.new('storage', prefix: 'uploads/cache'),
pdf_files: Shrine::Storage::FileSystem.new('storage', prefix: 'uploads'),
images: Shrine::Storage::S3.new(**s3_options)
}
And then use the default_storage plugin in each uploader (which you connect to a given model). Note that it won't work unless you specify the default_storage in both uploaders:
class PdfFileUploader < Shrine
plugin :default_storage, cache: :cache, store: :pdf_files
end
class ImageFileUploader < Shrine
plugin :default_storage, cache: :cache, store: :images
end
Sorry, I’m afraid Active Storage doesn’t support this.
My application.rb:
S3_CREDENTIALS = YAML.load(File.read(File.expand_path(Rails.root.join("config","s3_credentials.yml"))))["production"]
# AWS::S3::Base.establish_connection! S3_CREDENTIALS['connection']
AWS::S3::Base.establish_connection!(
:access_key_id => S3_CREDENTIALS['access_key_id'],
:secret_access_key => S3_CREDENTIALS['secret_access_key'],
:persistent => true, # from http://www.ruby-forum.com/topic/110842
s3_credentials.yml:
production: &defaults
access_key_id: <%=ENV['AWS_ACCESS_KEY_ID']%>
secret_access_key: <%=ENV['AWS_SECRET_ACCESS_KEY']%>
persistent: true
I launch my web server and in my customers_controller I check for the connection:
Rails.logger.info("S3 service connected? " + AWS::S3::Base.connected?.to_s)
And the log says it's connected:
S3 service connected? true
So I know the following:
My env variables are correct.
My initializer and yml files are correct.
Gem is fine
So then I do something simple like this in the controller:
#documents = Service.buckets()
I reload the page and here we go:
AWS::S3::InvalidAccessKeyId in CustomersController#edit
The AWS Access Key Id you provided does not exist in our records.
THE KICKER -
When I print my secret key and access key in plain text into the application.rb file (get rid of the s3_credentials.yml file, just put the keys into the appropriate lines in the application.rb file - I won't get the error.
Why is it that using environment variables with the aws-s3 gem fails spectacularly using the API methods but loads just fine? The keys can't be good and bad at the same time. It wouldn't connect at all if the keys were wrong, correct?
What I found after hours is that there is something inherently borked in the version of aws-s3 gem that I'm using. Period. There is no solution other than to fix the gem, and I don't have time to go that far. I checked the source and when to the connection.rb module to find where it's calling up the keys. It uses his own extract_keys! method, and after poking at that for a while I gave up.
I ended up using the aws-sdk gem. Config is a bit different than with aws-s3, but same ENV vars and it is all working now.
I submitted a request to the owner of the aws-s3 gem to take a look. But I don't know how active that gem is any longer.
aws-sdk is overkill when all you need is some simple access to S3 stuff, but whatever. It works.
Here is some of the config just in case someone finds it useful in the future:
aws.yml:
development: &defaults
access_key_id: <%=ENV['AWS_ACCESS_KEY_ID']%>
secret_access_key: <%=ENV['AWS_SECRET_ACCESS_KEY']%>
persistent: true
# bucket: YOUR BUCKET
max_file_size: 10485760
acl: public-read
NOTE : The <%= %> syntax above doesn't have spaces. Normally I format it with spaces. This might sound crazy to some, but if you put spaces between the tag and the value inside you will get an error from Amazon saying some crazy stuff about only requiring a single space. This drove me nuts. The leading or trailing space shouldn't matter! But it did. I took the spaces out - started working. Call me crazy. I spent hours battling this so you wouldn't have to!
application.rb (portion):
# Establish a base connection to Amazon S3
S3_CREDENTIALS = YAML.load(File.read(File.expand_path(Rails.root.join("config","aws.yml"))))["development"]
AWS.config(
:access_key_id => S3_CREDENTIALS['access_key_id'],
:secret_access_key => S3_CREDENTIALS['secret_access_key'],
)
And of course export AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to your environment.
On Mac just 'export AWS_ACCESS_KEY_ID=YOUR_KEY_GOES_HERE AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY_GOES_HERE' and hit return.
Heroku is:
heroku config:set AWS_ACCESS_KEY_ID=YOUR_KEY_GOES_HERE AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY_GOES_HERE
Can't do anything for you if you're using Windows.
You're right that it would never work if the key was just wrong. Have you tried to make sure that S3_CREDENTIALS['access_key_id'] and ENV['AWS_ACCESS_KEY_ID'] contain what you think they contain? My first guess is that S3_CREDENTIALS['access_key_id'] ends up containing the literal string of <%=ENV['AWS_ACCESS_KEY_ID']%> (not interpreted) since you're not processing the yaml file as erb before loading the file.