I am having an issue trying to deploy my app to heroku with rails active storage. In development I have no issues using
config.active_storage.service = :local
and all works as it should.
However in my production.rb file I have set config.active_storage.service = :amazon
and followed the set up guides.
My storage.ymlis as follows:
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV['S3_ACCESS_KEY'] %>
secret_access_key: <%= ENV['S3_SECRET_ACCESS_KEY'] %>
region: <%= ENV['S3_REGION'] %>
bucket: <%= ENV['S3_BUCKET_NAME'] %>
When I run git push heroku master, the app will deploy, but the following error will appear in the logs: "Detecting rails configuration failed".
I am unable to open the app and the herkou log will display the following error: "Missing configuration for the :amazon Active Storage service. Configurations available for [:test]".
This same error occurs if I change
config.active_storage.service = :amazon
to
config.active_storage.service = :local
however if I change it to
config.active_storage.service = :test
the app will deploy without error and I am able to open the app and upload files as expected.
I have trawled the web but haven't seen anyone else with this error, so any comments or thoughts are appreciated.
Thanks in advance.
The problem is the indentation in storage.yml
test: local: are indented one space, and amazon: is indented two spaces.
I had the same problem, and it was hard to notice an extra space. In YAML the spaces matter, but Ruby devs are not accustomed to having the spaces matter.
Related
In my Ruby on Rails 6.1.3.2 application, I'm trying to has_one_attached service option to upload one of the model attachments to the separate S3 bucket.
Here is how my storage.yml looks like:
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_I'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: us-east-2
bucket: <%= ENV['AWS_BUCKET'] %>
amazon_logos_images:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_I'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: us-east-2
bucket: <%= ENV['AWS_LOGOS_BUCKET'] %>
In my production.rb I have the following Active Storage configuration:
config.active_storage.service = :amazon
My Logo model looks like this:
class Logo < ApplicationRecord
has_one_attached :image, service: :amazon_logos_images
end
Unfortunately, when I create a new Logo record, the image is uploaded to the amazon bucket instead of amazon_logos_images. Any idea why the service option is ignored by the has_one_attached method?
If you come from 6.0:
If our project is already using Active Storage and when we upgrade to Rails 6.1, we should run rake app:update to make sure service_name column is added to the internal ActiveStorageBlob model.
https://blog.saeloun.com/2020/02/03/rails-allows-configure-service-for-attachments-to-activestorage.html
Otherwise check that AWS_LOGOS_BUCKET is properly being populated (i.e. is not the wrong bucket)
Are you using direct upload? If it is, 6.1 can't support direct uploads to multiple services, you need to upgrade to 7.0.
Any idea why I'm getting this errors on my localhost when I send an inbound email?
It is not a part of code that I can fix. Maybe some settings?
http://localhost:3000/rails/conductor/action_mailbox/inbound_emails
I've just experienced this in an application that didn't use active_storage before adding in action_mailbox (which depends on active_storage).
The solution that worked for us was to specify an active_storage config like:
config/storage.yml
local:
service: Disk
root: <%= Rails.root.join("storage" %>
And reference which active storage config to load the environment:
config/environments/development.rb
Rails.application.configure do
# ... other stuff
config.active_storage.service = :local
end
I have some APIs which I'm working on and using ActiveStorage for image uploading Everything is working fine locally but when I deploy my app on Heroku other's APIs are fine but when I try to upload an image it throws a 500 error.
My storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
S3:
service: BetterS3
root: 'directory'
amazon:
service: S3
access_key_id: 'abc'
secret_access_key: 'abc'
region: 'xyz'
bucket: 'xyz'
I also add these lines to my production.rb & staging.rb
Rails.application.configure do
config.active_storage.service = :amazon
end
I also install Active storage and install
gem 'aws-sdk-s3', '~> 1'
Note: I have a dockerized app but I have not pushed the image on Heroku but my app I'm not sure it was just because my gem was not properly uploaded on Heroku maybe I'm wrong.
I've gone through every stackoverflow question regarding this error:
https://duckduckgo.com/?q=rails+Missing+host+to+link+to
All the posts mention the same solution, which is to add the config in the environment file you're working on. In my case, I added to my development.rb:
config.active_storage.service = :local
config.action_mailer.default_url_options = { host: "localhost", port: "3000" }
MyApp::Application.default_url_options = Robson::Application.config.action_mailer.default_url_options
Rails.application.routes.default_url_options = Robson::Application.config.action_mailer.default_url_options
But I still get the infamous error message:
Missing host to link to! Please provide the :host parameter, set default_url_options[:host], or set :only_path to true
In the following locations if I try to open a file that I uploaded locally:
open(file.service_url)
or if I try to access the files from ActiveAdmin (I called the model "Attachments" and I'm using ActiveStorage)
column(:file) {|a| link_to a.file.filename, a.file.service_url}
I also tried setting "host" in a dictionary as a parameter in the above "link_to" and "open" functions. I also tried "only_path".
Nothing works.
Any help would be appreciated!
P.S.: my active storage config:
local:
service: Disk
root: <%= Rails.root.join("storage") %>
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
amazon:
service: S3
access_key_id: S3_ACCESS_KEY_ID
secret_access_key: S3_SECRET_ACCESS_KEY
bucket: S3_BUCKET
region: S3_REGION
UPDATE
Trying to use rails_representation_url but getting an error undefined method 'variation' for ActiveStorage::Attached
class Attachment < ApplicationRecord
include Rails.application.routes.url_helpers
has_one_attached :file
....
def with_uploaded_file
tempfile = Tempfile.open([file.filename.to_s, File.extname(file.filename.to_s)]) do |file_temp|
file_temp.binmode unless file.content_type =~ /text/
require 'open-uri'
# file_temp.write(open(file.service_url).read)
file_temp.write(open(rails_representation_url(file, only_path: true)).read)
file_temp
end
begin
yield(tempfile)
ensure
tempfile.unlink
end
I upgraded Rails from 5.1 to 5.2 and had same problem.
Solution: https://github.com/rails/rails/issues/32866
So when you have your ActiveRecord variant instead of
variant.service_url
do
rails_representation_url(variant, only_path: true)
I'm setting up a new rails 5.2 app utilising Active Storage and using AWS for the hosting of images in production.
However, I'm having an issue with the app reading the credentials:
2018-07-06T08:11:52.625415+00:00 app[web.1]: ! Unable to load application: Aws::Sigv4::Errors::MissingCredentialsError: Cannot load `Rails.config.active_storage.service`:
2018-07-06T08:11:52.625432+00:00 app[web.1]: missing credentials, provide credentials with one of the following options:
2018-07-06T08:11:52.625435+00:00 app[web.1]: - :access_key_id and :secret_access_key
2018-07-06T08:11:52.625437+00:00 app[web.1]: - :credentials
2018-07-06T08:11:52.625479+00:00 app[web.1]: - :credentials_provider
This is an existing S3 Bucket which I created a new user just for this app. I'm happy with the CORS etc.
The user is set up under the S3FullAccess group.
I've edited the credentials in my app via $EDITOR="atom --wait" rails credentials:edit
The contents of the file:
aws:
access_key_id: [my access key]
secret_access_key: [my secrect key]
# Used as the base secret for all MessageVerifiers in Rails, including the one protecting cookies.
secret_key_base: [my secret key base]
Appreciate this is in YAML format, I have played with using one space, and one tab on the keys, but this doesn't seem to make a difference.
When I save and close the file, the terminal writes New credentials encrypted and saved.
I also have gem 'aws-sdk-s3', '~>1', require: false installed.
And config/storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: eu-west-2
bucket: [mybucket]
Any suggestions on what I might be doing wrong?
I think you're missing the master.key file in your server. Check your local repo in config/master.key (this file is added to your .gitignore by default).
Add this file to your server or set ENV["RAILS_MASTER_KEY"].
This worked for me on Heroku: in "Settings > Config vars" add a RAILS_MASTER_KEY key, with the content of your your config/master.key file (from your Rails app) as the value.
Go into config/environments/development.rb and make sure you have this:
config.active_storage.service = :local
in config/environments/production you should have
config.active_storage.service = :amazon
amazon is for Amazon S3. It can be changed to whichever storage service you want to use. See the Rails docs for more info on storage services and Active Storage.
In Rails 5.2, do the following:
Step 1. In config/storage.yml add
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: ap-south-1
bucket: my-bucket
Step 2:
Copy config/credentials.yml.example to config/credentials.yml
and add the following in config/credentials.yml
development:
AWS_ACCESS_KEY_ID: YOUR-KEY
AWS_SECRET_ACCESS_KEY: YOUR-SECRET
credentials.yml is already added to .gitignore by default.
Step 3:
In application.rb
Uncomment the following:
# Load ENV variables from credentials.yml file
config.before_configuration do
env_file = File.join(Rails.root, 'config', 'credentials.yml')
YAML.load(File.open(env_file))[Rails.env].each do |key, value|
ENV[key.to_s] = value
end if File.exists?(env_file)
end
Restart the server and try to upload again.
Another way of solving this issue (worked for me)
Run rake secret in the console
copy the key
go to config and open application.rb
inside the class type: config.secret_key_base = "paste the output of rake secrete"
I had the same error. In my case the problem was neither with configs, nor with master.key. Starting Redis server fixed the error. For MacOS:
$> redis-server