I'm setting up a new rails 5.2 app utilising Active Storage and using AWS for the hosting of images in production.
However, I'm having an issue with the app reading the credentials:
2018-07-06T08:11:52.625415+00:00 app[web.1]: ! Unable to load application: Aws::Sigv4::Errors::MissingCredentialsError: Cannot load `Rails.config.active_storage.service`:
2018-07-06T08:11:52.625432+00:00 app[web.1]: missing credentials, provide credentials with one of the following options:
2018-07-06T08:11:52.625435+00:00 app[web.1]: - :access_key_id and :secret_access_key
2018-07-06T08:11:52.625437+00:00 app[web.1]: - :credentials
2018-07-06T08:11:52.625479+00:00 app[web.1]: - :credentials_provider
This is an existing S3 Bucket which I created a new user just for this app. I'm happy with the CORS etc.
The user is set up under the S3FullAccess group.
I've edited the credentials in my app via $EDITOR="atom --wait" rails credentials:edit
The contents of the file:
aws:
access_key_id: [my access key]
secret_access_key: [my secrect key]
# Used as the base secret for all MessageVerifiers in Rails, including the one protecting cookies.
secret_key_base: [my secret key base]
Appreciate this is in YAML format, I have played with using one space, and one tab on the keys, but this doesn't seem to make a difference.
When I save and close the file, the terminal writes New credentials encrypted and saved.
I also have gem 'aws-sdk-s3', '~>1', require: false installed.
And config/storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: eu-west-2
bucket: [mybucket]
Any suggestions on what I might be doing wrong?
I think you're missing the master.key file in your server. Check your local repo in config/master.key (this file is added to your .gitignore by default).
Add this file to your server or set ENV["RAILS_MASTER_KEY"].
This worked for me on Heroku: in "Settings > Config vars" add a RAILS_MASTER_KEY key, with the content of your your config/master.key file (from your Rails app) as the value.
Go into config/environments/development.rb and make sure you have this:
config.active_storage.service = :local
in config/environments/production you should have
config.active_storage.service = :amazon
amazon is for Amazon S3. It can be changed to whichever storage service you want to use. See the Rails docs for more info on storage services and Active Storage.
In Rails 5.2, do the following:
Step 1. In config/storage.yml add
amazon:
service: S3
access_key_id: <%= ENV['AWS_ACCESS_KEY_ID'] %>
secret_access_key: <%= ENV['AWS_SECRET_ACCESS_KEY'] %>
region: ap-south-1
bucket: my-bucket
Step 2:
Copy config/credentials.yml.example to config/credentials.yml
and add the following in config/credentials.yml
development:
AWS_ACCESS_KEY_ID: YOUR-KEY
AWS_SECRET_ACCESS_KEY: YOUR-SECRET
credentials.yml is already added to .gitignore by default.
Step 3:
In application.rb
Uncomment the following:
# Load ENV variables from credentials.yml file
config.before_configuration do
env_file = File.join(Rails.root, 'config', 'credentials.yml')
YAML.load(File.open(env_file))[Rails.env].each do |key, value|
ENV[key.to_s] = value
end if File.exists?(env_file)
end
Restart the server and try to upload again.
Another way of solving this issue (worked for me)
Run rake secret in the console
copy the key
go to config and open application.rb
inside the class type: config.secret_key_base = "paste the output of rake secrete"
I had the same error. In my case the problem was neither with configs, nor with master.key. Starting Redis server fixed the error. For MacOS:
$> redis-server
Related
I have some APIs which I'm working on and using ActiveStorage for image uploading Everything is working fine locally but when I deploy my app on Heroku other's APIs are fine but when I try to upload an image it throws a 500 error.
My storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
S3:
service: BetterS3
root: 'directory'
amazon:
service: S3
access_key_id: 'abc'
secret_access_key: 'abc'
region: 'xyz'
bucket: 'xyz'
I also add these lines to my production.rb & staging.rb
Rails.application.configure do
config.active_storage.service = :amazon
end
I also install Active storage and install
gem 'aws-sdk-s3', '~> 1'
Note: I have a dockerized app but I have not pushed the image on Heroku but my app I'm not sure it was just because my gem was not properly uploaded on Heroku maybe I'm wrong.
My app deploys to Heroku but crashes every time. I don't know why. I have set up Carrierwave, fog, and aws for an app in production on Heroku before just fine. Tried to follow the same steps and I am getting an h10 error code. In the rails console it specifically says:
/app/vendor/bundle/ruby/2.3.0/gems/activestorage-5.2.1/lib/active_storage/engine.rb:76:in
`block (2 levels) in ': Couldn't find Active Storage
configuration in /app/config/storage.yml (RuntimeError)
storage.yml
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
# amazon:
amazon:
service: S3
access_key_id: "S3_KEY"
secret_access_key: "S3_SECRET"
region: "us-east-1"
bucket: "books4reviews"
production.rb
config.active_storage.service = :amazon
carrierwave.rb
CarrierWave.configure do |config|
config.fog_provider = 'fog/aws'
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['S3_KEY'],
aws_secret_access_key: ENV['S3_SECRET'],
region: 'us-east-1'
}
config.fog_directory = 'books4reviews'
config.fog_public = false
config.storage = :fog
end
puma.rb
threads_count = ENV.fetch("RAILS_MAX_THREADS") { 5 }
threads threads_count, threads_count
port ENV.fetch("PORT") { 3000 }
environment ENV.fetch("RAILS_ENV") { "development" }
plugin :tmp_restart
Procfile
web: bundle exec puma -C config/puma.rb
avatar_uploader.rb
class AvatarUploader < CarrierWave::Uploader::Base
# Include RMagick or MiniMagick support:
# include CarrierWave::RMagick
# Choose what kind of storage to use for this uploader:
include CarrierWave::MiniMagick
storage :fog
process resize_to_fit: [500,500]
version :small do
process resize_to_fill: [200, 200]
end
version :medium do
# change the word 'fit' to 'fill'
process resize_to_fill: [400,600]
end
version :large do
process resize_to_fill: [1000,1000]
end
version :thumb do
process resize_to_fill: [50, 50]
end
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
def extension_white_list
%w(jpg jpeg gif png)
end
end
I've set my env variables for the aws credentials in my heroku config variables from the terminal. Can you tell me why I'm getting this active storage error? Thanks
I had this same issue when deploying a recently upgraded Rails app. The application was upgraded from Rails 5 to Rails 6. However, when I try deploying to Heroku, I got the error below:
2021-02-12T17:32:33.404828+00:00 app[web.1]: ! Unable to load application: RuntimeError: Couldn't find Active Storage configuration in /app/config/storage.yml
2021-02-12T17:32:33.404874+00:00 app[web.1]: bundler: failed to load command: puma (/app/vendor/bundle/ruby/2.7.0/bin/puma)
2021-02-12T17:32:33.404958+00:00 app[web.1]: RuntimeError: Couldn't find Active Storage configuration in /app/config/storage.yml
Here's how I fixed it:
I checked the config directory of my application and realized that it had no config/storage.yml file. All I had to do was to create the file, and copy the vanilla template that comes with Rails 6 applications into the file:
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
# amazon:
# service: S3
# access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
# secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
# region: us-east-1
# bucket: your_own_bucket
# Remember not to checkin your GCS keyfile to a repository
# google:
# service: GCS
# project: your_project
# credentials: <%= Rails.root.join("path/to/gcs.keyfile") %>
# bucket: your_own_bucket
# Use rails credentials:edit to set the Azure Storage secret (as azure_storage:storage_access_key)
# microsoft:
# service: AzureStorage
# storage_account_name: your_account_name
# storage_access_key: <%= Rails.application.credentials.dig(:azure_storage, :storage_access_key) %>
# container: your_container_name
# mirror:
# service: Mirror
# primary: local
# mirrors: [ amazon, google, microsoft ]
This time when I deployed everything worked fine.
Note: You can modify the file content based on your storage configurations
That's all.
I hope this helps
This may not fix your issue, but I had ".yaml" instead of ".yml" because I had to manually create the file "/config/storage.yml" manually and made a typo.
Hope this helps someone, as I couldn't find many results on this error.
FYI, I think the generator didn't create the storage.yml file because I was on Rails 5.1, originally and then upgraded to 5.2
I am having an issue trying to deploy my app to heroku with rails active storage. In development I have no issues using
config.active_storage.service = :local
and all works as it should.
However in my production.rb file I have set config.active_storage.service = :amazon
and followed the set up guides.
My storage.ymlis as follows:
test:
service: Disk
root: <%= Rails.root.join("tmp/storage") %>
local:
service: Disk
root: <%= Rails.root.join("storage") %>
amazon:
service: S3
access_key_id: <%= ENV['S3_ACCESS_KEY'] %>
secret_access_key: <%= ENV['S3_SECRET_ACCESS_KEY'] %>
region: <%= ENV['S3_REGION'] %>
bucket: <%= ENV['S3_BUCKET_NAME'] %>
When I run git push heroku master, the app will deploy, but the following error will appear in the logs: "Detecting rails configuration failed".
I am unable to open the app and the herkou log will display the following error: "Missing configuration for the :amazon Active Storage service. Configurations available for [:test]".
This same error occurs if I change
config.active_storage.service = :amazon
to
config.active_storage.service = :local
however if I change it to
config.active_storage.service = :test
the app will deploy without error and I am able to open the app and upload files as expected.
I have trawled the web but haven't seen anyone else with this error, so any comments or thoughts are appreciated.
Thanks in advance.
The problem is the indentation in storage.yml
test: local: are indented one space, and amazon: is indented two spaces.
I had the same problem, and it was hard to notice an extra space. In YAML the spaces matter, but Ruby devs are not accustomed to having the spaces matter.
I'm currently trying to configure Paperclip with newest aws-sdk suggested gem.
On my S3.yml file I have something like this
development:
bucket: newmeeter-dev
access_key_id: ENV['S3_KEY']
secret_access_key: ENV['S3_SECRET']
But it is not recognizing the ENV variables. I'm getting the following error
AWS::S3::Errors::InvalidAccessKeyId in PhotosController#create
The AWS Access Key Id you provided does not exist in our records.
If I try to put both the access and secret directly into the file it works perfectly. At the same time I tried to print both ENV variables into the views or in the console I can see their values okay.
I'm not getting why it is not recognizing it.
Solved!
I found the reply to this question here
Ruby on Rails: Can you put Ruby code in a YAML config file?
Solution: YAML files understand code in ERB format.
Printing ENV variables inside <%= and %> works.
access_key_id: <%= ENV['S3_KEY'] %>
secret_access_key: <%= ENV['S3_SECRET'] %>
I have a rails app running on Heroku. I am using paperclip for some simple image uploads for user avatars and some other things, I have S3 set as my backend and everything seems to be working fine except when trying to push to S3 I get the following error:
The AWS Access Key Id you provided does not exist in our records.
Thinking I mis-pasted my access key and secret key, I tried again, still no luck. Thinking maybe it was just a buggy key I deactivated it and generated a new one. Still no luck.
Now for both keys I have used the S3 browser app on OS X and have been able to connect to each and view my current buckets and add/delete buckets. Is there something I should be looking out for? I have my application's S3 and paperclip setup like so
development:
bucket: (unique name)
access_key_id: ENV['S3_KEY']
secret_access_key: ENV['S3_SECRET']
test:
bucket: (unique name)
access_key_id: ENV['S3_KEY']
secret_access_key: ENV['S3_SECRET']
production:
bucket: (unique_name)
access_key_id: ENV['S3_KEY']
secret_access_key: ENV['S3_SECRET']
has_attached_file :cover,
:styles => {
:thumb => "50x50"
},
:storage => :s3,
:s3_credentials => "#{RAILS_ROOT}/config/s3.yml",
:path => ":class/:id/:style/:filename"
EDIT NOTE: The ENV['S3_KEY'] and ENV['S3_SECRET'] are environment variables in heroku which i have tried even using my keys directly and it still doesn't work
Note: I just added the (unique name) bits, those aren't actually there--I have also verified bucket names, but I don't even think this is getting that far. I also have my heroku environment vars setup correctly and have them setup on dev
You aren't setting a bucket. It's in your s3.yml file, but you aren't reading that value from your call to has_attached_file.
Paperclip S3 docs:
http://rubydoc.info/gems/paperclip/Paperclip/Storage/S3#s3_protocol-instance_method
Also, pay attention to those people who are telling you not to use a s3.yml file with Heroku. It's a waste and just added abstraction that buys you nothing. You already have your ENV set up with the values you need, so use them.
I've done this before where I don't want to push an s3.yml file to Heroku, but I do want to use one for test and development. In an initializer you can do something like this:
# If an s3.yml file exists, use the key, secret key, and bucket values from there.
# Otherwise, pull them from the environment.
if File.exists?("#{Rails.root}/config/s3.yml")
s3_config = YAML.load_file("#{Rails.root}/config/s3.yml")
S3[:key] = s3_config[Rails.env]['key']
S3[:secret] = s3_config[Rails.env]['secret']
S3[:bucket] = s3_config[Rails.env]['bucket']
else
S3[:key] = ENV['S3_KEY']
S3[:secret] = ENV['S3_SECRET']
S3[:bucket] = ENV['S3_BUCKET']
end
Then when you're setting up Paperclip in your model, you reference the value like this:
...
:s3_credentials => {
:access_key_id => S3[:key],
:secret_access_key => S3[:secret]
},
:bucket => S3[:bucket]
Obviously, this means that you do not want to have your s3.yml file in your git repository (which really, you shouldn't anyway).
I kept getting the same AWS::S3::InvalidAccessKeyId error, and had a very similar s3.yml file. As x1a4 recommended, I used ERB in my yaml file and it worked. Here's what it looks like now:
# myapp/config/s3.yml
development: &DEFAULTS
bucket: myapp_dev
access_key_id: <%= ENV['S3_KEY'] %>
secret_access_key: <%= ENV['S3_SECRET'] %>
test:
<<: *DEFAULTS
bucket: myapp_test
production:
<<: *DEFAULTS
bucket: myapp
staging:
<<: *DEFAULTS
bucket: myapp_staging
I guess this might a tad too indirect for some folks, but it seemed like the cleanest implementation to me.
Your s3 yaml file is actually using the strings ENV['S3_KEY'] and ENV['S3_SECRET'] as the auth info for s3. They are not being evaluated as ruby code.
There are a couple of things at least that you can do outside of putting that actual info into the yaml file. You can look into enabling ERB in your yaml configs or just not use a yaml file for your credentials at all, because you're always pulling from the environment in every one of your rails_envs, so the yaml file is just an extra layer of indirection in your case that is useless.