Where should server environment variables be stored in Ruby on Rails? - ruby-on-rails

Where should I store keys specific to my development, test, production servers in my ruby project? For example where should store my development-specific amazon s3 secret and key? my config/development.rb file? One issue I see with that is if the file was a part of a public github project it would show for everyone.
Thanks!

You store separate environment variables in config/development.rb, config/testing.rb and config/production.rb respectively.
However, if your files will be stored in a public git repo, you do not want to hardcode any sensitive information into them. The best method is to use either yaml files that are part of your .gitignore or to use Environment variables in your shell. I prefer the latter, like this:
PAPERCLIP_OPTIONS = { storage: :s3,
bucket: ENV['S3_BUCKET'],
s3_credentials: { access_key_id: ENV['S3_KEY'],
secret_access_key: ENV['S3_SECRET'] }}
You then just set the environment variables on the system that is running the app.
If you use the yaml config file method, you must add the sensitive config files to your .gitignore file. Otherwise they will still be uploaded to your public repo.

If you look at config directory, you will see a YAML file, containing database credential information. You could do the same for your cloud environments.
development:
server: xiy-234
username: hello
password: 1325abc
production:
...

You can put those information inside a .yml file in the config directory.
For instance:
production:
access_key_id: xxx
secret_access_key: xx
bucket: xxx
development:
access_key_id: xxx
secret_access_key: xxx
bucket: xxx
staging:
access_key_id: xxx
secret_access_key: xxx
bucket: xxx
Once this is done, you have to store those keys in a Hash by doing the following:
APIS_CONFIG = {'amazons3' => YAML.load_file("#{RAILS_ROOT}/config/amazons3.yml")[Rails.env]}
(You can put the previous line of code inside a .rb file situated in the config/initializers directory)
Note that you might find this Railscast interesting.

Related

Rails Active Storage push to DigitalOcean Spaces

Hi I'm trying to get active storage to push to a DigitalOcean space. However, I'm finding that the push url is being changed to amazonaws.com even though I've defined the endpoint to digital ocean.
here is what I have in storage.yml
amazon:
service: S3
access_key_id: <%= Rails.application.credentials.dig(:aws, :access_key_id) %>
secret_access_key: <%= Rails.application.credentials.dig(:aws, :secret_access_key) %>
region: sfo2
bucket: redacted_bucket_name
endpoint: https://sfo2.digitaloceanspaces.com
When I try to upload a file, I get the following error:
Aws::Errors::NoSuchEndpointError (Encountered a `SocketError` while attempting to connect to:
https://redacted_bucket_name.s3.sfo2.amazonaws.com/a8278561714955c23ee99
in my gemfile I have: gem 'aws-sdk-s3
I've followed the directions found here, and I'm still getting the error. Is it possible that there's a new way to do this?
I just set something like this up myself a few days ago. When you check the URL https://redacted_bucket_name.s3.sfo2.amazonaws.com/a8278561714955c23ee99 it's different from the actual endpoint you set up https://redacted_bucket_name.sfo2.amazonaws.com/a8278561714955c23ee99
the error is being caused by an invalid endpoint your hitting, the s3 right before the .sfo2 is offsetting the endpoint. Did you happen to add s3 to your spaces config? check your spaces dashboard and try to get the endpoint setup properly.
I had this same challenge when working on a Rails 6 application in Ubuntu 20.04.
Here's how I fixed mine:
Firstly, create a Spaces access keys in your digital ocean console. This link should help - DigitalOcean Spaces API
Secondly, add a new configuration for DigitalOcean Spaces in your config/storage.yml file. Just after the local storage definition:
# Use rails credentials:edit to set the AWS secrets (as aws:access_key_id|secret_access_key)
digital_ocean:
service: S3
access_key_id: <%= SPACES_ACCESS_KEY_ID %>
secret_access_key: <%= SPACES_SECRET_ACCESS_KEY %>
region: <%= SPACES_REGION %>
bucket: <%= SPACES_BUCKET_NAME %>
endpoint: <%= SPACES_ENDPOINT %>
Note: You can give your entry any name, say digital_ocean_spaces or something else. For me I named it digital_ocean.
Thirdly, modify the config.active_storage.service configuration in the config/environments/production.rb file from:
config.active_storage.service = :local
to
config.active_storage.service = :digital_ocean
Finally, specify these environment variables file in your config/application.yml file (if you're using the Figaro gem) or your .env file. (if you're using the dotenv gem). In my case I was using the dotenv gem, so my .env file looked looked like this:
SPACES_ACCESS_KEY_ID=E4TFWVPDBLRTLUNZEIFMR
SPACES_SECRET_ACCESS_KEY=BBefjTJTFHYVNThun7GUPCeT2rNDJ4UxGLiSTM70Ac3NR
SPACES_REGION=nyc3
SPACES_BUCKET_NAME=my-spaces
SPACES_ENDPOINT=https://nyc3.digitaloceanspaces.com
That's all.
I hope this helps

Rails Generate controller aws error missing bucket name

I am attempting to create a Users controller in my ruby on rails project, which I have also configured with heroku and an aws-s3 bucket. I set my .env and my heroku local with the S3_BUCKET, AWS_ACCESS_KEY_ID, and AWS_SECRET_ACCESS_KEY. I also set my initializer/aws.rb file to look like this:
Aws.config.update({
region: 'us-east-1',
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
})
S3_BUCKET = Aws::S3::Resource.new.bucket(ENV['S3_BUCKET'])
I have bundle installed the aws gem like this:
gem 'aws-sdk', '~> 3'
However when I run the command
rails g controller Users new
I get the following error in my terminal:
aws-sdk-s3/bucket.rb:658:in `extract_name': missing required option :name (ArgumentError)
I looked at that file and it is trying to find the S3 bucket name, but I have set that already in .env and heroku local. Is there some other place where this needs to be set? None of the guides I have read mention this error.
Hi please check whether you have specified the right credentials and bucket name. Also, make sure you have provided the right region .Try the below code
s3 = Aws::S3::Resource.new(
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']),
region: 'us-west-1'
)
obj = s3.bucket(ENV['S3_BUCKET']).object('key')
If you want to upload file or something
obj.upload_file(file, acl:'public-read')
This would help you, I have used like this in my project.
1. Create the file aws.rb in your /config/initializers folder.
2. Then copy the below code,
S3Client = Aws::S3::Client.new(
access_key_id: 'ACCESS_KEY_ID',
secret_access_key: 'SECRET_ACCESS_KEY',
region: 'REGION'
)
Thats all, this works.
Happy coding :)

Travis CI deploy to S3 bucket not working with secure keys

I have a static website and I'm trying to use Travis CI to migrate content to the S3 bucket where I'm hosting the website each time I commit changes to GitHub. To support this, I have the following .travis.yml file:
language: python
python: '2.7'
install: true
script: true
deploy:
provider: s3
access_key_id: XXXXX
secret_access_key: YYYYY
bucket: thug-r.life
skip_cleanup: true
region: us-east-1
local_dir: public
which works fine. Except I have my secret in plain text on GitHub in a public repo. So...that's bad. Travis CI has a section on encrypting keys (https://docs.travis-ci.com/user/encryption-keys/) which I followed. Using the CLI tool
travis encrypt secret_access_key="YYYYY" --add
which inserts at the bottom of my file
env:
global:
secure: ZZZZZ
So I tried to modify my original file to look like
deploy:
secret_access_key:
secure: ZZZZZ
But then Travis CI complains that the 'The request signature we calculated does not match the signature you provided.'
So I tried encrypting without quotes
travis encrypt secret_access_key=YYYYY --add
and using the output in the same way.
How am I supposed to include the encrypted key?
All of the examples in the Travic CI help on encrypting keys (https://docs.travis-ci.com/user/encryption-keys/) was of the form:
travis encrypt SOMEVAR="secretvalue"
which it states encrypts the key as well as the value. So, taking the output of the above encryption and using it like above
deploy:
secret_access_key:
secure: ZZZZZ
decrypts to be
deploy:
secret_access_key: secret_access_key: YYYYY
which is what was causing the errors. Instead, what I ended up doing that worked was:
travis encrypt "YYYYY" --add
and used it in the .travis.yml file as
deploy:
secret_access_key:
secure: ZZZZZ
which ended up being accepted.
tl;dr Don't include the key when encrypting the secure_access_key

Cannot upload files from S3 Development Bucket in Rails App

When running my rails app in development mode from Nitrous.io, I cannot access my development bucket which I set up on AWS S3. The upload button opens my personal computer, from where I don't want to load files. (even when I try to load files from my computer, I get a long error message stating "The request signature we calculated does not match the signature you provided. Check your key and signing method"
I think I don't have AWS S3 configured properly.
Currently, I have one IAM user, which I've assigned to AdministratorAccess Also, I am using the proper AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in my application.yml file. In fog.rb I have it read from the enviroment.
I should add too that I currently enrolled in a web development apprenticeship program.
Sorry for not showing my files
Here's my application.yml with the sensitive data taken out:
SENDGRID_PASSWORD: alphanumberic
SENDGRID_USERNAME: -------#heroku.com
AWS_ACCESS_KEY_ID: alphanumeric
AWS_SECRET_ACCESS_KEY: alphanumeric
development:
AWS_BUCKET: vmanamino-bloccit-development
production:
AWS_BUCKET: vmanamino-bloccit-production
development:
secret_key_base: alphanumeric
test:
secret_key_base: alphanumeric
Here's my fog.rb file which reads the values from the environment
CarrierWave.configure do |config|
config.fog_credentials = {
provider: 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
aws_secret_access_key: ENV['AWS_ACCESS_KEY_ID'],
}
config.fog_directory = ENV['AWS_BUCKET']
config.fog_public = true
end
You're using the AWS_ACCESS_KEY_ID environment variable for both the access key and the secret access key whereas the latter should of course be using ENV['AWS_SECRET_ACCESS_KEY']
I didn't realize I needed to enclose the KEY ID and the SECRET KEY in quotes. Once I did that, I got it to work. I can upload images from my conputer to S3. Also, I didn't understand the assignment perfectly. I thought my app would upload images from S3. Now, the error raised earlier makes sense. I upload from my computer sending the image to S3.

Rails: Paypal configuration file and figaro environment variables

I´m using the gem 'paypal-sdk-adaptivepayments' to integrate Paypal in my Rails app. The configuration file is paypal.yml:
development:
# Credentials for Classic APIs
username: ENV["PAYPAL_CLASSIC_USERNAME_DEV"]
password: ENV["PAYPAL_CLASSIC_PASSWORD_DEV"]
signature: ENV["PAYPAL_CLASSIC_SIGNATURE_DEV"]
app_id: ENV["PAYPAL_CLASSIC_APP_ID_DEV"]
http_timeout: 30
# Mode can be 'live' or 'sandbox'
mode: sandbox
test:
<<: *default
production:
<<: *default
#mode: live
Cause this information is secret I want to use another gem called 'Figaro' that externalize this variables. I used this for another configuration files in my app but it doesn´t works with 'paypal.yml'. I know that doesn´t works because when I put the real information in the paypal.yml file it works.
development:
# Credentials for Classic APIs
username: *******#yahoo.com
password: *******
signature: ******
app_id: ******
http_timeout: 30
# Mode can be 'live' or 'sandbox'
mode: sandbox
test:
<<: *default
production:
<<: *default
#mode: live
Has anyone used Figaro with this file? Is there any other option to "secretize" this information in Rails?
Thanks in advance!
Okay, what you should do is replace your yaml config file with a config.rb file:
#config/initializers/paypal.rb
PayPal::SDK.configure(
username: ENV["PAYPAL_CLASSIC_USERNAME_DEV"],
password: ENV["PAYPAL_CLASSIC_PASSWORD_DEV"],
signature: ENV["PAYPAL_CLASSIC_SIGNATURE_DEV"],
app_id: ENV["PAYPAL_CLASSIC_APP_ID_DEV"],
http_timeout: 30
)
Then define these variables in a yaml file such as application.yml:
#config/application.yml
PAYPAL_CLASSIC_USERNAME_DEV: yourpaypalusername
PAYPAL_CLASSIC_PASSWORD_DEV: yourpaypalpassword
PAYPAL_CLASSIC_SIGNATURE_DEV: yourpaypaysignature
PAYPAL_CLASSIC_APP_ID_DEV: yourappid
Lastly assuming you are using git for your version control, go into your gitignore and tell it to ignore the yaml file with your secret info:
#.git_ignore
/config/application.yml
This will properly keep your environment variables out of your code base when you push it to github or other git repo.
Edit: To use different environment keys you merely specify the keys for different environments in your yaml file:
#application.yml
production:
PAYPAL_CLASSIC_USERNAME_DEV: yourpaypalusername
PAYPAL_CLASSIC_PASSWORD_DEV: yourpaypalpassword
PAYPAL_CLASSIC_SIGNATURE_DEV: yourpaypaysignature
PAYPAL_CLASSIC_APP_ID_DEV: yourappid
That being said, I highly suspect what you are doing is a violation of 12factor best practices. This yaml file is not a part of your application - when you push your code up to a server like Heroku, this file is not supposed to go with it. The yaml file with your keys should only ever exist on your local machine. It should not exist in Github, nor on your production server. To read more about this, check out this link: http://12factor.net/config
So how do you tell a server like Heroku what these config variables are? This will vary based on what service provider you use, but is typically done via the command line tools or through an admin management page. e.g., for Heroku you would run:
heroku config:set PAYPAL_CLASSIC_USERNAME_DEV=putyourusernamehere

Resources