Rails]Initializing for Fog in development environment - ruby-on-rails

I deploy my rails app onto Heroku, and I've set aws access keys on the server as environment variables. However, to test my application in development environment, I need to initialize them somewhere on my local machine. So I decided to the following.
/config/initailizers/init_aws_locally.rb
ENV['AWS_ACCESS_KEY_ID'] = 'my key'
ENV['AWS_SECRET_ACCESS_KEY'] = 'my secret key'
This file is added in .gitignore
However, when I upload in development environment, I get this error message:
Missing required arguments: aws_access_key_id, aws_secret_access_key
I think somehow I overlooked a simple step to include my aws keys in my development environment. But I'm not sure why the reason for the error when I already initialized the keys.
For your reference, I'm using carrierwave, S3, and Fog.
config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'], # required
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY'], # required
:region => 'us-east-1', # optional, defaults to 'us-east-1'
}
config.fog_directory = 'd' # required
config.fog_public = true # optional, defaults to true
end
Thank you. I appreciate your help!

Your initializers will be run in alphabetical order. See the docs:
If you have any ordering dependency in your initializers, you can
control the load order by naming. For example, 01_critical.rb will be
loaded before 02_normal.rb.
The problem you're experiencing is that your fog.rb initializer is running before your init_aws_locally.rb one (because f comes before i). So ENV['AWS_ACCESS_KEY_ID'] has not been defined (yet) when you set fog_credentials.

I'd avoid putting any credentials in code. It's such a terrible idea and Heroku has the right idea. So what I do is use RVM and put a file .rvmrc in my project folder. I put .rvmrc in .gitignore as well.
Then you edit .rvmrc to have
export AWS_ACCESS_KEY_ID="BLAH"
so on and so forth. Anytime I "cd" into this directory my env is setup for me for that project by RVM. If you're not using RVM there is other alternatives out there.
rails s
and it will have all the environment variables setup that you put in your .rvmrc script. No need for initializer or development only yaml config files that you keep out of source control. From my experience this is the simplest solution.

I went to my shell and typed:
$ echo $AWS_SECRET_ACCESS_KEY
and it came back blank. It turns out I recently moved to a new virtual machine and forgot to add this to the .bashrc file. It's worth checking the shell environment just in case. Once I added those two lines to my .bashrc, everything was happy again.

Related

Fog-Google, Unable to Push to Heroku

So i am trying to set up uploading of files in my production environment. I am currently using CarrierWave along with Fog-Google. I have no issues storing files locally as i do not use Fog for development. However i am currently trying to test out the file uploading functionality in production, but i cannot even push my app up to Heroku.
Here's a snippet of the error i'm receiving when attempting to push to Heroku.
[fog][WARNING] Unrecognized arguments: google_storage_secret_access_key_id, google_storage_secret_access_key
rake aborted!
ArgumentError: Invalid keyfile or passphrase
Now i am relatively new to setting up ENV secret ids and all of that. So instead i'll just say what i know and what i have done just to be sure that i did everything correctly.
So as i am currently using Cloud9 IDE, in my .bashrc file i have
export GOOGLE_STORAGE_ACCESS_KEY_ID=XXXXX
export GOOGLE_STORAGE_SECRET_ACCESS_KEY=XXXXX
export GOOGLE_STORAGE_BUCKET_NAME=XXXXX
In my /config/initializers/carrierwave.rb
require 'carrierwave'
CarrierWave.configure do |config|
config.fog_provider = 'fog/google' # required
config.fog_credentials = {
provider: 'Google',
google_storage_access_key_id: ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
google_storage_secret_access_key: ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['GOOGLE_STORAGE_BUCKET_NAME']
end
and in my /config/initializers/fog.rb
GoogleStorage = Fog::Storage.new(
provider: 'Google',
google_project: 'XXXX',
google_client_email: 'XXXXXX',
google_key_location: Rails.root.join('private','google-cloud-service-key.p12'),
google_storage_secret_access_key_id: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID"],
google_storage_secret_access_key: ENV["GOOGLE_STORAGE_SECRET_ACCESS_KEY"]
)
Like mentioned i am actually quite new to all of these so i've tried my best in following the documentation on both Fog and CarrierWave's github page.
As far as i know i should use .bashrc to store my secret keys etc and then call on them using ENV['SECRET_KEY_NAME'] method. I've set up both the CarrierWave.rb and Fog.rb files in the initializer folder so i'm not quite sure what i'm missing as well.
Additionally i have also tried doing heroku config:set GOOGLE_STORAGE_SECRET_ACCESS_KEY_ID=XXXXXX but that didn't seem to work as well.
I'm not quite sure what to do now and what may be causing the error when attempting to push to Heroku, much less whether any of this is even working in production.
EDIT:
I think the error is largely from the fog.rb file. So i amended it to the following:
GoogleStorage = Fog::Storage::Google.new(
google_project: 'XXX',
google_client_email: 'XXX',
google_json_key_location: '~/.fog/XXXX.json'
)
And now when i try pushing to Heroku the error i get instead is
Errno::ENOENT: No such file or directory # rb_sysopen - /app/.fog/XXX.json
Just to share, i created a .fog folder under the ~ directory. Inside the .fog folder i added in the Private JSON key.
All help and advice would be greatly appreciated. Thank you!
So i've solved the problem and managed to successfully push my code to Heroku.
[Kindly note that this does not mean it functions perfectly in production, it just means i am now able to push to Heroku without any errors]
So there were 2 main errors.
1) Not including my Private Key JSON file in my app.
2) Error in the config/initializers/fog.rb file
To solve the first issue, i created a .fog folder in my App and added in my Private Key JSON file which was from Google's Cloud Platform.
Next i amended the code in config/initializers/fog.rb to:
GoogleStorage = Fog::Storage::Google.new(
:google_project => 'XXXX',
:google_client_email => 'XXXXX',
:google_json_key_location => Rails.root.join('.fog/XXXX'),
:google_storage_access_key_id => ENV['GOOGLE_STORAGE_ACCESS_KEY_ID'],
:google_storage_secret_access_key => ENV['GOOGLE_STORAGE_SECRET_ACCESS_KEY']
)
I was then able to successfully push my code to Heroku.

Refresh .bashrc with new env variables

I have a production server running our rails app, and we have ENV variables in there, formatted correctly. They show up in rails c but we have an issue getting them to be recognized in the instance of the app.
Running puma, nginx on an ubuntu box.
What needs to be restarted every time we change .bashrc? This is what we do:
1. Edit .bashrc
2. . .bashrc
3. Restart puma
4. Restart nginx
still not recognized..but in rails c, what are we missing?
edit:
Added env variables to /etc/environment based on suggestions from other posts saying that .bashrc is only for specific shell sessions, and this could have an effect. supposedly /etc/environment is available for all users, so this is mine. still having the same issues:
Show up fine in rails c
Show up fine when I echo them in shell
Do not show up in application
export G_DOMAIN=sandboxbaa3b9cca599ff0.mailgun.org
export G_EMAIL=mailgun#sandboxbaa3ba3806d5b499ff0.mailgun.org
export GEL=support#xxxxxx.com
PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games"
edit:
In the app i request G_DOMAIN and G_EMAIL in plain html (this works on development with dotenv, does not work once pushed to production server with ubuntu server) :
ENV TEST<BR>
G_DOMAIN: <%= ENV['G_DOMAIN'] %><br>
G_EMAIL:<%= ENV['G_EMAIL'] %>
However, the following env variables are available to use (in both .bashrc and /etc/environment, same as all variables we displayed above) because our images work fine and upload to s3 with no issue, on production.
production.rb
# Configuration for Amazon S3
:provider => 'AWS',
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY']
edit2: could this be anything with this puma issue?
https://github.com/puma/puma/commit/a0ba9f1c8342c9a66c36f39e99aeaabf830b741c
I was having a problem like this, also. For me, this only happens when I add a new environment variable.
Through this post and after some more googling, I've come to understand that the restart command for Puma (via the gem capistrano-puma) might not see new environment variables because the process forks itself when restarting rather than killing itself and being started again (this is a part of keeping the servers responsive during a deploy).
The linked post suggests using a YAML file that's only stored on your production server (read: NOT in source control) rather than rely on your deploy user's environment variables. This is how you can achieve it:
Insert this code in your Rails app's config/application.rb file:
config.before_configuration do
env_file = File.join(Rails.root, 'config', 'local_env.yml')
YAML.load(File.open(env_file)).each do |key, value|
ENV[key.to_s] = value
end if File.exists?(env_file)
end
Add this code to your Capistrano deploy script (config/deploy.rb)
desc "Link shared files"
task :symlink_config_files do
on roles(:app) do
symlinks = {
"#{shared_path}/config/local_env.yml" => "#{release_path}/config/local_env.yml"
}
execute symlinks.map{|from, to| "ln -nfs #{from} #{to}"}.join(" && ")
end
end
before 'deploy:assets:precompile', :symlink_config_files
Profit! With the code from 1, your Rails application will load any keys you define in your server's Capistrano directory's ./shared/config/local_env.yml file into the ENV hash, and this will happen before the other config files like secrets.yml or database.yml are loaded. The code in 2 makes sure that the file in ./shared/config/ on your server is symlinked to current/config/ (where the code in 1 expects it to be) on every deploy.
Example local_env.yml:
SECRET_API_KEY_DONT_TELL: 12345abc6789
OTHER_SECRET_SHH: hello
Example secrets.yml:
production:
secret_api_key: <%= ENV["SECRET_API_KEY_DONT_TELL"] %>
other_secret: <%= ENV["OTHER_SECRET_SHH"] %>
This will guarantee that your environment variables are found, by not really using environment variables. Seems like a workaround, but basically we're just using the ENV object as a convenient global variable.
(the capistrano syntax might be a bit old, but what is here works for me in Rails 5... but I did have to update a couple things from the linked post to get it to work for me & I'm new to using capistrano. Edits welcome)

Amazon S3 settings for Development and Production

I am using CarrierWave and Fog to upload images and process them to Amazon S3.
Below is my Fog settings.
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'AKIAJ23D1I25B2P2HX6A',
:aws_secret_access_key => 'WV64nQAd111+ZelqKgffrzvViG0lEeTTnEOonXHkg'#,
#:region => 'us-west-2'
}
config.fog_directory = "<TESTING>"
end
Is there a way to have two settings that gets used when the app is in production and development environment so that we dont mess with the files in the product and can delete it in dev one.
Yes you can use different settings by specifying in the particular environment. I am using this in my project but what I have done is I am using different directories for both. Like in the production.rb I am using project_directory and in development.rb I am using project_dev_directory. I have specified the same settings in both the environments. If you need to specify different settings you can do that too. Hope this helps.

Heroku S3 Env variables

I'm trying to use carrierwave to upload images to S3. It works locally but when I go to deploy to heroku I get the following error:
ArgumentError: Missing required arguments: aws_access_key_id, aws_secret_
access_key
The keys are definitely set because I can see them when I run heroku:config
I've searched every answer I could find on stack and I searched through every answer on the first 3 pages of Google. None of them have worked.
I know the uploading works so it's not the code that's a problem. What settings or variables do I have to set to make this work?
Please help, I can't move forward with my app until this is done (so I can deploy to heroku again without it being stopped because of this error.)
Some info:
Environment Variables
You've got a problem with the calling of your environment variables in
Heroku. ENV vars are basically variables stored in the OS /
environment, which means you've got to set them for each environment
you attempt to your deploy application
heroku config should should the ENV vars you've set. If you don't see ENV['AWS_ACCESS_KEY'] etc, it means you've not set them correctly, which as explained is as simple as calling the command heroku config:add YOUR_ENV_VAR=VAR
Figaro
I wanted to recommend using Figaro for this
This is a gem which basically stores local ENV vars in config/application.yml. This allows you to store ENV variables locally; but more importantly, allows you to sync them with Heroku using this command:
rake figaro:heroku
This will set your env vars on Heroku, allowing you to use them with Carrierwave as recommended in the other answers
It sounds like you have set the ENV variables on Heroku, but you need to hook those up to CarrierWave.
In config/initializers/fog.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => Rails.configuration.s3_access_key_id,
:aws_secret_access_key => Rails.configuration.s3_secret_access_key,
}
end
In your environments/<environment>.rb file
Rails.application.configure do
config.s3_access_key_id = ENV['S3_ACCESS_KEY_ID']
config.s3_secret_access_key = ENV['S3_SECRET_ACCESS_KEY']
end
This sets your Rails config to the ENV variables on Heroku which makes them available as Rails.configuration.<key>

Making ENV variables accessible in development

When storing sensitive credentials I normally create a yml file and load it like so in my development.rb
APP_CONFIG = YAML.load_file("#{Rails.root}/config/config.yml")[Rails.env]
I can then access like so
APP_CONFIG["google_secret"]
Problem is Heroku doesn't like this so i need to set ENV variables locally to make integration easier. so i have created a env.rb file like so
ENV['google_key'] = 'xxx'
ENV['google_secret'] = 'xxx'
ENV['application_key'] = 'xxx'
and to accesss it i thought i could use
x = ENV['application_key']
But its not finding the variable, how do I load them in the development environment?
Thanks
You should put the env.rb file in initializers folder. You can add env.rb file to .gitignore file if you don't want to push it to heroku.
Have you considered using Figaro to do this? Figaro was inspired by Heroku's secret key application configuration, so it's really easy to make secret ENV variables in development accessible in Heroku production environments.
I wrote up an answer on this StackOverflow thread about hiding secret info in Rails (using Figaro) that can hopefully serve of some reference to you as well.

Resources