I have my rails (4.2) app running through Passenger (5.0.28) + Apache (2.4.7) on an Ubuntu (14.02) system, ruby (2.3.0) managed with rbenv . I deploy with Capistrano (3.4.0).
All my environment variables are set in a very simple profile.d script.
#!/bin/sh
export VAR1=VAL1
export VAR2=VAL2
This works like a charm. My app ENV has all the correct variables, Secrets.yml is properly populated... everything works EXCEPT for when deploying with Capistrano over ssh.
In my deploy.rb I have the following that I think is relavant:
set :ssh_options, {
forward_agent: true,
paranoid: true,
keys: "~/.ssh/id_rsa.pub"
}
Capistrano docs being incredibly limited and ssh\server config not my strong point I can't seem to figure out why my ENV variables aren't seen by Capistrano. If I run puts ENV.inspect during the deploy flow, I get things such as "TERM_PROGRAM"=>"Apple_Terminal" and my local machine user info and whatnot. Why isn't Capistrano using the remote environment? How can I amend my configuration either server side or in my deploy script to fix this?
Thanks for the help.
First I think some clarification of terminology and Capistrano's execution model is needed.
Capistrano is a program that runs on your local machine. So ENV within Capistrano sees your local environment, not the server's. There is no way for Capistrano to "see" the remote ENV with plain Ruby code because the Ruby code that makes up Capistrano is not executing there.
What Capistrano does do is use SSH to send commands to the server to be executed there. Commands like mkdir, bundle install, and so on.
To see this illustrated, add a Capistrano task to your deployment flow that does this:
task :puts_remote_env do
on roles(:all) do
remote_env = capture("env")
puts remote_env
end
end
This will run the env command on the remote server, capture the result, and print it to your console.
I hope this makes it more clear how Capistrano works.
So, as you can see from the puts_remote_env output, the variables defined in your profile.d script are not there. Why?
It is because Capistrano is using a non-login, non-interactive SSH session. In that SSH session, your profile.d script is not being evaluated. This is explained in detail in the Capistrano FAQ: http://capistranorb.com/documentation/faq/why-does-something-work-in-my-ssh-session-but-not-in-capistrano/
You need to find another way to set those variables other than profile.d script.
You could specify them in your Capistrano configuration itself (e.g. production.rb) like this:
set :default_env, { var1: "val1", var2: "val2" }
Capistrano will then explicitly set up that environment when it executes SSH commands.
Or you could use a tool like dotenv, which allows Rails to read variables variables from a special file instead of relying on the execution environment.
Or you could experiment with different dot file locations to see if there are some that are still evaluated even in a non-login, non-interactive session. On Ubuntu, I've had success exporting variables at the very top of ~/.bashrc.
Related
I'm deploying my rails app with Capistrano. I want to save some API keys on the server as an environment variable. Those API keys should be accessible to my rails app that is deployed with Capistrano. Those API keys should also be accessible to a separate ruby file that is run as a daemon.
setting the API keys in environment variables seems like the ideal solution, however, I can't access them in my rails app with ENV["SOME_KEY"].
According to this post, because capistrano runs as non interactive and non login, ~/.bashrc and ~/.bash_profile are not loaded. The flowchart suggests that I should use $BASH_ENV.
Can I just add my api keys in $BASH_ENV and access them in my rails app and in the ruby file that is a daemon with ENV["SOME_KEY"]?
I'm also thinking of just adding the api keys to a file somewhere on the server and symlinking it to the ruby file dir and rails dir and just open and reading it. Would this be possible?
There are a few ways that work well with Capistrano, in my experience.
rbenv-vars
If you use Ruby via Rbenv on your server, then you are in luck. There is a Rbenv plugin called rbenv-vars that automatically injects environment variables into any Ruby process, which would include your Rails app. Just add your variables to ~/.rbenv/vars on the server using KEY=value syntax. That's it.
dotenv
The dotenv gem is a similar solution, but it works as a gem you add to your Rails app and doesn't require Rbenv or any other supporting tools. Add dotenv-rails to your Gemfile and deploy. Dotenv will automatically look for a .env.production file in the root of your Rails app. For Capistrano, create a .env.production file on the server inside Capistrano's shared directory, and then add .env.production to :linked_files. Now every deploy will link to it. Declare your variables using KEY=value syntax.
.bashrc
Declare your variables with export KEY=value syntax at very top of the ~/.bashrc file on the server. On Ubuntu, this file is evaluated even during an non-interactive SSH session. Just make sure you place the declarations on the top, before this case statement:
# If not running interactively, don't do anything
case $- in
*i*) ;;
*) return;;
esac
CentOS may be a different story, so YMMV.
I made a Capistrano plugin capistrano-env_config some time ago for managing and syncing environment variables across a Capistrano cluster which works by modifying the /etc/environment file to make environment variables available throughout the system. It's easy to use and is similar to how you can set environment variables with the Heroku toolbelt. Here are some examples:
cap env:list
cap env:get[VARIABLE_NAME, VARIABLE_NAME, ...]
cap env:unset[VARIABLE_NAME, VARIABLE_NAME, ...]
cap env:set[VARIABLE_NAME=VALUE, VARIABLE_NAME=VALUE, ...]
cap env:sync
I set up a Centos 7 VM using this tutorial (standalone passenger) and RVM. I am deploying the rails app via Capistrano.
https://www.phusionpassenger.com/library/walkthroughs/deploy/ruby/ownserver/standalone/oss/install_language_runtime.html
Everything seems to work, except no matter where I set environment variables, the ENV["myvar"] can't be read in Rails.
I've tried export myvar=test SSHed as the "deployers" as well as root. I've also tried adding it to bashrc. If I login as deployer and do the following:
#symlink to current capistrano deploy
cd ~/rails/railsapp/current/
rails r "puts ENV['myvar']"
It gives me the correct ENV output. However, if I try to output ENV['myvar'] from my actual deployed via capistrano rails app, I get nothing.
Where am I supposed to set these ENV vars? I know the ENV vars in rails are done correctly, because the app deployed to heroku, as well as on my dev machine, correctly output ENV['myvar'].
Generally, your setup should work. Since version 4, a standalone Passenger should inherit all variables defined in the shell startup scripts. There is a nice documentation about environment variables in various scenarios related to using Passenger.
I would check or two things:
That your .bashrc is loaded from .profile. If it weren't, then your variables would be loaded only in an interactive shell but not in passenger, which would explain the behavior your describe when you tried to log in as the deployers. Let me quote from the doc:
Make sure your ~/.bashrc is actually included by your ~/.profile, which might not be the case if you created the user with useradd instead of adduser for example.
Also, take a look at this section of the docs and check that you obey the conditions upon which Passenger actually passes the environment vars to the application.
I am running Ubuntu 14.04 LTS 64 bit Rails Application and I am unable to access my App environment variables.
In OpsWorks App panel, I set my environment variables, say:
MYKEY: 1234
Then I save and deploy my app again to make these visible.
In my Rails app, or the rails console I get nil:
$ bundle exec rails c production
>ENV["MYKEY"]
=> nil
I have tried restarting the server. I'm not sure what I am missing, I have been using environment variables in other services.
How can I trace where these should be set?
OpsWorks stores environmental variables in different places depending on what kind of app you're deploying. On Rails / Passenger they should be saved in the Apache config file #{your_app_name}.conf. (Source)
This means they aren't available in your normal shell environment.
I know the Node.js recipes stored everything in an /srv/www/#{app_name}/shared/app.env file... which is then sourced to pull in the environment to run the Node server. This implementation detail also meant you could write shell scripts that sourced that app.env file, then called some Node script or whatever.
Of course, Rails isn't Node. I have no idea if the environmental variables are also stored somewhere else or not: a quick look at the Rails recipes in the OpsWorks cookbooks didn't find anything obvious, but maybe I missed something.
Depending on the amount of modifications you have going on in your OpsWorks cookbook, you could create a deploy recipe that does something like this:
application_environment_file do
user deploy[:user]
group deploy[:group]
path ::File.join(deploy[:deploy_to], "shared")
environment_variables deploy[:environment_variables]
end
(maybe adjusting the path)
Then to run your console, when you're SSHed into the server, do something like
sudo source /srv/www/my_app_name/shared/app.env; bundle exec rails console -e production or whatever.
AWS OpsWorks console lets you declare environment variables but to let them be available for our Rails app we need to use a Chef cookbook recipe plus some precautions.
In a nutshell we use the config/secrets.yml file combined with config/application.yml file, Figaro gem and a Chef cookbook recipe.
The chef cookbook recipe read the variables defined in OpsWorks console and let them available to Rails app writing the config/application.yml file.
I have published a detailed guide to explain how exactly do it. Link here.
These are the core points that I covered:
Use config/secrets.yml file (added from Rails 4.1)
Use Figaro gem to load variables in the environment
Declare environment variables inside AWS OpsWorks Console
Use a custom Chef recipe to create a config/application.yml file that Figaro will use to let variables available
I (with some help from Bruno at the AWS PopUp Loft in NYC) added some custom Chef code inside the after_restart.rb deploy hook, simply add the folder "deploy" to your apps root directory and inside add "after_restart.eb." In it ....
Chef::Log.info("Running deploy/after_restart.rb")
contents = []
node[:deploy].each do |application, deploy|
deploy[:environment_variables].each do |key, value|
contents << "export #{key}=\"'#{value}'\""
end
end
Chef::Log.info("Adding the environment variables to /etc/profile.d/startup_env_config.sh")
bash "create_startup_env_config.sh" do
user "root"
cwd "/etc/profile.d"
code <<-EOH
echo \''#{contents.join(" ")}\'' > startup_env_config.sh
source startup_env_config.sh
cd #{release_path}
EOH
end
And that's it. If you update the environment variables inside the OpsWorks panel remember to restart your instances.
The Rails 4 application is deployed to the production server using Capistrano 3. The delayed jobs daemon starts. When a job is created, it does not recognize environment variables set in /etc/profile using ENV (returns nil). Any thoughts?
Moved the environment variables from /etc/profile to /etc/environment and that seemed to have fixed the problem. Maybe that's the way I was supposed to do it in the first place...I'm not a Linux expert.
I've got a staging server with both standard Ruby and Ruby Enterprise installed. As standard Ruby refuses to install a critical gem, I need to set $PATH so that ruby/gem/rake/etc. always refer to the REE versions. And since I use Capistrano to deploy to our machines, I need to do it in Capistrano.
How can I set an environment variable once, and have it persist throughout the Capistrano session?
1) It's easy to do in bashrc files, but Capistrano doesn't read bashrc files.
2) I'd use Capistrano's
default_environment['PATH'] = 'Whatever'
but Capistrano uses these environment variables like
env PATH=Whatever command arg ...
and they're lost whenever another shell is spun up within the executable passed to env. Like when you use sudo. Which is kinda important:
[holt#Michaela trunk]$ env VAR=hello ruby -e "puts ENV['VAR']"
hello
[holt#Michaela trunk]$ env VAR=hello sudo ruby -e "puts ENV['VAR']"
nil
3) And I can't use the bash export command, as these are lost too - Capistrano seems to start up a new shell for each command (or something like that), and that's lost, too:
cap> export MYVAR=12
[establishing connection(s) to xxx.xxx.xxx.xxx]
cap> echo $MYVAR
** [out :: xxx.xxx.xxx.xxx]
cap>
4) I've tried messing with Capistrano's :shell and :pty options as well (and in combination with the other approaches), but no luck there, either.
So - what's the right way to do this? This seems like such a basic task that there should be a really simple way to accomplish it, but I'm out of ideas. Anyone?
Thanks in advance!
I have the exactly same problem, but I think this solution is better:
set :default_environment, {
'env_var1' => 'value1',
'env_var2' => 'value2'
}
This works for me like a charm.
If you need to set a variable on the remote host other than PATH, you should know that sshd only allows certain /etc/profile or ~/.bashrc environment variables by default, for security reasons. As Lou said, you can either do cap shell and use the cap> printenv command, or you can do cap COMMAND=printenv invoke in one command.
If you see the variable when you ssh into the remote shell normally, but you don't see it in the cap printenv command, here's one solution:
Set PermitUserEnvironment yes in your remote server's /etc/ssh/sshd_config file, and restart sshd
Edit the ~/.ssh/environment file for the remote user you are ssh'ing in as, and put your variable(s) there as VARIABLE=value
Now those should show up when you do cap COMMAND=printenv invoke
I think you have in fact 2 problems:
1) You want to change the PATH on your remote host(s).
Alter/set the path in your .bashrc on your remote host(s) and run cap> printenv, if your path is right, goto #2, else try to add export BASH_ENV=~/.bashrc to your /etc/profile (be careful, ~/.bashrc will then be run for all non-interactive shell for all users)
2) You want sudo to keep your PATH
Run visudo on your remote host(s) and add:
Defaults exempt_group = "<your_user>"
I needed to set an environment variable for a specific task to work. The "run" command allows you to pass options which include :env:
run "cmd", :env => { 'name' => 'value' }
In my case, I wanted to add the environment variable to a task that I didn't write, so I used default_run_options which is used by all invocations of run. I added this to the top of my Capfile:
default_run_options[:env] = { 'name' => 'value' }
I tried unsuccessfully to use #brian-deterling's technique, which is pretty commonly used by others who have discussed this... Maybe I'm doing something wrong, but meanwhile I found the dotenv-rails gem, and it worked very nicely for loading up values out of a .env file in my project root.
The instructions on their Github repo are pretty straight-forward. I added the Dotenv.load to my config/application.rb