Can't find which *FILE* save the environment variable for cshell - environment-variables

When I hit the cmd set in csh it shows me a list of env variables I have right now in my session.
I was wondering where they are set/saved and couldn't find the location.
I have tried in files ~/.cshrc and ~/.cshrc.myusername and in both I saw none of the environment variables that set shows .
Where are they?

In memory. Each instance of csh will get a new copy of your default environment. Setting variables at the command prompt does not persist them anywhere for future sessions.

Related

set environment variable from .txt file in Ubuntu 18.04

I need set more than 100 environment variable from file.txt .this variable in text file is same below:
env=BACKTORY_AUTHENTICATION_MASTER_KEY=058f04d8ea6545sdf65sde99e49
env=BACKTORY_AUTHENTICATION_CLIENT_KEY=5a3ba2f0e4b0a24sdfsd4ffb4
env=BACKTORY_MASTER_ACCESS_TOKEN=my_token
.
.
.
is any way to set this variable automatically ?
and second question is : one time i set this variable manually by this way export variable = value one by one! but now i cant see any of them when i use printenv. is after restarting Ubuntu every environment variable will deleting?
thanks.

Get environment variable from Dockerfile or docker-compose.yml

I tried to get the variable in docker-compose.yml like ${NODE_ENV} but doesn't work.
Also I don't want to send any param on my commands. I have defined already an environment variable on my system and I'd like to take that one from either one of these 2 files.
the solution was running export NODE_ENV=development again. I was losing this env var every time I was closing the terminal

activating conda env vs calling python interpreter from conda env

What exactly is the difference between these two operations?
source activate python3_env && python my_script.py
and
~/anaconda3/envs/python3_env/bin/python my_script.py ?
It appears that activating the environment adds some variables to $PATH, but the second method seems to access all the modules installed in python3_env. Is there anything else going on under the hood?
You are correct, activating the environment adds some directories to the PATH environment variable. In particular, this will allow any binaries or scripts installed in the environment to be run first, instead of the ones in the base environment. For instance, if you have installed IPython into your environment, activating the environment allows you to write
ipython
to start IPython in the environment, rather than
/path/to/env/bin/ipython
In addition, environments may have scripts that add or edit other environment variables that are executed when the environment is activated (see the conda docs). These scripts can make arbitrary changes to the shell environment, including even changing the PYTHONPATH to change where packages are loaded from.
Finally, I wrote a very detailed answer of what exactly is happening in the code over there: Conda: what happens when you activate an environment? That may or may not still be up-to-date though. The relevant part of the answer is:
...the build_activate method adds the prefix to the PATH via the _add_prefix_to_path method. Finally, the build_activate method returns a dictionary of commands that need to be run to "activate" the environment.
And another step deeper... The dictionary returned from the build_activate method gets processed into shell commands by the _yield_commands method, which are passed into the _finalize method. The activate method returns the value from running the _finalize method which returns the name of a temp file. The temp file has the commands required to set all of the appropriate environment variables.
Now, stepping back out, in the activate.main function, the return value of the execute method (i.e., the name of the temp file) is printed to stdout. This temp file name gets stored in the Bash variable ask_conda back in the _conda_activate Bash function, and finally, the temp file is executed by the eval Bash function.
So you can see, depending on the environment, running conda activate python3_env && python my_script.py and ~/anaconda3/envs/python3_env/bin/python my_script.py may give very different results.

PyCharm not updating with environment variables

When I use vim to update my environmental variables (in ~/.bashrc), PyCharm does not get the updates right away. I have to shut down the program, source ~/.bashrc again, and re-open PyCharm.
Is there any way to have PyCharm source the changes automatically (or without shutting down)?
When any process get created it inherit the environment variables from it's parent process (the O.S. itself in your case). if you change the environment variables at the parent level, the child process is not aware of it.
PyCharm allows you to change the environment variables from the Run\Debug Configuration window.
Run > Edit Configurations > Environment Variables ->
In my case pycharm does not take env variables from bashrc even after restarting
Pycharm maintains it's own version of environment variables and those aren't sourced from the shell.
It seems that if pycharm is executed from a virtualenv or the shell containing said variables, it will load with them, however it is not dynamic.
the answer below has a settings.py script for the virtualenv to update and maintain settings. Whether this completely solves your question or not i'm not sure.
Pycharm: set environment variable for run manage.py Task
I recently discovered a workaround in windows. Close Pycharm, copy the command to run Pycharm directly from the shortcut, and rerun it in a new terminal window: cmd, cmder, etc.
C:\
λ "C:\Program Files\JetBrains\PyCharm 2017.2.1\bin\pycharm64.exe"
I know this is very late, but I encountered this issue as well and found the accepted answer tedious as I had a lot of saved configurations already.
The solution that a co-worker told me is to add the environment variables to ~/.profile instead. I then had to restart my linux machine and pycharm picked up the new values. (for OSX, I only needed to source ~/.profile and restart pycharm completely)
One thing to be aware is that another coworker said that pycharm would look at ~/.bash_profile so if you have that file, then you need the environment variables added there
In case you are using the "sudo python" technique, be aware that it does not by default convey the environment variables.
To correctly pass on the environment variables defined in the PyCharm launch configuration, use the -E switch:
sudo -E /path/to/python/executable "$#"
This is simply how environment variables work. If you change them you have to re-source your .bashrc (or whatever file the environment variables are located in).
from dotenv import load_dotenv
load_dotenv(override=True)
Python-dotenv can interpolate variables using POSIX variable expansion.
With load_dotenv(override=True) or dotenv_values(), the value of a variable is the first of the values defined in the following list:
Value of that variable in the .env file.
Value of that variable in the environment.
Default value, if provided.
Empty string.
With load_dotenv(override=False), the value of a variable is the first of the values defined in the following list:
Value of that variable in the environment
Value of that variable in the .env file.
Default value, if provided.
Empty string.

Use Environment Variable in the same process that assigned it

I have an Installer that assigns an Environment variable using setx command
Afterwards, that installer invokes a command line that uses this enviroment variable but in that context the variable is still empty.
If I invoke the command line independently the variable is read properly.
Why is that? and how can I overcome that?
I've experimented extensively with SETX. Variables set via SETX cannot be seen in the process or script that sets them, unless you programmatically re-read the pertinent Registry key.

Resources