How to remove environment variable from ld.config in Solaris - environment-variables

I have just added permanent environment variable to /var/ld/ld.config and /var/ld/64/ld.config using crle command:
crle -E VAR=VALUE -u
crle -E VAR=VALUE -u -64
How can I remove only this variable from ld.config, but leave rest params?

crle lets you add to an existing config file with -u, but it doesn't give you a way to selectively remove things from one. This is not symmetrical, and it's true that the ability to remove selected attributes might be considered to be missing functionality. However, note that crle config files are not expected to be manipulated in this way very often, so it's not a common use case to do this. Far more common is to create one, and then later to simply remove it entirely. And it must be said that config files in general are rarely used.
It's pretty easy to work around this. The goal is to remove an attribute, without having to remember all the other ones, simply carrying them over. Notice that the information displayed by crle for a given config file shows you the command that would recreate the file. You can therefore approximate the sort of selective removal you're after here, by using cut/paste to grab that displayed command, minus the part you no longer want.
For instance, let's create a config file with 2 environment variables:
% crle -c ld.config -E VAR1=v1 -E VAR2=v2
Later, if I want to remove VAR1 from the existing config file, I
would use 'crle -c file' to display the current contents:
% crle -c ld.config
Configuration file [version 5]: ld.config
Platform: 32-bit LSB 80386
Default Library Path: /lib:/usr/lib (system default)
Trusted Directories: /lib/secure:/usr/lib/secure (system default)
Environment Variables:
VAR1=v1 (permanent)
VAR2=v2 (permanent)
Command line:
crle -c ld.config -E VAR1=v1 -E VAR2=v2
That last line shows me all I need, to reproduce the config file, omitting the bit I no longer want:
% crle -c ld.config -E VAR2=v2

Related

Read ENV variable value within the ENV File only [duplicate]

I am using a base.env as an env_file for several of my docker services.In this base.env I have several parts of the environment variable that repeat throughout the file. For example, port and ip are the same for three different environment variables.
I would like to specify these in an environment variable and reuse those variables to fill out the other environment variables.
Here is base.env:
### Kafka
# kafka's port is 9092 by default in the docker-compose file
KAFKA_PORT_NUMBER=9092
KAFKA_TOPIC=some-topic
KAFKA_IP=kafka
KAFKA_CONN: //$KAFKA_IP:$KAFKA_PORT_NUMBER/$KAFKA_TOPIC
# kafka topic that is to be created. Note that ':1:3' should remain the same.
KAFKA_CREATE_TOPICS=$KAFKA_TOPIC:1:3
# the url for connecting to kafka
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://$KAFKA_IP:$KAFKA_PORT_NUMBER
I have tried writing
KAFKA_CONN: //$${KAFKA_IP}:$${KAFKA_PORT_NUMBER}/$${KAFKA_TOPIC}
in the environment section of the appropriate service in the docker-compose.yml, but this gets interpreted as a literal string in the container.
Is there a way to do what I want in the base.env file?
Thank you for your help!
You can actually do it like this (at least in vlucas/dotenv package (php), not sure about others, please check it yourself)
MAIL_NAME=${MAIL_FROM}
Read more about it here
There is no way to do this in an env_file since it is not run as a bash command. This means that the variable is not created and then concatenated into the next variable it appears in. The values are just read in as they are in the env_file.
I used $ in Node.js and React.js , and both worked
POSTGRES_PORT=5432
DATABASE_URL="postgresql://root#localhost:${POSTGRES_PORT}/dbname"
and in react
REACT_APP_DOMAIN=domain.com
#API Configurations
REACT_APP_API_DOMAIN=$REACT_APP_DOMAIN
I know that I am a little late to the party, but I had the same question and I found a way to do it. There is a package called env-cmd, which allows you to use a .js file as an .env file. The file simply needs to export an object with the keys being your environment variable names and the values being, well, the values. This now allows you to run javascript before the environment variables are exported and thus use environment variables to set others.
I temporarly managed to deal with this where I create a script to replace env file vars from another env file vars like so:
.env.baseurl:
BASEURL1=http://127.0.0.1
BASEURL2=http://192.168.1.10
.env.uris.default:
URI1=${BASEURL1}/uri1
URI2=${BASEURL2}/uri2
URI3=${BASEURL2}/uri3
convert-env.sh:
#!/bin/bash
# To allow using sed correctly from same file multiple times
cp ./.env.uris.default ./.env.uris
# Go through each variable in .env.baseurl and store them as key value
for VAR in $(cat ./.env.baseurl); do
key=$(echo $VAR | cut -d "=" -f1)
value=$(echo $VAR | cut -d "=" -f2)
# Replace env vars by values in ./.env.uris
sed -i "s/\${$key}/$value/g" ./.env.uris
done
then you can run docker run command to start the container and load it with your env vars (from .env.baseurl to .env.uris) :
docker run -d --env-file "./.env.uris" <image>
This is not the best solution but helped me for now.
Using Nextjs, in the .env.local file I have the following variables:
NEXT_PUBLIC_BASE_URL = http://localhost:5000
NEXT_PUBLIC_API_USERS_URL_REGISTER = ${NEXT_PUBLIC_BASE_URL}/api/users/register
it works well, I used the variable NEXT_PUBLIC_BASE_URL in the variable NEXT_PUBLIC_API_USERS_URL_REGISTER.
There is a simple way to do this you will just need to run:
env >>/root/.bashrc && source /root/.bashrc
this will append all environment variables inside /root/.bashrc then convert those if they have not been converted while passing the env-file
you can use something like this ${yourVar}
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://${KAFKA_IP}:${$KAFKA_PORT_NUMBER}
I test this on PHP / Laravel .env it's working fine

Replace string placeholder with value in sh file

I have to say that in Windows environment /Powershell I would have done it immediately, but since I have to execute this shell script inside a docker Linux image, I need your help.
I have a node.js env file where I store my environment variables, so the nodejs app can use them later.I've set some placeholders and I need to replace them substituting from the event args parameter I got from docker run command.
The content of the .env file is
NodePort={NodePort}
DBServer={DBServer}
DBDatabaseName={DBDatabaseName}
DBUser={DBUser}
DBPassword={DBPassword}
DBEncrypt= {DBEncrypt}
RFIDNodeUrlRoot={RFIDNodeUrlRoot}
RFIDStartMethod={RFIDStartMethod}
RFIDStopMethod={RFIDStopMethod}
RFIDGetTagsMethod={RFIDGetTagsMethod}
I don't know which is the best approach to open the file, replace the values from env variables, and then save it.
Anyone can please help me?
Thanks
You can use envsubst which ist part of gettext-base package
see:
https://stackoverflow.com/a/14157575/2087704
https://unix.stackexchange.com/a/294400/193945
.env.temp
NodePort=${NodePort} # notice the `$` before `{}`
DBServer=${DBServer}
..
Assuming you are setting environment variables with
docker run -e "NodePort=8080" -e "DBServer=foo"
Inside that container you will have to use some entrypoint.sh script to run:
envsubst \$NodePort,$\DBServer,.. < .env.temp > .env
then start your app passing .env to your nodejs app.
As an alternative you can also use sed to edit .env, which might be hard to understand.
subst_env() {
eval val="\$$1" # expands environment variable to val
sed -i "s%\$$1%${val}%g" $2 # using % as sed-delimiter to avoid escaping slashes in urls
}
subst_env 'ENV_DOCKER_DOMAIN' .env

terraform doesn't load environment variables set in fish

In the root folder of my project next to main.tf, I have a script called load_env.fish containing these two lines:
set -U AWS_SHARED_CREDENTIALS_FILE "~/path/to/file"
set -U AWS_PROFILE "my_profile"
I run that, then I run the command terraform import foo bar. It gives me Access Denied.
However, if I use bash instead of fish, and I set up the same environment variables, then terraform import foo bar works.
And I can even get it to work in fish if I do this:
from bash, set up environment variables
start the fish shell from bash
now in the fish shell, run terraform import foo bar
So,
Why does it work if I use bash and not fish? And why does it work in fish if the fish shell is opened from a bash shell that has the correct environment variables set?
How can I use terraform in the fish shell without having to open nested bash and fish shells?
Universal variables are shared between all fish sessions, but they are not automatically exported to subprocesses.
I simply changed all instances of set -U ... to set -Ux ... and everything worked.
EDIT: After seeing KurtisRader's comment concerning the downside of set -Ux and reading a bit more, I realize now that fish has the source command just like bash. So, inside the script I can just use
set -x foo bar
Then I can
$ source load_env.fish
instead of just
$ ./load_env.fish

How to Use Environment Variables in a .env file to fill out other environment variable in the same .env file

I am using a base.env as an env_file for several of my docker services.In this base.env I have several parts of the environment variable that repeat throughout the file. For example, port and ip are the same for three different environment variables.
I would like to specify these in an environment variable and reuse those variables to fill out the other environment variables.
Here is base.env:
### Kafka
# kafka's port is 9092 by default in the docker-compose file
KAFKA_PORT_NUMBER=9092
KAFKA_TOPIC=some-topic
KAFKA_IP=kafka
KAFKA_CONN: //$KAFKA_IP:$KAFKA_PORT_NUMBER/$KAFKA_TOPIC
# kafka topic that is to be created. Note that ':1:3' should remain the same.
KAFKA_CREATE_TOPICS=$KAFKA_TOPIC:1:3
# the url for connecting to kafka
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://$KAFKA_IP:$KAFKA_PORT_NUMBER
I have tried writing
KAFKA_CONN: //$${KAFKA_IP}:$${KAFKA_PORT_NUMBER}/$${KAFKA_TOPIC}
in the environment section of the appropriate service in the docker-compose.yml, but this gets interpreted as a literal string in the container.
Is there a way to do what I want in the base.env file?
Thank you for your help!
You can actually do it like this (at least in vlucas/dotenv package (php), not sure about others, please check it yourself)
MAIL_NAME=${MAIL_FROM}
Read more about it here
There is no way to do this in an env_file since it is not run as a bash command. This means that the variable is not created and then concatenated into the next variable it appears in. The values are just read in as they are in the env_file.
I used $ in Node.js and React.js , and both worked
POSTGRES_PORT=5432
DATABASE_URL="postgresql://root#localhost:${POSTGRES_PORT}/dbname"
and in react
REACT_APP_DOMAIN=domain.com
#API Configurations
REACT_APP_API_DOMAIN=$REACT_APP_DOMAIN
I know that I am a little late to the party, but I had the same question and I found a way to do it. There is a package called env-cmd, which allows you to use a .js file as an .env file. The file simply needs to export an object with the keys being your environment variable names and the values being, well, the values. This now allows you to run javascript before the environment variables are exported and thus use environment variables to set others.
I temporarly managed to deal with this where I create a script to replace env file vars from another env file vars like so:
.env.baseurl:
BASEURL1=http://127.0.0.1
BASEURL2=http://192.168.1.10
.env.uris.default:
URI1=${BASEURL1}/uri1
URI2=${BASEURL2}/uri2
URI3=${BASEURL2}/uri3
convert-env.sh:
#!/bin/bash
# To allow using sed correctly from same file multiple times
cp ./.env.uris.default ./.env.uris
# Go through each variable in .env.baseurl and store them as key value
for VAR in $(cat ./.env.baseurl); do
key=$(echo $VAR | cut -d "=" -f1)
value=$(echo $VAR | cut -d "=" -f2)
# Replace env vars by values in ./.env.uris
sed -i "s/\${$key}/$value/g" ./.env.uris
done
then you can run docker run command to start the container and load it with your env vars (from .env.baseurl to .env.uris) :
docker run -d --env-file "./.env.uris" <image>
This is not the best solution but helped me for now.
Using Nextjs, in the .env.local file I have the following variables:
NEXT_PUBLIC_BASE_URL = http://localhost:5000
NEXT_PUBLIC_API_USERS_URL_REGISTER = ${NEXT_PUBLIC_BASE_URL}/api/users/register
it works well, I used the variable NEXT_PUBLIC_BASE_URL in the variable NEXT_PUBLIC_API_USERS_URL_REGISTER.
There is a simple way to do this you will just need to run:
env >>/root/.bashrc && source /root/.bashrc
this will append all environment variables inside /root/.bashrc then convert those if they have not been converted while passing the env-file
you can use something like this ${yourVar}
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://${KAFKA_IP}:${$KAFKA_PORT_NUMBER}
I test this on PHP / Laravel .env it's working fine

PBSPro qsub output error file directed to path with jobid in name

I'm using PBSPro and am trying to use qsub command line to submit a job but can't seem to get the output and error files to be named how I want them. Currently using:
qsub -N ${subjobname_short} \
-o ${path}.o{$PBS_JOBID} -e ${path}.e${PBS_JOBID}
... submission_script.sc
Where $path=fulljobname (i.e. more than 15 characters)
I'm aware that $PBS_JOBID won't be set until after the job is submitted...
Any ideas?
Thanks
The solution I came up with was following the qsub command with a qalter command like so:
jobid=$(qsub -N ${subjobname_short} submission_script.sc)
qalter -o ${path}.o{$jobid} -e ${path}.e${jobid} ${jobid}
This way, PBS Pro does not need to resolve the variables, as it failed to do so in our install (this may be a configuration issue)
If you want the ${PBS_JOBID} to be resolved by PBSPro, you need to escape it on the command line:
qsub -o \$PBS_JOBID
Otherwise, bash will attempt to resolve $PBS_JOBID before it gets to the qsub command. I don't know if $subjobname_short and $path are actual environment variables or ones you want pbs to resolve, but if you want pbs to resolve them you'll also need to escape these ones or place it inside the job script.
NOTE: I also notice that your -o argument says {$PBS_JOBID} and I'm pretty sure you want ${PBS_JOBID}. I don't know if that's a typo in the question or what you tried to pass to qsub.

Resources