I wonder if there is a way for a dbt macro to access local environment variables.
I tried the following syntax, but it didn’t work
{{ env.MY_ENV_VARIABLE }}
and
{{ env_var('MY_ENV_VARIABLE')}}
Is there a way to access local variables?
This syntax worked for me for assigning the environment variable to a local variable.
{% set my_env_variable = env_var('MY_ENV_VARIABLE') %}
The issues was, that I haven't properly exported the environment variable
Related
Problem
Some library I use requires the case sensitive environment variable QXToken.
When I create a codespaces secret the environment variable is only available in uppercase (QXTOKEN), as the secrets are case insensitive. Therefore I want to copy the secret stored in QXTOKEN to the environment variable QXToken.
I tried to do that in the devcontainer.json:
{
...
"remoteEnv": {
"QXAuthURL": "https://auth.quantum-computing.ibm.com/api",
"QXToken": "${secrets.QXTOKEN}"
},
"updateContentCommand": "env; export QXToken=$QXTOKEN; env",
"postCreateCommand": "env; export QXToken=$QXTOKEN; env",
"postStartCommand": "env; export QXToken=$QXTOKEN; env",
"postAttachCommand": "env; export QXToken=$QXTOKEN; env"
}
But remoteEnv cannot access the codespaces secrets via ${secrets.QXTOKEN} as one would be able to with GitHub Actions and none of updateContentCommand, postCreateCommand, postStartCommand and postAttachCommand saved the environment variable persistently for the user.
Using the command env I see from the logs that the environment variables have been set, but already in the next command they are gone.
Even though the postCreateCommand is able to access the codespaces secrets according to the documentation I was not able to set environment variables for later usage.
For now I only see the following environment variables, but I am missing QXToken:
$ env | grep QX
QXAuthURL=https://auth.quantum-computing.ibm.com/api
QXTOKEN=***
Question
Is there a best practice to reuse codespaces secrets inside devcontainer.json and make them available as environment variables in the codespace?
The GitHub Codespaces secrets are available via localEnv which is a special variable used by devcontainer.json which provides access to environment variables on the host machine. Therefore, you can set the environment variable QXToken with ${localEnv:QXTOKEN} inside devcontainer.json.
Furthermore, if you want to set an environment variable pointing to a path inside your repo you can use ${containerWorkspaceFolder}/path/inside/your/repo.
"remoteEnv": {
// Use a GitHub Codespaces secret:
"QXToken": "${localEnv:QXTOKEN}",
// Point to a path inside your repo:
"QISKIT_SETTINGS": "${containerWorkspaceFolder}/.qiskit/settings.conf"
}
For more details on the available variables in devcontainer.json have a look at the documentation.
I am using a base.env as an env_file for several of my docker services.In this base.env I have several parts of the environment variable that repeat throughout the file. For example, port and ip are the same for three different environment variables.
I would like to specify these in an environment variable and reuse those variables to fill out the other environment variables.
Here is base.env:
### Kafka
# kafka's port is 9092 by default in the docker-compose file
KAFKA_PORT_NUMBER=9092
KAFKA_TOPIC=some-topic
KAFKA_IP=kafka
KAFKA_CONN: //$KAFKA_IP:$KAFKA_PORT_NUMBER/$KAFKA_TOPIC
# kafka topic that is to be created. Note that ':1:3' should remain the same.
KAFKA_CREATE_TOPICS=$KAFKA_TOPIC:1:3
# the url for connecting to kafka
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://$KAFKA_IP:$KAFKA_PORT_NUMBER
I have tried writing
KAFKA_CONN: //$${KAFKA_IP}:$${KAFKA_PORT_NUMBER}/$${KAFKA_TOPIC}
in the environment section of the appropriate service in the docker-compose.yml, but this gets interpreted as a literal string in the container.
Is there a way to do what I want in the base.env file?
Thank you for your help!
You can actually do it like this (at least in vlucas/dotenv package (php), not sure about others, please check it yourself)
MAIL_NAME=${MAIL_FROM}
Read more about it here
There is no way to do this in an env_file since it is not run as a bash command. This means that the variable is not created and then concatenated into the next variable it appears in. The values are just read in as they are in the env_file.
I used $ in Node.js and React.js , and both worked
POSTGRES_PORT=5432
DATABASE_URL="postgresql://root#localhost:${POSTGRES_PORT}/dbname"
and in react
REACT_APP_DOMAIN=domain.com
#API Configurations
REACT_APP_API_DOMAIN=$REACT_APP_DOMAIN
I know that I am a little late to the party, but I had the same question and I found a way to do it. There is a package called env-cmd, which allows you to use a .js file as an .env file. The file simply needs to export an object with the keys being your environment variable names and the values being, well, the values. This now allows you to run javascript before the environment variables are exported and thus use environment variables to set others.
I temporarly managed to deal with this where I create a script to replace env file vars from another env file vars like so:
.env.baseurl:
BASEURL1=http://127.0.0.1
BASEURL2=http://192.168.1.10
.env.uris.default:
URI1=${BASEURL1}/uri1
URI2=${BASEURL2}/uri2
URI3=${BASEURL2}/uri3
convert-env.sh:
#!/bin/bash
# To allow using sed correctly from same file multiple times
cp ./.env.uris.default ./.env.uris
# Go through each variable in .env.baseurl and store them as key value
for VAR in $(cat ./.env.baseurl); do
key=$(echo $VAR | cut -d "=" -f1)
value=$(echo $VAR | cut -d "=" -f2)
# Replace env vars by values in ./.env.uris
sed -i "s/\${$key}/$value/g" ./.env.uris
done
then you can run docker run command to start the container and load it with your env vars (from .env.baseurl to .env.uris) :
docker run -d --env-file "./.env.uris" <image>
This is not the best solution but helped me for now.
Using Nextjs, in the .env.local file I have the following variables:
NEXT_PUBLIC_BASE_URL = http://localhost:5000
NEXT_PUBLIC_API_USERS_URL_REGISTER = ${NEXT_PUBLIC_BASE_URL}/api/users/register
it works well, I used the variable NEXT_PUBLIC_BASE_URL in the variable NEXT_PUBLIC_API_USERS_URL_REGISTER.
There is a simple way to do this you will just need to run:
env >>/root/.bashrc && source /root/.bashrc
this will append all environment variables inside /root/.bashrc then convert those if they have not been converted while passing the env-file
you can use something like this ${yourVar}
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://${KAFKA_IP}:${$KAFKA_PORT_NUMBER}
I test this on PHP / Laravel .env it's working fine
I am currently using an ansible script to deploy a docker-compose file (using the docker_service module), which sets a series of environment variables which are read by the .NET Core service running inside the docker container, like this:
(...)
environment:
- Poller:Username={{ poller_username }}
- Poller:Password={{ poller_password }}
(...)
The variables for poller_username and poller_password are being loaded from an Ansible Vault (which will be moved to a Hashicorp Vault eventually), and are interpolated into the file with no problem.
However, I have come across a scenario where this logic fails: the user has a '$' in the middle of his password. This means that instead of the environment variable being set to 'abc$123' it's instead set to 'abc', causing my application to fail.
Upon writing a debug command, I get the password output to the console correctly. If I do docker exec <container_name> env I get the wrong password.
Is there a Jinja filter I can use to ensure the password is compliant with docker-compose standards? It doesn't seem viable to me to guarantee the password will never have a $.
EDIT: {{ poller_password | replace("$","$$") }} works, but this isn't a very elegant solution to have in, potentially, every variable I use in the docker-compose module.
For this particular scenario, the {{ poller_password | replace("$","$$") }} solution seems to be inevitable. Thankfully, it appears to be the only case that requires this caution.
Had a similar situation was not a $ but some other character, end up using
something: !unsafe "{{ variable }}"
couldn't find a better way.
I am using a base.env as an env_file for several of my docker services.In this base.env I have several parts of the environment variable that repeat throughout the file. For example, port and ip are the same for three different environment variables.
I would like to specify these in an environment variable and reuse those variables to fill out the other environment variables.
Here is base.env:
### Kafka
# kafka's port is 9092 by default in the docker-compose file
KAFKA_PORT_NUMBER=9092
KAFKA_TOPIC=some-topic
KAFKA_IP=kafka
KAFKA_CONN: //$KAFKA_IP:$KAFKA_PORT_NUMBER/$KAFKA_TOPIC
# kafka topic that is to be created. Note that ':1:3' should remain the same.
KAFKA_CREATE_TOPICS=$KAFKA_TOPIC:1:3
# the url for connecting to kafka
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://$KAFKA_IP:$KAFKA_PORT_NUMBER
I have tried writing
KAFKA_CONN: //$${KAFKA_IP}:$${KAFKA_PORT_NUMBER}/$${KAFKA_TOPIC}
in the environment section of the appropriate service in the docker-compose.yml, but this gets interpreted as a literal string in the container.
Is there a way to do what I want in the base.env file?
Thank you for your help!
You can actually do it like this (at least in vlucas/dotenv package (php), not sure about others, please check it yourself)
MAIL_NAME=${MAIL_FROM}
Read more about it here
There is no way to do this in an env_file since it is not run as a bash command. This means that the variable is not created and then concatenated into the next variable it appears in. The values are just read in as they are in the env_file.
I used $ in Node.js and React.js , and both worked
POSTGRES_PORT=5432
DATABASE_URL="postgresql://root#localhost:${POSTGRES_PORT}/dbname"
and in react
REACT_APP_DOMAIN=domain.com
#API Configurations
REACT_APP_API_DOMAIN=$REACT_APP_DOMAIN
I know that I am a little late to the party, but I had the same question and I found a way to do it. There is a package called env-cmd, which allows you to use a .js file as an .env file. The file simply needs to export an object with the keys being your environment variable names and the values being, well, the values. This now allows you to run javascript before the environment variables are exported and thus use environment variables to set others.
I temporarly managed to deal with this where I create a script to replace env file vars from another env file vars like so:
.env.baseurl:
BASEURL1=http://127.0.0.1
BASEURL2=http://192.168.1.10
.env.uris.default:
URI1=${BASEURL1}/uri1
URI2=${BASEURL2}/uri2
URI3=${BASEURL2}/uri3
convert-env.sh:
#!/bin/bash
# To allow using sed correctly from same file multiple times
cp ./.env.uris.default ./.env.uris
# Go through each variable in .env.baseurl and store them as key value
for VAR in $(cat ./.env.baseurl); do
key=$(echo $VAR | cut -d "=" -f1)
value=$(echo $VAR | cut -d "=" -f2)
# Replace env vars by values in ./.env.uris
sed -i "s/\${$key}/$value/g" ./.env.uris
done
then you can run docker run command to start the container and load it with your env vars (from .env.baseurl to .env.uris) :
docker run -d --env-file "./.env.uris" <image>
This is not the best solution but helped me for now.
Using Nextjs, in the .env.local file I have the following variables:
NEXT_PUBLIC_BASE_URL = http://localhost:5000
NEXT_PUBLIC_API_USERS_URL_REGISTER = ${NEXT_PUBLIC_BASE_URL}/api/users/register
it works well, I used the variable NEXT_PUBLIC_BASE_URL in the variable NEXT_PUBLIC_API_USERS_URL_REGISTER.
There is a simple way to do this you will just need to run:
env >>/root/.bashrc && source /root/.bashrc
this will append all environment variables inside /root/.bashrc then convert those if they have not been converted while passing the env-file
you can use something like this ${yourVar}
KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://${KAFKA_IP}:${$KAFKA_PORT_NUMBER}
I test this on PHP / Laravel .env it's working fine
I folllow this tutorial http://pjambet.github.com/blog/direct-upload-to-s3/. Pretty much stuck in the setting the AWS variable. The tut says I have to set the variables as below
export S3_BUCKET=<YOUR BUCKET>
export AWS_ACCESS_KEY_ID=<YOUR KEY>
export AWS_SECRET_KEY_ID=<YOUR SECRET KEY>
But it doesn't mention where to set those of variable.
Could anyone point me out ? Thanks
Just type it in the command line. Export command.