I folllow this tutorial http://pjambet.github.com/blog/direct-upload-to-s3/. Pretty much stuck in the setting the AWS variable. The tut says I have to set the variables as below
export S3_BUCKET=<YOUR BUCKET>
export AWS_ACCESS_KEY_ID=<YOUR KEY>
export AWS_SECRET_KEY_ID=<YOUR SECRET KEY>
But it doesn't mention where to set those of variable.
Could anyone point me out ? Thanks
Just type it in the command line. Export command.
Related
I am not able to pass ${environment} in the vault secret path for reading the values.
May be secret getting initialized before variables are getting set.
Kindly help as I'm not able to read environment-specific values from the same vault repo.
It worked pretty nicely for me using a choice parameter in a parameterized build. I think your issue is in the used Vault path (vault/secret/$environment). I think the correct in your case is just "secret/$environment". Does your secret engine start with "vault"?
Just FYI, if you define the variable in "Jenkins > Manage Jenkins > Configure System > Environment variables" it'll work too.
Problem
Some library I use requires the case sensitive environment variable QXToken.
When I create a codespaces secret the environment variable is only available in uppercase (QXTOKEN), as the secrets are case insensitive. Therefore I want to copy the secret stored in QXTOKEN to the environment variable QXToken.
I tried to do that in the devcontainer.json:
{
...
"remoteEnv": {
"QXAuthURL": "https://auth.quantum-computing.ibm.com/api",
"QXToken": "${secrets.QXTOKEN}"
},
"updateContentCommand": "env; export QXToken=$QXTOKEN; env",
"postCreateCommand": "env; export QXToken=$QXTOKEN; env",
"postStartCommand": "env; export QXToken=$QXTOKEN; env",
"postAttachCommand": "env; export QXToken=$QXTOKEN; env"
}
But remoteEnv cannot access the codespaces secrets via ${secrets.QXTOKEN} as one would be able to with GitHub Actions and none of updateContentCommand, postCreateCommand, postStartCommand and postAttachCommand saved the environment variable persistently for the user.
Using the command env I see from the logs that the environment variables have been set, but already in the next command they are gone.
Even though the postCreateCommand is able to access the codespaces secrets according to the documentation I was not able to set environment variables for later usage.
For now I only see the following environment variables, but I am missing QXToken:
$ env | grep QX
QXAuthURL=https://auth.quantum-computing.ibm.com/api
QXTOKEN=***
Question
Is there a best practice to reuse codespaces secrets inside devcontainer.json and make them available as environment variables in the codespace?
The GitHub Codespaces secrets are available via localEnv which is a special variable used by devcontainer.json which provides access to environment variables on the host machine. Therefore, you can set the environment variable QXToken with ${localEnv:QXTOKEN} inside devcontainer.json.
Furthermore, if you want to set an environment variable pointing to a path inside your repo you can use ${containerWorkspaceFolder}/path/inside/your/repo.
"remoteEnv": {
// Use a GitHub Codespaces secret:
"QXToken": "${localEnv:QXTOKEN}",
// Point to a path inside your repo:
"QISKIT_SETTINGS": "${containerWorkspaceFolder}/.qiskit/settings.conf"
}
For more details on the available variables in devcontainer.json have a look at the documentation.
I have a DAG that uses Environment variables. the Environment variables are set
in /etc/default/airflow-scheduler
export MY_KEY=1234
but when I echo MY_KEY in the DAG it doesn't print anything.
I checked the airflow scheduler variables and verified that MY_KEY existed. the command I used for verifying that was:
cat /proc/process_id_of_airflow_scheduler/environ
anyone can advise me on how to solve this problem?
Thanks
I found the answer. I added the key in the /etc/environment and now it is working.
If you are using Centos/Redhat, the key should rather be added in /etc/sysconfig/airflow or /etc/default/airflow on Debian/Ubuntu. See this answer.
I am pretty new to setting up remote servers, but I was playing around today and was hoping that I could leverage a Cloud Config file upon setup in order to set a few environment variables as the server spins up.
How can I set my environment variables programmatically when spinning up a machine on Digital Ocean? The key is that I want to automate the setup and avoid interactively defining these variables.
Thanks in advance.
This is what I did with for ubuntu
write_files:
- path: /etc/environment
content: |
FOO="BAR"
append: true
There's a couple ways to do this, although Cloud Init doesn't support a built-in resource type for environment variables.
Depending on your OS, use a write-files section to output the env vars you want to the appropriate file. For CoreOS, you'd do something like:
write_files:
- path: "/etc/profile.env"
append: true
content: |
export MY_VAR="foo"
For Ubuntu, use /etc/environment, or a user's profile, etc.
Another way to do it would be to leverage Cloud Init's support for Chef, and use that tool to set the variables when the profile is applied.
Do you need the environment variable to be permanent, or just for the execution of a single command/script?
If it's for a single command, you can do that:
FOO=${BAR} | sh ./your_script.sh
I am developing a Rails 4 app on cloud9 (c9.io). When I placed SECRET="geheim" in config file, it works fine. I tried setting an environment variable using
echo "export SECRET=geheim" >> ~/.profile
and then using ENV['SECRET'] in config file, but it doesn't work. When I type printenv SECRET in console, it returns nothing, meaning the variable is not set. How can I fix this? Thanks.
You can add environment variables on cloud9 only if you are using the run panel to run your application. In the run panel theres a ENV button at the far right side where you can set your environment variables.
Heres some documentation about setting up your run command:
https://docs.c9.io/v1.0/docs/run-an-application
Unfortunately, this doesnt work if you're running your app from the terminal as cloud9 doesnt seem to support environment variables directly from the terminal.
In the linux terminal:
% export <env-variable-name> = <env-variable-value>
For example, setting an AWS-S3 bucket:
% export PHOTOS_BUCKET='s3://edx-photo-lab/photos/'
source: https://docs.aws.amazon.com/cloud9/latest/user-guide/env-vars.html
The above solution has the problem that the variables are cleared when the Cloud9 EC2 is rebooted.
To persist the variable you must add the export statement to the ~/.bashrc. You could use vim for edit:
% sudo vim ~/.bashrc