We have a docker container and the program that runs inside expects a number of envvars to be set. There are several ways to do this and I was curious as to what the correct way is:
1- put envvars in a config repo, add it as a material, use docker --env-file flag to pass the file. Cons: seems like overkill for passing 10 envvars.
2- define envvars in the job configuration tab. Export them during docker build. Cons: Every new job would have to manually set the envvars, adding an envvar would require changing all jobs in one place.
3- define envvars in the job configuration tab. Pass them during docker run using -e flag. Cons: Every new job would have to manually set the envvars, adding an envvar would require changing each job in two places.
Is there any other way?
There's another option: Write a small shell script that passes all environment variables to docker that start with a certain prefix.
For example, you could make it turn DOCKER_A=a and DOCKER_B=b into -e A=a -e B=b, and call it as
docker $(./munge_env_vars) ...
Then you can put all your environment variables in one place (either in the GoCD config, or in a shell script under version control that you can source), and no need to modify two places when you add another env variable.
Related
If there is a Jenkins shell script build step, environment variables are set so that, for example, if you echo $WORKSPACE you see the current working directory. I am writing a custom plugin which also can execute shell commands but WORKSPACE is not set when this plugin executes.
I could see that Jenkins sets all those env variables prior to executing the shell commands the Jenkins project specifies so that they would not be already set for a custom plugin. If that is the case, it would seem like there is no way to access those env variables.
But if there is a way to obtain the values of the env variables that would be useful.
I am using Bamboo for building and deploying my docker container. My code uses environment variables. I am using a shell script to set values of those variables with those values being hardcoded in .sh file. Ideally, I would like the values for those environment variables to be passed through bamboo variables. One option is to generate a shell script during bamboo build plan and call that shell script from startup file. Is there any better option to set system environment variables using bamboo variables?
When adding the Docker task in the Plan configuration, you have the option to pass environment variables.
For example, if your Dockerfile has ENV variable test_db_pass you should pass in the Docker task field "Container environment variables" the following: test_db_pass=${bamboo.test_db_pass}
It is possible to define either plan or global variables in Bamboo.
You can then use them in you build.
It's in the documentation :
https://confluence.atlassian.com/bamboo/defining-plan-variables-289276859.html
I want to run a shell command in a sh step, that depends on some environment variables. Now I know that there is the environment directive. If I declare all environment variables there, the command works just fine.
However, I have several pipeline scripts that all run this command with the same environment variables. So instead of declaring these variables in each script, I want to declare them only once. I tried to set environment variables for the jenkins user, in which I didn't succeed. Finally I declared them system wide (in /etc/environment) only to find, that they're still not present for the command run in the shell step.
I conclude from this that jenkins runs shell step commands in a clean environment, ignoring variables I may have declared. My question is now: how can I set environment variables for the shell step for all pipeline scripts in jenkins?
I wonder if this is already part of the system...
I need to use the current gitlab user id and email ($GITLAB_USER_ID, $GITLAB_USER_EMAIL) injected into the execution of the docker image (to later configure the git repository).
Is there a magic way to do this ? or should I explicitly write the export commands into my .gitlab-ci.yml file (as a before_script for example) ?
Thanks.
I got my response by trying the env command on a build.
So yes every job variables are available into the docker execution env.
I am trying to set up Jenkins to continually check out and build code and verify that the code is compilable.
Our build system works like this: we have several different .bat files that set up environment variables for different build configurations, and then we execute gmake to actually build the code.
When I created Jenkins job, in Build part of the job I set up two "Execute windows batch command" commands: one that calls the script to set up env. variables, and gmake to build it.
Problem is, when gmake step runs, all environment variables are forgotten. How can I prevent env. variables from being cleared?
Tx
What if you set it up to call only one bat file instead? That one file can then call the two you're currently calling with Jenkins.