I was running the following command:
docker run -e 'SOME_ENV_VARIABLE=somevalue' somecommand
And this did not pass the env variable to somecommand (Note: SOME_ENV_VARIABLE and somevalue did not have un-escaped single quotes, they were capital snake case and lowercase respectively without special character.
Later, I tried:
docker run -e "SOME_ENV_VARIABLE=somevalue" somecommand
And this somehow worked (note the double quotes).
Why is this? I have seen documentation with single quotes.
I tried this on Windows.
Quotation is handled by your shell - since you are on windows, that is cmd.exe. And, from what I understand, since cmd.exe does not handle single quotes, it will pass that argument with the quotes to docker.
It would work on Linux though (which usually has the bash shell), being the the environment many people would use docker on, and as such the reason it is present in a lot of documentation.
Related
I'm trying to create a script that starts a docker container and mounts a local folder that contains spaces in the name. I can get it to work fine when I run a *.bat file with the docker run command:
docker run -p 8081:8081 -v "C:\Test Folder With Blanks":/zzz myimage jupyter lab --notebook-dir=/zzz--ip=0.0.0.0 --port=8081 --allow-root
But when I try to do the same in a Powershell script file, I get an error:
$CMD = 'docker run -p 8081:8081 -v "C:\Test Folder With Blanks":/zzz myimage jupyter lab --notebook-dir=/zzz--ip=0.0.0.0 --port=8081 --allow-root'
Invoke-Expression $CMD
docker: invalid reference format.
See 'docker run --help'.
I'm on Win10 and running Powershell in Visual Studio Code IDE.
Thanks for ideas.
First, the obligatory warning: Unless you have a command line stored as a single string somewhere and you either fully control or trust the content of the string, Invoke-Expression should generally be avoided.
You're seeing an inconsistency in how PowerShell treats compound tokens composed of directly concatenated quoted and unquoted parts.
Specifically, argument "C:\Test Folder With Blanks":/zzz is unexpectedly broken in two, i.e passed as two arguments.
The workaround is to quote the entire argument, i.e.
"C:\Test Folder With Blanks:/zzz"
Note: I'm assuming that docker doesn't actually require partial quoting in its arguments, which it shouldn't; however, there are high-profile CLIs on Windows that do, notably msiexec.
Alternatively, use an expression enclosed in (...) to compose your string; e.g.
("C:\Test Folder With Blanks" + ':/zzz')
There's no good reason to do so in this case, but it could be helpful if you need string interpolation in one part of your string ("..."), but not in another ('...').
General caveats:
Compared to cmd.exe and also POSIX-compatible shells such as bash, PowerShell has several additional metacharacters, notably # (at the start of a token), { / }, and ;. Therefore, you cannot always expect command lines written for these shells to work as-is in PowerShell.
As of PowerShell 7.2.2, passing arguments to external programs (such as docker) is fundamentally broken with respect to arguments that have embedded " characters and empty-string arguments - see this answer.
The general pattern of the inconsistency, as of PowerShell 7.2.2, is as follows:
If an argument:
starts with a quoted token - whether single- ('...') or double-quoted ("...") -
and has additional characters directly following the closing quote,
the first such character starts a separate argument.
E.g. "foo":bar / "foo"=bar / "foo"'bar' are passed as separate arguments foo and :bar / foo and =bar / foo and bar, respectively.
In other words:
You cannot compose a single string argument from a mix of quoted and unquoted / differently quoted tokens if the first token is quoted.
Conversely, it does work if the first token is unquoted, including an unquoted simple variable reference such as $HOME.
# OK: First token is unquoted.
PS> cmd /c echo foo"bar"'baz'last
foobarbazlast
# !! BROKEN: First token is quoted.
# !! Because each successive token is quoted too,
# !! each becomes its own argument.
PS> cmd /c echo 'foo'"bar"'baz'last
foo bar baz last
GitHub issue #6467 discusses this inconsistency; however, it has been closed, because the behavior is - surprisingly - considered by design.
This does not happen if the first token is unquoted; however, there are related bugs that start with unquoted tokens that similarly break arguments in two, related to their looking like named arguments to PowerShell (which is a PowerShell concept that doesn't apply when calling external programs):
GitHub issue #11646: an argument such as -foo=1,2 breaks parsing.
GitHub issue #6291: an argument such as -foo=bar.baz is broken in two at the (first) .
I have a variable in my jboss cli file, let's call it ${FOO}.
I can see that the variable is set in my Dockerfile, if I:
RUN echo ${FOO}
I see the value printed.
I have a foo.cli file which contains a line similar to:
/system-property=foo:add(value="${FOO}")
and when I run jboss-cli.sh foo.cli
I get:
Cannot resolve expression ${FOO}.
Is there a way to pass a variable from Docker to the file argument to jboss-cli.sh ?
I've tried removing the quotes around ${FOO} also in the system-property line but no luck.
This is due to the fact that the CLI does not treat values the same as names. In short, operation names, parameter names, header names and values, command names as well as command argument names are resolved automatically. Parameter and argument values aren’t. To fix this, we need to set resolve-parameter-values to true in jboss-cli.xml.
For e.g.:-
sed -i "s/<resolve-parameter-values>false<\/resolve-parameter-values>/\
<resolve-parameter-values>true<\/resolve-parameter-values>/" \
$JBOSS_HOME/bin/jboss-cli.xml
Note that the command above is meant to be a one-off solution. From the maintenance perspective, using sed for XML is not a good idea in general. If you’re automating this, consider using XPath instead.
Note:- This functionality requires WildFly 8.0.0.CR1 or newer.
you might need to set jboss-cli.xml to <resolve-parameter-values>true<\/resolve-parameter-values>
$ sed -i "s/<resolve-parameter-values>false<\/resolve-parameter-values>/\
<resolve-parameter-values>true<\/resolve-parameter-values>/" \
$JBOSS_HOME/bin/jboss-cli.xml
src: https://mirocupak.com/using-environment-variables-in-jboss-cli/
These are good hints how to solve the problem, but the sed command does not work this way on Mac. There is a lot of fun with "-i". More information here:
sed command with -i option failing on Mac, but works on Linux
On Mac this worked for me:
sed -i "" "s/false</resolve-parameter-values>/true</resolve-parameter-values>/" $JBOSS_HOME/bin/jboss-cli.xml
Unfortunately this way of using sed does not work on RHEL for example.
I need to pass -Dfml.queryResult=confirm to the executable of an otherwise working docker container (on the docker run line), but all I get is : -Dfml.queryResult=confirm: invalid syntax ... I've tried with quotes, no quotes, single quotes, escaping it, everything I can think of, and I've put it behind -c as suggested here, but I always get an error. Am I missing something silly?
Let us say that I have environment variable PO, with value 1.If I use the LINUX echo command I get:
>echo $PO
1
However, if I use TCL and exec, I do not get interpolation:
>exec echo "\$PO"
$PO
Now, if I do something more elaborate, by using regsub to replace every ${varname} with [ lindex array get env varname ] 0 ], and use substr, it works:
>subst [ regsub -all {\$\{(\S+?)\}} "\${PO}/1" "\[ lindex \[ array get env \\1 \] 1 \]" ]
1/1
I have some corner cases, sure. But why is the exec not giving back what the shell would do?
why is the exec not giving back what the shell would do?
Because exec is not a shell.
When you do echo $PO from a shell, echo is not responsible for resolving the value. It is the shell that converts $PO to the value 1 before calling echo. echo never sees $PO when calling it from the shell.
If you are trying to emulate what the shell does, then you need to do the same work as the shell (or, invoke an actual shell to do the work for you).
Tcl is a lot more careful about where it does interpolation than Unix shells normally are. It keeps environment variables out of the way so that you don't trip over them by accident, and does far less processing when it invokes a subprocess. This is totally by design!
As much as possible (with a few exceptions) Tcl passes the arguments to exec through to the subprocesses it creates. It also has standard mechanisms for quoting strings so that you can control exactly what substitutions happen before the arguments are actually passed to exec. This means that when you do:
exec echo "\$PO"
Tcl is going to do its normal substitution rules and get these exact arguments to the command dispatch: exec, echo, and $PO. This then calls into the exec command, which launches the echo program with one argument, $PO, which does exactly that. (Shells usually substitute the value first.) If you'd instead done:
exec echo {$PO}
you would have got the same effect. Or even if you'd done:
exec {*}{echo $PO}
You still end up feeding the exact same characters into exec as its arguments. If you want to run the shell on it, you should explicitly ask for it:
exec /bin/sh -c {echo $PO}
The bit in the braces there is a full (small) shell script, and will be evaluated as such. And you could do this even:
exec /bin/sh -c {exec echo '$PO'}
It's a bit of a useless thing to do but it works.
You can also do whatever substitutions you want from your own code. My current favourite from Tcl 8.7 (in development) is this:
exec echo [regsub -all -command {\$(\w+)} "\$PO" {apply {- name} {
global env
return $env($name)
}}]
OK, total overkill for this but since you can use any old complex RE and script to do the substitutions, it's a major power tool. (You can do similar things with string map, regsub and subst in older Tcl, but that's quite a bit harder to do.) The sky and your imagination are the only limits.
Travis CI's user documentation has a section on how to escape secure environment variables. It seems not to work for the '$' symbol. Is there anything special that needs done for the '$' symbol?
I setup this example. In .travis.yml:
travis encrypt "FAKE_PASSWORD=H3llo\\#Worl\\$d" -a
In my script I echo the variable and get:
fake password is H3llo#Worl
It appears that $d is being replaced with nothing. How can I fix this?
The problem is when running travis encrypt the $ symbol needs to be escaped for the command as well as when the variable is used. With two backslashes \\ it only creates one backslash in the variable and $d is still expanded by bash. Using three backslashes fixes the issue.
travis encrypt "FAKE_PASSWORD=H3llo\\#Worl\\\$d" -a
\\ creates a single backslash and \$ creates a $ symbol that is not expanded by bash. When travis runs the bash command to create that variable looks like
FAKE_PASSWORD=H3llo\\#World\$d
This is what bash expects when using a $ in a variable.