What does the USER keyword mean in Ubuntu - docker

I'm just recently started to using the Windows Subsystem for Linux. I was trying to install Angular and ran into an error. I found a potential solution, but I don't understand part of the solution. In the script bellow, what do the keywords USER, ENV, and RUN mean, and what are they called? I tried running "USER node" and i got an error
USER node
RUN mkdir /home/node/.npm-global
ENV PATH=/home/node/.npm-global/bin:$PATH
ENV NPM_CONFIG_PREFIX=/home/node/.npm-global
RUN npm install -g #angular/cli
Here is the entire answer in case you need more context https://github.com/angular/angular-cli/issues/7389

USER sets a username to use when executing commands that follow later in Dockerfile. See Dockerfile docs

That is not a script. Those directives have no meaning in Ubuntu.
That is a dockerfile. It is used by Docker to build images.

Related

How do I run a script file in Windows?

I am trying to build Pyodide from source on Windows. In their documentation they recommend using Docker. From the documentation:
1 Install Docker
2 From a git checkout of Pyodide, run ./run_docker or ./run_docker --pre-built
3 Run make to build.
I don't understand how to run ./run_docker?
I don't even know exactly what the file is. Is it a shell script?
Combining your question, "How do I run a script file in Windows?", with the information provided (you want to run a file called run_docker from the Pyodide project) you should get started by installing the Windows Subsystem for Linux version 2 (WSL). After you install WSL, you will need to open a command prompt, run bash to enter the Ubuntu linux distribution. From here you should follow the steps for building on Linux. When you run into a problem you can search the internet for solutions related to "Linux" or "Ubuntu".

Docker RUN instruction to source bash profile

I run some installation scripts via docker, they change ~/.bashrc but then I need to source it to use installed commands in RUN instructions below.
Tried obvious RUN . ~/.bashrc and got /bin/sh: 13: /root/.bashrc: shopt: not found error.
Tried RUN . ~/.profile and got mesg: ttyname failed: Inappropriate ioctl for device
I do not want to use ENV instructions. The point of having external installation scripts is to use them in non-Docker environments, for example when running unit tests locally. ENV instructions would duplicate environment setup which is already done in installation scripts.
You should not try to set up shell dotfiles in Docker. Many typical paths do not run them at all; for example
# In a Dockerfile
CMD ["some", "command", "here"]
# From the command line
docker run myimage some command here
The Docker environment is, fundamentally, different from a standalone Linux system; in addition to shell dotfiles, "home directory" isn't really a Docker concept, and if you have a multi-part process, on Docker it's standard to run each part in a separate container, but on standalone Linux you could use the init system to keep all of the parts running together. If you're expecting things to work exactly the same with exactly the same installation scripts, a virtual machine would be a better technological match for what you're attempting.
("Inappropriate ioctl for device" also suggests that there are things in the dotfiles that strongly expect to be run from an actual terminal, which you don't necessarily have at docker build time.)
My generic advice here is:
If possible, install things in the "system" directories within the image and avoid needing custom environment variable settings. (Don't use a version manager like nvm or rvm; don't use a Python virtual environment.)
If you do have to set environment variables, ENV is the way to do it.
If you really can't do either of the above, you can set environment variables in an ENTRYPOINT script before launching the main process; but if it's important to you that variables show up in docker inspect or docker exec shells, they won't be set there.
(Also remember that each RUN command launches a new container with a totally new shell environment. You can RUN . .profile; foo, but the environment variable settings won't carry through to the next RUN line.)

Apache PredictionIO - Docker run failed

I have been trying http://predictionio.apache.org/install/install-docker/ this tutorial. I have successfully built Docker image however when I try to run docker run i get the Can't open /etc/predictionio/pio-env.sh error.
docker build -t predictionio/pio pio
docker run -ti predictionio/pio
PS: If I comment out the last line CMD ["sh", "/usr/bin/pio_run"] I can build and run docker image successfully. I can open the file too from docker bash.
I think you need to grant permissions to execute this file. add the following line at the end of your Dockerfile
RUN chmod +x pio_run.sh
also, you might need to change CMD to ENTRYPOINT like following:
ENTRYPOINT ["sh","/usr/bin/pio_run.sh"]
Your output states you are running Windows. Did you use the default command prompt or did you use docker terminal? I had the same error messages in the past on Windows but mysteriously it disappeared after trying the tutorial again. I am not sure what I did different except I might possibly used docker instead of the default command prompt...
Could you also try using docker-compose instead of plain docker commands as described in the tutorial?
Ensure your storage (Postgres, MySQL or ElasticSearch) is running before starting PIO as well.
Just resolved it on my machine.
When you cloned repository on Windows, git converted end of line symbols from Unix-style (\n) to Windows style (\r\n).
You need to open file C:\wherever-you-cloned-pio-repository\predictionio\docker\pio\pio_run and change it back (for e.g. using Visual Studio Code, or Notepad++). Then you need to rebuild the image and it should work.
Also for the future you may want to disable automatic conversion Disable git EOL Conversions

Dockerd with Chocolatey

I am using Chocolatey to install Docker.
When I originally run the following command:
choco install docker
and try to run the "docker --version" command, everything goes as expected.
Docker version 17.10.0-ce, build f4ffd25
When I try to run "dockerd" command, it shows as not being part of my path.
'dockerd' is not recognized as an internal or external command,
Looking at the PATH variable, and navigating to where Chocolatey stores the executables, dockerd.exe is not present while docker.exe is. Am I missing something in instructing Chocolatey in adding dockerd?
The reason I need the dockerd executable is so that I can limit the number of concurrent downloads, as shown in the Docker documentation.
This is a decision that the package maintainer(s) for Docker have made. If you have a look here:
https://chocolatey.org/packages/docker#files
You will see that there is a dockerd.exe.ignore file. This file is used to instruct Chocolatey to explicitly not create what is referred to as a shim file, which would make it work from the command line, in the same way as Docker does.
My best suggestion would be to reach out to the maintainers of that package to ask them why this was done, and to perhaps get it changed. You can do this by clicking on the Contact Maintainers link on this page:
https://chocolatey.org/packages/docker
As a workaround, you could add the following path to your Windows PATH environment variable:
C:\ProgramData\chocolatey\lib\docker\tools\docker
Which would allow it to work.

Docker run hello-world not working

I've installed Docker following exactly the documentation on the website but when I try to run docker run hello-world, I get the following output from the terminal:
Saved file tree to doc-filelist.js
Copied JS to doc-script.js
Compiled CSS to doc-style.css
Does anybody have an idea what is going wrong?
Is it possible that you have docker.js installed locally?
The output from your command looks like the docker.js docker executable is called instead of the Docker container one.
The log messages that you showed can be found in the docker.js documentation, it looks like you're running that instead of the one you want.
If this is the issue run npm uninstall -g docker
I was experiencing the same issue, and it turned out to be caused by running nvm (node version manager). When I used nvm to run Node, Docker was not recognized. However, when I removed nvm, Docker worked as expected. I'm sure there's a workaround to make nvm and Docker work together, and I'll look into that later so I can continue using both tools.

Resources