VSCode combine remote ssh and remote containers - docker

On my office desktop machine, I'm running a docker container that accesses the GPU. Now, since I'm working from home, I'm connected through ssh to my office desktop computer in vs code via remote ssh plugin which works really well. However, I would like to further connect via remote containers to that running container in order to be able to debug the code I'm running in that container. Failed to get this done yet.
Anyone has any idea if this is possible at all and in case any idea how to get this done?

Install and activate ssh server in a container.
Expose ssh port via docker
create user with home directory and password in container
(install remote ssh extension for vs code and) setup ssh connection within the remote extension in vs code and add config entry:
Host <host>-docker
Hostname your.host.name
User userIdContainer
Port exposedSshPortInContainer
Connect in vs code.
Note: Answer provided by OP on question section.

Related

How to connect a Dev Container to another Container?

for this question im working with prisma's dev container: https://github.com/prisma/prisma/tree/main/.devcontainer
once i open that repo inside of a container using the remote container plugin in visual studio and run some Jest Tests that rely on docker services defined in the https://github.com/prisma/prisma/tree/main/docker folder, i get the error of "cant connect to database" for all databases...
it's like if the dev container had no idea those services exist... on my pc, looking at docker desktop i see the services up and running but the devcontainer can't... why?
i find it weird that i had to change any type of setting since this files are from the prisma repo, they are suposed to be ready for action once downloaded... right?
Assumming the docker network driver is bridge (Default).
If the script is runing this line to get env in your devcontainer as below.
const connectionString = (
process.env.TEST_MYSQL_URI_MIGRATE || 'mysql://root:root#localhost:3306/tests-migrate'
).replace('tests-migrate', 'tests-migrate-dev')
`
The localhost in the connection string means the localhost in your devcontainer but not your host machine.
You should access the localhost of your host machine instead.
The fix is set the TEST_MYSQL_URI_MIGRATE environment variable instead like
TEST_MYSQL_URI_MIGRATE=mysql://root:root#host.docker.internal:3306/tests-migrate
For the details how to access the localhost of host machine, please read this question

VS Code: connect a docker container in a remote server

I want to work in a container in a remote server.
But it doesn't work.
Environment:
Local: Windows 10
Local Terminal for ssh: WSL in Windows 10
Server: Ubuntu 18.04
I checked these two articles.
https://code.visualstudio.com/docs/remote/containers-advanced
https://code.visualstudio.com/docs/containers/ssh
I followed these steps.
I installed [Remote Development] extension in VS Code.
Remote-SSH: Connect to host. It works fine.
I Installed [Docker] extension on the remoter server.
Now I can see my containers and images in a docker tab.
I clicked one container and clicked [Attach Visual Studio Code] and it says There are no running containers to attach to.
I resolved this problem by switching to the remote server's Docker context on my local machine:
docker context create some-context-label --docker "host=ssh://user#remote_server_ip"
docker context use some-context-label
docker ps
# A list of remote containers on my local machine! It works!
After that:
Connect via Remote-SSH to the container server
Right click relevant container -> the "Attach Visual Studio Code"
That works for me.
(Note: One would think that I should be able to just use my local VSCode (skip step 1) to connect to said remote container after switching my local context, but VSCode complains Failed to connect. Is docker running? in the Docker control pane.)
I solve this issue using SSH tunneling following the steps found in https://florian-kriegel.de/blog/?p=234
Summarizing:
Set (or add) "docker.host": "tcp://localhost:23750" in settings.json
in VSCode.
Open a SSH tunnel like this in your local machine
changing the user and hostname by the remote machine (where the docker daemon is running) credentials:
ssh -NL localhost:23750:/var/run/docker.sock user#hostname.
Now, in the docker tab, you will be able to see and attach to containers in the remote machine.
Note that the Remote SSH Extension is not used in this case.
This might sound very strange, but for me, I had to open a folder on the remote SSH server prior to using the Remote Containers extension in VS Code. If I didn't do that, then it would constantly try to find the docker service running locally, even though the terminal tab was connected to the remote SSH server.
This seems very weird, because if you're conncted via SSH in VS Code, then the extension should assume you're trying to attach to the container on the remote server. Shouldn't have to open a remote folder first.
By "opening a folder" on the remote server, the Remote Containers extension was then able to attach VS code to the container running on the remote SSH server. I didn't have to do any of the steps in any of those articles. Just simply use Remote SSH to connect VS Code remotely via SSH, open a folder, and then use Remote Containers.
Solution using the "Remote SSH" and the "Remote Explorer" extension in Visual Studio Code.
Following the steps above (https://stackoverflow.com/a/61728799/11687201) I figured out how to make use of the SSH Remote and Remote Explorer Extension. The first step is the same as above:
Open the settings.json file in VSCode, press F1 and select ">Preferences: Open Settings (JSON)" and add/edit the following line:"docker.host": "tcp://localhost:23750"
Open the ssh config file, click on the "Remote Explorer" Extension, then click on the "SSH Targets" "Configure" button and open the ssh config file.
Add the following line to your ssh connection:
LocalForward localhost:23750 /var/run/docker.sock
Remark: Previously I used the solution described earlier in this thread (https://stackoverflow.com/a/61728799/11687201). I had to reboot both machines the local machine and remote machine before the solution described below worked out.
Afterwards I have to use multiple VSCode Windows:
Local Machine: Start VSCode and use the "Remote Explorer" to connect to the remote machine using a new VSCode window
VSCode window connected to remote (SSH)
→ startup the Docker container of your choice
(I was not able to "Attach Visual Studio Code" from this VSCode window)
VSCode window connected to local machine
→ Click on the "Docker" extension, the docker containers running on the remote get listed. Attach VSCode to a running container using one of the folling options:
Right-click on the desired container and chose "Attach Visual Studio Code"
Press F1 and chose">Remote-Containers: Attach to Running Container..." and select the container of your choice afterwards
A third VSCode window will open being attached to the Docker container.
Pros and cons of this solution
(+) Using the "Remote Explorer" extension I can directly connect and open a previously used project folder on my remote machine with one click
(-) 3 VSCode windows (local machine, remote ssh and remote container) are needed instead of 2 VSCode windows
Do you see the error message as of following?
Failed to connect. Is Docker running?
Error: connect EACCES /var/run/docker.sock
Error Message on VSCode
It's because VSCode uses /var/run/docker.sock of remote host to communicate with the Docker service.
There're two methods.
Method 1. (Secure, Need reboot or logging out) After executing following code of dockerode npm getting error "connect EACCES /var/run/docker.sock" on ubuntu 14.04
Method 2. (Instant effect. Use it if you're not dealing with production server)
Run the following command on SSH console.
sudo chmod o+rw /var/run/docker.sock
For some reason, this problem is fixed for me when I open a folder in the remote window before trying to attach to a container.
I found Daniel's answer really helpful but didn't work for me. I put my two cents.
TL;DR
Create a new docker context for the remote machine where remote container is running.
docker context create some-context-label --docker "host=ssh://user#remote_server_ip"
docker context use some-context-label
Just open VSC, go to Docker (you should have installed the extension) tab and you'll see listed all running containers from the remote context you recently created.
Right click on your desired container and attach visual studio code
You can also use the remote-explorer tab, just select containers from the dropdown at the top left.
Why not to ssh remote host
When attaching visual studio code to a container, you can check logs by clicking the notification Setting up Remote-Containers (show log) at the bottom left. There, you can check that:
...
[26154 ms] Start: Run: ssh some-remote-host /bin/sh
[26160 ms] Start: Run in host: id -un
Here, my guess is that it's trying to ssh to the remote host from itself ,since we already connected via remote-ssh.
If you can reach the remote node running Docker engine via SSH why you need yet another SSH server inside the container? From the host running your container, it is possible and safe to use tty, i.e. attach.
I don't think that this is not a good idea to use SSHD running inside the container although it is possible. To be useful SSHD has to listen to non-conflict port in every container. Otherwise, 2 containers occasionally exposing the same port on the same node will conflict like any other service running on same the node.
Of course, ports can be randomized using -P option but it is not so convenient. It is also less convenient to manage keys and users at the container level than at host level where all machinery is provided by the Host software.
Loading every container with SSHD increases the container size. In Kubernetes, every container is reachable without any SSHD running inside containers via pass Pod->Container because Pod, has IP and containers are attachable by id, i.e. "Docker-host->container"
Step 1 - Docker daemon on the remote machine
make sure your remote Docker daemon can accept connections from your host
for testing purposes, I use the following command on the remote
machine to force Docker daemon to listen on port 4243 on all IPs,
beware this is not secure
There is no support for reading a file from /etc/sysconfig or elsewhere to modify the command line. Fortunately, systemd gives us the tools we need to change this behavior.
The simplest solution is probably to create the file /etc/systemd/system/docker.service.d/docker-external.conf (the exact filename doesn't matter; it just needs to end with .conf) with the following contents:
[Service]
ExecStart=
ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:4243 -H unix:///var/run/docker.sock
And then:
systemctl daemon-reload
systemctl restart docker
Step 3 - Opening Docker Ports Using FirewallD
firewall-cmd --permanent --zone=public --change-interface=docker0
firewall-cmd --permanent --zone=public --add-port=4243/tcp
firewall-cmd --reload
Step 4 - Set (or add) "docker.host": "tcp://localhost:4243" in settings.json in VSCode.

How to SSH in to different containers in Multi Container Azure App Service

I want to SSH to my containers created in an Azure App Service. These are Linux based containers and used Docker Compose to deploy these to Azure App Service.
I have followed the article to enable SSH. For one of the container (Container A) I am able to SSH (exposed port 2222, 80 for this). But I would like to SSH to other containers (Container B) too. I have exposed another port 2223 for Container B and followed the same steps in that document. When I try to access them using the command ssh root#172.xx.x.x -p 2223, I get the error Connection Refused. But the command ssh root#172.xx.x.x -p 2222 works for the Container A and I am able to see the Dotnet process running for the API in that Container A when I run the Top command
This is still not supported in 2020. For further info you can also check with GitHub repo for App Service.
https://github.com/Azure/app-service-linux-docs
For SSH in multicontainer app service, you can't select a specific container to SSH to. It's not possible as of today. Always the front-facing container is picked up for SSH which is mostly at 80 port also if not then the first container in yaml file.
https://github.com/Azure/app-service-linux-docs/blob/master/how_multicontainer_webapp_determine_web_container.md
https://feedback.azure.com/forums/169385-web-apps/suggestions/34743265-support-ssh-to-specific-container-in-multi-contain
This is the thread from product owner in late 2018. Seems like they still haven't approved that functionality.
How to SSH in to different containers in Multi Container Azure App Service
After further research, I found out that we cannot SSH to multi container. Currently Azure supports SSH to public facing container only. Based on this link its planned by Azure App Service team!

Trying to get Xdebug session initiated in a docker inside a VM to reach my remote computer

I have a docker running my PHP app.
This docker needs to run inside a VM in a remote datacenter.
I work from a computer that can connect to the mentioned VM.
My intention is to have the Xdebug session that is initiated inside the docker reach my computer (more precisely my PHPStorm).
Both docker and the VM are running Centos (company approved/installed images).
The development computer is OSx.
I am able to use ssh remote forward (aka: tunnel) to forward any requests from the VM to my computer.
I want to either:
- be able to open a tunnel from my computer directly to the docker container in the VM
- or be able to continue the current tunnel in from the VM to the docker.
Have found no way to do the first option and have ran into a lot of issues trying to do the second.
Any suggestions?

Debugging in and deploying to containers on a remote server in IntelliJ IDEA

IntelliJ IDEA (and PyCharm with others) support remote deployment, debugging, and execution in "Tools → Deployment". This allows running remote SDK as well, so the workflow is identical to local development.
This works until development is containerized. In this case, you have to execute (run or debug) inside a container on a remote server.
For Docker containers:
Deployment is simple: Set up SFTP to the remote server and automatically upload files there. Files are stored in folders. Folders are attached to Docker containers as volumes. Restart the app inside the container.
Setting up a remote SDK is not clear because this SDK is inside the remote container. IntelliJ IDEA has Docker plugin that supports remote SDKs from Docker containers:
I guess I should set up a new Docker server by connecting IDEA to the remote Docker daemon via TCP socket.
Several sources explain how to configure the remote API at various stages:
Put Docker on a network socket: How do I enable the remote API for dockerd
Protect the socket: Protect the Docker daemon socket
Open it to the external world: How to open a specific port such as 9090 in Google Compute Engine
Add the server:socket to the new SDK configuration in the picture above.
Where can I get a more detailed guide on connecting IDEA to the remote Docker? For example, where do I get the certificate, what ports should I open on the remote machine, and how to set it up securely if the remote server is an AWS/GCP machine?

Resources