I'm trying to set up VSCode so I can work on a project which resides inside a docker container. There's a recently published extension Remote Development which seems to enable just that.
I followed detailed official instructions on creating .devcontainer/devcontainer.json and setting up remote by running Remote-Containers: Reopen Folder in Container, however, even with official/provided containers and settings I get the error:
Setting up container for folder: /home/ilijas/<path_to>/workspace
Error: (HTTP code 500) server error - linux spec user: unable to find user ilijas: no matching entries in passwd file
at /home/ilijas/.vscode-insiders/extensions/ms-vscode-remote.remote-containers-0.53.0/dist/extension.js:1:151013
at /home/ilijas/.vscode-insiders/extensions/ms-vscode-remote.remote-containers-0.53.0/dist/extension.js:1:150976
at m.buildPayload (/home/ilijas/.vscode-insiders/extensions/ms-vscode-remote.remote-containers-0.53.0/dist/extension.js:1:150986)
at IncomingMessage.<anonymous> (/home/ilijas/.vscode-insiders/extensions/ms-vscode-remote.remote-containers-0.53.0/dist/extension.js:1:150486)
at IncomingMessage.emit (events.js:187:15)
at endReadableNT (_stream_readable.js:1090:12)
at process._tickCallback (internal/process/next_tick.js:63:19)
In my first attempts I tried to mount a local workspace to remote one, however, since I couldn't resolve this user-not-found error, I removed all of the arguments inside docker settings which regarded user, just to make one dummy container work. I had no success. I know this is a fresh extension, but still, I hope someone can help.
Essentially, removing all the previous docker containers solved the issue.
Reference GitHub issue:
The container has a label with the folder as the value, so it can be found again. When you close the window, the container is only stopped, not removed, for later use. (You could have some changes inside the container you want to keep. Also: Reusing an existing container is slightly faster.)
Related
I ran docker container prune, but after attaching it to my new container, I still see the old container in the explorer pane in VSCode. And I do not see the new container in the explorer pane.
And if I try to open a new terminal (Terminal > New Terminal), I see:
The terminal process failed to launch: Starting directory (cwd)
"/path/inside/my/old/container" does not exist.
Whenever I create a new container based on the same image (drupal:latest), VSCode tries to open the old container based on this image, even if I give the new container a different name.
VSCode also sometimes shows the error "Workspace does not exist" when I attach it to the new container.
After attaching to the new container, I needed to do...
File > Open Folder
And then, in the Open Folder UI, I needed to edit the old path there, which still referred to the old container, and then click OK.
For example, in the new container, the path /opt exists, so entered that path and clicked OK. This causes the explorer to refresh and the old container disappears and now I can access the new container. I can also open the terminal now.
0
I succeeded in connecting to a remote server configured with Docker through vscode. By the way, the list of containers from the past was fetched from the remote explorer of vscode. If you look at this list of containers, they are obviously containers made with images I downloaded a few days ago. I don't know why this is happening.
I am new to coding in containers using docker. I am trying to open a container inside vscode in order to work in the environment. However every time I open the container, it errors out of every time.
I am using vscode's docker extension to open the docker file in the cloned repository
After trying to launch the file in the container, it errors out and gives me this error code
I tried rebuilding the container to see if that would work, but it still gives me the same error.
I am really new to working on containers itself, so can anyone help me?
I'm using a container for Tensorflow-GPU environment to avoid the hassle of setting one manually and I was following this guide: https://code.visualstudio.com/docs/remote/containers
I've set up the container and installed the necessary extensions and then I try to run "Open Folder in a Container" command. It works fine but none of my files get linked to the new working area inside the docker.
I felt like it was saying that I should get access to all you existing files and folders for the project inside a container.
Is this not how this works? What are the normal way of linking project from the host system onto the docker?
EDIT: This is what I get when I open my container with none of my files present
If I run Docker (Docker for Desktop, 2.0.0.3 on Windows 10), then access to internal infrastructure and containers is fine. I can easily do
docker pull internal.registry:5005/container:latest
But ones I enable Kubernetes there, I completely lose an access to internal infrastructure and [Errno 113] Host is unreachable in Kubernetes itself or connect: no route to host from Docker appears.
I have tried several ways, including switching of NAT from DockerNAT to Default Switch. That one doesn't work without restart and restart changes it back to DockerNAT, so, no luck here. This option also seems not to work.
let's start from the basics form the official documentation:
Please make sure you meet all the prerequisites and all other instructions were met.
Also you can use this guide. It has more info with details pointing to what might have gone wrong in your case.
If the above won't help, there are few other things to consider:
In case you are using a virtual machine, make sure that the IP you are referring to is the one of the docker-engines’ host and not the one on which the client is running.
Try to add tmpnginx in docker-compose.
Try to delete the pki directory in C:\programdata\DockerDesktop (first stop Docker, delete the dir and than start Docker). The directory will be recreated and k8s-app=kube-dns labels should work fine.
Please let me know if that helped.
I am trying to run Hyperledger's BYFN Tutorial on a Win10 Home using Docker Toolbox, with VirtualBox 5.2.4. I am using the default image for the VirtualBox VM.
I have set up a shared folder (not in C:/Users, but on my other drive) and it seems to be functioning correctly - changes I make from either Windows, or the docker-machine are reflected in both places as intended. I successfully generate the network artifacts using "./byfn -m generate", but I get an error when trying to "./byfn up" it.
What happens is that, as far as I can see from the logs, all the containers get brought up correctly, but for some reason the volumes of the cli container are not attached correctly (I think). When byfn.sh finishes I get the following error:
When I ssh into the cli container, I can see the channel-artifacts, crypto and scripts folders, but their contents don't seem to correlate with the volumes: part of the docker-compose file. First, the scripts folder is empty (whereas in the docker-compose file it's specified that it should mount a bunch of files), so I get the above error. Second, the channel-artifacts containes only 1 directory named genesis.block, which should actually be a file. And in the crypto folder there are just a bunch of directories.
As you might have guessed, I'm pretty new at docker, so this might be intended behavior, but I'm still getting an error.
Please let me know if I can provide additional information. Thanks in advance.