Manage VSCode Remote Container as Docker-Compose Service - docker

for the development of my Python project I have setup a Remote Development Container. The project uses, for example, MariaDB and RabbitMQ. Until recently I built and started containers for those services outside of VSCode. A few days ago, I reworked the project using Docker Compose so that I can manage the other containers using the remarkable Docker extension (Id: ms-azuretools.vscode-docker, Version: 1.22.0). That is working fine, besides one issue I cannot figure out:
I can start all containers using compose up, however, the Python project Remote Development Container is not staying up. Currently, I open the project folder in a second VSCode window and use the "Reopen in Container" command.
However, it would be nice if the Python project container is staying up and I could just use the "Attach Visual Studio Code" command from the Docker extension Containers menu.
I am wondering, if there is something I can add to the .devcontainer.json or some other configuration file to realize this scenario?
Any help is much appreciated!
If it helps I can post the docker-compose.yml, Dockerfile's or the .devcontainer.json, please let me know what is required.

Related

"Building Image" task hangs in VS Code Dev Container when using a large directory

I'm using Visual Studio Code on a Windows machine. I'm trying to setup a Python Dev Container using a directory that contains a large set of CSV files (about 200GB). When I click to launch the remote container in Visual Studio the application hangs saying (Starting Dev Container (show log): Building image.
I've been looking through the docs and having read the Advanced Container Configuation I've tried modifying the devcontainer.json file by adding workspaceMount and workspaceFolder entries:
"workspaceMount" : "source=//c/path/to/folder,target=/workspace,type=bind,consistency=delegated"
"workspaceFolder" : "/workspace"
But to no avail. Is there a solution to launching Dev Containers on Windows using folders which contain large files?
I had a slightly different problem, but the solution might help you or someone else. I was trying to run docker-compose inside a docker-in-docker image (provided by vscode). In my case, my container was able to start, but nothing inside the container was able to run.
To solve my issue, I updated vscode and and now there is a new option Remote-Containers: Clone Repository in Container Volume.... If your code is a git repo, you can do this:
Step #1:
Step #2:
Step #3 and onwards:
Follow the given steps provided by vscode and you should have your repository in the container as a volume. It reduced my building times from about 30mins to 3mins (within the running container) because I brought stuff into the container after it was up and running.
Assuming the 200GB is ignored by your .gitignore, what you could try to do is once the container has started, you can copy the 200GB worth of excel files into the container. I thought this would help because I did a similar thing by bringing in all my node_modules after running the container.

Open VS Code from inside a docker container

Is it possible to run code someFile.js from inside a docker container, and have it open in VS Code?
Why do I want to do this? Because vue dev tools allows you to open a vue component from within the browser. This is especially helpful for new devs that want to quickly track down components and open them in the editor.
Unfortunatly - since my dev server is running inside a docker container - this functionality doesn't work. This is because the editor is opened from within the devserver.
Might be worth noting, I'm using Visual Studio Code Remote - Containers.
So to narrow the question furthur:
How can I allow launch VS Code from a docker container, so that vue dev tools can open that file in my local editor?
Yes, if you don't mind running your vue tools inside the docker container as well. You have to set up a .devcontainer.json file specifying the dockerfile or image or dockercompose file to use to build the container. It will create the container for you and automatically mount your project directory by default, but there are a lot of alternative configuration options as well.
This means you'd open VS Code and basically your whole IDE would be in the docker container. You could call vue tools from the VS Code terminal, including calls to code.
I've been doing this with some tensorflow stuff for the last 6 weeks or so. It was a little confusing at first, but now I really like it.
One challenge I've encountered so far is that if you are deploying your image as a deliverable, using a container as a dev environment can cause some dev tool creep into the image (like including dev tools in your Dockerfile that you need in development but dont want in the deployed image). There are probably good ways to deal with this but I haven't explored them all yet.
Another note: I can't seem to find the docs, but I think the recommended way is to use WSL2-backed docker, and then do all your docker mounting and docker client invocations from the WSL2 filesystem to docker instead of from Windows to Docker. I guess if WSL2 and docker are sharing the same VM, the mounted file systems are faster between WSL2/Docker than from Windows/Docker. This has worked well for me so far...
I've managed to adapt this dockerized version of VS Code to our restrictive runtime environment (Openshift), although it does assume connection to the internet, so extensions and Intellisense ML model had to be preinstalled:
https://hub.docker.com/r/codercom/code-server

Best practice for spinning up container-based (development) environments

OCI containers are a convenient way to package suitable toolchain for a project so that the development environments are consistent and new project members can start quickly by simply checking out the project and pulling the relevant containers.
Of course I am not talking about projects that simply need a C++ compiler or Node.JS. I am talking about projects that need specific compiler packages that don't work with newer than Fedora 22, projects with special tools that need to be installed manually into strange places, working on multiple projects that have tools that are not co-installable and such. For this kind of things it is easier to have a container than follow twenty installation steps and then pray the bits left from previous project don't break things for you.
However, starting a container with compiler to build a project requires quite a few options on the docker (or podman) command-line. Besides the image name, usually:
mount of the project working directory
user id (because the container should access the mounted files as the user running it)
if the tool needs access to some network resources, it might also need
some credentials, via environment or otherwise
ssh agent socket (mount and environment variable)
if the build process involves building docker containers
docker socket (mount); buildah may work without special setup though
and if is a graphic tool (e.g. IDE)
X socket mount and environment variable
--ipc host to make shared memory work
And then it can get more complicated by other factors. E.g. if the developers are in different departments and don't have access to the same docker repository, their images may be called differently, because docker does not support symbolic names of repositories (podman does though).
Is there some standard(ish) way to handle these options or is everybody just using ad-hoc wrapper scripts?
I use Visual Studio Code Remote - Containers extension to connect the source code to a Docker container that holds all the tools needed to build the code (e.g npm modules, ruby gems, eslint, Node.JS, java). The container contains all the "tools" used to develop/build/test the source code.
Additionally, you can also put the VSCode extensions into the Docker image to help keep VSCode IDE tools portable as well.
https://code.visualstudio.com/docs/remote/containers#_managing-extensions
You can provide a Dockerfile in the source code for newcomers to build the Docker image themselves or attach VSCode to an existing Docker container.
If you need to run a server inside the Docker container for testing purposes, you can expose a port on the container via VSCode, and start hitting the server inside the container with a browser or cURL from the host machine.
Be aware of the known limitations to Visual Studio Code Remote - Containers extension. The one that impacts me the most is the beta support for Alphine Linux. I have often noticed some of the popular Docker Hub images are based on Alphine.

open a file in docker container with vscode

I need to edit a file in a docker container. as no advanced ide inside the container,I wonder if vscode or some other ide can access into the container and let me edit files as outside the container.
With the May, 2nd 2019 announcement of "Remote Development with VS Code", you actually can use The Remote - Containers extension :
The Remote - Containers extension lets you use a Docker container as a full-featured development environment.
Containers make a great development environment because you can:
Develop with a consistent and easily reproducible toolchain and on the same operating system you are deploying to.
Quickly swap between different, isolated development environments and safely make updates without worrying about impacting your local machine.
Make it easy for new team members / contributors to get up and running in a consistent development environment.
Docker Workspace makes enabled to add folder inside a running docker container.
Docker extension for VS Code (ms-azuretools.vscode-docker) allows you to right-click on any running container in the list, then choose 'Attach Visual Studio Code' - and you will be able to open any folder inside Docker container in a new VS Code window. I presume you must have The Remote - Containers extensions installed as well.
Start the container by mapping the host path with container path using -v
docker run -v ~/yourlocalpath:/dockerpath
Changes you make on this folder will be reflected inside.

Remote debugging of Docker containers using IntelliJ

I have a number of Docker containers (10) each running a Java service that makes up my system. To create these containers I use a couple of docker-compose files. Using the Docker Integration plugin for IntelliJ, I can now spool up these services to my remote server using the Docker-compose option (the images used are built outside of IntelliJ, using Gradle). Here are the steps I have done to achieve this:
I have added a Docker server using the Docker Machine option to connect to the remote Docker daemon (message says Connection Successful).
I have added a new Docker Compose configuration, using the server, specifying my compose files, and the services I want to start.
Now that I have the system controlled through IntelliJ, I have been trying to figure out how to attach the remote debugger to each of these services so that IntelliJ will hit my breakpoints.
Will I need to add the JVM args (-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005) to each service (container) and add the usual remote debug configuration for each service? Do I need to use a different address for each service? If so, how do I add these args? Surely with the Docker Integration plugin, there is an easier way to do this.
IntelliJ Idea v2018.1.5 (Community Edition)
Docker Integration v181.5087.20

Resources