Can't run dependent project - docker

I'm working with Intellij IDEA project. My main program is dependent on some database server inside the docker container, so to run project I firstly run "docker-compose" ("RNAO Services" configuration on the attached screen) and only after that I run my main configuration ("RNAO Server").
What I want to do is to run docker compose before start of my main configuration.
I have tried to add docker compose configuration in the "before launch" list inside the main configuration after the "Build" step, but it just freezes on start. I think that the reason is that when I try to run docker compose configuration, it is not shown as running (no red square "stop" button available), but they are (I have access to the database inside the container).
Here is the project look when I start docker compose configuration:

IDE will not run the configuration until all tasks from Before launch section have been terminated. There is a related request: IDEA-112294 Before Launch -> Run Another Configuration Should Support Non-Terminating Before Launch Configurations about it.

Related

"Building Image" task hangs in VS Code Dev Container when using a large directory

I'm using Visual Studio Code on a Windows machine. I'm trying to setup a Python Dev Container using a directory that contains a large set of CSV files (about 200GB). When I click to launch the remote container in Visual Studio the application hangs saying (Starting Dev Container (show log): Building image.
I've been looking through the docs and having read the Advanced Container Configuation I've tried modifying the devcontainer.json file by adding workspaceMount and workspaceFolder entries:
"workspaceMount" : "source=//c/path/to/folder,target=/workspace,type=bind,consistency=delegated"
"workspaceFolder" : "/workspace"
But to no avail. Is there a solution to launching Dev Containers on Windows using folders which contain large files?
I had a slightly different problem, but the solution might help you or someone else. I was trying to run docker-compose inside a docker-in-docker image (provided by vscode). In my case, my container was able to start, but nothing inside the container was able to run.
To solve my issue, I updated vscode and and now there is a new option Remote-Containers: Clone Repository in Container Volume.... If your code is a git repo, you can do this:
Step #1:
Step #2:
Step #3 and onwards:
Follow the given steps provided by vscode and you should have your repository in the container as a volume. It reduced my building times from about 30mins to 3mins (within the running container) because I brought stuff into the container after it was up and running.
Assuming the 200GB is ignored by your .gitignore, what you could try to do is once the container has started, you can copy the 200GB worth of excel files into the container. I thought this would help because I did a similar thing by bringing in all my node_modules after running the container.

VScode remote containers - how to view the dockerised service console output?

This is a follow on from this question (none of the current answers seem hit the nail on the head).
VScode's default behaviour for starting a remote vscode session (using VScode Remote-Containers) seems to be:
Run the project's docker-compose file
If the user selects show log (a UI popup) during build, open a VScode terminal session called Dev Containers reflecting Docker's build logging, supplemented with VSCode Remote-Containers logging. This output ends after build complete.
If the user didn't select show log, and later opens a VScode terminal after build complete, just start a new bash session within the container. No other VScode terminal sessions exist.
Launch a VScode session from inside the now-running container
From the user's perspective, the container is running, but the output that is happening inside the container seems inaccessible (even if the docker-compose command did not use daemon mode).
So, how can the user now view the console output that is happening inside the container?
If I am reading correctly, VScode Remote-Containers documentation seems to suggest overriding the default behaviour, ie:
supress your docker-compose command that would otherwise have started the service, and instead apply some dummy command to persist the container upon creation, then
manually start the service (using debug mode, or via VScode terminal) from inside the remote session. This reveals the output, but within an accessible VSCode terminal session.
Is there no way to:
A) Start the services via system terminal (e.g. docker-compose up), and then start a VSCode remote session in this already running container*, or
B) Accessing the service's output without having to override as above (the override seems hacky)
*This would be ideal. The Remote-Containers "Attach to Running Container..." command sounds close to this. But it seems to instantiate itself in a directory I don't recognise, and doesn't seem to be the container.
Option A seems to be achievable by
Starting the service in terminal (docker-compose up)
In vscode, using the remote-containers "remote explorer" UI (not the cmd+P "Attach to container" commands) to select the running container's working directory. Right click > "Open in container". This doesn't actually open a new container, it "Opens the directory, from within the container".
OR ( thanks #cybercoder )
Letting vscode start the services
In a separate terminal: docker logs -f container_name OR docker-compose logs -f

How to make VSCode run custom script when attaching to a running remote container

I have a running Docker container and would like to use the VSCode remote container plugin to attach to it.
Is it possible to have VSCode run a script when it attaches? Some custom actions are required to setup the container. These actions cannot be baked into the Dockerfile/Image.
Is it possible to configure the Docker exec arguments when attaching to a running container. (This is possible for Docker Run using .devcontainer when creating new containers, but I haven't found anything about Docker exec regarding already running containers).
There is a "postAttachCommand" that lets you execute a custom command after the vscode attached to the running container.
However my preference would be to use a login shell, for that there is an undocumented property called
"userEnvProbe": "loginInteractiveShell"
Below github issue explains this parameter (This is where i learnt about the parameter as well) :
https://github.com/microsoft/vscode-remote-release/issues/3585
The userEnvProbe and postAttachCommand is per docker container, you have to add them to "Container Configuration File", hover your mouse on the tip of the red arrow and you will see a settings icon, when you press it you can access to the "Container Configuration File"
For further customization there is a great github page that explains what else you can do to further customize the way you execute docker commands as well
https://github.com/microsoft/vscode-docker/issues/1596

How to view docker logs from vscode remote container?

I'm currently using vscode's remote containers extension with a .devcontainer.json file that points to my docker-compose.yml file.
Everything works fine and my docker-compose start command gets run (which launches a web server), but I haven't found a way to quickly see the logs from the web server. Has anyone found a way to view the docker log output automatically once vscode connects to the remote container?
I know as an alternative I could remove my container's start command and, after vscode connects, manually open a terminal and start the web server, but I'm hoping there's an easier way.
Thanks in advance!
I'm not using remote containers, just local once, so not sure if this applies but for locally running containers, you can go to the "Docker" tab (you need to install the official Microsoft Docker VS Code Plugin) where you can see your running containers. Just right-click on the container you want to see the logs for and select "View Logs":
You'll see a new "Task" appear in the Terminal pane that will show all your docker logs:
This question is really old and I'm not sure it this option was available at this time but just open the Command Palette (F1) and select/find "Remote-Containers: Show Log".
You see now the log of your container in the terminal.
You can open the command palette and search for: Remote Explorer: Focus on containers view. You should see a sidebar of containers, if you right click your container you can view logs.
I use VS Code's builtin terminal to see the live logs of the docker container that is connected with VS Code.
When VS Code is connected to the docker container, you can open the builtin terminal using the View > Terminal menu option. You should see an existing terminal labeled Dev Containers.
Maybe this is too late? But for others, this is how I do it.
First, instead of logging stuff to the stdout, I redirect all of the outputs into one single file and then using the tail command to steam the output to the terminal instead.
For example, I am going Go here:
logFile, err := os.OpenFile(logFileName, os.O_WRONLY|os.O_CREATE, 0755)
if err != nil {
log.Fatal("Fail to open the log file")
}
logrus.SetOutput(logFile)
Once that's done, I open up my terminal and run my the following command:
$ tail -f {logFileName}
That's one way to do it I guess, but I sure hope VSCode can come up with a better solution.
In the Remote Explorer tab you can see all your docker containers. Under "Dev Containers" is the container for the service specified in devcontainer.json; the rest are in "Other Containers." Simply right click on the container you're interested in and click "Show Container Log." You'll see the full output of the command for that service, just like in an interactive terminal - not a docker build log!
Note I am using a local development container and did not test with remote containers but I'm guessing it's the same.

Docker: Run command while another command is running

I need to configure a program running in a docker container. To achieve that the program must be running (and provide an open port) so that the administration program can connect to the running process. Unfortunately there is no simple editable config file so this is the only way. The RUN command is obviously not the right one because it does not provide a running instance after docker went to the next command. The best way would be doing this while building the docker image but if it has to be done during container start it would be OK as well. But there is (as far as I know) also no easy way to run multiple commands on startup. Does anyone has an idea how to do that?
To make it a bit more clear, here is a simple example from my Dockerfile:
# this command should start the application which has to be configured
RUN /usr/local/server/server.sh
# I tried this command alternatively because the shell script is blocking
RUN nohup /usr/local/server/server.sh &
# this is the command which starts an administration program which connects to the running instance started above
RUN /usr/local/administration/adm [some configuration parameters...]
# afterwards the server process can be stopped
Downloading the complete program directory containing the correct state could be a solution, too. But then the configuration cannot changed easily in the Dockerfile, what would be great.
A Dockerfile is supposed to be a sequential list of instructions to produce an image. The image should contain your application's code, and all of its installable dependencies.
Each RUN instruction gets executed as its own container. Once the command that you run completes, any changed files get committed as a new image layer.
Trying to run a process in the background, will cause the command you are running to return immediately. Once that happens, the container is considered stopped, and the Dockerfile's next instruction will be executed in a new separate container.
If you really need two processes running, you will need to produce a command that you can pass to a single RUN instruction.

Resources