I've been regularly using the Tabnine VSCode extension on my computer for a while now, and it's been very helpful so far. However, I'm having trouble installing it inside a DevContainer.
Outside the DevContainer, Tabnine works perfectly fine.
On the other hand, although I've installed Tabnine as an extension in the DevContainer, inside the DevContainer
there are no suggestions being made, typing tabnine::config does nothing, and there's no TabNine icon in the Activity Bar.
I've tried adding port forwarding to port 5555, but VSCode automatically changes it to port 5556, and it doesn't fix the issue.
Is there anything I can do to get TabNine working in my DevContainer? I'm still very much a beginner, so I apologize if I'm missing something obvious.
If you are using M1 chip then try to force build docker image with flag "--platform linux/amd64". Tabnine will run just fine
You don't need to install Tabnine extension in the devcontainer. Only install it in the Host instance and the devcontainer will auto do the port forwarding for you.
Tested with version of Tabnine v3.5.45
Related
I've got two containers running via a docker-compose file. One is calling the other using 172.17.0.1 (instead of localhost). This has been working fine for a long time, but has suddenly stopped working.
I suspect it might be some new anti-something-ware installed by company that might be blocking it, but I don't know how to figure out if that is the case. What can I do to debug this?
I know that I can use host.docker.internal, but I want this to work out of the box on Linux as well (and know why it has stopped working on Windows).
I am a developer who is using Ubuntu 20.04 LTS regularly for my development. I never install any packages like, node, PHP, python in the OS and make use of docker for the purpose. VS Code is the editor I use, and the extension of the remote container will help me to develop & debug inside the docker container.
Right now, I am in the process of moving the development to a windows environment and I wanted to follow a similar workflow there too. Unfortunately, I am facing few issues like "file changes are not getting detected" (when npm serve in angular and react projects).
https://github.com/microsoft/WSL/issues/4739
https://www.reddit.com/r/bashonubuntuonwindows/comments/c48yej/wsl_2_react_not_reloading_with_file_changes/
I have tried different methods to solve the issue like
use wsl2 and then docker inside that and then serve from the container
use just docker and serve the code from inside the container
Regardless of the methods, the file changes are not getting detected inside the docker.
Trust me I have gone through many bizarre words like inotify, increasing the watchers, etc... Nothing helped.
Is there a developer out there following a similar practice in a Windows environment? (docker + windows)
Any help is highly appreciated.
I suggest moving the files to the wsl2 file system and not the windows.
Wsl2 'sees' the windows file system from inside a mount image /mnt/c .
Move out of it, like at ~ (cd ~) and i think your files will be normally watched .
At first Vscode was running perfectly.
But suddenly after some days VScode is not working.
I am connected to my wifi but VScode shows
I have uninstalled Vscode and all the related directories but it was helpless.
Help would be appreciated.
I had the same issue with me for some time.It was a issue with the proxy i used before.
Solution:
Goto Environment Variables Settings in windows and remove the variable with the proxy address.
good luck!
This is the same problem discussed in
Cannot connect to X server using docker on OS X - Part II
but never resolved.
In MacOS El Capitan, I'm running the OpenFOAM binary under Docker as there is no native version. I want to use paraView to view results. Cannot run the paraView supplied inside OpenFOAM since I get
paraview: cannot connect to X server
The advice here
Cannot connect to X server using docker on OSX
is to install paraView separately and run it from a normal terminal. This did not work as a normal terminal cannot "see" Docker files. And I cannot run the suggested
open -a paraview foam.foam
inside the docker terminal, as for some reason it does not recognize the "open" command.
Perhaps some Docker expert can help?
The team at OpenFOAM.org worked a lot on the issue recently to release a good solution to the MacOS users (http://openfoam.org/download/4-1-macos/).
Unfortunately, they reached the conclusion that going through X in Docker although working was not a solution because it freezes most of the time and it is really not convenient.
I understand from the question that the files cannot be seen from the MacOS terminal (that would require to know which version has been used). In the release of OpenFOAM 4.1 (see link above) in Docker this is done by the script and therefore it is directly possible to access the file by opening them with ParaView in a regular terminal.
Changed to: hot loading does not work in docker and it looks like it is a docker issue.
Following this: React with webpack or this React hot loader on local host machine they work fine and to me, they work the same - still I dont get why you would install React hot loader?
But running it in a container, updating/"hot loading" does not work in any of them. So this might be a question a docker expert?
As described on GitHub, you can do this:
watchOptions: {
poll: true
}
Or, in the package.json, instead of --watch do --watch --watch-poll.
If you are just looking for a proper file watching solution on a mac with Docker, check out docker-osx-dev. It uses boot2docker behind the scenes, but adds rsync support. I tried it and it works great for file changes.
I found a workaround; I have a reverse proxy(nginx) running in a container. The proxy forwards back to my main host computer(a Mac) on a port. This gives me hot loading and most important, I have no issues with cross domain as I have my database running in yet another container.