Connect to weblogic server running inside docker container from Intellij IDEA - docker

I have Weblogic server running in Docker. Dockerfile can be found here: https://github.com/oracle/docker-images/tree/master/OracleWebLogic/dockerfiles/12.1.3
I also enabled Weblogic integration in Intellij IDEA and created following run config:
I'm running it on Windows 7, so 192.168.99.100 is default IP address assigned to VirtualBox VM where docker engine runs in.
Port 20180 is the result of port mapping
When I click Test Connection, it timeouts with exception
Error connecting to the Application Server: com.intellij.javaee.process.common.WrappedException: java.io.IOException: t3://192.168.99.100:20180: Bootstrap to: docker.homecredit.net/192.168.99.100:20180' over: 't3' got an error or timed out while trying to connect to docker.homecredit.net/192.168.99.100:20180
What should I do to make it working?
Why am I required to specify Application Server, when I connect to remote server (I chose remote run config type)

Related

How to connect a Dev Container to another Container?

for this question im working with prisma's dev container: https://github.com/prisma/prisma/tree/main/.devcontainer
once i open that repo inside of a container using the remote container plugin in visual studio and run some Jest Tests that rely on docker services defined in the https://github.com/prisma/prisma/tree/main/docker folder, i get the error of "cant connect to database" for all databases...
it's like if the dev container had no idea those services exist... on my pc, looking at docker desktop i see the services up and running but the devcontainer can't... why?
i find it weird that i had to change any type of setting since this files are from the prisma repo, they are suposed to be ready for action once downloaded... right?
Assumming the docker network driver is bridge (Default).
If the script is runing this line to get env in your devcontainer as below.
const connectionString = (
process.env.TEST_MYSQL_URI_MIGRATE || 'mysql://root:root#localhost:3306/tests-migrate'
).replace('tests-migrate', 'tests-migrate-dev')
`
The localhost in the connection string means the localhost in your devcontainer but not your host machine.
You should access the localhost of your host machine instead.
The fix is set the TEST_MYSQL_URI_MIGRATE environment variable instead like
TEST_MYSQL_URI_MIGRATE=mysql://root:root#host.docker.internal:3306/tests-migrate
For the details how to access the localhost of host machine, please read this question

docker containerized zabbix server monitoring same host running the zabbix server : connection refused

I m running a dockerized version of zabbix server (centos 6.0) on my host. I m also running zabbix-agent2 on this host with configuration of server with ip adress of 127.0.0.1.
When i go into zabbix frontend web interface i get these error on the detected host:
When the ip is at default value 127.0.0.1:10050:
Get value from agent failed: cannot connect to [[127.0.0.1]:10050]: [111] Connection refused
When i change the default value to 172.17.0.1:10050 (the docker cant identify localhost of the host naturally) :
Get value from agent failed: ZBX_TCP_READ() failed: [104] Connection reset by peer
When i go to host and i ping or traceroute the host it works well.
When i go to host and i try detect operating system i get the error
Cannot execute script.
sh: sudo: command not found
What can i do to make the host work properly?
I have tried to check these posts :
Access localhost from docker container
(when i use 172.17.0.1).
I have tried to use network_mode: host but it conflicts with network being defined in the docker-compose.yml
I have tried this solution How to use the host network, and any other user-defined network together in Docker-Compose?
but it doesnt work either
The port of the docker-compose is well defined and mapped (10051:10051)

gRPC in docker container can't connect to services on host machine

I have slightly modified this example: https://github.com/grpc/grpc-web/tree/master/net/grpc/gateway/examples/echo. I am running envoy on a docker container with exposed port 8080 (running this proxy server is required because the browser can't speak directly to a backend gRPC service). I am running all the services on my localhost (the host machine of the envoy docker container). However, I cannot seem to connect envoy in the docker container to the services running on the host machine.
I compiled grpc_cli in the container and when I run grpc_cli ls 192.168.1.10:9000 (host's LAN IP address and the port the service is running on), I get
root#bdc9ac396a87:~/grpc# ./bins/opt/grpc_cli ls 192.168.1.10:9000
Received an error when querying services endpoint.
ServerReflectionInfo rpc failed. Error code: 14, message: failed to connect to all addresses, debug info: {"created":"#1569023274.866465052","description":"Failed to pick subchannel","file"
:"src/core/ext/filters/client_channel/client_channel.cc","file_line":3876,"referenced_errors":[{"created":"#1569023274.866463178","description":"failed to connect to all addresses","file":"
src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":395,"grpc_status":14}]}
I get an almost identical error when I use the IP address of the docker0 interface, which should also provide a connection to the host machine.
root#bdc9ac396a87:~/grpc# ./bins/opt/grpc_cli ls 172.17.0.1:9000
Received an error when querying services endpoint.
ServerReflectionInfo rpc failed. Error code: 14, message: failed to connect to all addresses, debug info: {"created":"#1569022455.801913949","description":"Failed to pick subchannel","file"
:"src/core/ext/filters/client_channel/client_channel.cc","file_line":3876,"referenced_errors":[{"created":"#1569022455.801910006","description":"failed to connect to all addresses","file":"
src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":395,"grpc_status":14}]}
However, running a simple http server from the host with
python -m http.server
I can run the following commands from the container just fine:
wget 172.17.0.1:8000/test.txt // works
wget 192.168.1.10:8000/test.txt // works
A client on the host (not in the container) connects and works just fine with the service, so it's not a server problem.
Does docker block certain types of traffic? I noticed in the example the server was placed on another docker container, and it worked (it also worked locally for me), but I'd prefer to have my services running on my host machine while I build and test them. Is there a setting somewhere to enable gRPC from the container to a service on the host machine?
Docker version 1.13.1, build 47e2230/1.13.1
Fedora 29

Docker: How to connect to locally available servers from within docker

I run docker on windows. I have a docker container running a python application that needs a database connection.
Installing a DB on my machine and connecting to it via "docker.for.win.localhost" in my container works fine.
Now I want to connect to a database running on a server that is available over my local network. I can't seem to connect to it from inside my docker container. I don't quite understand how I can proxy the server to my container. The error indicates that it can't establish a connection to this server:
(psycopg2.OperationalError) could not connect to server: No route to host
Is the server running on host "XX.XXX.XX.XX" and accepting
TCP/IP connections on port 5555?
I'm sure this is supposed to work somehow, right?
you can add IP of host to the container
docker run --add-host="yourhost:IPOFTHEHOST"
and yourhost will be connected to host

Debugging in and deploying to containers on a remote server in IntelliJ IDEA

IntelliJ IDEA (and PyCharm with others) support remote deployment, debugging, and execution in "Tools → Deployment". This allows running remote SDK as well, so the workflow is identical to local development.
This works until development is containerized. In this case, you have to execute (run or debug) inside a container on a remote server.
For Docker containers:
Deployment is simple: Set up SFTP to the remote server and automatically upload files there. Files are stored in folders. Folders are attached to Docker containers as volumes. Restart the app inside the container.
Setting up a remote SDK is not clear because this SDK is inside the remote container. IntelliJ IDEA has Docker plugin that supports remote SDKs from Docker containers:
I guess I should set up a new Docker server by connecting IDEA to the remote Docker daemon via TCP socket.
Several sources explain how to configure the remote API at various stages:
Put Docker on a network socket: How do I enable the remote API for dockerd
Protect the socket: Protect the Docker daemon socket
Open it to the external world: How to open a specific port such as 9090 in Google Compute Engine
Add the server:socket to the new SDK configuration in the picture above.
Where can I get a more detailed guide on connecting IDEA to the remote Docker? For example, where do I get the certificate, what ports should I open on the remote machine, and how to set it up securely if the remote server is an AWS/GCP machine?

Resources