Docker Compose port mapping: 127.0.0.1 refused to connect - docker

I am currently working with Docker and a simple Flask website to which I want to send images. For this I'm working on port 8080, but the mapping from docker to host is not working properly as I am unable to connect. Could someone explain to me what I am doing wrong?
docker-compose.yml
version: "2.3"
services:
dev:
container_name: xvision-dev
build:
context: ./
dockerfile: docker/dev.dockerfile
working_dir: /app
volumes:
- .:/app
- /path/to/images:/app/images
ports:
- "127.0.0.1:8080:8080"
- "8080"
- "8080:8080"
dev.dockerfile
FROM tensorflow/tensorflow:latest
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt
RUN apt update && apt install -y python-tk
EXPOSE 8080
CMD ["python", "-u", "app.py"]
app.py
#APP.route('/test', methods=['GET'])
def hello():
return "Hello world!"
def main():
"""Start the script"""
APP.json_encoder = Float32Encoder
APP.run(host="127.0.0.1", port=os.getenv('PORT', 8080))
I start my docker with docker-compose up, this gives the output: Running on http://127.0.0.1:8080/ (Press CTRL+C to quit).
But when I do a get request to 127.0.0.1:8080/test I get that there is no response.
I have also tried docker-compose run --service-port dev as some people have suggested online but this says that there is no service dev.
Can someone help me for what I am doing wrong?

Use:
APP.run(host="0.0.0.0", port=os.getenv('PORT', 8080))
Using only:
ports:
- "8080:8080"
is enough

Related

Access to API documentation on Docker

I have a fastapi and when I run it with uvicorn, I am able to open its api documentation through localhost/docs.
When I have this fastapi on docker, I am not able to see the API documentation.
Do I need to add an extra docker container for the API documentation?
Here is my docker compose file:
version: "3.7"
services:
web:
build: ui
ports:
- 80:80
depends_on:
- api
api:
build: app
environment:
- PORT=80
ports:
- 8020:80
and I run it with docker-compose up --build.
I tried localhost/docs and localhost/swagger/index.html and I wasn't able to see the API documentation on docker.
Here is my fastapi docker file:
FROM python:3.9
COPY requirements.txt /app/
RUN pip install -r /app/requirements.txt
COPY ./ /app
WORKDIR /app
CMD ["uvicorn", "api:app", "--host", "0.0.0.0", "--port", "8020"]
I think the problem is mismatching ports.
try
ports:
- 8020:8020
then you can see docs at
http://0.0.0.0:8020/docs

Run commands on docker container and sync automatically with host

I Dockerkized a MENN(Nextjs) stack App, now everything works fine. I run into issues when i need to install npm packages. let me first show you the structure
src/server/Dockerfile
FROM node:14-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install -qyg nodemon#2.0.7
RUN npm install -qy
COPY . .
CMD ["npm", "run", "dev"]
src/client/Dockerfile
FROM node:14-alpine
WORKDIR /usr/app
COPY package*.json ./
RUN npm install -qy
COPY . .
CMD ["npm", "run", "dev"]
src/docker-compose.yml
version: "3"
services:
client:
build:
context: ./client
dockerfile: Dockerfile
ports:
- 3000:3000
networks:
- mern-network
volumes:
- ./client/src:/usr/app/src
- ./client/public:/usr/app/public
depends_on:
- server
environment:
- REACT_APP_SERVER=http://localhost:5000
- CHOKIDAR_USEPOLLING=true
command: npm run dev
stdin_open: true
tty: true
server:
build:
context: ./server
dockerfile: Dockerfile
ports:
- 5000:5000
networks:
- mern-network
volumes:
- ./server/src:/usr/app/src
depends_on:
- db
environment:
- MONGO_URL=mongodb://db:27017
- CLIENT=http://localhost:3000
command: /usr/app/node_modules/.bin/nodemon -L src/index.js
db:
image: mongo:latest
ports:
- 27017:27017
networks:
- mern-network
volumes:
- mongo-data:/data/db
networks:
mern-network:
driver: bridge
volumes:
mongo-data:
driver: local
Now if i install any packages using the host machine it is as expected updated in package.json file and if run
docker-compose build
the package.json is also updated inside the container which is fine, but i feel like this kinda breaks the whole point of having your App Dockerized! , if multiple developers need to work on this App and they all need to install node/npm in their machines whats the point of using docker other than for deployments? so what I do right now is
sudo docker exec -it cebc4bcd9af6 sh //login into server container
run a command e.g
npm i express
it installs the package and updates package.json but the host package.json is not updated and if i run the build command again all changes are lost as Dockerfile copies in the source code of host into container, is there a way to synchronize the client and host? in a way that if i install a package inside my container that should also update the host files? this way i dont need to have node/npm installed locally and fulfills the purpose of having your App Dockerized!

Docker does not build and run first service when there are two services in docker-compose

docker-compose up -d works fine when I have only the postgres service in the docker-compose.yml code below. But once I add the python service, the postgres container is never run even though its image is built. docker container ls -a shows that it does not exist.
version: '3'
services:
postgres:
build:
context: .
dockerfile: Dockerfile.postgres
restart: always
container_name: test_postgres
ports:
- "5431:5432"
# Once this python service is added, the postgres does not run.
python:
depends_on:
- postgres
build:
context: .
dockerfile: Dockerfile.python
restart: on-failure:10
container_name: test_python
ports:
- "8001:8000"
I haven't been able to find clear information on why this should be. Some solutions mention that version 3 doesn't use depends_on anymore. I thought this might be a possible issue so I removed it and added restart: on-failure:10 but it made no difference.
If I run docker-compose up -d with just the postgres service in it first, then add the python service into the same docker-compose.yml file and run it again, both images are built and containers run properly.
Not sure if necessary but here are the Dockerfiles for the services:
Dockerfile.postgres:
FROM postgres
WORKDIR /docker-entrypoint-initdb.d
ENV POSTGRES_DB test_postgres
ENV POSTGRES_PASSWORD 1234
COPY init.sql /docker-entrypoint-initdb.d
EXPOSE 5432
Dockerfile.python:
FROM python:latest
RUN mkdir /code
WORKDIR /code
COPY ./backend/ /code
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
RUN python manage.py migrate
RUN python manage.py loaddata customers
EXPOSE 8000
CMD python manage.py runserver 0.0.0.0:8000
What am I doing wrong?

Hot reloading of Gatsby doesn't work inside docker for Windows

I have set up Gatsby to work inside docker container and it works perfectly fine except for hot reloading.
I tried something like gatsby develop --host 0.0.0.0 --port 8080 but doesn't do hot reloading. I have to manually restart the container.
In your file docker-compose you must incorporate the following environment variable:
docker-compose.yml
version: '3'
services:
gatsby-app:
build:
context: ./
dockerfile: Dockerfile
image: gatsby-app
container_name: gatsby-app
working_dir: /app
volumes:
- /app/node_modules
- ./app:/app
ports:
- 80:8000
- 81:9000
environment:
- NODE_ENV=development
- GATSBY_WEBPACK_PUBLICPATH=/
- CHOKIDAR_USEPOLLING=1
Your DockerFile file must be:
Dockerfile
from node:latest
EXPOSE 8000
RUN npm install -g gatsby-cli yarn
WORKDIR /app
COPY ./app/package.json .
RUN yarn install && yarn cache clean
CMD ["yarn", "develop", "-H", "0.0.0.0", "-p", "8000"]

Docker on Windows 10- D: drive not shared

I have a Django rest project which I am dockerizing.
My Dockerfile:
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY . /code/
RUN pip install -r requirements.txt
And docker-compose:
version: '3'
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
I first ran: docker-compose build which was successful. I then ran: docker-compose up which is giving me error as ERROR: for web Cannot create container for service web: D: drive is not shared. Please share it in Docker for Windows Settings
How to fix this?
You just need to activate a drive for sharing in the settings.
Docker Desktop Settings

Resources