Streamlit Docker does not opens up using internal URL but rather localhost - docker

Here's the reproducible example
Dockerfile
Dockerfile
FROM python:3.8
WORKDIR /app
RUN pip install streamlit
ENTRYPOINT ["streamlit", "run", "app.py"]
Docker Commands used
docker build -t streamlit-app:latest .
docker run -ti streamlit-app:latest
Weirdly enough, it works using the network port provided by Streamlit (in Docker installed in my system with Ubuntu), but I have to use the localhost:8501 on my system with M1 mac.
Does it have something to do with the issue?

Related

docker container stops after docker run

I have a docker file which when built and run stops. I am trying to run both client and server in one docker container. If there is any solution to use docker-compose, then that is already in place and working fine. Please advise how to keep the container up and running using docker run. Thanks!
Here is my docker file, package.json and screenshot of folder structure.
DockerFile contents:
FROM node:14.14.0-alpine
RUN apk update && apk add bash
SHELL ["/bin/bash", "-o", "pipefail", "-c"]
WORKDIR /app
EXPOSE 3000
EXPOSE 4565
CMD ["npm","run","prebuild"]
docker build: command:
docker build -t sample .
docker run command:
docker run -d -it --name sm -v `pwd`:/app sample
Package.json:

Unable to Run the Fast API Application deployed in Docker Container

I have create a Docker Image based on the following DockerFile
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.7
COPY . /usr/app/
EXPOSE 80
WORKDIR /usr/app/
RUN pip install -r requirements.txt
CMD ["uvicorn", "app_Testing_Praveen:app", "--host", "0.0.0.0", "--port", "80"]
following the documentation available at
https://fastapi.tiangolo.com/deployment/docker/
After running the command
docker run -p 80:80 image_name
My docker image is running but giving the address as 0.0.0.0:80
But I am not able to find the absolute link to open the application. I know, due to virtualization there will be different external IP address for docker.
I found that IP on my docker network interface as "docker subnet mask" but that value is also not opening the applicatiln on browser.
My docker version is Docker version 20.10.5, build 55c4c88 and I am running this on windows.
You reach your services inside Docker containers, via the IP of the host machine
So you either access your service by http://localhost:80 or, from another machine, with http://<docker_host_ip>:80.

I am able to run the container in docker but unable to view in the browser

I am new to Docker. Firstly, I have created Dockerfile with in source code location.
Here is my Dockerfile
FROM nginx:latest
RUN mkdir /app
COPY . /app
EXPOSE 8000
lately, build an image using: docker build -t mywebapp:v1 .
and i have run the container using following command:
docker run -d -p 8000:8000 mywebapp:v1
problem is : container is running using port 8000, but unable to view in the browser
http://192.168.13.135:8000
please help me out in this problem inorder to view in the browser.

How to connect your pytest container to appium container running in same docker machine using --link command

I am running appium in a container. I am able to run my tests also in a container. Both are running in same docker machine but I am not able to point my tests to appium container. I tried running tests with --link but not working.
running my appium container with command:
docker run -d -p 32769:4723 --privileged --name appium_server_v1 appium/appium
exposing port 32769 for pytest to consume
using host = 0.0.0.0 and port = 32769 in my desired capabilities
I am running my pytest tests using command:
docker run -it --link appium_server_v1:appium/appium --name uitests_v1 uitests
uitests is my image which contain my tests
I have build it with a Dockerfile whose contents are:
FROM python:alpine3.7
WORKDIR .
COPY . .
RUN pip install --trusted-host pypi.python.org -r requirements.txt
EXPOSE 80
CMD ["pytest"]
I am using mac os
I am able to run the tests when my tests are running in local and appium server is running in container in a docker machine. I am using HOST = 192.168.99.100 and PORT = 32769
=========================================================================================== test session starts ============================================================================================
platform linux -- Python 3.7.2, pytest-4.3.0, py-1.8.0, pluggy-0.9.0
rootdir: /, inifile:
plugins: metadata-1.8.0, html-1.20.0
collecting ...
These are pytest logs it does not proceed after this and nothing appears in appium server logs
I am using a real device with a host and port for adb connect via appium conatiner. I am expecting my tests to run on the real device which is running when I run my test from local but not running when I dockerise my tests.
link option is deprecated. You should create a network and connect containers to it :
docker network create mynet
docker container run -d -p 32769:4723 --privileged --network=mynet --name appium_server_v1 appium/appium
docker container run -it --network=mynet --name uitests_v1 uitests
And then you can connect to your appium server using appium_server_v1:4723 from within the uitests_v1 container.
the solution #michalk told worked. I have edited my docker file a little bit
Use an official Python runtime as a parent image
FROM python:alpine3.7
Set the working directory to /app
WORKDIR /app
Set the working directory to /app
COPY . /app
Install any needed packages specified in requirements.txt
RUN pip install --trusted-host pypi.python.org -r requirements.txt
Make port 80 available to the world outside this container
EXPOSE 80
Run pytest when the container launches
CMD ["pytest","/app/"]
and I have removed all the .pyc file using the command to make the tests run properly in the docker container
find . -name "*.pyc" -exec rm -f {} \;

Docker Desktop Community for Windows | Container Caching

Does Docker Desktop Community version for Windows caches the containers?
I was removing some of my containers and then trying to compose them again for a Python 3/Flask/Angular 7 application and it was turning them up without installing dependencies pretty fast. I had to remove containers then restart my machine for it to build the containers again.
I was running this command:
docker-compose up --build
Yes I have a docker-compose.yml. I also have Dockerfile with commands to install the dependencies.
FROM python:3.7
RUN mkdir -p /var/www/flask
Update working directory
WORKDIR /var/www/flask
copy everything from this directory to server/flask docker container
COPY . /var/www/flask/
Give execute permission to below file, so that the script can be executed
by docker.
RUN chmod +x /var/www/flask/entrypoint.sh
Install the Python libraries
RUN pip3 install --no-cache-dir -r requirements.txt
COPY uswgi.ini
COPY ./uwsgi.ini /etc/uwsgi.ini
EXPOSE 5000
run server
CMD ["./entrypoint.sh"]
I also tried following commands:
docker system prune
docker-compose up --build --force-recreate

Resources