I am running a nginx application using docker. My nginx application creates some files in the docker container. I can see those files in the directory. I tried those files from my flask application, but I cannot since I am running my flask application in another docker container.
Is there a way to read files from a docker container inside a flask application running in localhost/docker?
You can explore docker container's file system via;
docker exec -it [containerId] bash
also you can try docker cp.
Related
I try to migrate most of my server apps to docker. I basically have a Apache2 running on my host and some PHP based WebApps as docker containers using FPM.
As my knowledge goes, only the *.php files are served through the docker container, those this configuration must be added to Apache2:
ProxyPassMatch "^/phpMyAdmin/(.*\.php)$" "fcgi://localhost:9000/var/www/html/$1"
So you can't pass any static files (CSS, JavaScript) to the fpm container. Therefore I usually mount a host directory to the container like so:
docker run -v /var/www/phpMyAdmin:/var/www/html -p 9000:9000 -d phpmyadmin:fpm-alpine
But as soon as I add the mount (-v), the containers "/var/www/html" directory is empty. I checked with:
docker exec -it phpmyadmin /bin/sh
It seems like the whole phpMyAdmin installation wasn't extracted or got deleted/overwritten. This approach did work for other containers (postfixadmin, roundcube), so I have no idea what is going on or what I'm doing wrong.
How am I supposed to serve the static files from the fpm docker container through my Apache2 host? I didn't find any example, only nginx as server or docker compose.
Best regards,
Billie
E.g., my local setup is to run a server (listens to :1111) and a docker container (with -p 5555:5555). The purpose is that a server can send the requests to docker container as well (to :5555).
I think that the typical way to deploy servers is to wrap it in a docker image and run the docker image in the cloud. How can I do the same thing but run my custom docker container inside of a server automatically (e.g., add docker run command to a Dockerfile)?
I am creating a Spring Boot monitoring agent that collects docker metrics. The agent can be attached through POM dependency to any client Spring Boot application that runs inside a docker container.
In the agent, I am trying to programatically run docker stats
But, it fails to execute because the docker container doesn't have docker client installed in it.
So how can I run docker commands in docker container? Please note, I can't make changes to the Dockerfile of client.
You may execute docker commands within the container by defining the docker socket in the container.
run the container and mount the 'docker.sock' in the following manner:
docker run -v /var/run/docker.sock:/var/run/docker.sock ...
so mainly you have to mount docker.sock to order to run docker commands within container.
how to access a path of a container from docker-machine? I have the ip docker-machine and I want to connect via remote in a docker image, e.g:
when I connect to ssh docker#5.5.5.5, all file are docker-machine, but I wat to conect a docker image via ssh.
whe I use this comman docker exec -u 0 -it test bash all files from the imagen are ok, but I want to access with ssh using docker-machine.
How can I do it?
This is tricky as Docker is designed to run a single process in foreground and containers dies when the process completed. This means Docker containers don't run anything additional other than what you define in the Dockerfile or docker-compose.yml.
What you can try is using docker-compose.yml file, expose the port 22 to outside world (also can be done through command line with Dockerfile). This is NOT guaranteed to work as this require the image to run an SSH daemon and most cases it runs one process.
If you're looking to persist files that are used by containers, such as when a container is re-deployed it starts where it left off, you can mount a folder from host machine to the container as a volume.
I have a container with code in it. This container runs on production server. How can I share folder with code in this container to my local machine? May be with Samba server and then mount (cifs) this folder with code on my machine? May be some examples...
Using
docker cp <containerId>:/file/path/within/container /host/path/target
you could copy some data from the container. If the data in the container and on your machine need to be in sync constantly I suggest you use a data volume to share a directory from your server with the container. This directory can then be shared from the server to your local machine with any method (e.g. sshfs)
The docker documentation about Manage data in containers shows how to add a volume:
$ docker run -d -P --name web -v /webapp training/webapp python app.py
The data from you severs training/webapp location will then be available in the docker container at /webapp.