I need to run the docker container with logs redirected to a file in a shared location. I searched a lot for but didn't a solution for this.
what I tried is:
Run the docker container first
docker run --name sample -d -p 8083:8080 xxxxx.yyyy.zzz/test/test-application:latest
then I ran
docker run -v /home/ubuntu/logs:/opt/logs sample
but when i check /home/ubuntu/logs folder, nothing is there.
Can anyone help me on this?
Thanks in advance,
You need to mount your external directory where you want logs to be stored when running the container.
docker run --name sample -d -p 8083:8080 -v /home/ubuntu/logs:/opt/logs xxxxx.yyyy.zzz/test/test-application:latest
you redirect it with a command such as
docker logs sample > /home/ubuntu/logs
see also
Redirect application logs to docker logs
and
how to redirect docker logs to a single file?
Related
I have deployed the tensorflow-serving docker image with the path to tf model.
The REST api (/predict) is working as expected. But, the problem is , to use the model in production environment exporting logs to kibana is expected.
here is the current command to run the docker container : sudo docker run -h 0.0.0.0 -p 80:8501 --mount type=bind,source=/pathto model/somemodel/,target=/models/somemodel -e MODEL_NAME=somemodel -t tensorflow/serving:2.8.2
i tried printing the logs with docker logs <dockerid>
please let me know how to obtain error and request logs from docker container?
Things i have tried :
tried adding -e TF_CPP_MAX_VLOG_LEVEL=4 to docker run command
tried adding -e TF_CPP_VMODULE=http_server=3 to docker run command
the command line argument mentioned in this discussion doesn't seem to work : https://stackoverflow.com/a/64046022/7382421
How to have live log from a docker container?
After many sleepless nights and many many internet searches an answer finely comes to me in a dream I would say - Hey! just detach the dumb container!
The easiest way if you have the container running and sending logs to stdout, is to use docker attach , here is an example of running a container and attaching to it to see the output
$ docker run -d --name topdemo ubuntu /usr/bin/top -b
$ docker attach topdemo
basically this is the syntax of the command docker attach [OPTIONS] <CONTAINER>
After searching quite a lot(!),
I am using it like this:
NUM=`docker run -d --user 1013830000 -p 4567:4567 container_test:v1`
docker logs -f $NUM
docker stop $NUM
I'm trying to develope Plone project with Docker, i have used this official image of Plone 5.2.0, the images is built a run perfectly with:
$ docker build -t plone-5.2.0-official-img .
$ docker run -p 8080:8080 -it plone-5.2.0-official-cntr
But the plone restarts each time i run the docker container asking to create the project from skratch.
Anybody could help me with this.
Thanks in advance.
You can also use a volume for data like:
$ docker run -p 8080:8080 -it -v plone-data:/data plone-5.2.0-official-cntr
The next time you'll run a new container it will re-use previous data.
If this helps,
Volumes are the docker way to persist data. You can read it up over here
When running the container just add a -v option and specify your path to store your data.
$ docker run -p "port:port" -it -v "path"
This is expected behavior, because docker run starts a new container, which doesn't have the state from your previous container.
You can use docker start CONTAINER, which will have the state from that CONTAINER's setup
https://docs.docker.com/engine/reference/commandline/start/
A more common approach is to use docker-compose.yml and docker-compose up -d, which will, in most cases, reuse previous state.
https://docs.docker.com/compose/gettingstarted/
When I execute commands in Ubuntu 18:
cd ~/r-projects
docker run -d -v $PWD:/home/rstudio rocker/rstudio
docker creates rstudio container accessible in localhost:8787. But I can't see the content of the $PWD inside RStudio session. When I save files in RStudio session and then restart the container those files persist, but I can not find them in the host using locate command. It seems that $PWD is not mounted but docker uses another folder to preserve RStudio state.
This is strange behavior. What I really want is to link some folder on the host to the rstudio inside docker container. What am I doing wrong?
Official instructions did not help me.
Please, provide correct command.
I resolved the issue:
docker run -d -p 8787:8787 -e PASSWORD=123 -v $PWD:/home/rstudio rocker/rstudio
The problem was that I executed commands inside kubernetes cluster.
I've used docker to install couchbase on my ubuntu machine using (https://hub.docker.com/r/couchbase/server/). The docker run query is as follows:
docker run -d --name db -p 8091-8094:8091-8094 -p 11210:11210 -v /home/dockercontent/couchbase:/opt/couchbase/var couchbase
Everything works perfectly fine. My application connects, I'm able to insert/update and query the couchbase. Now, I'm looking to debug a situation wherein the couchbase is on my co-developers machine who also has the same installation i.e., couchbase on docker using the above link. For achieving this, I wanted to run cbbackup on his installation. To achieve this, I run the following command which is a variation of the above link:
bash -c "clear && docker exec -it couch-db sh"
Can anyone please help me with the location of /opt/couchbase/bin in this setup? I believe this is where I can get access to "cbbackup", "cbrestore" and "cbtransfer" which I can then use to backup and restore data from my colleague's machine.
Thanks,
Abhi.
When you run the command
docker run -d --name db -p 8091-8094:8091-8094 -p 11210:11210 -v /home/dockercontent/couchbase:/opt/couchbase/var couchbase
you're pulling a docker image and spawning a docker container.
Please read more about Docker and containerization.
In order to run cbbackup you need to log into your docker container.
Follow these steps:
Retrieve the container-id:
$ docker ps -a
Look for the CONTAINER ID for IMAGE NAME=couchbase
Login to the container using the command:
$ docker exec -it <container-id> bash
Go to the directory : /opt/couchbase/bin using:
$ cd /opt/couchbase/bin
You'll find cbbackup binary in this directory.