custom logs in docker - docker

I need to get the logs from the running container persisted on the host, so we don't lose the logs when the container restarts.
The logs that get put in the standard apache logs are handled fine with the --log-driver=syslog --log-opt syslog-tag="app_name" run options. However, each application also has a custom debug.log output.
I tried using the --log-opt syslog-address=unix://infra/py/appinstance/app/log/debug.log run parameter, but that doesn't work. I would like to plug the debug logs into the standard syslog, but I don't see how to do it. Any ideas.

the docker run--log-driver option is to specify where to store your docker container log. The log we are talking about here is the one that you get from the docker logs command.
The content of that log is gathered from the container's process standard output and error output.
The debug.log file you are mentioning isn't sent to any of the standard or error output and has such won't be handled by docker.
You have at least two options to persist those debug messages:
writing to stdout or stderr
You can make your application write its debug messages to the standard or error output instead of to the debug.log file. This way those debug messages will be handled by docker, and given the --log-driver=syslog option will persist in your host syslog service.
mount a volume
You can also use the docker run -v option to create a volume in your container that will mount a directory from your docker host in your container.
Then configure your application so that it writes the debug.log file on that mount point.

Related

Make Docker Logs Persistent

I have been using docker-compose to setup some docker containers.
I am aware that the logs can be viewed using docker logs <container-name>.
All logs are being printed to STDOUT and STDERR when the containers are run, there is no log 'file' being generated in the containers.
But these logs (obtained from docker logs command) are removed when their respective containers are removed by commands like docker-compose down or docker-compose rm.
When the containers are created and started again there is a fresh set of logs. No logs from the previous 'run' is present.
I am curious if there is a way to somehow prevent the logs from being removed along with their containers.
Ideally i would like to keep all my previous logs even when the container is removed.
I believe you have two ways you can go:
Make containers log into file
You can reconfigure the applications inside the container to write into logfiles rather than stdout/stderr. As you put it, you'd like to keep the logs even when the container is removed. Therefore ensure the files are stored in a (bind) mounted volume.
Reconfigure docker to store logs
Reconfigure docker to use a different logging driver. This can be especially helpful as it prevents you from changing each and every container.

How to auto-remove Docker container while persisting logs?

Is there any way to use Docker's --rm option that auto-removes the container once it exits but allow the container's logs to persist?
I have an application that creates containers to process jobs, and then once all jobs are complete, the container exits and is deleted to conserve space. However, in case a bug caused the container's process to exit prematurely, I'd like to persist the log files so I can confirm it exited cleanly or diagnose a faulty exit.
However, the --rm option appears to remove the container's logs along with the container.
Log to somewhere outside of the container.
You could mount a directory of the host in your container, so logs will be written to the host directory and kept after rm.
Or you can mount a volume on your container; which will be persisted after rm
Or you can setup rsyslog - or some similar log collection agent - to export your logs to a remote service. See https://www.simulmedia.com/blog/2016/02/19/centralized-docker-logging-with-rsyslog/ for more on this solution.
The first 2 are hacks but easier to get up and running on your workstation/server. If this is all cloud hosted there might be a decent log offloading option (Cloudwatch on AWS) which saves you the hassle of configuring rsyslog

Where does Docker save logs?

Docker seems to allow to specify any log driver of choice either through /etc/docker/daemon.json or through options while running a container. Further, it allows specifying driver options too, but is it possible to mention the location where the logs themselves get stored. Or at least can I know where docker is saving the logs even if the location is not customizable.
Reference: For example consider the default driver - JSON File logging driver
Environments to consider: Ubuntu/CentOS/Windows etc... but looking for generic solution.
If you want to check docker daemon logs then here is the location where you can find it.
To check logs of containers.
In case of default logging driver Json file, you can get the logs using command.
docker logs container-id
Or get the location of specific container logs using docker inspect
docker inspect --format='{{.LogPath}}' container-id
Hope this helps.

Is there a way to save Docker container logs automatically?

The application that I run in a container sends its logs to stdout and this can't be reconfigured. I need these logs to be written to a file to keep them. Is there a way to automatically redirect logs from stdout of a container to a file as soon as the container starts?
(I know about "docker logs" command, but it has to be controlled manually and it is no good if a container stops before logs are saved this way.)
Thanks in advance.
Modify the entrypoint with stdout and stderr redirection to a volume mount.
command > /volumemount/out 2>&1
Now all the docker logs which used to come to stdout of the container will come to the host shared volume. https://askubuntu.com/questions/625224/how-to-redirect-stderr-to-a-file/625230

Parse Server Logs in Docker?

So I need to dockerize my parse server for a project and im a bit new to Docker.
I'm used to Heroku where I just could use heroku logs or connect papertrial to see the parse logs to help debug things but I have no clue how to see my parse specific logs when its running in a docker container.
I'm able to do test curl and get data back so I know its working but no idea how to find log data.
Searching around really doesn't lead to any results that are more specific to Docker. I also tried to figure out how to write to the logs folder but the folder alway seems empty.. ?
Docker collects the logs of each container from stdout and stderr of a container. As described in 12-factor application, an application should send its logs to stdout (it is standardized by heroku):
A twelve-factor app never concerns itself with routing or storage of its output stream. It should not attempt to write to or manage logfiles. Instead, each running process writes its event stream, unbuffered, to stdout. During local development, the developer will view this stream in the foreground of their terminal to observe the app’s behavior.
That being said, everything you stream to stdout of a container will be stored in /var/lib/docker/containers/<container id>/<container id>-json.log directory. You can see the container id using docker ps command. You don't have to see the file always. You can do docker logs <container-id> to see the logs stored in the container's directory.
If you have to have logs in the filesystem, you can store logs inside the container and mount that directory on your host machine to see log files. You can do something like this:
docker run -v <host directory>:<log directory in container> ...
Hope it helps.

Resources