is it possible to use multiple logging drivers for the same container - say fluentd and json?
Thank you.
As of 18.03, Docker Engine Enterprise(EE) supports multiple log drivers, but it is not in the Community Edition(CE):
https://docs.docker.com/ee/engine/release-notes/#18031-ee-1-2018-06-27
No, you can only specify a single logging driver/container.
To have separate sinks for your logs, you'd have to rely on something like fluentd to receive the logs (or read the json log files) and configure a pipeline to distribute them.
Dual logging is available in docker CE since version 20.10.1.
The feature was previously only available in Docker Enterprise since version 18.03.1-ee-1.
The official documentation chapter "Dual Logging" doesn't reflect this (as of 2021-01-04 ).
The feature has been open-sourced in pull request #40543 and was merged into master on 2020-02-27.
The related GitHub issue #17910 in moby/moby was closed with the following comment:
The upcoming Docker 20.10 release will come with the feature described above ("dual logging"), which uses the local logging driver as a ring-buffer, which makes docker logs work when using a logging driver that does not have "read" support (for example, logging drivers that send logs to a remote logging aggregator).
No you can specify a single logging driver as stated in the official documentation :
You cannot specify more than one log driver.
The log-driver documentation indicates that too :
To configure the Docker daemon to default to a specific logging
driver, set the value of log-driver to the name of the logging driver
in the daemon.json file...`
{
"log-driver": "syslog"
}
You can see that "log-driver" expects a string and not an array.
In fact, since Docker Engine Enterprise 18.03.1-ee-1, Docker has "just" enable a dual logging feature that allows to configure any logging driver log while being still the possibility to read them with docker logs.
For example before that feature, specifying that driver in the daemon.json :
{
"log-driver": "syslog"
}
allowed to redirect the logs to a syslog server but that also made Docker to not publish any longer logs to the local logging driver.
Now that is not the case, the information is available in both destination.
Starting with Docker Engine Enterprise 18.03.1-ee-1, you can use
docker logs to read container logs regardless of the configured
logging driver or plugin. This capability, sometimes referred to as
dual logging, allows you to use docker logs to read container logs
locally in a consistent format, regardless of the remote log driver
used, because the engine is configured to log information to the
“local” logging driver.
Related
I am trying to learn how to utilize the RClone Docker plugin to declutter my mounting strategy. Since most of my storage is remote and not on-device, I had previously just used bind mounts to the actual Linux mounts which were provided via RClone through fstab.
So in order to make that a little cleaner and store configurations better, I am largely using Docker Compose and now I am starting to add the RClone plugin to the configurations.
Problem is: How do I get the logs? So far, I couldn't interact with the RClone plugin at all aside from en- or disabling it and setting a few default arguments. That said, passing --log-file ... caused an error. However, docker logs is already a command and I am pretty sure I am able to query plugin logs too.
But how? I installed the plugin by aliasing it as rclone.
I want to send logs to multiple locations from a docker logging driver, is it possible with any logging driver?
For php you can use Monolog.
Find monolog here,
https://github.com/Seldaek/monolog
Monolog is not a driver, its a php package.
It depends on your setup.Can you elaborate more?
I've installed both the agent and the piggyback plugin on the Docker Node and created the hosts on check_mk page, with the hostame pointed to the container ID, according to the https://mathias-kettner.com/cms_monitoring_docker.html documentation.
I can see the information for each running container but I can only see 3 services per container:
Check_MK
Check_MK Discovery
Docker container status
All other services shown on the documentation page and described as being automaticaly discovered, are not shown.
Do you have any clue of what it might be?
I'm using Check_MK RAW v1.5.0p9.
if you feel free to share what you configured for piggyback would be useful.
But try this and share the output.
cmk -nvII hostname
-n - don't submit result to core
-v verbose
-II reinventory
Im using docker with my Web service.
when I deploy using Docker, loosing some logging files (nginx accesslog, service log, system log.. etc)
Cause, docker deployment system using down and up container architecures.
So I thought about this problem.
LoggingServer and serviceServer(for api) must seperate!
using these, methods..
First, Using logstash(in elk)(attaching all my logFile) .
Second, Using batch system, this batch system will moves logfiles to otherServer on every midnight.
isn't it okay?
I expect a better answer.
thanks.
There are many ways for logging which most the admin uses for containers
1 ) mount log directory to host , so even if docker goes up/down logs will be persisted on host.
2) ELK server, using logstash/filebeat for pushing logs to elastic search server with tailing option of file, so if new log contents it pushes to server.
3) if there is application logs like maven based projects, then there are many plugins which pushes logs to server
4) batch system , which is not recommended because if containers dies before mid-night then logs will be lost.
I noticed that the fluentd engine uses the out_forward output to send logs. Meaning all logs are sent in the clear. Is there a way to specify the output type? I'd like to be able to have Docker send logs with out_secure_forward instead.
Are there plans to enable more configuration? Should I use a different logging driver if I want security? Perhaps use the JSON file engine and then use fluentd to ship those securely?
IMO the best option to do what you want is:
introduce an additional docker container (A) to run Fluentd in it
configure your docker containers to send logs (over fluentd log drivers) to that container (A)
send these logs to another site from the fluentd in container (A) by using secure-forward