I try to use syslog as driver to collect logs
using following command:
docker run --log-driver syslog --log-opt syslog-address=udp://localhost:514 --log-opt tag=expconf app-1
Where does the syslog write this log?
I want the logs to be written into a specific file/ location. How to achieve this?
Related
I am running a new container using the following command:
docker run -d --log-driver=gelf --log-opt gelf-address=tcp://<my_log_server> nginx
Looking at the documentation, this should send the logs to my_log_server, and if I run the docker logs <my_container> command I shoult NOT see any logs.
But actually I do, and I don't want this.
Do you have any idea why this is happening?
I'm testing it with Docker version 20.10.8, build 3967b7d
According to the docker documentation - https://docs.docker.com/config/containers/logging/configure/
When using Docker Engine 19.03 or older, the docker logs command is only functional for the local, json-file and journald logging drivers. Docker 20.10 and up introduces “dual logging”, which uses a local buffer that allows you to use the docker logs command for any logging driver. Refer to reading logs when using remote logging drivers for details.
Hope it clarifies.
I have my logging driver setup of journald. Does the log-level config in daemon.json file impact logs when using a logging driver or only the container logs when using docker logs <container_name> ?
For example, docker and journald have documentation showing how to set log level/priority.
Docker's default setting is info: log-level: info.
With journald I can also use -p to set the log priority to info: journalctl -p info.
If my docker logging driver is journald with log priority set to info, do I even need to worry about setting log-level to info in daemon.json file?
I think maybe you confused the following concepts: logs of docker daemon, logs of container(s) and logs print with journalctl command.
The configuration in docker.json file impact logs of docker
daemon.
The logs of container(s) would be only impacted by your application
configuration in that container.
The command journalctl -p ONLY impact the logs showing on your
screen, which means -p only do the filtering thing. No matter what
level you've indicated, err or info, the logs are there already.
Hope this would be helpful.
I used the below command to start the splunk server using Docker.
docker run -d -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_USER=root" -p "8000:8000" splunk/splunk
But when I opened the URL localhost:8000, I am getting Server can't be reached message
What am I missing here?
I followed a tutorial from the source :- https://medium.com/#caysever/docker-splunk-logging-driver-c70dd78ad56a
Depending on your docker version and host OS, you could be missing the need to map 8080 from the VirtualBox.
This should not be needed if you are using HyperV (Windows host) or XHyve (Mac host), but can still be needed with VirtualBox.
The link to the Docker Image is https://hub.docker.com/r/splunk/splunk/. Referring to this we can see some details related to pulling and running the image. According to the link the right command is:
docker run -d -p 8000:8000 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=<password>" --name splunk splunk/splunk:latest
This works correctly for me. The image uses Ansible to do the configurations once the container has been created. If you do not specify password, the respective Ansible task will fail and your container will not be configured.
To follow the progress of the container configuration, you may run this command after running the above command:
docker logs -f splunk
Given that the name of your container is splunk. Here you will be able to see the progress of Ansible in configuring Splunk.
In case you are looking to create a clustered Splunk deployment then, you might want to have a look at this: https://github.com/splunk/docker-splunk
Hope this helps!
I need to forward docker logs to a ELK stack.
The administrator of the stack filters my log according to the type parameter of the message. Right now I use filebeat and have to set the document_type parameter so the Logstash configuration filters my messages properly.
I am now trying to avoid using Filebeat, because I am going to instantiate my EC2 machines on demand, and did not want to have to install filebeat on each of them on runtime.
I already saw that there is a syslog driver among others available. I set the syslog driver, and the messages go to Logstash, but I am not able to find how to set a value for the document_type like in filebeat. How can I send this metadata to Logstash using Syslog driver, or any other Docker native driver?
Thanks!
Can't you give your syslog output a tag like so:
docker run -d --name nginx --log-driver=syslog --log-opt syslog-address=udp://LOGSTASH_IP_ADDRESS:5000 --log-opt syslog-tag="nginx" -p 80:80 nginx
And then in your logstash rules:
filter {
if "nginx" in [tags] {
add_field => [ "type", "nginx" ]
}
}
Can logs in a docker container ... say logs located in /var/log/syslog get shipped to logstash without using any additional components such as lumberjack and logspout?
Just wondering because I set up an environment and tried to make it work with syslog (so syslog ships the logs from docker container to logstash) but for now it's not working .. just wondering if there's something wrong with my logic.
There's no way for messages in /var/log/syslog to magically route to logstash without something configured to forward messages. Something must send the logs to logstash. You have a few options:
Configure your app to send log messages to stdout rather than to /var/log/syslog, and run logspout to collect stdout from all the running containers and send messages to your logstash endpoint.
Run rsyslog inside your container and configure a syslog daemon such as rsyslog to send messages to your logstash endpoint
Bind mount /dev/log from the host to your container by passing -v /dev/log:/dev/log to docker run when starting your container. On the host, configure your syslog daemon to send messages to logstash.
You could use the docker syslog driver to send docker logs straight from docker containers to logstash. Just have to add some parameters when you run your container
https://docs.docker.com/engine/admin/logging/overview/#supported-logging-drivers