Logstash error: open /usr/share/logstash/config/logstash.yml: permission denied - docker

I have a logging repo that contains kibana, logstash, elasticsearch and a setup file, it is running on a server and works well, but now i want to run it on another server using docker but didn't run and it returns that the permission denied.
I have ran docker compose logs -f for setup it return the following: Could not resolve host. Is Elasticsearch running?
Note: it build correctly and show started but just filebeat and setup are up, logstash and elasticsearch are not.

Related

docker compose logs follow not running

I am using docker compose on mac and since a recent docker update my docker-compose logs -f is displaying logs but not following, once the logs displayed the process is stopped.
I am only using docker compose on local
Docker Compose version v2.0.0-beta.6
Docker version 20.10.7, build f0df350
From time to time when running docker-compose logs I have Error response from daemon: configured logging driver does not support reading I do not know if this is related.

docker-compose up does not work with remote host (but docker itself does)

I would like to deploy an application to a remote server by using docker-compose with a remote context, following this tutorial.
The Dockerfile contains
FROM ubuntu
The docker-compose.yml contains
version: "3.8"
services:
ubuntu_test:
build: .
The remote context remote is set as ssh://root#host
When I run docker-compose --context remote up, it crashes with the following error message
runtime/cgo: pthread_create failed: Resource temporarily unavailable
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7fb21d93bfb7 m=3 sigcode=18446744073709551610
goroutine 0 [idle]:
runtime: unknown pc 0x7fb21d93bfb7
stack: frame={sp:0x7fb21aee9840, fp:0x0} stack=[0x7fb21a6ea288,0x7fb21aee9e88)
[...]
ERROR: Couldn't connect to Docker daemon at http+docker://ssh - is it running?
If it's at a non-standard location, specify the URL with the DOCKER_HOST environment variable.
What is already working
Copying the source code to the remote server, logging in and running docker-compose up
Unpacking docker-compose into the corresponding docker commands
docker --context remote build .: works
docker --context remote run ubuntu: works
Using docker-compose --context remote build on the local machine to build the images on the remote server
In summary, everything works except for docker-compose --context remote up and I can't for the live of me figure out why. Everything I got is this cryptic error message (but obviously, Docker is running on the remote server, otherwise docker with remote context would fail).
Edit: My problem can be reduced to: What is the difference between docker --context remote run ubuntu and docker-compose --context remote up (as defined in my case)?

Permission Denied on port when starting HDP Sandbox Proxy on Docker (Windows 10)

I am getting the following error when trying to start sandbox-proxy (proxy-deploy.sh) on docker.
Have tried reinstalling, rebooting, checking existing in use ports using netstat -a -n. Nothing helped.
Error response from daemon: driver failed programming external connectivity on endpoint sandbox-proxy (b710798aa75668908d359602541ed4d8a3da4e4b8b2856f5e779453ea296aeef): Error starting userland proxy: Bind for 0.0.0.0:50111: unexpected error Permission denied
Error: failed to start containers: sandbox-proxy
Detailed snapshot of failure
Docker logs attempt as requested
Go to the location where you saved the Docker deployment scripts – refer to Deploy HDP Sandbox as an example. You will notice a new directory sandbox was created.
Edit file sandbox/proxy/proxy-deploy.sh
Modify conflicting port (first in keypair). For example, 6001:6001 to 16001:6001
Save/Exit the File
Run bash script: bash sandbox/proxy/proxy-deploy.sh
Repeat steps for continued port conflicts
More info : https://hortonworks.com/tutorial/sandbox-deployment-and-install-guide/section/3/#port-conflict

How to create rolling logs for Filebeat within a docker container

I'm new to log4j2 and the elastic stack.
I have a filebeat docker container that doesn't work exactly how I want and now I want to take a look at the logs. But when I do docker-compose logs I get a lot debug messages and json objects. It's unreadable how much there is.
How can I create a log4j2 properties setup to create some rolling log files. Maybe put the old logs into a monthly based folder or something? and where do I put this log4j2.properties file?
It's generating a lot of logs because you're running docker-compose logs, which will get the logs for all containers in your docker compose file.
What you want is probably:
docker logs <name-of-filebeat-container>. The name of the filebeat container can be found doing a docker ps.
docker compose logs <name-of-filebeat-service>. The name of the service can be found on your docker-composer.yml file.
Regarding the JSON outputs, you can query your Docker engine default logging driver with:
# docker info | grep 'Logging Driver'
Logging Driver: json-file
If your container have a different Logging Driver you can check with:
docker inspect -f '{{.HostConfig.LogConfig.Type}}' <name-or-id-of-the-container>
You can find all log drivers in this link
To run containers with a different log-driver you can do:
With docker run: docker run -it --log-driver <log-driver> alpine ash
With docker-compose:
`logging:
driver: syslog
options:
syslog-address: "tcp://192.168.0.42:123"`
Regarding your log rotation questio, I'd say the easyest way is to configure the logging driver with the syslog driver, configure it to your local machine (or your syslog server) and then logrotate the files.
You can find several logrotate articles for Linux (which I assume you're using), for example this one

How to docker container logs to a local file?

I need to look up into docker logs for some days ago and checking by docker service logs SERVICE | grep WHAT_I_NEED takes forever so I want to download the container logs from docker swarm and check those locally. I found that the container logs in Swarm can be found by:
docker inspect --format='{{.LogPath}}' $INSTANCE_ID
but I can't find a way to download the log from the location.
Doing: docker cp CONTAINER_ID:/var/lib/docker/containers/ABC/ABC-json.log ./ tells me that the path is not present. I understand that this path is in Swarm but then how to get the log from the container itself? Or is there another way to copy this file directly to a local file?
Try running this one from your terminal:
docker logs your_container_name 2> file.log
This will redirect the container logs to the local file file.log

Resources