I've the following docker compose file. I'm trying to connect elastic search running in another machine to kibana.
version: '3.3'
services:
kibana_ci:
image: docker.elastic.co/kibana/kibana:6.3.2
environment:
ELASTICSEARCH_URL: http://my_domain:9200
container_name: kibana_ci
command: kibana
ports:
- "5601:5601"
But kibana is keep trying to connect to http://elasticsearch:9200/ url. I've also tried with following options which didnt work.
environment:
- "ELASTICSEARCH_URL=http://my_domain:9200"
environment:
- "KIBANA_ELASTICSEARCH_URL=http://my_domain:9200"
environment:
KIBANA_ELASTICSEARCH_URL: http://my_domain:9200
environment:
elasticsearch.url: http://my_domain:9200
How can I change the url in docker compose file (without overriding kibana.yml file).
This compose file works for me:
version: '3.3'
services:
kibana:
image: docker.elastic.co/kibana/kibana:6.3.2
environment:
SERVER_NAME: kibana.example.org
ELASTICSEARCH_URL: http://my_domain
You don't need to define default port 9200.
kibana_1 | {"type":"log","#timestamp":"2018-09-20T16:58:31Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"Unable to revive connection: http://my_domain:9200/"}
kibana_1 | {"type":"log","#timestamp":"2018-09-20T16:58:31Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}
kibana_1 | {"type":"log","#timestamp":"2018-09-20T16:58:34Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"Unable to revive connection: http://my_domain:9200/"}
kibana_1 | {"type":"log","#timestamp":"2018-09-20T16:58:34Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}
For those who will face the same issue in Kibana 7.5, you will have to use the ELASTICSEARCH_HOSTS environment variable instead of ELASTICSEARCH_URL, like below:
kibana:
image: docker.elastic.co/kibana/kibana:7.5.2
container_name: kibana
environment:
ELASTICSEARCH_HOSTS: http://es01:9200
ports:
- 5601:5601
depends_on:
- es01
networks:
- elastic
You can also consult via the following link the list of all environment variables available, and how to setup in a docker environment:
https://www.elastic.co/guide/en/kibana/7.5/docker.html
Related
I have an ElasticSearch cluster running somewhere and I though to spin a Kibana container on my local machine and connect to the cluster, but it's not working. It looks like it's looking for a local ES.
kibana_1 | {"type":"log","#timestamp":"2022-08-31T09:06:05Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"Unable to revive connection: http://elasticsearch:9200/"}
kibana_1 | {"type":"log","#timestamp":"2022-08-31T09:06:05Z","tags":["warning","elasticsearch","admin"],"pid":1,"message":"No living connections"}
This is docker-compose.yml I'm using
version: "3"
services:
kibana:
image: kibana:7.0.1
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_URL=https://esinstance.us-east-1.es.amazonaws.com/
- ELASTICSEARCH_USERNAME=admin
- ELASTICSEARCH_PASSWORD=pass123
You need edit ENV ELASTICSEARCH_URL to ELASTICSEARCH_HOSTS.
File docker-compose.yml will look like this:
version: "3"
services:
kibana:
image: kibana:7.0.1
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_HOSTS='["https://esinstance.us-east-1.es.amazonaws.com"]'
- ELASTICSEARCH_USERNAME=admin
- ELASTICSEARCH_PASSWORD=pass123
I ran a docker-compose file to setup elasticsearch and Kibana on Ubuntu 18.04LTS. Kibana container is up and running just fine but elasticsearch goes down after about 10secs. I have restarted the containers and docker service several times and still got the same result. Been on this all day and hoping that I get some help.
Docker-Compose file.
version: "3.0"
services:
elasticsearch:
container_name: es-container
image: docker.elastic.co/elasticsearch/elasticsearch:7.16.3
environment:
- xpack.security.enabled=true
- xpack.security.audit.enabled=true
- "discovery.type=single-node"
- ELASTIC_PASSWORD=secretpassword
networks:
- es-net
ports:
- 9200:9200
kibana:
container_name: kb-container
image: docker.elastic.co/kibana/kibana:7.16.3
environment:
- ELASTICSEARCH_HOSTS=http://es-container:9200
- ELASTICSEARCH_USERNAME=elastic
- ELASTICSEARCH_PASSWORD=secretpassword
networks:
- es-net
depends_on:
- elasticsearch
ports:
- 5601:5601
networks:
es-net:
driver: bridge
Also checked the logs on the es-container and it displayed;
Created elasticsearch keystore in
/usr/share/elasticsearch/config/elasticsearch.keystore
Audit logging can be only enabled with paid ES subscription and you don't provide any license info to your container.
I have the below docker-compose.yaml:
version: "3.9"
services:
server:
depends_on:
- db
build:
context: .
container_name: grpc-server
hostname: grpc-server
networks:
- mynet
ports:
- 8080:8080
deploy:
restart_policy:
condition: on-failure
db:
image: postgres
container_name: postgres-db
hostname: postgres
networks:
- mynet
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 5432:5432
networks:
mynet:
driver: bridge
However my server container logs are indicating it can't connect to the db.
[error] failed to initialize database, got error dial tcp 127.0.0.1:5432: connect: connection refused
I'm assuming I need to inject the db path into the server somehow via the mynet?
It looks like your grpc-server container tries to connect to the database using the address 127.0.0.1:5432.
By default, docker compose creates a virtual network where each container is addressed using it's service name. However, you've overridden that by specifying hostname: postgres for your database container.
So your grpc-server needs to connect to the database using the address postgres:5432 rather than 127.0.0.1:5432.
I'm using ecs-cli to deploy my docker-compose.yml to ecs with SSL support.
When I run the command it's show me that the container is running. but when I browse to url is show me 404 error.
why?
this is my docker-compose.yml:
version: '2'
services:
tester-cluster:
image: yeasy/simple-web:latest
environment:
VIRTUAL_HOST: mydomin.net
LETSENCRYPT_HOST: mydomin.net
LETSENCRYPT_EMAIL: mydomin#gmail.com
nginx-proxy:
image: jwilder/nginx-proxy
ports:
- '80:80'
- '443:443'
volumes:
- '/etc/nginx/vhost.d'
- '/usr/share/nginx/html'
- '/var/run/docker.sock:/tmp/docker.sock:ro'
- '/etc/nginx/certs'
letsencrypt-nginx-proxy-companion:
image: jrcs/letsencrypt-nginx-proxy-companion
volumes:
- '/var/run/docker.sock:/var/run/docker.sock:ro'
volumes_from:
- 'nginx-proxy'
You will have to set the WORDPRESS_DB_HOST for the wordpress server as well. This will be something similar to the following:
WORDPRESS_DB_HOST: mysql:3306
Note the host name would be the name of the db container.
You can view container logs by running the following:
docker-compose logs -f -t
I'm trying to configure docker-compose with kibana and elasticsearch and I would like to know do I need logstash as well?
No. If you don't need the Logstash functionality, you don't need it.
Simple example with Elasticsearch and Kibana would be:
---
version: '2'
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:$ELASTIC_VERSION
volumes:
- /usr/share/elasticsearch/data
ports:
- 9200:9200
kibana:
image: docker.elastic.co/kibana/kibana:$ELASTIC_VERSION
links:
- elasticsearch
ports:
- 5601:5601
Kibana credentials (if you are using version 5):
login: elastic
password: changeme