i just wanna know, if you guys have a tutorial to a golang app send logs to elasticsearch with docker
i wanna to send my logs with tcp connection (with logstash or filebeat)
i will be very happy with the recommendation. Tks!
Related
I am getting data from a kafka topic through logstash and want to connect to an SSL-protected elasticsearch(all these are in docker images). I have tested the connection with a non SSL-elasticsearch and it worked fine, but with the ssl-elastic does not.
Do you have any suggestions,
e.g do i have to change my logstash configuration output to connect to https://elasticsearch:9200,
Do i have to install x-pack in my logstash,
Any suggestions
Thanks in advance!
I have a webserver listening on some port. I dockerize this server and publish its port with the command:
docker run -p 8080:8080 image-tag
Now I write a short Java client socket connecting to localhost on this port (it is able to connect). However, when I read data from this socket via the readLine function, it always returns me null. It shouldn't. Can someone point me some direction on how to troubleshoot this? Things I have tried:
This webserver and client works fine without docker.
Using my docker installation, I'm able to pull the getting-started app and it works fine. (means there is no problem with my docker, it still can publish port)
My docker pulls only the openjdk:latest as the base image. Other than that, nothing special.
The docker is Linux Docker on Windows Host.
The port the webserver is running on is correct and the same as the published port.
I would be very happy if someone could help.
By making the server app inside container listen on address 0.0.0.0 instead of localhost, I'm able to solve the problem.
I was trying to do a quick bootstrap to see some sample data in elasticsearch.
Here is where you do a Docker Compose to get a ES Cluster:
https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html
Next I needed to get logstash in place. I did that with: https://www.elastic.co/guide/en/logstash/current/docker-config.html
When I curl my host, curl localhost:9200 it gives me the sample connection string. So i can tell it is exposed. Now when I run the logstash docker file from above, i noticed that during the bootstrap code it cant connect to: localhost:9200
I was thinking that the private network created in for elastic is fine for the cluster and that i didnt need to add logstash to it. Do I have to do something different to get the default logstash to interact with the default docker?
I have been stuck on this for awhile. My host system is Debian 9. I am trying to think of what the issues might be. I know that -p 9200:9200 would couple the ports together, but 9200 has been claimed by ES, so I'm not sure how I should be handling things. I didn't see anything on the Website though which says "To link the out of the box logstash to the out of the box elasticsearch you need to do X,Y,Z"
When attempting to create a terminal to the logstash server with -it though, it is continually bootstrapping logstash and isn't giving me a terminal to see what is going on from the inside.
What Recommendations do you have?
Add --link your_elasticsearch_container_id:elasticsearch to the docker run command of logstash. Then the elasticsearch container will be visible to logstash under http://elasticsearch:9200, assuming you don't have TLS and the default port is used (what will be the case if you follow the docs you refer to).
If you need filebeat or kibana in the next step, see this question I answered recently: https://stackoverflow.com/a/60122043/7330758
I'm trying to launch a docker container that is running a tornado app in python 3.
It serves a few API calls and is writing data to a rethinkdb service on the system. RethinkDB does not run inside a container.
The system it runs on is ubuntu 16.04.
Whenever I tried to launch the docker with docker-compose, it would crash saying the connection to localhost:28015 was refused.
I went researching the problem and realized that docker has its own network and that external connections must be configured prior to launching the container.
I used this command from a a question I found to make it work:
docker run -it --name "$container_name" -d -h "$host_name" -p 9080:9080 -p 1522:1522 "$image_name"
I've changed the container name, host name, ports and image name to fit my own application.
Now, the docker is not crashing, but I have two problems:
I can't reach it from a browser by pointing to https://localhost/login
I lose the docker-compose usage. This is problematic if we want to add more services that talk to each other in the future.
So, how do I launch a docker that can talk to my rethinkdb database without putting that DB into a container?
Please, let me know if you need more information to answer this question.
I'd appreciate your guidance in this.
The end result is that the docker will serve requests coming over https.
for exmaple I have an end-point called /getURL.
The request includes a token verified in the DB. The URL is like this:
https://some-domain.com/getURL
after verification with the DB it will send back a relevant response.
the docker needs to be able to talk on 443 and also on 28015 with the rethinkdb service.
(Since 443 and https include the use of certificates, I'd appreciate a solution that handles this on regular http with some random port too and I'll take it from there)
Thanks!
P.S. The service works when I launch it without a docker on pycharm it's the docker configuration I have problems with.
I found a solution.
I needed to add this so that the container can connect to both the database and the rethinkdb:
--network="host"
Since this solution works for me right now, but it isn't the best solution, I won't mark this as the answer for now.
First of all, if I left anything out please forgive me as this is my first post.
I have docker running goaws and i added a separate container running a python daemon that i wrote. The python daemon reads from the SQS endpoint i have subscribed to my SNS topic and does a POST to a webapp in another docker container running tomcat. All of this works perfectly in one docker-compose.yml. I can publish a message directly to my goaws SNS topic using the python publish API and i recieve the output in elasticsearch which is after my webapp. I view the elasticsearch cluster in Kibana (yet another container I have running).
I wanted to take things a step further and add Logstash to the stack in docker. I cant get logstash SNS output plugin to send a messsage to the goaws SNS topic. It wants to send it to sns.us-east-1.amazonaws.com which I dont have credentials for. Does anyone have any idea what is causing this issue?