How to find source hostname with fluentd? - fluentd

I'm looking for a way to send source_hostname to the fluentd destination server.
I was on logstash but we have agent/server side and we have variables to get the source hostname in the logstash server config file.
I search a similar way to do it with FluentD but the only thing that I find is to set the hostname in the source tag "#{Socket.gethostname}". But in this way i can't use the hostname in the path of the destinatation log file.
Based on source : http://docs.fluentd.org/articles/config-file#embedded-ruby-code
In the server-side, this is why i would like to do :
<source>
type forward
port 24224
bind 192.168.245.100
</source>
<match apache.access.*>
type file
path /var/log/td-agent/apache2/#{hostname}/access
</match>
<match apache.error.*>
type file
path /var/log/td-agent/apache2/#{hostname}/error
</match>
Should someone can help me to something like this please ?
Thank you in advance for your time.

You can evaluate the Ruby code with #{} in " quoted string.
So you can change it to,
path /var/log/td-agent/apache2/"#{hostname}"/access
Refer the docs - http://docs.fluentd.org/articles/config-file#embedded-ruby-code

You can try using record-reformer plugin here or forest plugin here

Related

Exclude logs from fluentd using exclude directive not working

Trying to exclude logs using the grep's exclude directive.
<filter kubernetes.var.log.containers.**>
#type grep
<exclude>
key kubernetes.pod_name
pattern /^podname-*/
</exclude>
</filter>
I tried with different key names e.g. container and namespace as well. I am trying to exclude logs from a certain pod using the pattern but it's not working. Using type forward source type to send logs.
Want to exclude logs from certain pods starting with the same name from var log containers.

Using fluentD to catch logs when the same file is created again

I have a log file that is continuously deleted and re-created with the same structure but different data.
I'd like to use fluentD to export that file when a new version of the file is created. I tried various set of options but it looks like fluentD misses the updates unless I manually add some lines to the file.
Is this a use case that is supported by default sources/parsers?
Here is a config file is use
<source>
#type tail
tag file.keepalive
open_on_every_update true
read_from_head true
encoding UTF-8
multiline_flush_interval 1
...
</source>
Try tail plugin, but instead of specifying a path to a file, specify a path to a parent directory like dir/*: https://docs.fluentd.org/input/tail#path
Try adding a datetime to filename everytime you recreate it - this will 100% force it to read all.

How to configure handling of when logs sometimes don't exist? fluentd

I am adjusting our fluentd configuration to include a specific log file and send to S3. The issue I am trying to wrap my head around is this.... Only some instance types in our datacenter will contain this specific log. Other instances will not (because they are not running the app that we are logging). How do you modify the configuration so that fluentd can handle the file existing or not existing?
So in the below example Input, this log file will not be on every server instance -- that is expected. Do we have to configure the security.conf file to look for this and skip if missing? Or will fluentd just not include what it doesn't find?
## Inputs:
<source>
#type tail
path /var/log/myapp/myapp-scan.log.*
pos_file /var/log/td-agent/myapp-scan.log.pos
tag s3.system.security.myapp-scan
format none
</source>

Get HostName in .jtl file in Jmeter

I am using ANT (build tool) to run jmeter functional scripts. I want to get the hostname or website name where all my jmeter scripts are running.
I have checked the jmeter.properties file to do some changes but no luck but no luck.
I fixed and I want to share solution.
I have uncommented the configuration in jmeter.properties.
jmeter.save.saveservice.hostname=true
so that the hostname will be written to the jtl file and from the jtl file, i got it by xpath as below,
/testResults/httpSample/#Host
Thats it, you can use this as xsl variable for reporting or for any purpose.
You can use below inbuilt functions in JMeter.
${__machineName} - to get the machine name
${__machineIP} - to get the IP address

Graylog: How to Import Apache LogFiles into the Graylog server

I have a specific need for knowing how to "import" log files I receive from anyone into Graylog. My need is not about 'sending' or configuring a collector that will be sending logs to Graylog.
I need to know if I can copy a TAR with logs into the graylog and render the content of via the Web UI of Graylog.
I have read many blogs, and I am having difficulty finding guidance for my specific need.
Your help is greatly appreciated
so far as i know it is not possible to import logs, but you can use fluentd(http://www.fluentd.org/guides/recipes/graylog2) to read log-files.
BUT if you want to send logfiles from apache to graylog try this, add into you apache2.conf the following lines:
LogFormat "{ \"version\": \"1.1\", \"host\": \"%V\", \"short_message\": \"%r\", \"timestamp\": %{%s}t, \"level\": 6, \"_user_agent\": \"%{User-Agent}i\", \"_source_ip\": \"%a\", \"_duration_usec\": %D, \"_duration_sec\": %T, \"_request_size_byte\": %O, \"_http_status\": %s, \"_http_request_path\": \"%U\", \"_http_request\": \"%U%q\", \"_http_method\": \"%m\", \"_http_referer\": \"%{Referer}i\" }" graylog2_access
and add into you virtualhost file the following lines:
CustomLog "|/bin/nc -u syslogserver.example.de 50520" graylog2_access
also take a look here: https://serverfault.com/questions/310695/sending-logs-to-graylog2-server
You could try the community editon of nxlog. With nxlog you can load up your log files with im_file parse the logs up some and get them into gelf format which should make them easier to search in Graylog2. If you set SavePos and ReadFromLast to False it will suck in the entire log file anytime you kick off nxlog regardless of when the log happened, or even if it's been entered into Graylog2 before.

Resources