I would like to modify the IIS logs for further transfer to the destination.
Now I am parsing the IIS log with the xm_csv module, as in the template.
UndefValue is disabled to not get empty.
How can I interact with parsed data from w3c_parser?
For example, I want to combine into a variable $request = '"' + $cs-method + ' ' + $cs-uri-stem + ' ' + $cs-version + '"'; such a value, but I get an error.
When I try to write a field from w3c_parser to $raw_event, I also get an error. Any other data is added without error.
For example $raw_event = $c-ip -- error
$raw_event = $EventTime + ' ' + $http_host -- no error
Example error, logs and config file below
2022-03-23 16:49:56 ERROR Couldn't parse Exec block at C:\Program
Files\nxlog\conf\nxlog.conf:59; couldn't parse statement at line 71,
character 32 in C:\Program Files\nxlog\conf\nxlog.conf; syntax error,
unexpected +, expecting (
2022-03-23 16:49:56 ERROR module 'iis_w3c' has configuration errors,
not adding to route 'uds_to_file' at C:\Program
Files\nxlog\conf\nxlog.conf:84
2022-03-23 16:49:56 ERROR route uds_to_file is not functional without
input modules, ignored at C:\Program Files\nxlog\conf\nxlog.conf:84
2022-03-23 16:49:56 WARNING no routes defined!
2022-03-23 16:49:56 WARNING not starting unused module iis_w3c
2022-03-23 16:49:56 WARNING not starting unused module file
2022-03-23 16:49:56 INFO nxlog-ce-3.0.2272 started
Current log format
date time s-computername s-ip cs-method cs-uri-stem cs-uri-query
s-port cs-username c-ip cs-version cs(User-Agent) cs(Cookie)
cs(Referer) cs-host sc-status sc-bytes cs-bytes time-taken
2022-03-23 08:00:01 HOST.DOMAIN 99.XX.XX.4 GET /AnalyticsService - 443
XX.XX.XX.XXX HTTP/1.1 Zabbix - - site.host.domain 200 3918 144 4
Required log format
$http_host $remote_addr $remote_user [$time_local] UNIX-TIME-$msec
"$request" $status "$sent_http_content_type" $body_bytes_sent
"$http_referer" "$http_user_agent" "$http_cookie" $request_time
"$upstream_addr" NGINX-CACHE-$upstream_cache_status "$request_id"
"$request_body"
host.domain 99.99.99.249 - [11/Mar/2022:20:09:56+0300]
UNIX-TIME-1647018596.031 "GET /api/company.php?id=853747 HTTP/1.1" 200
"text/xml; charset=UTF-8" 1455 "-" "-"
"20b6b325ea192383cb1244412247c5ea=3002538ef353c9daab4f742176a840;
etpsid=f488b343a23d1a4a2332e089a0" 0.059 0.059 "10.10.10.111:80"
NGINX-CACHE-- "d0b5ac12cf82671067aa5e6c5c" "-"
Panic Soft
#NoFreeOnExit TRUE
define ROOT C:\Program Files\nxlog
define CERTDIR %ROOT%\cert
define CONFDIR %ROOT%\conf\nxlog.d
define LOGDIR %ROOT%\data
define LOGFILE %LOGDIR%\nxlog.log
LogFile %LOGFILE%
Moduledir %ROOT%\modules
CacheDir %ROOT%\data
Pidfile %ROOT%\data\nxlog.pid
SpoolDir %ROOT%\data
<Extension _syslog>
Module xm_syslog
</Extension>
<Extension fileop>
Module xm_fileop
</Extension>
<Extension _charconv>
Module xm_charconv
AutodetectCharsets iso8859-2, utf-8, utf-16, utf-32
</Extension>
<Extension _exec>
Module xm_exec
</Extension>
#Create the parse rule for IIS logs. You can copy these from the header of the IIS log file.
<Extension w3c_parser>
Module xm_csv
Fields $date, $time, $s-computername, $s-ip, $cs-method, $cs-uri-stem, $cs-uri-query, $s-port, $cs-username, $c-ip, $cs-version, $cs(User-Agent), $cs(Cookie), $cs(Referer), $cs-host, $sc-status, $sc-bytes, $cs-bytes, $time-taken
FieldTypes string, string, string, string, string, string, string, integer, string, string, string, string, string, string, string, integer, integer, integer, integer
Delimiter ' '
EscapeChar '"'
QuoteChar '"'
EscapeControl FALSE
# UndefValue -
</Extension>
<Extension w3c_out>
Module xm_csv
Fields $http_host, $c-ip, $cs-username, $EventTime1, $sc-status, $Unix
FieldTypes string, string, string, string, string, string
Delimiter ' '
# UndefValue -
QuoteMethod None
</Extension>
<Input iis_w3c>
Module im_file
File 'C:\inetpub\logs\LogFiles\W3SVC1\u_ex*.log'
SavePos TRUE
<Exec>
if $raw_event =~ /^#/ drop();
else
{
w3c_parser->parse_csv();
$EventTime = parsedate($date + " " + $time);
$EventTime = $EventTime + (3 * 3600);
$EventTime1 = strftime($EventTime, '[%d/%b/%Y:%H:%M:%S]');
# $EventTime1 = '$EventTime1' + ' +0003]';
$Unix = integer($EventTime);
$Unix = 'UNIX-TIME-' + $Unix;
$http_host = "site.host.domain";
# $request = '"' + $cs-method + ' ' + $cs-uri-stem + ' ' + $cs-version + '"';
# $request = $cs-method;
w3c_out->to_csv();
}
</Exec>
</Input>
<Output file>
Module om_file
File 'C:\inetpub\logs\LogFiles\Parser\w3c.txt'
</Output>
<Route uds_to_file>
Path iis_w3c => file
</Route>
let's start with NXLog language in the conf files. Dashes in explicit format are not allowed - you can check:
https://nxlog.co/docs/nxlog-ce/nxlog-reference-manual.html#lang_fields
Hence, one needs to apply curly braces to reach the goal ({}). If I understand your issue correctly, this may help with most of your issues.
Related
My overall goal is to determine whether or not the file control.lnk exists in the Windows administrator's desktop. The challenge of this (as i currently see it) is to get the complete path/file spec of such into win_stat as a valid path argument. Originally, i used variations of $env:USERPROFILE and appended \Desktop\control.lnk to such but got system.string is not path errors (dialog to that effect). Lastly, i tried to derive this path into an Ansible variable via Powershell task and then use the ansible variable ...
- name: Get Profile path
# This will get logon user path; different than ansible_environment path
ansible.windows.win_powershell:
script:
$env:USERPROFILE
register: profilePath
- name: Debug Play essentials
debug:
msg:
- "UserProfile path: {{ profilePath.output}}"
... and this produces the path as i expect. However (in all cases), i still get errors of the form:
msg: argument for path is of type System.String and we were unable to convert to path: string value contains invalid path characters, cannot convert to path" }
Here's the win_stat task i've been trying along with the path-failures I've tried as comments ...
` - name: Are Shortcuts installed
ansible.windows.win_stat:
# This is a typical without the variable ...
# path: C:\Users\Administrator.PAxAnsible\Desktop\Control.lnk
# Using $env:USERPROFILE directly doesn't work; is resolved as type system.string and not of type path | cannot covert ...
# $env:USERPROFILE\Desktop\Control.lnk
# '$env:USERPROFILE\Desktop\Control.lnk'
# "$env:USERPROFILE\Desktop\Control.lnk" -invalid syntax; cannot commit
# "$env:USERPROFILE\\Desktop\\Control.lnk"
# '$env:USERPROFILE\\Desktop\\Control.lnk'
# '"$env:USERPROFILE\\Desktop\\Control.lnk"'
# $env:USERPROFILE\\Desktop\\Control.lnk
# "$env:USERPROFILE/Desktop/Control.lnk"
# $env:USERPROFILE/Desktop/Control.lnk
# '"$env:USERPROFILE"/Desktop/Control.lnk' -string contains invalid path characters, cannot convert to path"
# '$env:USERPROFILE + /Desktop/Control.lnk'
# $env:USERPROFILE + '/Desktop/Control.lnk'
# Template error ...
# '"{{ $env:USERPROFILE}}"/Desktop/Control.lnk' -unexpected char '$' at 4.
#
# System.String issues while using a derived path (>> is how it's translated) ...
# '{{ profilePath.output}}/Desktop/Control.lnk' >> "['C:\\\\Users\\\\Administrator']/Desktop/Control.lnk"
# '{{ profilePath.output}}\\Desktop\\Control.lnk' >> "['C:\\\\Users\\\\Administrator']\\\\Desktop\\\\Control.lnk"
# "{{ profilePath.output}}\Desktop\Control.lnk" >> syntax error; cannot community
# '"{{ profilePath.output}}\Desktop\Control.lnk"' >> "\"['C:\\\\Users\\\\Administrator']\\Desktop\\Control.lnk\""
# "{{ profilePath.output}}\\Desktop\\Control.lnk" >> "['C:\\\\Users\\\\Administrator']\\Desktop\\Control.lnk"
path: '"{{ profilePath.output}}"\\Desktop\\Control.lnk'
register: Shortcuts
when: AppServShortcuts != "none"
I see that the bordering brackets as are of the output are not being removed when combined with the static text as is within the path argument. -I suspect this will have to be done before using the output variable as a component of the path argument. As for using $env:USERPROFILE directly as a path component, I haven't figured out a syntax that works.
I'm hoping someone out there can advise correct syntax to make this win_stat query work ;) ...
I am trying to fliter/mask logs by using fluent bit to change logs using lua. The filter contains a simple substitution of digits to # using gsub. I am entering logs in JSON format bit lua is not able to access key value.
test.lua
function modifyRecord(tag, timestamp, record)
local new_record = record
old_message = record["message"]
digits = "[%d]+"
new_message = old_message:gsub(digits, "#")
new_record["message"] = new_message
return 1, timestamp, new_record
end
Log file – test.log
echo '{"Test": "Logs","message":"This is a message 123432"}' >> test.log
fluent-bit.conf
[SERVICE]
# This is the main configuration block for fluent bit.
# Ensure the follow line exists somewhere in the SERVICE block
Plugins_File plugins.conf
[INPUT]
Name tail
#Path /export/home/dummy/newrelic-infra/var/log/newrelic-infra/newrelic-infra.log
Path /export/home/dummy/nrfb/test.log
[OUTPUT]
Name newrelic
Match *
licenseKey <key>
proxy <proxy>
endpoint <endpoint>
[FILTER]
Name modify
Match *
Add hostname <hostname>
Add service_name newrelic-infra-fluent-bit
[FILTER]
Name lua
Match *
script /export/home/dummy/nrfb/test.lua
call modifyRecord
Fluent-bit logs
-bash-4.2$ ./fluent-bit -c ./fluent-bit.conf
Fluent Bit v1.6.3
* Copyright (C) 2019-2020 The Fluent Bit Authors
* Copyright (C) 2015-2018 Treasure Data
* Fluent Bit is a CNCF sub-project under the umbrella of Fluentd
* https://fluentbit.io
[2022/04/15 11:09:20] [ info] [engine] started (pid=11454)
[2022/04/15 11:09:20] [ info] [storage] version=1.0.6, initializing...
[2022/04/15 11:09:20] [ info] [storage] in-memory
[2022/04/15 11:09:20] [ info] [storage] normal synchronization mode, checksum disabled, max_chunks_up=128
[2022/04/15 11:09:20] [ info] [sp] stream processor started
[2022/04/15 11:09:20] [ info] [input:tail:tail.0] inotify_fs_add(): inode=194 watch_fd=1 name=/export/home/dummy/nrfb/test.log
[2022/04/15 11:09:49] [error] [filter:lua:lua.1] error code 2: /export/home/dummy/nrfb/test.lua:5: attempt to index global 'old_message' (a nil value)
I had mistake in config file
[INPUT]
Name tail
#Path /export/home/dummy/newrelic-infra/var/log/newrelic-infra/newrelic-infra.log
Path /export/home/dummy/nrfb/test.log
It should have parser set to look/parse for json.
Changes in input and addition of parser did the trick.
[INPUT]
Name tail
#Path /export/home/dummy/newrelic-infra/var/log/newrelic-infra/newrelic-infra.log
Path /export/home/dummy/nrfb/test.log
Parser docker
[PARSER]
Name docker
Format json
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S %z
I run cmd.exe to move a file with Administrator rights:
ThisParams := '/K move ' + '"' + ThisSourceFile + '"' + ' ' + '"' + ATargetFile + '"';
Winapi.ShellAPI.ShellExecute(0, 'runas', 'cmd.exe', PChar(ThisParams), '', Winapi.Windows.SW_HIDE);
However, the cmd.exe process (although invisible) after execution remains active and in memory and stays visible in Task Manager.
How can cmd.exe, in this case, be automatically closed after execution?
As documented /k makes the command interpreter to continue running after executing the passed command. You should instead use
/c Carries out the command specified by String and then stops.
I have here a script that allows me to compress all .mp4 files of a folder.
The output file is:
original_name.mp4.webm
I would like the output file to be original_name.webm.
How to get rid of .mp4?
I think I have to learn .gsub(/ /, '\ ').
Please suggest.
I found it sorry for the buzz, here is the final code that removes ".mp4" and rename it ".webm"
Dir.glob("*.mp4") do |my_text_file|
puts ' --> converting: ' + my_text_file
puts "ffmpeg -i #{my_text_file.gsub(/ /, '\ ')} -b:v 640k #{my_text_file.gsub(/.mp4/, '')}.webm"
`ffmpeg -i #{my_text_file.gsub(/ /, '\ ')} -b:v 640k #{my_text_file.gsub(/.mp4/, '')}.webm`
end
I'm using the online WLST script to configure the WebLogic server during Docker image build. Basically the docker image build starts up the WebLogic and executes the following script
import os
import time
import getopt
import sys
import re
# Deployment Information
domainname = os.environ.get('DOMAIN_NAME', 'base_domain')
domainhome = os.environ.get('DOMAIN_HOME', '/u01/oracle/user_projects/domains/' + domainname)
cluster_name = os.environ.get("CLUSTER_NAME", "DockerCluster")
admin_name = os.environ.get("ADMIN_NAME", "AdminServer")
connect(username,password,server_url)
edit()
print ""
print "================== DataSource ==================="
startEdit()
# Create Datasource
# ==================
cd('/')
cmo.createJDBCSystemResource(dsname)
cd('/JDBCSystemResources/' + dsname + '/JDBCResource/' + dsname)
cmo.setName(dsname)
cd('/JDBCSystemResources/' + dsname + '/JDBCResource/' + dsname)
cd('JDBCDataSourceParams/' + dsname)
set('JNDINames', jarray.array([String(dsjndiname)], String))
cd('/JDBCSystemResources/' + dsname + '/JDBCResource/' + dsname)
cd('JDBCDriverParams/' + dsname)
cmo.setDriverName(dsdriver)
cmo.setUrl(dsurl)
set('PasswordEncrypted', encrypt(dspassword))
print 'create JDBCDriverParams Properties'
cd('Properties/' + dsname)
cmo.createProperty('user')
cd('Properties/user')
cmo.setValue(dsusername)
print 'create JDBCConnectionPoolParams'
cd('/JDBCSystemResources/' + dsname + '/JDBCResource/' + dsname)
cd('JDBCConnectionPoolParams/' + dsname)
set('TestTableName','SQL SELECT 1 FROM DUAL')
# Assign
# ======
#assign('JDBCSystemResource', dsname, 'Target', admin_name)
#assign('JDBCSystemResource', dsname, 'Target', cluster_name)
cd('/SystemResources/' + dsname)
set('Targets',jarray.array([ObjectName('com.bea:Name=' + targetname + ',Type=' + targettype)], ObjectName))
# Update Domain, Close It, Exit
# ==========================
#save()
activate()
print ""
#disconnect()
exit()
The problem is, the database host doesn't exists at the build time, as it is the container name of another docker container in the docker-compose environment. With this script, setting the target on data source throws exception, as the host name couldn't be resolved, thus the activate call fails, as well as all the following WLST scripts which depends on the data source. Yet, I don't want to manually set the target after the whole environment is up and running. How do I avoid the exception in this case?
Set the inital and the minimum capacity of the datasource to 0, this allows activation without testing and should skip your error.