Logstash input twitter authorized error - twitter

Any one have experience with below error ? Please help me
Logstash startup completed
exception=>Twitter::Error::Unauthorized, :backtrace=>["C:/logstash-1.5.1 ...
I'm using the below twitter config
input{
twitter{
consumer_key => ""
consumer_secret => ""
oauth_token => ""
oauth_token_secret => ""
keywords => [""]
full_tweet => true
}
}
output {
stdout { codec => dots }
elasticsearch {
host => "localhost:9200"
}
}

Related

Logstash container stopped because of an error creating action from filter

Hello I'm new to Elasticsearch
I'm working with log files comming from filebeat and logstash and I'm trying to add a field "response_time", and then affect the difference between timestamps to It.
So I create a logstash's filter and I add it to logstash configuration file but when I restared the container I get the error bellow.
This is my logstash configuration file:
input {
beats {
port => 5044
}
}
filter {
json {
source => "message"
}
ruby {
code => "event.set('indexDay', event.get('[#timestamp]').time.localtime('+01:00').strftime('%Y%m%d'))"
}
aggregate {
add_field => {
"response_time" => "timestamp2-timestamp1"
}
}
grok {
match => ["message","%{LOGLEVEL:loglevel},%{DATESTAMP_RFC2822:timestamp},%{NOTSPACE:event_type},%{NUMBER:capture_res_id},%{NUMBER:capture_pid},%{NUMBER:mti},%{NUMBER:node_id}
,%{UUID:msg_uuid},%{NOTSPACE:module},%{NUMBER :respCode}"]}
if [event_type] == "request_inc" {
aggregate {
msg_uuid => "%{UUID}"
timestamp1 => event.get('DATESTAMP_RFC2822')
code => "map['response_time'] = 0"
map_action => "create"
}
}
if [event_type] == "response_outg" {
aggregate {
msg_uuid => "%{UUID}"
event_type => event.set('event_type')
timestamp2 => "%{DATESTAMP_RFC2822}"
code => "map['response_time']"
map_action => "update"
end_of_task => true
timeout =>120
}
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
template => "/usr/share/logstash/templates/testblogstash.template.json"
template_name => "testblogstash"
template_overwrite => true
index => "testblogstash-%{indexDay}"
codec => json
}
stdout {
codec => rubydebug
}
}
And this is an exemple of my log file:
{"log_level":"INFO","timestamp":"2021-12-15T16:06:24.400087Z","event_type":"s_tart","ca_id":"11","c_pid":"114","mti":"00","node_id":"00","msg_uuid":"1234","module":"cmde"}
{"log_level":"INFO","timestamp":"2021-12-15T16:06:31.993057Z","event_type":"e_nd","mti":"00","node_id":"00","msg_uuid":"1234","module":"PWC-cmde","respCode":"1"}
This is the error from docker logs :
[2022-06-01T14:43:24,529][ERROR][logstash.agent ] Failed to execute
action {:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
[A-Za-z0-9_-], [ \t\r\n], "#", "{", [A-Za-z0-9_], "}" at line
25, column 24 (byte 689) after filter {\r\n json {\r\n source =>
"message"\r\n }\r\n ruby {\r\n code => "event.set('indexDay',
event.get('[#timestamp]').time.localtime('+01:00').strftime('%Y%m%d'))"\r\n
}\r\n aggregate {\r\n add_field => {\r\n "response_time" =>
"timestamp2-timestamp1"\r\n\t\t }\r\n\t\t}\r\n grok {\r\n match =>
["message","%{LOGLEVEL:loglevel},%{DATESTAMP_RFC2822:timestamp},%{NOTSPACE:event_type},%{NUMBER:capture_res_id},%{NUMBER:capture_pid},%{NUMBER:mti},%{NUMBER:node_id}\r\n\t,%{UUID:msg_uuid},%{NOTSPACE:module},%{NUMBER
:respCode}"]}\r\n if [event_type] == "request_inc" {\r\n aggregate
{\r\n\t msg_uuid => "%{UUID}"\r\n\t timestamp1 => event",
:backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in
compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in initialize'",
"org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:383:in block
in converge_state'"]}
...
[2022-06-01T14:43:29,460][INFO ][logstash.runner ] Logstash shut down.

Logstash shutdows after start

im new on the ELK Stack and i can't figure out why Logstash keeps shutdown after i execute it. I'm trying to gather information from twitter.
input {
twitter {
consumer_key => "XXX"
consumer_secret => "XXX"
oauth_token => "XXX"
oauth_token_secret => "XXX"
keywords => ["portugal", "game", "movie"]
ignore_retweets => true
full_tweet => true
}
}
filter {}
output{
stdout {
codec => dots
}
elasticsearch {
hosts => "localhost:9200"
index => "twitterind"
}
}

Stopping pipeline in logstash

A line of error code print out when I am trying to run logstash in docker
Here is the error code:
[LogStash::Runner] WARN logstash.agent - stopping pipeline {:id=>"main"}
Below is my configuration when running logstash in docker
input {
jdbc {
jdbc_driver_library => "/config-dir/mysql-connector-java-5.1.36-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://172.17.0.5:3306/data1"
jdbc_user => "user"
jdbc_password => "password"
statement => "SELECT * from COMPANY"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "172.17.0.2:9200"
"index" => "test-migrate"
"document_type" => "data"
}
}

not able to add proper rule for firebase custom claim

I have created an auth token using Firebase documentation for ruby but then when I am trying to access the custom claim in the data security rules it does not work.
my JWT creation code looks something like this
$service_account_email = "service-account#my-project-abc123.iam.gserviceaccount.com"
$private_key = OpenSSL::PKey::RSA.new "-----BEGIN PRIVATE KEY-----\n..."
now_seconds = Time.now.to_i
payload = {:iss => $service_account_email,
:sub => $service_account_email,
:aud => "https://identitytoolkit.googleapis.com/google.identity.identitytoolkit.v1.IdentityToolkit",
:iat => now_seconds,
:exp => now_seconds+(60*60), # Maximum expiration time is one hour
:uid => uid,
:claims => {:modify_vessels => true}}
jwt_token = JWT.encode payload, private_key, "RS256"
render json: { :status => "ok", :email => usr, :jwt_token => jwt_token, :uid => uid } and return
I have tried below rules for accessing claims object but none of them work
{
"rules": {
"vessels" : {
"$uid" : {
".read":"auth.token.modify_vessels === true",
".write":"auth.token.modify_vessels === true"
}
}
}
}
and
{
"rules": {
"vessels" : {
"$uid" : {
".read":"auth.claims.modify_vessels === true",
".write":"auth.claims.modify_vessels === true"
}
}
}
}
Please Note :- I can do auth.uid and it works
Any help on this will be helpful

Logstash twitter config - search only people i follow?

Does anyone know how i can modify this logstash so that it only watches feeds of those my twitter account follows?
input {
twitter {
# add your data
consumer_key => ""
consumer_secret => ""
oauth_token => ""
oauth_token_secret => ""
full_tweet => true
keywords => ["pizza"]
}
}
output {
elasticsearch_http {
host => "localhost"
index => "twitter"
index_type => "tweet"
}
}

Resources