JSON Export of Packets Dissected By Custom LUA Dissector - lua

I have written a custom wireshark dissector using Lua that successfully dissects packets as expected when it is installed.
When I attempt to 'Export Packet Dissections' 'As JSON', however, all of the fields handled by my custom dissector are exported as follows:
"_ws.lua.text": ""
Here is a broader snippet:
"_ws.lua.fake": "",
"my_protocol": {
"_ws.lua.text": {
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": ""
},
"_ws.lua.text": {
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": "",
"_ws.lua.text": ""
},
"_ws.lua.text": ""
What do I need to do to get the fields handled by my custom dissector to be exported properly?

I had the same problem. I adjusted the dissectors and thus got rid of the "_ws.lua.text"s, but not the fakes. My dissector used to look like this:
my_proto=Proto("my_proto", "My custom Protocol", "My custom Protocol")
*something something*
local my_proto_packet = tree:add(my_proto, buffer(),"My custom protocol");
value = buffer(curPos,4):uint();
local valueNode = my_proto_packet:add_le(buffer(curPos, 4), "value = " .. value)
The "value = 3.8"-String was displayed in Wireshark.
I added a ProtoField-variable and added it into the proto.fields-array. And then changed the valueNode definition so it now looks like this:
my_proto=Proto("my_proto", "My custom Protocol", "My custom Protocol")
local field_myproto_intfield = ProtoField.uint32("myproto.intfield", "Integer", base.DEC)
my_proto.fields = { field_myproto_intfield }
local my_proto_packet = tree:add(my_proto, buffer(),"My custom protocol");
*something something*
valueNode:add(field_myproto_intfield, buffer(curPos, 4))

Related

query string in requests method

I am trying to do a get request through airflow , and I get a 400.
my params dict :
[
{
"api_name":"",
"api_conn_id": "",
"endpoint": "",
"request_parameters":{
"q": "analytics",
"pivot": "MEMBER_COUNTRY_V2",
"dateRange.start.day": "15",
"timeGranularity": "DAILY",
"fields": "clicks,impressions,pivotValue,dateRange,totalEngagements,costInLocalCurrency,likes,shares,comments,landingPageClicks,companyPageClicks,oneClickLeads,follows,otherEngagements",
"accounts[0]" : ""
},
"script": ""
}
]
my airflow dag:
for source in sources:
task_api_call = HttpSensor(
task_id='task_api_call',
http_conn_id=source['api_conn_id'],
endpoint=source['endpoint'],
request_params=source['request_parameters'],
dag=dag,
)
my query string includes %2C while doing the get request, and i am not able to encode it.
I guess thats the reason
url being passed is
https://api/v2/?q=analytics&pivot=endpoint&dateRange.start.day=15&dateRange.start.month=07&dateRange.start.year=2022&dateRange.end.day=15&dateRange.end.month=07&dateRange.end.year=2022&timeGranularity=DAILY&fields=clicks%2Cimpressions%2CpivotValue%2CdateRange%2CtotalEngagements%2CcostInLocalCurrency%2Clikes%2Cshares%2Ccomments%2ClandingPageClicks%2CcompanyPageClicks%2ConeClickLeads%2Cfollows%2CotherEngagements&accounts%5B0%5D=account

How to update JSON payload when values are embedded in backslash in Groovy

In Groovy I have to update values in JSON payload and make an API call. I am running into challenges while updating payload as the fields are embedded in backslash. Is there a simpler way to directly update the servers in below payload i.e update 1. JSON payload to 2. Updated JSON payload (updating name and host values).
1. JSON payload:
{
"environment": "dev",
"config": "Create",
"configType": "Server",
"ServerName": "",
"Servers": "[\\{\"name\":\"Server-test_1\",\"host\":\"test.com\",\"port\":\"443\",\"tls\":\"2-way\"}]",
"tsHost": "",
"tsPort": "",
"tsSSLOption": "1-way"
}
2. Updated JSON payload:
{
"environment": "dev",
"config": "Update",
"configType": "Server",
"ServerName": "",
"Servers": "[\\{\"name\":\"Server-test_2\",\"host\":\"test123.com\",\"port\":\"443\",\"tls\":\"2-way\"}]",
"tsHost": "",
"tsPort": "",
"tsSSLOption": "1-way"
}
Tried below (losing backslash in conversion process):
Code:
def json = $/ {
"environment": "dev",
"config": "Create",
"configType": "Server",
"ServerName": "",
"Servers": "[\\{\"name\":\"Server-test_1\",\"host\":\"test.com\",\"port\":\"443\",\"tls\":\"2-way\"}]",
"tsHost": "",
"tsPort": "",
"tsSSLOption": "1-way"
}
/$
def parser = new JsonSlurper()
def jsonResp = parser.parseText(json)
println(jsonResp.Servers)
jsonResp.Servers.name = "Server-test_2"
jsonResp.Servers.host = "test123.com"
Servers is a string in your initial json - you have to parse it
import groovy.json.*
def json = $/ {
"environment": "dev",
"config": "Create",
"configType": "Server",
"ServerName": "",
"Servers": "[{\"name\":\"Server-test_1\",\"host\":\"test.com\",\"port\":\"443\",\"tls\":\"2-way\"}]",
"tsHost": "",
"tsPort": "",
"tsSSLOption": "1-way"
}
/$
def parser = new JsonSlurper()
def jsonResp = parser.parseText(json)
println(jsonResp.Servers)
def servers = parser.parseText(jsonResp.Servers)
servers[0].name="Server-test_2"
servers[0].host="test123.com"
jsonResp.Servers = JsonOutput.toJson(servers)
json = JsonOutput.prettyPrint(JsonOutput.toJson(jsonResp))

Elastalert Rules for slack integration (message formatting and Attachments)

I'm trying to use message formatting in slack. The Elastalert Testrule.yaml file is partially being parsed. The slack alert shows up with only slack_alert_fields and alert_text fields. I want to send attachments as well in the alerts.
How to use attachments or create buttons fort slack alerts?
es_host: elasticsearch
es_port: 9200
name: Test rule Alert
type: any
index: alerts-*
filter:
- term:
alertType.keyword: "New alert created"
alert:
- "slack"
slack_alert_fields:
- title: Network Name
value: networkName
short: true
- title: Alert Type
value: alertType
short: true
slack_actions:
- name: "network url"
text: "Network URL"
type: "button"
value: networkUrl
alert_text: |
alertData : {0}
alert_text_type: alert_text_only
alert_text_args: ["alertData"]
attachments: [
{
"fallback": "Required plain-text summary of the attachment.",
"color": "#37964f",
"pretext": "New alert created",
"title": alertData.reason ,
"fields": [
{
"title": "Network Name",
"value": networkName,
"short" : true
},
{
"title": "Timestamp",
"value": timestamp,
"short" : true
}
],
"actions": [
{
"name": "network url",
"text": "Network URL",
"type": "button",
"value": networkUrl
},
{
"name": "org_url",
"text": "Organization URL",
"type": "button",
"value": organizationUrl
}
]
}
]
slack_webhook_url:
- "https://hooks.slack.com/xxxxxxxxxxxxxxxxxxxxxxx"
Looking at the official documentation it appears that Elastalert does not support adding custom Slack attachments for alerts, because there is no property for it in the documentation.
In fact it seams that alerts are already formatted as attachment, which is why you can set a title and a title-URL. And also define additional "fields". Something that you can only do with attachments in Slack.
This also means that you can not specify buttons for your alerts (which are a special kind of attachments in Slack).
If you need this functionality I would suggest contacting the developer of Elastalert.

How to convert my custom json to swagger 2.0 json

I'm new to swagger documentation. We have an existing project developed in progress language for RESTFul Web Services. The different resource based urls consumes and produces in application/json format. The input and output json formats for one of our resource url is given below:
Request:
{
"request": {
"companyNumber": 5000,
"operatorInit": "sys",
"operatorPassword": "",
"customerNumber": 101,
"shipTo": "",
"warehouse": "01",
"productCode": "2-001",
"crossReferenceFlag": false,
"retrieveFlag": false,
"tInbinlocation": {
"t-inbinlocation": [
{
"binloc": "",
"icswbinloc1fl": false,
"icswbinloc2fl": false,
"addrecordfl": false,
"deleterecordfl": false,
"charuser": "",
"user1": "",
"user2": "",
"user3": "",
"user4": "",
"user5": "",
"user6": 0,
"user7": 0,
"user8": null,
"user9": null
}
]
},
"tInfieldvalue": {
"t-infieldvalue": [
{
"level": "",
"lineno": 0,
"seqno": 0,
"fieldname": "",
"fieldvalue": ""
}
]
}
}
}
Response:
{
"response": {
"cErrorMessage": "",
"crossReferenceProduct": "2-001",
"crossReferenceType": "",
"tOutbinlocation": {
"t-outbinlocation": []
},
"tOutfieldvalue": {
"t-outfieldvalue": []
}
}
}
How to convert above request and response json formats into swagger 2.0 json format?
Thanks!
Try using api-spec-converter.
this tool supports converting API descriptions between popular formats.
Supported formats:
* swagger_1
* swagger_2
* api_blueprint
* io_docs
* google
* raml
* wadl

PhantomJS not running javascript when setup through capybara with poltergeist

The way I'm setting up the capybara driver seems to be causing phantomjs to not execute javascript. To confirm that this can work outside of capybara / poltergeist, I used raw watir-webdriver with the same phantomjs installation and it did run javascript.
So here is my code that creates the capybara driver using poltergeist:
require 'capybara'
require 'capybara/poltergeist'
include Capybara::DSL
Capybara.register_driver :poltergeist do |app|
options = {
:js_errors => true,
:timeout => 120,
:debug => false,
:phantomjs_options => ['--load-images=no', '--disk-cache=false'],
:inspector => true,
}
Capybara::Poltergeist::Driver.new(app, options)
end
Capybara.default_driver = :poltergeist
Capybara.javascript_driver = :poltergeist
Capybara.app_host = 'https://google.com'
visit('/')
fill_in('q', with: 'green cheese')
sleep 2
p all('span').map{|s| s.text}
When I run that I get out like:
["", "", "+You", "", "Search", "", "Images", "", "Maps", "", "Play", "", "YouTube", "", "News", "", "Gmail", "", "More", "More", "", "", "", "Sign in", "Sign in", "", "", "", "", "× A faster way to browse the web Install Google Chrome", "", "", "", "", "Advertising ProgramsBusiness Solutions+GoogleAbout Google © 2013 - Privacy & Terms"]
Which makes it clear that the search suggestions are not showing up, which means javascript is not running. I've confirmed this using screenshots.
Now, when I run the following code that uses only water-webdriver, it works:
require 'watir-webdriver'
b = Watir::Browser.new :phantomjs
b.goto 'https://google.com'
b.text_field(:name, 'q').set "green cheese"
sleep 2
p b.spans.map{|s| s.text }
b.close
and that gives the following output:
["", "", "+You", "", "Search", "", "Images", "", "Maps", "", "Play", "", "YouTube", "", "News", "", "Gmail", "", "More", "More", "", "", "", "Sign in", "Sign in", "", "", "", "", "×\nA faster way to browse the web\nInstall Google Chrome", "", "", "", "", "Advertising ProgramsBusiness Solutions+GoogleAbout Google\n© 2013 - Privacy & Terms", "green cheese", "green cheese strain", "green cheese weed", "green cheese enchiladas", "green cheesecake", "green cheese moon", "green cheese enchiladas recipe", "green cheese strain review", "green cheese media group", "green cheese penny", "", "", "", ""]
All that green cheese in there makes it clear that search suggestions are showing up, which means javascript must be running. Again, I have confirmed this using screenshots.
Can anyone help me understand how to configure Capybara/poltergeist so it works?
One final note: This has nothing to do with Rails. this is a standalone test suite, and you can find the full code for it here: https://github.com/rschultheis/rspec_capybara_starter
I was able to solve this issue by upgrading my phantomjs from 1.9.8 to 2.1.1. For instructions on doing this reference this http://phantomjs.org/download.html.

Resources