Filter a PCAP file using tshark : show ip source>ip destination:info in a txt file - wireshark

I need a tshark command so i can create a txt file containing Ipsource>Ipdestination:Info in this order ! i tried this command
tshark -T fields -n -r "C:\Users\dell\Desktop\tracecomplete.pcap" -E separator=, -e ip.src -e ip.dst > "C:\Users\dell\Desktop\walima22.txt"*
but i can't change the separator and show the infos

There are generally 2 solutions for printing specific column data, one using column-specifiers and the other using fields, similar to what you have.
Using column-specifiers:
Standard specifiers as described by tshark.exe -G column-formats:
tshark.exe -n -r "C:\Users\dell\Desktop\tracecomplete.pcap" -o "gui.column.format:\"Source\",\"%s\",\"Destination\",\"%d\",\"Info\",\"%i\""
... or using custom columns for those fields that are supported:
tshark.exe -n -r "C:\Users\dell\Desktop\tracecomplete.pcap" -o "gui.column.format:\"Source\",\"%Cus:ip.src\",\"Destination\",\"%Cus:ip.dst\",\"Info\",\"%i\""
Using Fields:
tshark.exe -n -r "C:\Users\dell\Desktop\tracecomplete.pcap" -T fields -E separator=, -e ip.src -e ip.dst -e _ws.col.Info
but i can't change the separator
You should be able to change it using the -E option. Refer to the tshark man page for more help with this option.

Related

How To Extract The Name of the Level 7 HTTP2 Application in Tshark

So by default when you open a pcap in wireshark it looks something like this.
enter image description here
But I want to view the name of the websites, so I have to find the host (HTTP) or :authority (HTTP2) header then apply it as a column so that I have the name of the website in my display. Once that is done I can export the pcap as a CSV with the name of the websites include CSV.
enter image description here
My question is how do I do this in tshark? Especially for HTTP2. There's lot's of information on how to do this for HTTP.
From the tshark man page:
-T ek|fields|json|jsonraw|pdml|ps|psml|tabs|text
...
fields The values of fields specified with the -e option, in a form
specified by the -E option. For example,
tshark -T fields -E separator=, -E quote=d
So in your case, you might use something like:
tshark -r Wednesday.pcap -Y http2 -T fields -E separator=, -E quote=d -e frame.number -e frame.time_relative -e ip.src -e ip.dst -e _ws.col.Protocol -e frame.len -e http2.headers.authority -e _ws.col.Info > Wednesday.csv

How to make portia spider run?

I can not send my spiders.
I use one of the following codes:
docker run -i -t --rm -v /home/raphael/Documents/entreprise/portia/portia-master/test:/app/data/projects:rw -v /home/raphael/Documents/entreprise/portia/res:/mnt:rw -p 9001:9001 scrapinghub/portia \
portiacrawl /app/data/projects/Oscaro www.oscaro.com -o /mnt/Oscaro.jl
docker run -i -t --rm -v /home/raphael/Documents/entreprise/portia/portia-master/test:/app/data/projects:rw -v /home/raphael/Documents/entreprise/portia/res:/mnt:rw -p 9001:9001 scrapinghub/portia \
portiacrawl /app/data/projects/Oscaro
The console me return:
+ action=portiacrawl
+ shift
+ '[' -z portiacrawl ']'
+ case $action in
+ exec portiacrawl /app/data/projects/Oscaro www.oscaro.com -o /mnt/Oscaro.jl
Usage: portiacrawl <project dir/project zip> [spider] [options]
Allow to easily run slybot spiders on console. If spider is not given, print a
list of available spiders inside the project
Options:
-h, --help show this help message and exit
--settings=SETTINGS Give specific settings module (must be on python path)
--logfile=LOGFILE Specify log file
-a NAME=VALUE Add spider arguments
-s NAME=VALUE Add extra scrapy settings
-o FILE, --output=FILE
dump scraped items into FILE (use - for stdout)
-t FORMAT, --output-format=FORMAT
format to use for dumping items with -o (default:
jsonlines)
-v, --verbose more verbose
However, this seems to be a good adaptation of the documentation code:
docker run -i -t --rm -v <PROJECTS_FOLDER>:/app/data/projects:rw -v <OUPUT_FOLDER>:/mnt:rw -p 9001:9001 scrapinghub/portia \
portiacrawl /app/data/projects/PROJECT_NAME SPIDER_NAME -o /mnt/SPIDER_NAME.jl
I am completely new to docker, portia and scrapy.
I have trouble idantifying the sorce of the problem.
By the way, I did not understand the solution proposed here:
https://emu.one/scrapy/823487/how-do-i-start-running-portia-spider-how-to-do-it.html
I do not know if this solution concerns me since it does not seem to be used docker.
I also have a question about the first part of the code. I would like to know what is done that I write:
-v /home/raphael/Documents/entreprise/portia/portia-master/test:/app/data/projects
I thank you in advance
It was necessary to put a relative path. :(
docker run -i -t --rm -v ~/Documents/entreprise/portia/portia-master/test:/app/data/projects:rw -v ~/Documents/entreprise/portia/res:/mnt:rw -p 9001:9001 scrapinghub/portia \
portiacrawl /app/data/projects/Oscaro www.oscaro.com -o /mnt/Oscaro.jl

Multiple filter in tshark

The filters -Y, -2 and -R in tshark confusing in Wireshark version 2.XX.
In version 1.8, we were able to apply multiple filters and save the filtered packets in csv file using command below:
tshark.exe -r src.pcap -T fields -e frame.number -e frame.time -e frame.len -e ip.src -e ip.dst -e udp.srcport -e udp.dstport -E header=y -E separator=, -E quote=d -E occurrence=f -R (ip.src==x.x.x.x)&&(ip.dst==y.y.y.y) > filtered.csv
But this command does not work in versions 2.x. Please help if someone applied multi-filter in new Wireshark versions.
You should be able to achieve what you want by replacing -R (ip.src==x.x.x.x)&&(ip.dst==y.y.y.y) with -Y "(ip.src==x.x.x.x)&&(ip.dst==y.y.y.y)".
On windows 7, I had this working with wireshark 2.2.1, adding -2 and quoting the string that follow -R option, like this:
tshark.exe -r mypcap.pcapng -T fields -2 -e frame.number -e frame.time -e frame.len -E header=y -E separator=, -E quote=d -E occurrence=f -R "(ip.src==192.168.1.20)&&(ip.dst==20.1.168.192)"
Not quoting the expression after "-R" results in printing fields and evaluate expression. If the expression results TRUE, the filter is recognized and the result is given. Otherwise the filter (e.g. ip.src) will be evalued as a command by the system, resulting in "command not recognized"

How to add an extra column to Tshark's output (while keeping the default ones)?

I would like to add a Tshark column that tells me which type of ICMP-packet has been captured. This would be the following: icmp.type
While I still need the default columns, how can I make Tshark also show this one?
I've already seen the option to work with -T fields and -e but then all the default columns are left out.
You can add the default columns and use for instance:
tshark -i 1 -T fields -e frame.number -e frame.time -e eth.src -e eth.dst -e frame.protocols -e _ws.col.Protocol -e _ws.col.Info -e icmp.type -E header=y > output.csv
See tshark -h or the man-page for more information.
If you want to add something to the default summary output, you can also use:
-z proto,colinfo,filter,field
For example something like:
-z proto,colinfo,tcp.seq,tcp.seq
Will show this:
1 2018-10-10 10:39:54 192.168.0.10 -> 192.168.0.1 SSH 198 Encrypted response packet len=132 tcp.seq == 1

Grep with color and multiple excludes

I would like to do a grep to dig through my code hierarchy and look for the term "x", but color the results and exclude annoying terms. Right now I do:
grep -Rn --color x * | grep -v -e html -e svn -e test -e doc -e y
The problem is that this loses the matching color because of the pipe. Is there anyway to make this one statement so that the coloring isn't lost?
Specify --color=always to preserve color formatting through pipes:
grep --color=always x * | grep -v -e html -e svn -e test -e doc -e y
And later on if you happen to need to pipe the result into a file and need to remove the escape characters that format color, here's a nifty sed script you can pipe your results through to remove the escape charaters:
sed -r "s/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g"
(Note that you need -E option instead of -r for OS X)
You can try repeating the color search:
grep -Rn --color x * | grep -v -e html -e svn -e test -e doc -e y | grep --color x

Resources