What is RSSI value in 802.11 packet - wireshark

I see two values of SSI signal in 802.11 packet when viewed in wireshark. I would like tot know that which one value is the correct RSSI value
Information from wireshark:
SSI Signal: -40 dBm
SSI Noise: -100 dBm
Signal Quality: 64
Antenna: 0
SSI Signal: 60 dB
Also note that SSI signal(second time) is the ((SSI signal) - (SSI Noise))
I am just confused which one is correct. Also the wikipedia entry tells that these implementation can be vendor dependent. I am totally confused about which is the correct value.

Take my answer with a pinch of salt- this is what makes sense to me, need not be correct..if it makes sense to you use it.
The first SSI signal is measurement of the the Rx Signal Strength at the/after the Rx antenna( its doing this calculation at the ADC stage)
The SSI noise is the noise at the ADC stage (probably measured noise ).
The 2nd SSI signal is the SNR, which would be original SSI Siganl - SSI Noise = 60 dB -this difference would be 60 dB not dBm- the way you get that is by converting both values to dB before subtraction. You neednt do it but youll still get the same magnitude just ensure to use dB as the units.
Neither of them are actually RSSI as per IEEE definitions- RSSI is defined to be a number between two values. It does not have a dBm unit, although a lot of popular apps now give it a dBm value which has led to significant confusion. Cisco uses values between 0-100, atheros 0 to 127 etc. So going by that logic the RSSI in this case would probably be Signal Quality -64.

Take my answer with a pinch of salt- this is what makes sense to me, need not be correct..if it makes sense to you use it.
The first SSI signal is measurement of the the Rx Signal Strength at the/after the Rx antenna( its doing this calculation at the ADC stage)
The SSI noise is the noise at the ADC stage (probably measured noise ).
The 2nd SSI signal is the SNR, which would be original SSI Siganl - SSI Noise = 60 dB -this difference would be 60 dB not dBm- the way you get that is by converting both values to dB before subtraction. Now, you need not do the dB conversion I mentioned before subtraction , you'll still get the same magnitude, just ensure to use dB as the units.
Movin on, to answer your specific question, neither of them are actually RSSI as per IEEE definitions- RSSI is defined to be a number between two values. It does not have a dBm unit, although a lot of popular apps now give it a dBm value which has led to significant confusion. Cisco uses values between 0-100, atheros 0 to 127 etc. So going by that logic the RSSI in this case would probably be Signal Quality -64.

I see two values of SSI signal in 802.11 packet when viewed in wireshark
It sounds as if the driver for the 802.11 adapter used to capture the packet is being weird and supplying both antenna signal strength in dBm and antenna strength in dB. What type of adapter was that, and what operating system is the machine on which the machine that did the capture running?
"dBm", as the link above indicates, decibels from 1 milliwatt of power; "dB", as the other link above indicates, is decibels from some unspecified arbitrary point. dBm tells you the actual signal power at the antenna; dB doesn't - you can only use dB values to compare with other dB values.
Neither of those are "RSSI" as defined by 802.11; that RSSI value is also arbitrary, but it's even more arbitrary - 802.11 doesn't even say what it measures, just that larger values correspond to stronger signals, and those values are vendor-dependent.
Also note that SSI signal(second time) is the ((SSI signal) - (SSI Noise))
The writer of the driver for your adapter might not have properly read the Radiotap page about the "dB antenna signal" value (linked above), and might have thought it was supposed to be a signal-to-noise ratio, and calculated that by subtracting the noise value from the signal value (decibels are a logarithmic scale, and the quotient of two values is the difference between the logarithms of those values). I would ignore that value, and use "SSI signal" as an indication of signal strength in milliwatts (-40 dBm = 100 nanowatts, at least as per the table in the Wikipedia article on dBm).

As per the link (on 06/10/2016) http://www.radiotap.org/suggested-fields/RSSI
RSSI is still a "suggested" field only usable with OpenBSD OS.
(I was trying to get the same info with an AirPcap and a Windows machine)

Related

UART transfer speed

I want to check if my understanding is correct, however I cannot find any precise explanation or examples. Let's say I have UART communication set to 57600 bits/second and I am transferring 8 bit chars. Let's say I choose to have no parity and since I need one start bit and one stop bit, that means that essentially for transferring one char I would need to transfer 10 bits. Does that mean that the transfer speed would be 5760 chars/second?
Your calculation is essentially correct.
But the 5760 chars/second would be the maximum transfer rate. Since it's an asynchronous link, the UART transmitter is allowed to idle the line between character frames.
IOW the baud rate only applies to the bits of the character frame.
The rate that characters are transmitted depends on whether there is data available to keep the transmitter busy/saturated.
For instance, if a microcontroller used programmed I/O (with either polling or interrupt) instead of DMA for UART transmiting, high-priority interrupts could stall the transmissions and introduce delays between frames.
Baudrate = 57600
Time for 1 Bit: 1 / 57600 = 17,36 us
Time for a frame with 10 Bit = 173,6 us
this means max. 1 / 1736 us = 5760 frames(characters) / s**

Frequency analysis of very short signal in GNU Octave

I have some very short signals from oscilloscope (50k-200k samples) registered over about 2ms time length. Those are acoustic signals with registered signal of a spark of ESD (electrostatic discharge).
I'd like to get some frequency data of that signal, in near-acoustic frequency range (up to about 30kHz) with as high time resolution as possible.
I have tried ploting a spectrogram (specgram in Octave) to view the signal, but the output is not really usefull. Using specgram( x, N, fs );, where x is my signal of fs sampling rate, I receive plot starting at very high frequencies of about 500MHz for low values of N and I get better frequency resolution for big N values (like 2^12-13) but the window is too wide and I receive only 2 spectrum values over whole signal length.
I understand that it may be the limitation of Fourier transform which is probably used by the specgram function (actually, I don't know much about signal analysis).
Is there any other way to get some frequency (as a function of time) information of that kind of signal? I've read something about wavelets, but when I tried using dwt function of signal package, I received this error:
error: 'wfilters' undefined near line 51 column 14
error: called from
dwt at line 51 column 12
Even if this would work, I am not so sure if I'd know how to actually use the output of those wavelet functions ...
To get audio frequency information from such a high sample rate, you will need obtain a sample vector long enough to contain at least a few whole cycles at audio frequencies, e.g. many 10's of milliseconds of contiguous samples, which may or may not be more than your scope can gather. To reasonably process this amount of data, you might low pass filter the sample data to just contain audio frequencies, and then resample it to a lower sample rate, but above twice that filter cut-off frequency. Then you will end up with a much shorter sample vector to feed an FFT for your audio spectrum analysis.

Understanding ibeacon distancing

Trying to grasp a basic concept of how distancing with ibeacon (beacon/ Bluetooth-lowenergy/BLE) can work. Is there any true documentation on how far exactly an ibeacon can measure. Lets say I am 300 feet away...is it possible for an ibeacon to detect this?
Specifically for v4 &. v5 and with iOS but generally any BLE device.
How does Bluetooth frequency & throughput affect this? Can beacon devices enhance or restrict the distance / improve upon underlying BLE?
ie
| Range | Freq | T/sec | Topo |
|–—–––––––––––|–—––––––––––|–—––––––––––|–—––––––––––|
Bluetooth v2.1 | Up to 100 m | < 2.481ghz | < 2.1mbit | scatternet |
|-------------|------------|------------|------------|
Bluetooth v4 | ? | < 2.481ghz | < 305kbit | mesh |
|-------------|------------|------------|------------|
Bluetooth v5 | ? | < 2.481ghz | < 1306kbit | mesh |
The distance estimate provided by iOS is based on the ratio of the beacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each beacon must be calibrated with this txPower value to allow accurate distance estimates.
While the distance estimates are useful, they are not perfect, and require that you control for other variables. Be sure you read up on the complexities and limitations before misusing this.
When we were building the Android iBeacon library, we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.
Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.
protected static double calculateAccuracy(int txPower, double rssi) {
if (rssi == 0) {
return -1.0; // if we cannot determine accuracy, return -1.
}
double ratio = rssi*1.0/txPower;
if (ratio < 1.0) {
return Math.pow(ratio,10);
}
else {
double accuracy = (0.89976)*Math.pow(ratio,7.7095) + 0.111;
return accuracy;
}
}
Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV
I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.
davidgyoung (accepted answer, > 100 upvotes) says:
Note that the term "accuracy" here is iOS speak for distance in meters.
Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:
Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.
Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.
Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_error, http://en.wikipedia.org/wiki/Uncertainty
In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:
mean (which is easy to understand, it's value for which peak of the curve occurs).
standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.
one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.
Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.
When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!
It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.
There is a blog post which thoroughly analyses the matter: http://blog.shinetech.com/2014/02/17/the-beacon-experiments-low-energy-bluetooth-devices-in-action/
Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.
I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.
It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.
The iBeacon output power is measured (calibrated) at a distance of 1 meter. Let's suppose that this is -59 dBm (just an example). The iBeacon will include this number as part of its LE advertisment.
The listening device (iPhone, etc), will measure the RSSI of the device. Let's suppose, for example, that this is, say, -72 dBm.
Since these numbers are in dBm, the ratio of the power is actually the difference in dB. So:
ratio_dB = txCalibratedPower - RSSI
To convert that into a linear ratio, we use the standard formula for dB:
ratio_linear = 10 ^ (ratio_dB / 10)
If we assume conservation of energy, then the signal strength must fall off as 1/r^2. So:
power = power_at_1_meter / r^2. Solving for r, we get:
r = sqrt(ratio_linear)
In Javascript, the code would look like this:
function getRange(txCalibratedPower, rssi) {
var ratio_db = txCalibratedPower - rssi;
var ratio_linear = Math.pow(10, ratio_db / 10);
var r = Math.sqrt(ratio_linear);
return r;
}
Note, that, if you're inside a steel building, then perhaps there will be internal reflections that make the signal decay slower than 1/r^2. If the signal passes through a human body (water) then the signal will be attenuated. It's very likely that the antenna doesn't have equal gain in all directions. Metal objects in the room may create strange interference patterns. Etc, etc... YMMV.
iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).
Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.
A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.
As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.
With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.
You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.

Jitter calculation in Wireshark

I have a query regarding the Jitter calculation method in Wireshark.
Wireshark calculates jitter according to RFC3550 (RTP):
If Si is the RTP timestamp from packet i, and Ri is the time of arrival in RTP timestamp units for packet i, then for two packets i and j, D may be expressed as
D(i,j) = (Rj - Ri) - (Sj - Si) = (Rj - Sj) - (Ri - Si)
The interarrival jitter SHOULD be calculated continuously as each data packet i is received from source SSRC_n, using this difference D for that packet and the previous packet i-1 in order of arrival (not necessarily in sequence), according to the formula
J(i) = J(i-1) + (|D(i-1,i)| - J(i-1))/16
Now, here the absolute value of inter-arrival jitter has been taken into consideration. My query is why the absolute value has been taken when the jitter could be negative also and i think if we take the negative jitter into consideration also, we will get the much actual value rather than the value we are taking at present
Also, when we plot the jitter distribution graph using the above method, it wont be centered around zero as we have made all the values positive and that graph wont look realistic.
Can someone clarify my query?
Wikipedia has a good definition of jitter:
Jitter is the undesired deviation from true periodicity of an assumed
periodic signal...
A jitter value of zero means the signal has no variation from the expected value. As the variation increases (the packets are getting bunched up and spread out) the jitter increases in magnitude.
The bunching and spreading are really the same effect; bunching in one place causes spreading in another so this 'bunching and spreading' doesn't have a direction, just a magnitude.
I hope this helps - it was the best explanation I could come up with.

How to interpet wireshark signal strength

I use Google for hours but I do not find a satisfying answer how to interpret the captured signal strength, given by the radiotap header. For instance Wireshark shows me a SSI Signal of -52 dBm and I want to convert it to a linear representation/unit. For me, a sensible unit would be the signal power at the antenna in Watt over mW. Is it possible to convert this -52 dBm to mW?
Some background information: I implement a WLAN-based localisation and want to estimate the position of APs by combining some reference points and the measured signal strength. With the help of triangulation, this should produce a rough map of the environment.
As explained quite well on the Wikipedia page, you can convert from x dBm to P Watts by applying:
P = 10**((x-30)/10)
So in your case, -52 dBm = 6.3 nW, which seems about right.

Resources