How to interpet wireshark signal strength - geolocation

I use Google for hours but I do not find a satisfying answer how to interpret the captured signal strength, given by the radiotap header. For instance Wireshark shows me a SSI Signal of -52 dBm and I want to convert it to a linear representation/unit. For me, a sensible unit would be the signal power at the antenna in Watt over mW. Is it possible to convert this -52 dBm to mW?
Some background information: I implement a WLAN-based localisation and want to estimate the position of APs by combining some reference points and the measured signal strength. With the help of triangulation, this should produce a rough map of the environment.

As explained quite well on the Wikipedia page, you can convert from x dBm to P Watts by applying:
P = 10**((x-30)/10)
So in your case, -52 dBm = 6.3 nW, which seems about right.

Related

fundamental frequency of female voice

According to what I have read on the internet, the normal range of fundamental frequency of female voice is 165 to 255 Hz .
I am using Praat and also python library called Parselmouth to get the fundamental frequency values of female voice in an audio file(.wav). however, I got some values that are over 255Hz(eg: 400+Hz, 500Hz).
Is it normal to get big values like this?
It is possible, but unlikely, if you are trying to capture the fundamental frequency (F0) of a speaking voice. It sounds likely that you are capturing a more easily resonating overtone (e.g. F1 or F2) instead.
My experiments with Praat give me the impression that the with good parameters it will reliably extract F0.
What you'll want to do is to verify that by comparing the pitch curve with a spectrogram. Here's an example of a fitting made by Praat (female speaker):
You can see from the image that
Most prominent frequency seems to be F2
Around 200 Hz seems likely to be F0, since there's only noise below that (compared to before/after the segment)
Praat has calculated a good estimate of F0 for the voiced speech segments
If, after a visual inspection, it seems that you are getting wrong results, you can try to tweak the parameters. Window length greatly affects the frequency resolution.
If you can't capture frequencies this low, you should try increasing the window length - the intuition is that it gives the algorithm a better chance at finding slowly changing periodic features in the data.

Frequency analysis of very short signal in GNU Octave

I have some very short signals from oscilloscope (50k-200k samples) registered over about 2ms time length. Those are acoustic signals with registered signal of a spark of ESD (electrostatic discharge).
I'd like to get some frequency data of that signal, in near-acoustic frequency range (up to about 30kHz) with as high time resolution as possible.
I have tried ploting a spectrogram (specgram in Octave) to view the signal, but the output is not really usefull. Using specgram( x, N, fs );, where x is my signal of fs sampling rate, I receive plot starting at very high frequencies of about 500MHz for low values of N and I get better frequency resolution for big N values (like 2^12-13) but the window is too wide and I receive only 2 spectrum values over whole signal length.
I understand that it may be the limitation of Fourier transform which is probably used by the specgram function (actually, I don't know much about signal analysis).
Is there any other way to get some frequency (as a function of time) information of that kind of signal? I've read something about wavelets, but when I tried using dwt function of signal package, I received this error:
error: 'wfilters' undefined near line 51 column 14
error: called from
dwt at line 51 column 12
Even if this would work, I am not so sure if I'd know how to actually use the output of those wavelet functions ...
To get audio frequency information from such a high sample rate, you will need obtain a sample vector long enough to contain at least a few whole cycles at audio frequencies, e.g. many 10's of milliseconds of contiguous samples, which may or may not be more than your scope can gather. To reasonably process this amount of data, you might low pass filter the sample data to just contain audio frequencies, and then resample it to a lower sample rate, but above twice that filter cut-off frequency. Then you will end up with a much shorter sample vector to feed an FFT for your audio spectrum analysis.

Understanding ibeacon distancing

Trying to grasp a basic concept of how distancing with ibeacon (beacon/ Bluetooth-lowenergy/BLE) can work. Is there any true documentation on how far exactly an ibeacon can measure. Lets say I am 300 feet away...is it possible for an ibeacon to detect this?
Specifically for v4 &. v5 and with iOS but generally any BLE device.
How does Bluetooth frequency & throughput affect this? Can beacon devices enhance or restrict the distance / improve upon underlying BLE?
ie
| Range | Freq | T/sec | Topo |
|–—–––––––––––|–—––––––––––|–—––––––––––|–—––––––––––|
Bluetooth v2.1 | Up to 100 m | < 2.481ghz | < 2.1mbit | scatternet |
|-------------|------------|------------|------------|
Bluetooth v4 | ? | < 2.481ghz | < 305kbit | mesh |
|-------------|------------|------------|------------|
Bluetooth v5 | ? | < 2.481ghz | < 1306kbit | mesh |
The distance estimate provided by iOS is based on the ratio of the beacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each beacon must be calibrated with this txPower value to allow accurate distance estimates.
While the distance estimates are useful, they are not perfect, and require that you control for other variables. Be sure you read up on the complexities and limitations before misusing this.
When we were building the Android iBeacon library, we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.
Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.
protected static double calculateAccuracy(int txPower, double rssi) {
if (rssi == 0) {
return -1.0; // if we cannot determine accuracy, return -1.
}
double ratio = rssi*1.0/txPower;
if (ratio < 1.0) {
return Math.pow(ratio,10);
}
else {
double accuracy = (0.89976)*Math.pow(ratio,7.7095) + 0.111;
return accuracy;
}
}
Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV
I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.
davidgyoung (accepted answer, > 100 upvotes) says:
Note that the term "accuracy" here is iOS speak for distance in meters.
Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:
Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.
Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.
Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_error, http://en.wikipedia.org/wiki/Uncertainty
In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:
mean (which is easy to understand, it's value for which peak of the curve occurs).
standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.
one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.
Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.
When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!
It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.
There is a blog post which thoroughly analyses the matter: http://blog.shinetech.com/2014/02/17/the-beacon-experiments-low-energy-bluetooth-devices-in-action/
Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.
I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.
It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.
The iBeacon output power is measured (calibrated) at a distance of 1 meter. Let's suppose that this is -59 dBm (just an example). The iBeacon will include this number as part of its LE advertisment.
The listening device (iPhone, etc), will measure the RSSI of the device. Let's suppose, for example, that this is, say, -72 dBm.
Since these numbers are in dBm, the ratio of the power is actually the difference in dB. So:
ratio_dB = txCalibratedPower - RSSI
To convert that into a linear ratio, we use the standard formula for dB:
ratio_linear = 10 ^ (ratio_dB / 10)
If we assume conservation of energy, then the signal strength must fall off as 1/r^2. So:
power = power_at_1_meter / r^2. Solving for r, we get:
r = sqrt(ratio_linear)
In Javascript, the code would look like this:
function getRange(txCalibratedPower, rssi) {
var ratio_db = txCalibratedPower - rssi;
var ratio_linear = Math.pow(10, ratio_db / 10);
var r = Math.sqrt(ratio_linear);
return r;
}
Note, that, if you're inside a steel building, then perhaps there will be internal reflections that make the signal decay slower than 1/r^2. If the signal passes through a human body (water) then the signal will be attenuated. It's very likely that the antenna doesn't have equal gain in all directions. Metal objects in the room may create strange interference patterns. Etc, etc... YMMV.
iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).
Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.
A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.
As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.
With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.
You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.

What is RSSI value in 802.11 packet

I see two values of SSI signal in 802.11 packet when viewed in wireshark. I would like tot know that which one value is the correct RSSI value
Information from wireshark:
SSI Signal: -40 dBm
SSI Noise: -100 dBm
Signal Quality: 64
Antenna: 0
SSI Signal: 60 dB
Also note that SSI signal(second time) is the ((SSI signal) - (SSI Noise))
I am just confused which one is correct. Also the wikipedia entry tells that these implementation can be vendor dependent. I am totally confused about which is the correct value.
Take my answer with a pinch of salt- this is what makes sense to me, need not be correct..if it makes sense to you use it.
The first SSI signal is measurement of the the Rx Signal Strength at the/after the Rx antenna( its doing this calculation at the ADC stage)
The SSI noise is the noise at the ADC stage (probably measured noise ).
The 2nd SSI signal is the SNR, which would be original SSI Siganl - SSI Noise = 60 dB -this difference would be 60 dB not dBm- the way you get that is by converting both values to dB before subtraction. You neednt do it but youll still get the same magnitude just ensure to use dB as the units.
Neither of them are actually RSSI as per IEEE definitions- RSSI is defined to be a number between two values. It does not have a dBm unit, although a lot of popular apps now give it a dBm value which has led to significant confusion. Cisco uses values between 0-100, atheros 0 to 127 etc. So going by that logic the RSSI in this case would probably be Signal Quality -64.
Take my answer with a pinch of salt- this is what makes sense to me, need not be correct..if it makes sense to you use it.
The first SSI signal is measurement of the the Rx Signal Strength at the/after the Rx antenna( its doing this calculation at the ADC stage)
The SSI noise is the noise at the ADC stage (probably measured noise ).
The 2nd SSI signal is the SNR, which would be original SSI Siganl - SSI Noise = 60 dB -this difference would be 60 dB not dBm- the way you get that is by converting both values to dB before subtraction. Now, you need not do the dB conversion I mentioned before subtraction , you'll still get the same magnitude, just ensure to use dB as the units.
Movin on, to answer your specific question, neither of them are actually RSSI as per IEEE definitions- RSSI is defined to be a number between two values. It does not have a dBm unit, although a lot of popular apps now give it a dBm value which has led to significant confusion. Cisco uses values between 0-100, atheros 0 to 127 etc. So going by that logic the RSSI in this case would probably be Signal Quality -64.
I see two values of SSI signal in 802.11 packet when viewed in wireshark
It sounds as if the driver for the 802.11 adapter used to capture the packet is being weird and supplying both antenna signal strength in dBm and antenna strength in dB. What type of adapter was that, and what operating system is the machine on which the machine that did the capture running?
"dBm", as the link above indicates, decibels from 1 milliwatt of power; "dB", as the other link above indicates, is decibels from some unspecified arbitrary point. dBm tells you the actual signal power at the antenna; dB doesn't - you can only use dB values to compare with other dB values.
Neither of those are "RSSI" as defined by 802.11; that RSSI value is also arbitrary, but it's even more arbitrary - 802.11 doesn't even say what it measures, just that larger values correspond to stronger signals, and those values are vendor-dependent.
Also note that SSI signal(second time) is the ((SSI signal) - (SSI Noise))
The writer of the driver for your adapter might not have properly read the Radiotap page about the "dB antenna signal" value (linked above), and might have thought it was supposed to be a signal-to-noise ratio, and calculated that by subtracting the noise value from the signal value (decibels are a logarithmic scale, and the quotient of two values is the difference between the logarithms of those values). I would ignore that value, and use "SSI signal" as an indication of signal strength in milliwatts (-40 dBm = 100 nanowatts, at least as per the table in the Wikipedia article on dBm).
As per the link (on 06/10/2016) http://www.radiotap.org/suggested-fields/RSSI
RSSI is still a "suggested" field only usable with OpenBSD OS.
(I was trying to get the same info with an AirPcap and a Windows machine)

How to calculate distance from Wifi router using Signal Strength?

I would like to calculate the exact location of a mobile device inside a building ( so no GPS access)
I want to do this using the signal strength(in dBm) of at least 3 fixed wifi signals(3 fixed routers of which I know the position)
Google already does that and I would like to know how they figure out the exact location based on the this data
Check this article for more details : http://www.codeproject.com/Articles/63747/Exploring-GoogleGears-Wi-Fi-Geo-Locator-Secrets
FSPL depends on two parameters: First is the frequency of radio signals;Second is the wireless transmission distance. The following formula can reflect the relationship between them.
FSPL (dB) = 20log10(d) + 20log10(f) + K
d = distance
f = frequency
K= constant that depends on the units used for d and f
If d is measured in kilometers, f in MHz, the formula is:
FSPL (dB) = 20log10(d)+ 20log10(f) + 32.44
From the Fade Margin equation, Free Space Path Loss can be computed with the following equation.
Free Space Path Loss=Tx Power-Tx Cable Loss+Tx Antenna Gain+Rx Antenna Gain - Rx Cable Loss - Rx Sensitivity - Fade Margin
With the above two Free Space Path Loss equations, we can find out the Distance in km.
Distance (km) = 10(Free Space Path Loss – 32.44 – 20log10(f))/20
The Fresnel Zone is the area around the visual line-of-sight that radio waves spread out into after they leave the antenna. You want a clear line of sight to maintain strength, especially for 2.4GHz wireless systems. This is because 2.4GHz waves are absorbed by water, like the water found in trees. The rule of thumb is that 60% of Fresnel Zone must be clear of obstacles. Typically, 20% Fresnel Zone blockage introduces little signal loss to the link. Beyond 40% blockage the signal loss will become significant.
FSPLr=17.32*√(d/4f)
d = distance [km]
f = frequency [GHz]
r = radius [m]
Source : http://www.tp-link.com/en/support/calculator/
To calculate the distance you need signal strength and frequency of the signal. Here is the java code:
public double calculateDistance(double signalLevelInDb, double freqInMHz) {
double exp = (27.55 - (20 * Math.log10(freqInMHz)) + Math.abs(signalLevelInDb)) / 20.0;
return Math.pow(10.0, exp);
}
The formula used is:
distance = 10 ^ ((27.55 - (20 * log10(frequency)) + signalLevel)/20)
Example: frequency = 2412MHz, signalLevel = -57dbm, result = 7.000397427391188m
This formula is transformed form of Free Space Path Loss(FSPL) formula. Here the distance is measured in meters and the frequency - in megahertz. For other measures you have to use different constant (27.55). Read for the constants here.
For more information read here.
K = 32.44
FSPL = Ptx - CLtx + AGtx + AGrx - CLrx - Prx - FM
d = 10 ^ (( FSPL - K - 20 log10( f )) / 20 )
Here:
K - constant (32.44, when f in MHz and d in km, change to -27.55 when f in MHz and d in m)
FSPL - Free Space Path Loss
Ptx - transmitter power, dBm ( up to 20 dBm (100mW) )
CLtx, CLrx - cable loss at transmitter and receiver, dB ( 0, if no cables )
AGtx, AGrx - antenna gain at transmitter and receiver, dBi
Prx - receiver sensitivity, dBm ( down to -100 dBm (0.1pW) )
FM - fade margin, dB ( more than 14 dB (normal) or more than 22 dB (good))
f - signal frequency, MHz
d - distance, m or km (depends on value of K)
Note: there is an error in formulas from TP-Link support site (mising ^).
Substitute Prx with received signal strength to get a distance from WiFi AP.
Example: Ptx = 16 dBm, AGtx = 2 dBi, AGrx = 0, Prx = -51 dBm (received signal strength), CLtx = 0, CLrx = 0, f = 2442 MHz (7'th 802.11bgn channel), FM = 22. Result: FSPL = 47 dB, d = 2.1865 m
Note: FM (fade margin) seems to be irrelevant here, but I'm leaving it because of the original formula.
You should take into acount walls, table http://www.liveport.com/wifi-signal-attenuation may help.
Example: (previous data) + one wooden wall ( 5 dB, from the table ). Result: FSPL = FSPL - 5 dB = 44 dB, d = 1.548 m
Also please note, that antena gain dosn't add power - it describes the shape of radiation pattern (donut in case of omnidirectional antena, zeppelin in case of directional antenna, etc).
None of this takes into account signal reflections (don't have an idea how to do this). Probably noise is also missing. So this math may be good only for rough distance estimation.
the simple answer to your question would be Triangulation. Which is essentially the concept in all GPS devices, I would give this article a read to learn more about how Google goes about doing this: http://www.computerworld.com/s/article/9127462/FAQ_How_Google_Latitude_locates_you_?taxonomyId=15&pageNumber=2.
From my understanding, they use a service similar to Skyhook, which is a location software that determines your location based on your wifi/cellphone signals. In order to achieve their accuracy, these services have large servers of databases that store location information on these cell towers and wifi access points - they actually survey metropolitan areas to keep it up to date. In order for you to achieve something similar, I would assume you'd have to use a service like Skyhook - you can use their SDK ( http://www.skyhookwireless.com/location-technology/ ).
However, if you want to do something internal (like using your own routers' locations) - then you'd likely have to create an algorithm that mimics Triangulation. You'll have to find a way to get the signal_strength and mac_address of the device and use that information along with the locations of your routers to come up with the location. You can probably get the information about devices hooked up to your routers by doing something similar to this ( http://www.makeuseof.com/tag/check-stealing-wifi/ ).
Distance (km) = 10^((Free Space Path Loss – 92.45 – 20log10(f))/20)
In general, this is a really bad way of doing things due to multipath interference. This is definitely more of an RF engineering question than a coding one.
Tl;dr, the wifi RF energy gets scattered in different directions after bouncing off walls, people, the floor etc. There's no way of telling where you are by trianglation alone, unless you're in an empty room with the wifi beacons placed in exactly the right place.
Google is able to get away with this because they essentially can map where every wifi SSID is to a GPS location when any android user (who opts in to their service) walks into range. That way, the next time a user walks by there, even without a perfect GPS signal, the google mothership can tell where you are. Typically, they'll use that in conjunction with a crappy GPS signal.
What I have seen done is a grid of Zigbee or BTLE devices. If you know where these are laid out, you can used the combined RSS to figure out relatively which ones you're closest to, and go from there.
Don't care if you are a moderator. I wrote my text towards my audience not as a technical writer
All you guys need to learn to navigate with tools that predate GPS. Something like a sextant, octant, backstaff or an astrolabe.
If you have receive the signal from 3 different locations then you only need to measure the signal strength and make a ratio from those locations. Simple triangle calculation where a2+b2=c2. The stronger the signal strength the closer the device is to the receiver.

Resources