I am wondering if the magnitude value of a CLLocationAccuracy object represents distance in meters. It is described like this in Xcode docs:
Description: The magnitude of this value. //this value being accuracy
value?
For any value x, x.magnitude.sign is .plus. If x is not NaN,
x.magnitude is the absolute value of x. The global abs(:) function
provides more familiar syntax when you need to find an absolute value.
In addition, because abs(:) always returns a value of the same type,
even in a generic context, using the function instead of the magnitude
property is encouraged.
Listing 1
let targetDistance: Double = 5.25
let throwDistance: Double = 5.5
let margin = targetDistance - throwDistance // margin == -0.25 //
margin.magnitude == 0.25
// Use 'abs(_:)' instead of 'magnitude' print("Missed the target by
(abs(margin)) meters.") // Prints "Missed the target by 0.25 meters."
How does the magnitude relate to distance, if at all? I can see that it is different from the raw proximity value because it goes higher than 3.
For example (from console output of testing):
[CLBeacon (uuid:12345678-B644-4520-8F0C-720EAF059935, major:1, minor:3, proximity:3 +/- 5.39m, rssi:-71)] //a beacon that is being ranged
major: 1
minor: 3
accuracy: 5.38695083568272
2.64546246474154 --- magnitude
You can see the accuracy is 5.3869... and the proximity value is 3...and magnitude is 2.6454 - how do they all relate?
The accuracy value attempts to estimate the distance form the beacon in meters. This was confirmed in a private Apple forum several years ago by an Apple support engineer, and it is consistent with my testing. The docs for the property say is is the "one sigma horizontal accuracy in meters where the measuring device's location is referenced at the beaconing device." This simply means that the best guess of iOS is that your distance from the beacon is about the accuracy value in meters.
The proximity value is simply an enumeration that represents the following values and their raw integer equivalents:
unknown 0
immediate 1
near 2
far 3
These are basically distance "buckets" derived from the accuracy field. An accuracy of 0-0.5 will give you an immediate proximity. An accuracy value of 0.5-3 meters will give you a near proximity and an accuracy value > 3 will give you far. Unknown is returned if the accuracy cannot be computed (it typically returns -1 in this case.)
The documentation shown in the question for "magnitude" is about a mathematical function related to absolute value. It has nothing to do with beacons and is not related to accuracy and proximity of a CLBeacon.
Accuracy is simply giving you a value on how accurate the data is.
At least for CLLocation we use different params such as kCLLocationAccuracyKilometer, among others. Docs at: CLLocation.desiredAccuracy
Proximity is an obvious one, how close you are to the beacon (proximity :3 +/- 5.39m) between 3 to 5.4 meters.
Magnitude in general terms is simply an absolute value, but unsure how it fits into your problem... pretty sure its useless unless someone can correct me.
First time posting, thanks for the great community!
I am using AudioKit and trying to add frequency weighting filters to the microphone input and so I am trying to understand the values that are coming out of the AudioKit AKFFTTap.
Currently I am trying to just print the FFT buffer converted into dB values
for i in 0..<self.bufferSize {
let db = 20 * log10((self.fft?.fftData[Int(i)])!)
print(db)
}
I was expecting values ranging in the range of about -128 to 0, but I am getting strange values of nearly -200dB and when I blow on the microphone to peg out the readings it only reaches about -60. Am I not approaching this correctly? I was assuming that the values being output from the EZAudioFFT engine would be plain amplitude values and that the normal dB conversion math would work. Anyone have any ideas?
Thanks in advance for any discussion about this issue!
You need to add all of the values from self.fft?.fftData (consider changing negative values to positive before adding) and then change that to decibels
The values in the array correspond to the values of the bins in the FFT. Having a single bin contain a magnitude value close to 1 would mean that a great amount of energy is in that narrow frequency band e.g. a very loud sinusoid (a signal with a single frequency).
Normal sounds, such as the one caused by you blowing on the mic, spread their energy across the entire spectrum, that is, in many bins instead of just one. For this reason, usually the magnitudes get lower as the FFT size increases.
Magnitude of -40dB on a single bin is quite loud. If you try to play a tone, you should see a clear peak in one of the bins.
I am a master student working with localization in VANEts
in this moment I am working on a trilateration method based on RSSI for
Cooperative Positioning (CP).
I am considering the Analogue Model : Simple Path Loss Model
But I have some doubts in how to calculate the distance correctly for a determined Phy Model.
I spent some time (one day) reading some papers of Dr. Sommer about the PHY models included in veins.
Would anyone help-me with this solution?
I need a way to:
1) Measure the power of an receiver when its receive a beacon (I found this in the Decider class).
In the Decider802.11p the received Power can be obtained with this line in method Decider80211p::processSignalEnd(AirFrame* msg):
double recvPower_dBm = 10*log10(signal.getReceivingPower()->getValue(start));
2) Apply a formula of RSSI accordingly the phy model in order to achieve a distance estimation between transmiter and receiver.
3) Asssociate this measure (distance by RSSI) with the Wave Short Message to be delivered in AppLayer of the receiver (that is measuring the RSSI).
After read the paper "On the Applicability of Two-Ray Path Loss Models for Vehicular Network Simulation"
and the paper "A Computationally Inexpensive Empirical Model of IEEE 802.11p Radio Shadowing in Urban Environments"
and investigating how it works in the veins project. I noticed that each analogue model have your own path loss model
with your own variables to describe the model.
For example for the SimplePathLossModel we have these
variables defined on AnalogueModels folder of veins modules:
lambda = 0.051 m (wave length to IEEE 802.11p CCH center frequency of 5.890 GHz)
A constant alpha = 2 (default value used)
a distance factor given by pow(sqrDistance, -pathLossAlphaHalf) / (16.0 * M_PI * M_PI);
I found one formula for indoor environments in this link, but I am in doubt if it is applicable for vehicular environments.
Any clarification is welcome. Thanks a lot.
Technically, you are correct. Indeed, you could generate a simple look-up table: have one vehicle drive past another one, record distance and RSSIs, and you have a table that can map RSSI to mean distance (without knowing how the TX power, antenna gains, path loss model, fading models, etc, are configured).
In the simplest case, if you assume that antennas are omnidirectional, that path loss follows the Friis transmission equation, that no shadow fading occurs, and that fast fading is negligible, your table will be perfect.
In a more complicated case, where your simulation also includes probabilistic fast fading (say, a Nakagami model), shadow fading due to radio obstacles (buildings), etc. your table will still be roughly correct, but less so.
It is important to consider a real-life application, though. Consider if your algorithm still works if conditions change (more reflective road surface changing reflection parameters, buildings blocking more or less power, antennas with non-ideal or even unknown gain characteristics, etc).
Trying to grasp a basic concept of how distancing with ibeacon (beacon/ Bluetooth-lowenergy/BLE) can work. Is there any true documentation on how far exactly an ibeacon can measure. Lets say I am 300 feet away...is it possible for an ibeacon to detect this?
Specifically for v4 &. v5 and with iOS but generally any BLE device.
How does Bluetooth frequency & throughput affect this? Can beacon devices enhance or restrict the distance / improve upon underlying BLE?
ie
| Range | Freq | T/sec | Topo |
|–—–––––––––––|–—––––––––––|–—––––––––––|–—––––––––––|
Bluetooth v2.1 | Up to 100 m | < 2.481ghz | < 2.1mbit | scatternet |
|-------------|------------|------------|------------|
Bluetooth v4 | ? | < 2.481ghz | < 305kbit | mesh |
|-------------|------------|------------|------------|
Bluetooth v5 | ? | < 2.481ghz | < 1306kbit | mesh |
The distance estimate provided by iOS is based on the ratio of the beacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each beacon must be calibrated with this txPower value to allow accurate distance estimates.
While the distance estimates are useful, they are not perfect, and require that you control for other variables. Be sure you read up on the complexities and limitations before misusing this.
When we were building the Android iBeacon library, we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.
Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.
protected static double calculateAccuracy(int txPower, double rssi) {
if (rssi == 0) {
return -1.0; // if we cannot determine accuracy, return -1.
}
double ratio = rssi*1.0/txPower;
if (ratio < 1.0) {
return Math.pow(ratio,10);
}
else {
double accuracy = (0.89976)*Math.pow(ratio,7.7095) + 0.111;
return accuracy;
}
}
Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV
I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.
davidgyoung (accepted answer, > 100 upvotes) says:
Note that the term "accuracy" here is iOS speak for distance in meters.
Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:
Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.
Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.
Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_error, http://en.wikipedia.org/wiki/Uncertainty
In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:
mean (which is easy to understand, it's value for which peak of the curve occurs).
standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.
one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.
Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.
When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!
It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.
There is a blog post which thoroughly analyses the matter: http://blog.shinetech.com/2014/02/17/the-beacon-experiments-low-energy-bluetooth-devices-in-action/
Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.
I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.
It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.
The iBeacon output power is measured (calibrated) at a distance of 1 meter. Let's suppose that this is -59 dBm (just an example). The iBeacon will include this number as part of its LE advertisment.
The listening device (iPhone, etc), will measure the RSSI of the device. Let's suppose, for example, that this is, say, -72 dBm.
Since these numbers are in dBm, the ratio of the power is actually the difference in dB. So:
ratio_dB = txCalibratedPower - RSSI
To convert that into a linear ratio, we use the standard formula for dB:
ratio_linear = 10 ^ (ratio_dB / 10)
If we assume conservation of energy, then the signal strength must fall off as 1/r^2. So:
power = power_at_1_meter / r^2. Solving for r, we get:
r = sqrt(ratio_linear)
In Javascript, the code would look like this:
function getRange(txCalibratedPower, rssi) {
var ratio_db = txCalibratedPower - rssi;
var ratio_linear = Math.pow(10, ratio_db / 10);
var r = Math.sqrt(ratio_linear);
return r;
}
Note, that, if you're inside a steel building, then perhaps there will be internal reflections that make the signal decay slower than 1/r^2. If the signal passes through a human body (water) then the signal will be attenuated. It's very likely that the antenna doesn't have equal gain in all directions. Metal objects in the room may create strange interference patterns. Etc, etc... YMMV.
iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).
Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.
A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.
As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.
With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.
You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.
I use Google for hours but I do not find a satisfying answer how to interpret the captured signal strength, given by the radiotap header. For instance Wireshark shows me a SSI Signal of -52 dBm and I want to convert it to a linear representation/unit. For me, a sensible unit would be the signal power at the antenna in Watt over mW. Is it possible to convert this -52 dBm to mW?
Some background information: I implement a WLAN-based localisation and want to estimate the position of APs by combining some reference points and the measured signal strength. With the help of triangulation, this should produce a rough map of the environment.
As explained quite well on the Wikipedia page, you can convert from x dBm to P Watts by applying:
P = 10**((x-30)/10)
So in your case, -52 dBm = 6.3 nW, which seems about right.