iPhone SDK: CLocationAccuracy. What constants map to what positioning technology? - ios

With respect to CLocationManager docs....
Constant values you can use to specify the accuracy of a location.
extern const CLLocationAccuracy kCLLocationAccuracyBestForNavigation;
extern const CLLocationAccuracy kCLLocationAccuracyBest;
extern const CLLocationAccuracy kCLLocationAccuracyNearestTenMeters;
extern const CLLocationAccuracy kCLLocationAccuracyHundredMeters;
extern const CLLocationAccuracy kCLLocationAccuracyKilometer;
extern const CLLocationAccuracy kCLLocationAccuracyThreeKilometers;
Given that, I have the following questions.
What triangulation method (GPS, cell tower or wi-fi) corresponds to each accuracy level?
Does iPhone SDK utilize Skyhook Wireless API?
For kCLLocationAccuracyBestForNavigation, there is note stating the phone must be plugged in. Is this enforced or is it just warning the developer the battery is likely to drain quick from using the GPS receiver.
Thanks in advance.

Cell tower triangulation is used in all accuracy modes, I believe, to speed up an initial fix. See Wikipedia.
I am myself playing around with kCLLocationAccuracyBestForNavigation. The answer to the question is clearly no. It works perfectly anyway, and for a few hours of data gathering, I can not tell the difference in power consumption (but then I'm doing some CPU-intensive tasks anyway...). My guess from reading the docs is that it uses some Kalman filtering with e.g. accelerometer data, to reduce the number of "off-points".

Related

Is my experiment suggesting the ESP8266 NodeMCU 12E only has maximum ADC sampling rate of 6kHz?

As my first IoT type of project, I tried to hook up a pre-amplified electret microphone to an ESP8266 NodeMCU 12E development board, then stream that audio to another computer over the internet. I tried various protocols and approaches to streaming the audio data, but my best effort still resulted in a distorted audio quality on the receiving end.
I began troubleshooting my poor signal quality by investigating the sampling rate of the analogRead(A0). I wrote the code below to show me how much time it takes to sample 1000 analog data points:
#include <ESP8266WiFi.h>
const char* ssid = "___";
const char* pass = "___";
char pay_load[1000] = {0};
int pay_load_length = sizeof(pay_load)/sizeof(pay_load[0]);
void setup()
{
Serial.begin(115200);
Serial.println(0); //start
WiFi.mode(WIFI_STA);
WiFi.begin(ssid, pass);
}
void loop()
{
int count = micros();
for(int i=0; i<pay_load_length;i++){
int analog = analogRead(A0);
}
Serial.println(micros()-count);
}
The serial monitor printed values of about 166000 with plus or minus about 100. Does this mean it takes about 166us to get a single value from analogRead(A0)? Which would mean as it stands now, I have a maximum analog to digital sampling rate of 6khz? And this low frequency is a major contribution to my poor signal quality? (When I increase the pay_load length, the for loop execution time increased linearly).
I just want to make sure my experiment above is not mis-leading me into thinking it is difficult for the ESP8266 to capture (and eventually transmit) stereo quality audio.
Related Note: I've been trying to figure out how to write an ISR using hardware timer, because others on stackoverflow suggested that could help me sample analog data points with greater frequency without jitter. But I feel like I'm getting lost in a maze because I have FreeRTOS, which doesn't support hardware timer, and now have to research how to get nonos sdk instead..and not sure what challenge lies after that. Just wondering if getting an arduino with an ethernet shield is just easier?

How do magnitude, accuracy, and proximity relate to each other with regards to CLLocationAccuracy

I am wondering if the magnitude value of a CLLocationAccuracy object represents distance in meters. It is described like this in Xcode docs:
Description: The magnitude of this value. //this value being accuracy
value?
For any value x, x.magnitude.sign is .plus. If x is not NaN,
x.magnitude is the absolute value of x. The global abs(:) function
provides more familiar syntax when you need to find an absolute value.
In addition, because abs(:) always returns a value of the same type,
even in a generic context, using the function instead of the magnitude
property is encouraged.
Listing 1
let targetDistance: Double = 5.25
let throwDistance: Double = 5.5
let margin = targetDistance - throwDistance // margin == -0.25 //
margin.magnitude == 0.25
// Use 'abs(_:)' instead of 'magnitude' print("Missed the target by
(abs(margin)) meters.") // Prints "Missed the target by 0.25 meters."
How does the magnitude relate to distance, if at all? I can see that it is different from the raw proximity value because it goes higher than 3.
For example (from console output of testing):
[CLBeacon (uuid:12345678-B644-4520-8F0C-720EAF059935, major:1, minor:3, proximity:3 +/- 5.39m, rssi:-71)] //a beacon that is being ranged
major: 1
minor: 3
accuracy: 5.38695083568272
2.64546246474154 --- magnitude
You can see the accuracy is 5.3869... and the proximity value is 3...and magnitude is 2.6454 - how do they all relate?
The accuracy value attempts to estimate the distance form the beacon in meters. This was confirmed in a private Apple forum several years ago by an Apple support engineer, and it is consistent with my testing. The docs for the property say is is the "one sigma horizontal accuracy in meters where the measuring device's location is referenced at the beaconing device." This simply means that the best guess of iOS is that your distance from the beacon is about the accuracy value in meters.
The proximity value is simply an enumeration that represents the following values and their raw integer equivalents:
unknown 0
immediate 1
near 2
far 3
These are basically distance "buckets" derived from the accuracy field. An accuracy of 0-0.5 will give you an immediate proximity. An accuracy value of 0.5-3 meters will give you a near proximity and an accuracy value > 3 will give you far. Unknown is returned if the accuracy cannot be computed (it typically returns -1 in this case.)
The documentation shown in the question for "magnitude" is about a mathematical function related to absolute value. It has nothing to do with beacons and is not related to accuracy and proximity of a CLBeacon.
Accuracy is simply giving you a value on how accurate the data is.
At least for CLLocation we use different params such as kCLLocationAccuracyKilometer, among others. Docs at: CLLocation.desiredAccuracy
Proximity is an obvious one, how close you are to the beacon (proximity :3 +/- 5.39m) between 3 to 5.4 meters.
Magnitude in general terms is simply an absolute value, but unsure how it fits into your problem... pretty sure its useless unless someone can correct me.

AudioKit - Issues getting specific frequencies (above 20kHz)

I got a small problem using the AudioKit framework:
----> I can't get the AudioKit framework to pick up special frequencies above and below a specific amount. (frequency below 100Hz & above 20kHz)
Edit: I've tested some frequency-tracker apps on my iOS device combined with some online-tonegenerator tools to check if my iPhone microphone is able to pick up those frequency above 20000Hz... And it is.
But using the default frequency tracker code snippets: Frequency above 20000Hz can not be picked up by AudioKit.
mic = AKMicrophone()
tracker = AKFrequencyTracker(mic)
silence = AKBooster(tracker, gain: 0)
AKSettings.audioInputEnabled = true
AudioKit.output = silence
AudioKit.start()
print(tracker.frequency)
---> Is this is a limitation of the AudioKit framework, a limitation of the default iOS-device settings- or is there maybe another way to achieve getting those frequencies?
This is a limitation in AudioKit. The Frequency Tracker is based on a soundpipe opcode called ptrack which is in turn based off Csound's ptrack. You could modify the parameters to try to givie you rmore precision in the area of your concern, but if you are concerned with better results both low and high, that will be very processor intensive. Perhaps using two "banded" trackers for low and high would be better. There are always choices to be made and AudioKit's frequency tracker is geared towards typical audible ranges.

Numeric distance from ibeacon

How can i get a numeric distance from ibeacon:
NSString *proximity;
switch (beacon.proximity) {
case CLProximityNear:
proximity = #"Near";
break;
case CLProximityImmediate:
proximity = #"Immediate";
break;
case CLProximityFar:
proximity = #"Far";
break;
case CLProximityUnknown:
default:
proximity = #"Unknown";
break;
}
I want to have values like 2.4m
There's a great answer here on Stack Overflow: What are the nominal distances for iBeacon. The simple answer is, there is no numerical value you can easily extrapolate. The longer answer is to get a numerical value you'd either need multiple iBeacons, or a lot of luck to be able to generate a figure that's accurate.
If you're using Estimote beacons,they have property distance which automatically calculate distance between device and beacon
There is no simple and reliable way to do this. You can recon it base on the proximity value (immediate,near,far). Moreover you can try to play with RSSI (which is changing all the time, sometimes quite rapidly) and compare it with TxPower which is a measured signal strength at a distance of 1 meter form beacon. As I mentioned at the beginning it won't be a reliable method but you can try it how well it will work for you. I do not expect that those value will change in a linear way but maybe you will find a good method that will work in your case.
Accuracy parameter returns numeric distance in meters.
Proximity values like Immediate, Near, Far and Unknown are predefined by apple on the basis of RSSI (Recieved signal strength intensity), Accuracy (distance in meters) and Tx Power (Measured signal strength at 1m distance from beacon).
If you want to create your own algorithm you can always use parameters those parameters but they are not reliable as RSSI fluctuates too much.
So you have to do testing for optimum values of these parameters according to your surroundings.

Understanding ibeacon distancing

Trying to grasp a basic concept of how distancing with ibeacon (beacon/ Bluetooth-lowenergy/BLE) can work. Is there any true documentation on how far exactly an ibeacon can measure. Lets say I am 300 feet away...is it possible for an ibeacon to detect this?
Specifically for v4 &. v5 and with iOS but generally any BLE device.
How does Bluetooth frequency & throughput affect this? Can beacon devices enhance or restrict the distance / improve upon underlying BLE?
ie
| Range | Freq | T/sec | Topo |
|–—–––––––––––|–—––––––––––|–—––––––––––|–—––––––––––|
Bluetooth v2.1 | Up to 100 m | < 2.481ghz | < 2.1mbit | scatternet |
|-------------|------------|------------|------------|
Bluetooth v4 | ? | < 2.481ghz | < 305kbit | mesh |
|-------------|------------|------------|------------|
Bluetooth v5 | ? | < 2.481ghz | < 1306kbit | mesh |
The distance estimate provided by iOS is based on the ratio of the beacon signal strength (rssi) over the calibrated transmitter power (txPower). The txPower is the known measured signal strength in rssi at 1 meter away. Each beacon must be calibrated with this txPower value to allow accurate distance estimates.
While the distance estimates are useful, they are not perfect, and require that you control for other variables. Be sure you read up on the complexities and limitations before misusing this.
When we were building the Android iBeacon library, we had to come up with our own independent algorithm because the iOS CoreLocation source code is not available. We measured a bunch of rssi measurements at known distances, then did a best fit curve to match our data points. The algorithm we came up with is shown below as Java code.
Note that the term "accuracy" here is iOS speak for distance in meters. This formula isn't perfect, but it roughly approximates what iOS does.
protected static double calculateAccuracy(int txPower, double rssi) {
if (rssi == 0) {
return -1.0; // if we cannot determine accuracy, return -1.
}
double ratio = rssi*1.0/txPower;
if (ratio < 1.0) {
return Math.pow(ratio,10);
}
else {
double accuracy = (0.89976)*Math.pow(ratio,7.7095) + 0.111;
return accuracy;
}
}
Note: The values 0.89976, 7.7095 and 0.111 are the three constants calculated when solving for a best fit curve to our measured data points. YMMV
I'm very thoroughly investigating the matter of accuracy/rssi/proximity with iBeacons and I really really think that all the resources in the Internet (blogs, posts in StackOverflow) get it wrong.
davidgyoung (accepted answer, > 100 upvotes) says:
Note that the term "accuracy" here is iOS speak for distance in meters.
Actually, most people say this but I have no idea why! Documentation makes it very very clear that CLBeacon.proximity:
Indicates the one sigma horizontal accuracy in meters. Use this property to differentiate between beacons with the same proximity value. Do not use it to identify a precise location for the beacon. Accuracy values may fluctuate due to RF interference.
Let me repeat: one sigma accuracy in meters. All 10 top pages in google on the subject has term "one sigma" only in quotation from docs, but none of them analyses the term, which is core to understand this.
Very important is to explain what is actually one sigma accuracy. Following URLs to start with: http://en.wikipedia.org/wiki/Standard_error, http://en.wikipedia.org/wiki/Uncertainty
In physical world, when you make some measurement, you always get different results (because of noise, distortion, etc) and very often results form Gaussian distribution. There are two main parameters describing Gaussian curve:
mean (which is easy to understand, it's value for which peak of the curve occurs).
standard deviation, which says how wide or narrow the curve is. The narrower curve, the better accuracy, because all results are close to each other. If curve is wide and not steep, then it means that measurements of the same phenomenon differ very much from each other, so measurement has a bad quality.
one sigma is another way to describe how narrow/wide is gaussian curve.
It simply says that if mean of measurement is X, and one sigma is σ, then 68% of all measurements will be between X - σ and X + σ.
Example. We measure distance and get a gaussian distribution as a result. The mean is 10m. If σ is 4m, then it means that 68% of measurements were between 6m and 14m.
When we measure distance with beacons, we get RSSI and 1-meter calibration value, which allow us to measure distance in meters. But every measurement gives different values, which form gaussian curve. And one sigma (and accuracy) is accuracy of the measurement, not distance!
It may be misleading, because when we move beacon further away, one sigma actually increases because signal is worse. But with different beacon power-levels we can get totally different accuracy values without actually changing distance. The higher power, the less error.
There is a blog post which thoroughly analyses the matter: http://blog.shinetech.com/2014/02/17/the-beacon-experiments-low-energy-bluetooth-devices-in-action/
Author has a hypothesis that accuracy is actually distance. He claims that beacons from Kontakt.io are faulty beacuse when he increased power to the max value, accuracy value was very small for 1, 5 and even 15 meters. Before increasing power, accuracy was quite close to the distance values. I personally think that it's correct, because the higher power level, the less impact of interference. And it's strange why Estimote beacons don't behave this way.
I'm not saying I'm 100% right, but apart from being iOS developer I have degree in wireless electronics and I think that we shouldn't ignore "one sigma" term from docs and I would like to start discussion about it.
It may be possible that Apple's algorithm for accuracy just collects recent measurements and analyses the gaussian distribution of them. And that's how it sets accuracy. I wouldn't exclude possibility that they use info form accelerometer to detect whether user is moving (and how fast) in order to reset the previous distribution distance values because they have certainly changed.
The iBeacon output power is measured (calibrated) at a distance of 1 meter. Let's suppose that this is -59 dBm (just an example). The iBeacon will include this number as part of its LE advertisment.
The listening device (iPhone, etc), will measure the RSSI of the device. Let's suppose, for example, that this is, say, -72 dBm.
Since these numbers are in dBm, the ratio of the power is actually the difference in dB. So:
ratio_dB = txCalibratedPower - RSSI
To convert that into a linear ratio, we use the standard formula for dB:
ratio_linear = 10 ^ (ratio_dB / 10)
If we assume conservation of energy, then the signal strength must fall off as 1/r^2. So:
power = power_at_1_meter / r^2. Solving for r, we get:
r = sqrt(ratio_linear)
In Javascript, the code would look like this:
function getRange(txCalibratedPower, rssi) {
var ratio_db = txCalibratedPower - rssi;
var ratio_linear = Math.pow(10, ratio_db / 10);
var r = Math.sqrt(ratio_linear);
return r;
}
Note, that, if you're inside a steel building, then perhaps there will be internal reflections that make the signal decay slower than 1/r^2. If the signal passes through a human body (water) then the signal will be attenuated. It's very likely that the antenna doesn't have equal gain in all directions. Metal objects in the room may create strange interference patterns. Etc, etc... YMMV.
iBeacon uses Bluetooth Low Energy(LE) to keep aware of locations, and the distance/range of Bluetooth LE is 160ft (http://en.wikipedia.org/wiki/Bluetooth_low_energy).
Distances to the source of iBeacon-formatted advertisement packets are estimated from the signal path attenuation calculated by comparing the measured received signal strength to the claimed transmit power which the transmitter is supposed to encode in the advertising data.
A path loss based scheme like this is only approximate and is subject to variation with things like antenna angles, intervening objects, and presumably a noisy RF environment. In comparison, systems really designed for distance measurement (GPS, Radar, etc) rely on precise measurements of propagation time, in same cases even examining the phase of the signal.
As Jiaru points out, 160 ft is probably beyond the intended range, but that doesn't necessarily mean that a packet will never get through, only that one shouldn't expect it to work at that distance.
With multiple phones and beacons at the same location, it's going to be difficult to measure proximity with any high degree of accuracy. Try using the Android "b and l bluetooth le scanner" app, to visualize the signal strengths (distance) variations, for multiple beacons, and you'll quickly discover that complex, adaptive algorithms may be required to provide any form of consistent proximity measurement.
You're going to see lots of solutions simply instructing the user to "please hold your phone here", to reduce customer frustration.

Resources