Location with iBeacon - ios

I am using an iBeacon, and using triangulation and trilateration (or something similar), want to be able to locate an exact (or fairly accurate) distance between the iBeacon and user's device (in feet/metres/e.t.c). What is the best way to do this, and how would I do this?
I forgot to mention: I understand that it is possible to find proximity (i.e near, immediate, far, etc.), however as mentioned, ideally I am looking to find an accurate distance (maybe by combining RSSI, accuracy, and proximity values).

For this you should use RSSI (Received Signal Strength Indication) of an iBeacon. The signal strength determines how close or far it is from you. But the problem is that:
Every beacon's RSSI might differ distance, accuracy.
If beacon is behind the wall or any static obstacle the RSSI-Distance-Ratio will not work.
Therefore instead of Triangulation or Trilateration you should go for Fingerprinting. This will work better then rest of the techniques.
Place obstacles all around you.
Make reference points on your map.
Calibrate your app with that location i.e. Get the signal strengths from atleast 3 nearest iBeacons and save it against that reference points.
Do this for all other reference points.
(If you can) Do this twice or thrice and take average and store in database.
Now you have laid map of calibrated reference points. (This will handle all different RSSI-DIstance-Ratios of all the beacons)
Now whenever you are at any position compare it with the nearest point and you will get to know the closest location of your reference point.
If you are using google maps, the lat long they provide is upto six decimal place i.e. 0.11 meters which i think is preety much accurate in a room as well.
I guess this helps :)
Please mark this the right answer if it works.

In iOS the Core Location beacon information you get when you range a beacon includes both a "proximity" value (far/near/immediate) and an "accuracy" reading, which is actually approximate distance, in meters.
In order for the distance reading to be as accurate as possible, you should really calibrate your beacons. To do that, you put the beacon exactly 1 meter from the receiver and take a reading. The receiver gives you a "measured power" reading, which you then set on the transmitter. The measured power reading is used in calculating the distance reading.
Distance readings are very approximate, and are subject to interference from the surroundings.
The Apple sample app "AirLocate" shows working code for calibrating a beacon, and I believe it also displays

Related

How to achieve a more accurate distance from device to Beacon?

I am sorry if this has been asked in one way shape or another. I have started working with beacons, and in Xcode (Swift) - using CoreLocation. I really need a more accurate determination between the device and a beacon though. So far I have been using the standard proximity region values (Far, Near, and Immediate), however this just isn't cutting it at all. It seems far too unstable for the solution I am looking for - which is a simple one at best.
My scenario;
I need to display notifications, adverts, images etc to the users device when they are approximately 4 meters away from the beacon. This sounds simple enough, but when I found out that the only real solutions there are for beacons are those aforementioned proximity regions, I started to get worried because I need to only display to devices that are 3-5 meters away, no more.
I am aware of the accuracy property of the CLBeacon class, however Apple state it should not be used for accurate positioning of beacons, which I believe is what I am trying to achieve.
Is there a solution to this? Any help is appreciated!
Thanks,
Olly
There are limitations of physics when it comes to estimating distance with Bluetooth radio signals. Radio noise, signal reflections, and obstructions all affect the ability to estimate distance based on radio signal strength. It's OK to use beacons for estimating distance, but you must set your expectations appropriately.
Apple's algorithms in CoreLocation take a running average of the measured signal strength over 20 seconds or so, then come up with a distance estimate in meters that is put into the CLBeacon accuracy field. The results of this field are then used to come up with the proximity field. (0.5 meters or less means immediate, 0.5-3 meters means near, etc.)
When Apple recommends against using the accuracy field, it is simply trying to protect you against unrealistic expectations. This will never be an exact estimate in meters. Best results will come with a phone out of a pocket, with no obstructions between the beacon and the phone, and the phone relatively stationary. Under best conditions, you might expect to get distance estimates of +/- 1 meter at close distances of 3 meters or less. The further you get away, the more variation you will see.
You have to decide if this is good enough for your use case. If you can control the beacons there are a few things you can do to make the results as good as possible:
Turn the beacon transmitter power setting up as high as possible. This gives you a higher signal to noise ratio, hence better distance estimates.
Turn the advertising rate up as high as possible. This gives you more statistical samples, hence better distance estimates.
Place your beacons in locations where there will be as few obstructions as possible.
Always calibrate your beacon after making the changes like above. Calibration involves measuring the signal level at 1 meter and storing this as a calibration constant inside the beacon. Consult your beacon manufacturer instructions for details of how to do this calibration.

What's the difference between distance from CLLocationManager and CMPedometer

I'm writing a running app according to an online tutorial on http://www.raywenderlich.com/97944/make-app-like-runkeeper-swift-part-1.
In the tutorial,the distance the user has run since the start is calculated from two latest recorded locations using "distanceFromLocation" method in CLLocation. However in CMPedometer there's also a distance data which can be retrieved directly. So which one should I use and why?
Thanks
CMPedometer relies on the motion tracking chips built into modern iPhones to measure steps and distance travelled by the owner of the device. It is able to estimate the number of steps taken using motion data, and extrapolate the distance traveled by the user using step counts and estimated stride length. If a distance estimate is good enough for your purposes, then CMPedometer is an easy, power efficient solution to tracking distance travelled.
On the other hand, if you would like the reported distance to be as accurate as possible, you should use CLLocation and calculate the distance between each location the user travels through on their workout. This requires more complex code and an accurate GPS signal. As an added benefit, you'll be able to use the location data to, for instance, draw a map of where the user ran on their workout.

Determining Distance and Direction of iDevice from Beacon

I am developing an iOS application in which I need to know the exact distance and direction of the device from the beacon. I am using Estimote beacon.
I have used iOS's CLLocation as well as Estimote's framework but both of them give an incorrect value for the distance. Moreover, the values fluctuate a lot, the beacon even goes into unknown state (accuracy -1.000) a lot of times.
I have also tried to use the formula given here:
Understanding ibeacon distancing
but in iOS, it seems there is no way to get the txPower or measured power of Beacon.
I have searched a lot but nowhere I found a satisfactory way to find the distance accurately.
is there any other way which can help me in finding accurately the distance and direction of iOS device from Beacon?
The distance is computed by comparing the received signal strength (RSSI) with the advertised transmitted power (txPower) of the beacon, as the signal strength in theory is inversely proportional to the square of the distance.
But there are lots of other things that can affect RSSI, including obstacles, orientation of the antennas, and possibly multi-path (reflections). So it's difficult to accurately measure distance based on this information.
Another way of measuring distance is using round-trip-time (RTT): you send something to the beacon, and you measure how long it takes to come back. But this requires a fixed response time, and on this sort of scale (meters), there are probably enough variable delays here and there that it might severely affect the calculation.
Direction would require either triangulation or multiple directional antennas, I don't believe that's the case in this scenario.
In short, you can get a rough idea of the distance (which is why it's good for proximity alerts), but accurate distance or direction would require different technologies.
Why do you need them? There may be alternatives based on your specific scenario.
EDIT
If you have a large number of beacons around, and you know their exact positions, it might be possible to pull off the following:
use at least 3 beacon distances to compute your exact position by triangulation
from there, as you know the position of the beacons, you can compute the distance and direction of any of the beacons (or anything else, really)
Of course, depending on the actual accuracy of the beacon distance measurement provided by the SDK, the result might be more or less accurate. The more beacons you have, the more precise you should be able to get (by picking only those that return a distance, or by eliminating those that are not "compatible" with the others when computing solutions).
Even having 3 or more beacons with fixed positions, you still won't be able to receive very accurate positioning without some serious and complex noise reduction. That's because radio waves are prone to being affected by diffraction, multipath propagation, interference and absorption - mostly by metal objects and water particles (therefore human bodies are strong signal blockers). Even phone's alignment (antenna position) can have a significant impact on the proximity readings. Therefore, without implementing alorithms for noise reduction, trilateration can give you accuracy of about 5 meters.
You can find some examples in Obj-C (https://github.com/MatVre/MiBeaconTrilaterationDemo) and Swift (https://github.com/a34729t/TriangulatorSwift) and check how they work for you.
Cheers.

Determine the exact distance of the Beacon

Can we determine the exact distance of the Beacon from the iOS App using these properties below.
Proximity
accuracy
rssi
If so, How can we achieve it?
Thanks,
accuracy is an estimation of the distance (in meter) between your device and the beacon. It isn't really reliable. As a matter of fact determining an exact distance would require taking into account all things creating interferences or attenuating the signal which isn't possible.
Accuracy is reliable within 1 meter,but the value may fluctuate.With increasing distance error increases.
You can calculate the distance by rssi in the actual environment.

iOS Calculating the vertical movement distance of the device

I'm looking to create a function for my app which records the distance travelled in the vertical plane. More specifically, I want to record how far the device has been 'dropped' - this could mean dropped at arm's length onto the floor or dropped slowly with the user as they go down ten floors in an elevator. I'm looking for advice on the best way to calculate this with a relatively high level of accuracy.
I've read a little on the difficulty in accurately measuring distance travelled using core motion - especially as I need it to work even if the device rotates during the movement. From what I've researched it seems as though it would be impossible, or at least very difficult, to achieve this using core motion.
Would I be able to achieve this effect with Core Location instead? I've seen posts about calculating lateral distance, as in during a car journey, but nothing about vertical distance.
Is it as simple as 'startingAltitude - endingAltitude = distanceTravelled?
If so - how accurate is the altitude measurement of Core Location and how could I get started with this behaviour? I'm fairly new to iOS programming and would appreciate any pointers on the most appropriate method of achieving the function I want.
Thanks
There are serious limitations to both approaches.
Using an accelerometer to measure distance travelled requires very precise and accurate real-time measurement of acceleration. Any error in acceleration reading leads to error in your velocity calculation, which makes your location reading drift from the real location. Drift gets worse over time, to the point where the error swamps the actual location reading.
Based on my testing the altitude reading in iOS GPS devices is really bad. +/- 100 or more meters is not uncommon. Indoors GPS readings tend to get really bad, and the altitude reading is bad enough to start.

Resources