How high is GeoPositionAccuracy.High? - geolocation

When working with the Windows Phone Location API, I am trying to gauge distance between two points that are only say inches or feet away. That said, accuracy is very important.
What is the difference between GeoPositionAccuracy.Default and GeoPositionAccuracy.High? Is the difference related to the number of decimal values? If that is the case, how many decimal values are assigned for GeoPositionAccuracy.Default and GeoPositionAccuracy.High?

GeoPositionAccuracy is just used to tell Windows Phone whether you need an accurate position or not. You get to make that choice because a higher accuracy uses more battery, so it's better not using it if you're just trying to figure out in which town the user is currently located. GeoPositionAccuracy.Default probably doesn't even use the GPS, but alternative localization methods
For the actual accuracy of the position you get, you can check the HorizontalAccuracy property of the GeoCoordinate. It gives you the error margin in meters.

GeoPositionAccuracy.Default uses a combination of the following:
Assisted GPS (aka cell tower triangulation)
Wi-Fi Triangulation (aka IP lookup)
GeoPositionAccuracy.High uses the methods above, in addition to:
GPS
Assisted GPS is battery friendly & locks on the location fast since it uses the cell radio which is already on. I can be accurate to the city block or building level in urban areas, but accuracy degrades in rural areas where the density of cell towers decreases. It also works well inside buildings.
GPS is slower to acquire a satellite lock and consumes more battery power since the GPS radio must be turned on as needed. Once locked in, it's the most accurate positioning method within just a few feet. It's virtually useless inside buildings since you need line of sight with the sky. Dense urban areas like Manhattan can also cause GPS inaccuracies.
Wi-Fi / IP Lookup is the least accurate since the IP address of an Wi-Fi hotspot can sometimes be registered at a different address.

I don't believe it makes any guarantees on the accuracy, it just allocates more resources towards acquiring a more accurate position (by providing more power to GPS module for instance).

Related

How to achieve a more accurate distance from device to Beacon?

I am sorry if this has been asked in one way shape or another. I have started working with beacons, and in Xcode (Swift) - using CoreLocation. I really need a more accurate determination between the device and a beacon though. So far I have been using the standard proximity region values (Far, Near, and Immediate), however this just isn't cutting it at all. It seems far too unstable for the solution I am looking for - which is a simple one at best.
My scenario;
I need to display notifications, adverts, images etc to the users device when they are approximately 4 meters away from the beacon. This sounds simple enough, but when I found out that the only real solutions there are for beacons are those aforementioned proximity regions, I started to get worried because I need to only display to devices that are 3-5 meters away, no more.
I am aware of the accuracy property of the CLBeacon class, however Apple state it should not be used for accurate positioning of beacons, which I believe is what I am trying to achieve.
Is there a solution to this? Any help is appreciated!
Thanks,
Olly
There are limitations of physics when it comes to estimating distance with Bluetooth radio signals. Radio noise, signal reflections, and obstructions all affect the ability to estimate distance based on radio signal strength. It's OK to use beacons for estimating distance, but you must set your expectations appropriately.
Apple's algorithms in CoreLocation take a running average of the measured signal strength over 20 seconds or so, then come up with a distance estimate in meters that is put into the CLBeacon accuracy field. The results of this field are then used to come up with the proximity field. (0.5 meters or less means immediate, 0.5-3 meters means near, etc.)
When Apple recommends against using the accuracy field, it is simply trying to protect you against unrealistic expectations. This will never be an exact estimate in meters. Best results will come with a phone out of a pocket, with no obstructions between the beacon and the phone, and the phone relatively stationary. Under best conditions, you might expect to get distance estimates of +/- 1 meter at close distances of 3 meters or less. The further you get away, the more variation you will see.
You have to decide if this is good enough for your use case. If you can control the beacons there are a few things you can do to make the results as good as possible:
Turn the beacon transmitter power setting up as high as possible. This gives you a higher signal to noise ratio, hence better distance estimates.
Turn the advertising rate up as high as possible. This gives you more statistical samples, hence better distance estimates.
Place your beacons in locations where there will be as few obstructions as possible.
Always calibrate your beacon after making the changes like above. Calibration involves measuring the signal level at 1 meter and storing this as a calibration constant inside the beacon. Consult your beacon manufacturer instructions for details of how to do this calibration.

Does altitude variance effects geofencing?

I need to implement a auto-check in feature for managing office attendance.Will accuracy of geofencing be effected if office is situated on 50th floor.ie Does altitude variance effects accuracy of geofencing?
The accuracy is not an issue (geofencing is based on lat/long coordinates, which are the same regardless of the floor). The issue is going to be GPS accuracy. Due to the fact that GPS is not able to penetrate walls, the GPS coordinates will nearly always show you outside of the building, not inside of it. Not to mention being able to identify which office room someone has entered.
If this is acceptable accuracy level for your project, then just GPS works. If you need to see that the person has actually entered the building or the specific office, you will need to utilize beacons.
(Disclosure: I work for Proximi.io, a technology-agnostic positioning platform)

Determining Distance and Direction of iDevice from Beacon

I am developing an iOS application in which I need to know the exact distance and direction of the device from the beacon. I am using Estimote beacon.
I have used iOS's CLLocation as well as Estimote's framework but both of them give an incorrect value for the distance. Moreover, the values fluctuate a lot, the beacon even goes into unknown state (accuracy -1.000) a lot of times.
I have also tried to use the formula given here:
Understanding ibeacon distancing
but in iOS, it seems there is no way to get the txPower or measured power of Beacon.
I have searched a lot but nowhere I found a satisfactory way to find the distance accurately.
is there any other way which can help me in finding accurately the distance and direction of iOS device from Beacon?
The distance is computed by comparing the received signal strength (RSSI) with the advertised transmitted power (txPower) of the beacon, as the signal strength in theory is inversely proportional to the square of the distance.
But there are lots of other things that can affect RSSI, including obstacles, orientation of the antennas, and possibly multi-path (reflections). So it's difficult to accurately measure distance based on this information.
Another way of measuring distance is using round-trip-time (RTT): you send something to the beacon, and you measure how long it takes to come back. But this requires a fixed response time, and on this sort of scale (meters), there are probably enough variable delays here and there that it might severely affect the calculation.
Direction would require either triangulation or multiple directional antennas, I don't believe that's the case in this scenario.
In short, you can get a rough idea of the distance (which is why it's good for proximity alerts), but accurate distance or direction would require different technologies.
Why do you need them? There may be alternatives based on your specific scenario.
EDIT
If you have a large number of beacons around, and you know their exact positions, it might be possible to pull off the following:
use at least 3 beacon distances to compute your exact position by triangulation
from there, as you know the position of the beacons, you can compute the distance and direction of any of the beacons (or anything else, really)
Of course, depending on the actual accuracy of the beacon distance measurement provided by the SDK, the result might be more or less accurate. The more beacons you have, the more precise you should be able to get (by picking only those that return a distance, or by eliminating those that are not "compatible" with the others when computing solutions).
Even having 3 or more beacons with fixed positions, you still won't be able to receive very accurate positioning without some serious and complex noise reduction. That's because radio waves are prone to being affected by diffraction, multipath propagation, interference and absorption - mostly by metal objects and water particles (therefore human bodies are strong signal blockers). Even phone's alignment (antenna position) can have a significant impact on the proximity readings. Therefore, without implementing alorithms for noise reduction, trilateration can give you accuracy of about 5 meters.
You can find some examples in Obj-C (https://github.com/MatVre/MiBeaconTrilaterationDemo) and Swift (https://github.com/a34729t/TriangulatorSwift) and check how they work for you.
Cheers.

Location with iBeacon

I am using an iBeacon, and using triangulation and trilateration (or something similar), want to be able to locate an exact (or fairly accurate) distance between the iBeacon and user's device (in feet/metres/e.t.c). What is the best way to do this, and how would I do this?
I forgot to mention: I understand that it is possible to find proximity (i.e near, immediate, far, etc.), however as mentioned, ideally I am looking to find an accurate distance (maybe by combining RSSI, accuracy, and proximity values).
For this you should use RSSI (Received Signal Strength Indication) of an iBeacon. The signal strength determines how close or far it is from you. But the problem is that:
Every beacon's RSSI might differ distance, accuracy.
If beacon is behind the wall or any static obstacle the RSSI-Distance-Ratio will not work.
Therefore instead of Triangulation or Trilateration you should go for Fingerprinting. This will work better then rest of the techniques.
Place obstacles all around you.
Make reference points on your map.
Calibrate your app with that location i.e. Get the signal strengths from atleast 3 nearest iBeacons and save it against that reference points.
Do this for all other reference points.
(If you can) Do this twice or thrice and take average and store in database.
Now you have laid map of calibrated reference points. (This will handle all different RSSI-DIstance-Ratios of all the beacons)
Now whenever you are at any position compare it with the nearest point and you will get to know the closest location of your reference point.
If you are using google maps, the lat long they provide is upto six decimal place i.e. 0.11 meters which i think is preety much accurate in a room as well.
I guess this helps :)
Please mark this the right answer if it works.
In iOS the Core Location beacon information you get when you range a beacon includes both a "proximity" value (far/near/immediate) and an "accuracy" reading, which is actually approximate distance, in meters.
In order for the distance reading to be as accurate as possible, you should really calibrate your beacons. To do that, you put the beacon exactly 1 meter from the receiver and take a reading. The receiver gives you a "measured power" reading, which you then set on the transmitter. The measured power reading is used in calculating the distance reading.
Distance readings are very approximate, and are subject to interference from the surroundings.
The Apple sample app "AirLocate" shows working code for calibrating a beacon, and I believe it also displays

ios google maps accuracy

i have view that display user current location using google maps and route to his distention.
the problem: user location is out of the road most of the time, I can't put the app like this in the Appstore, it will get bad reviews.
I checked google sdk for IOS, is there any property for accuracy !?
like: self.googleMap.accuracy = bestForNavigation
are there any tweaks or properties to set that improve user location accuracy?
how maps apps on the Appstore display user location with so much accuracy like google app?
In the CoreLocationManager you can set the desiredAccuracy to kCLLocationAccuracyBestForNavigation. However, GPS is still never perfect, you may get anywhere from 5 to 100m accuracy from the GPS depending on signal quality, sky view (canyons, cities), etc.
Another source of error to watch out for, make sure your GPS data is in the same datum (ie: WGS-84) as the map and the road network data. Different datums can add small (or large) errors.
To compensate for inherent GPS and mapping error, most turn-by-turn navigation apps use what we call "snap to road". We compute what roads the user is near and IF the GPS location is within 30m (see note1) of the road AND the course (or heading, note2) is within +/- 25 degrees of the road direction, THEN we "snap" to the road. That means we change the location and heading of the displayed location dot so that it shows that the user is exactly on the road (compute the nearest point on the road from the GPS point) and heading along the road path (select the road heading that is closest). This requires detailed road geometry data including curves and some fun calculations but it works really well once you get it.
If they are further off the road or not aligned with the road heading then we show their actual GPS location. This works really well but it requires that you have the road network geometry available (or at least their route to destination geometry) so you can make these checks.
Note1: we select a tolerance based on the reported horizontalAccuracy from the CLLocation we get.
Note2: we blend the compass heading (corrected) and course (from GPS) to decide the users actual heading. Below about 8 kph we predominantly use compass heading, above that we mainly use GPS course (its more accurate). We also GPS course to determine compass error to correct it. This allows us to show accurate headings even when stopped at a light or at very slow parking lot speeds.
There are limits to GPS accuracy.
If an app has access to the underlying map data and there is an assumption that the location should be on a road then it could fudge the user's location. But what if I am parked beside the road or not in a vehicle.
Fro: gps.gov: The GPS signal in space will provide a "worst case" pseudorange accuracy of 7.8 meters at a 95% confidence level.
But it is possible for receivers to do better than this especially if they are stationary. In the stationary case location averaging can help substantially but the fact it is stationary would have to be known to the device. It is also possible for receivers to do much worse. So, 10 to 20 meters is probably a safe assumption.

Resources