CLRegion hidden buffer? - ios

After coming across this question, I am concerned that there will not be an answer to the question, but I will hope, anyways.
I have setup a few geofences (most small and one large). I am using the simulator and I have outputted the radius of the large CLRegion and it tells me that the radius is 10881.98m around a certain coordinate, but when I simulate the geolocation to 11281.86m away from that same certain coordinate, it does not trigger the locationManager:didExitRegion: delegate method for the large region.
While the large region will not trigger locationManager:didExitRegion:, I have confirmed that the smaller regions will trigger the delegate method every time. Is there a reason why this is not firing? Is there a distance buffer around a region? Is it documented somewhere?
Any help would be great.
EDIT: From testing, I need to cut down the radius by around 45.28% in order to have the geofence trigger. Obviously this is not a great solution, as it is very imprecise and it goes against the whole idea of geofencing.

My guess is that this is an issue unique to the simulator. While CLRegion does not technically have a buffer or padding, the OS takes substantially longer to determine you have physically left the geofence area. On fences of that size, I would image it could take longer. On smaller regions, 100-200M, I've seen it take several minutes of driving, but easily 300-400M before triggering an event. From what the Apple Engineer told me at WWDC 2013, the OS takes its time in determining that you left. It is also harder for the system to determine you left because of its reliance on cell tower triangulation and known wifi networks. It needs to go well beyond the known networks before it can safely trigger the exit event.
I know it isn't an exact answer, but hopefully you'll understand a bit more how they work under the hood and what Apple's expectation of them is. Good luck.

Related

Filtering GPS jitter when standing still?

I am working on an iOS app with tracking. I have implemented Kalman smoothing in order to present a pleasing path. This is working pretty well at this point.
I am having a bit of trouble dealing with the user-not-moving case though. When the user IS moving we get very good reads back from the CLLocation Manager. And even when a reading is a bit off the Kalman algorithm takes care of it.
When standing still the CLLocation Manager delegate is still receiving "accurate" locations. They have good accuracy, not an unbelievable speed. Looking at the screen with human eyes it's clear that the user is standing still with all these points just scattered around. Some points very close and a few of them far out.
I have tried setting the CLLocationManager property pausesLocationUpdatesAutomatically but it doesn't seem to be working that well. It doesn't always stop when it should and there have been difficulty restarting the tracking again as the antennas are powered down.
So I'm looking to keep the tracking on the whole time but I want to filter out the jitter in post processing. So I determine programmatically that the user is stopped and discard (or ignore) all locations until the user is moving again.
I'm not really sure how to go about this, what algorithm is appropriate to achieve something like this?

Is there a way to calculate small distances with CoreMotion?

Is it possible to calculate small distances with CoreMotion?
For example a user moves his iOS device up or down, left and right and facing the device in front of him (landscape).
EDIT
Link as promised...
https://www.youtube.com/watch?v=C7JQ7Rpwn2k position stuff starts at about 23 minutes in.
His summary...
The best thing to do is to try and not use position in your app.
There is a video that I will find to show you. But short answer... No. The margin for error is too great and the integration that you have to do (twice) just amplifies this error.
At best you will end up with the device telling you it is slowly moving in one direction all the time.
At worst it could think it's hurtling around the planet.
2020 Update
So, iOS has added the measure app that does what the OP wanted. And uses a combination of accelerometer and gyroscope and magnetometer in the phone along with ARKit to get the external reference that I was talking about in this answer.
I’m not 100% certain but if you wanted to do something like the OP was asking you should be able to dig into ARKit and find some apis in there that do what you want.
👍🏻

Determining the height of iPhone from ground level

I am making an ios application in which it is to be determined that wether the person is sitting or standing.I wanted to know that if there is any method to find automatically that the person is sitting or standing like we can get the height from sea level with the help of CLLocation Manager.So like this can we get the height of iPhone from the ground level in any way?
This is not possible for the following reasons:
The phone can tell you its height above sea level, the accuracy of which has a larger margin of error than the difference between a sitting and a standing person
Even if 1. did not apply, and you knew the precise height of the ground at your current location and the additional height of the phone, this would still be meaningless, as it doesn't take into account buildings, the height of the person, their posture and so forth.
You may have more luck using the motion coprocessor on newer models, you could assume that a standing person moves about more than a sitting person, or something. Or accelerometer readings to detect changes of position. But altitude is definitely not the way to go.
You cannot find out by altitude if a person is standing or sitting.
Accuracy of GPS is much to low. Which is at best 6m for altitude.
But if you are really clever you could try other approaches:
-Use the acceleratoin sensor: A standing person might move a bit more than a sitting one, or moves different. [Sorry, I did not saw that user jrturton has written the same, bit this indicates that this might work]
Sitting persons often type on the keyboard. You can measure that with the accelerometer, by frequence analysis after doing a FFT.
Walking persons: A person that walks does not sit: Detect typical walking steps, with aclerometer or even with an ios API that is new in ios7. (I remeber there is a step counter)
These all are no accurate detections, but may raise the probability to detect a sitting person-.
If you get that to work, I will have major respect. Post an update if you succeed.
Expect 2,5 to 3,5 fulltime working month to get that to work (in some cases)

iOS location significant change accuracy and distance

I want to know the accuracy and the distance filter of the low-power significant change location service (i.e if I use startMonitoringSignificantLocationChanges how much it's accurate, and what is the distance of the significant change)?
I need some experimental (non documentary) info from real time apps
I had a chance to speak with the Apple Location Engineers at WWDC this past year and this is how it was explained to me.
The significant location change is the least accurate of all the location monitoring types. It only gets its updates when there is a cell tower transition or change. This can mean a varying level of accuracy and updates based on where the user is. City area, more updates with more towers. Out of town, interstate, fewer towers and changes.
This is also the hardest location type to test for since you can't use the simulator either. I'm not sure if they have fixed it to work with the GPX files for 6.0, but the significant location change api did not work at all in the simulator prior to iOS 6.
I have tried to avoid using the signification location change for many of these reasons. Sometimes it can't be helped. I ended up using the region monitoring API's as they are far more accurate and just as good on battery life. Hope this helps.
From the Apple documentation:
This interface delivers new events only when it detects changes to the
device’s associated cell towers, resulting in less frequent updates
and significantly lower power usage.
There doesn't appear to be much more specific information available about the exact accuracy, so I would assume you have accuracy roughly equivalent to the approximate distance between cell towers in the area that the iOS device is currently located in (which is shorter in more highly populated areas).
I had to build an app back then that uses cell tower significant location changes.
Short answer: very inaccurate.
I was clearly crossing the boundaries of my region.
From what we observe in our app, it can be a few hundred metres to a few kilometres off. Our testing was in the city area, cell towers in suburbs parellel to the train tracks and other suburban cell towers.
Pretty rough.
It was consistent most of the time. I notice that every time I was about to go into the tunnel to the underground train station, it would fire off my 3 region crossing notifications that I have setup for the CBD city area.
I'm using Xcode 4.6.2, and you can indeed simulate significant location change on this simulator.
In the iOS Simulator, the menu entries you need are Debug->Location->Freeway Drive.
Caveats (I welcome being told I'm wrong):
1. After a long while, there seem to be no more significant location change events.
2. You can only drive a pre-defined route in the general Cupertino/SF area. If all you care about is significant location change, that's fine.
Be careful, although you can access the speed property of the location got from the significant location update, it's useless! the simulator actually gives the speed but in real devices the speed is not available because location got from cell towers will not include the actual speed(unlike GPS). more the that as said before the location itself is very inaccurate it can be a few km off.
Be aware of that.
The only way to get the speed is have two cllocation and compute the speed manually

iOS Dev: Map Offset in China

I made a very simple APP in which I can throw a pin right onto the location I am standing at (just a beginner's practice). But I found a problem.
I swear neither I was moving nor the device thought I was moving. And I directly use the geolocation to set the pin. but the pin and the current-location blue point are hundreds of meters apart.
(By the way, the blue point expressed my real location at the time.)
This is a famous problem of Google Map on iOS in China. Put aside the complicated issue of the so-called national security, where I want help is what should we do as a developer. Technically, is there a way, in programming, to figure out what exactly the offset is and correct it?
Does anyone have any idea?
At what time did you place the pin? iOS has up to three sources of location data (cell tower triangulation, Wifi sniffing and GPS) and will keep you up to date with the most accurate. So often you get a not very accurate location, then a more accurate location, then an even more accurate location.
If you have a MKMapView open then something you can do is key-value observe on its userLocation property rather than starting any sort of CLLocationManager. That way you'll always be updated with whatever the map view has decided is the current location, meaning that you don't need to try to match your logic to its.
I did some research on the offset, but haven't gotten a satisfying result yet. The added offset is deterministic, i.e. given a location, the deviated location is fixed. So my goal is to get the deviation function, f(p)=p', where both p and p' are 2D points. You can check here if you are interested.

Resources