I am making an ios application in which it is to be determined that wether the person is sitting or standing.I wanted to know that if there is any method to find automatically that the person is sitting or standing like we can get the height from sea level with the help of CLLocation Manager.So like this can we get the height of iPhone from the ground level in any way?
This is not possible for the following reasons:
The phone can tell you its height above sea level, the accuracy of which has a larger margin of error than the difference between a sitting and a standing person
Even if 1. did not apply, and you knew the precise height of the ground at your current location and the additional height of the phone, this would still be meaningless, as it doesn't take into account buildings, the height of the person, their posture and so forth.
You may have more luck using the motion coprocessor on newer models, you could assume that a standing person moves about more than a sitting person, or something. Or accelerometer readings to detect changes of position. But altitude is definitely not the way to go.
You cannot find out by altitude if a person is standing or sitting.
Accuracy of GPS is much to low. Which is at best 6m for altitude.
But if you are really clever you could try other approaches:
-Use the acceleratoin sensor: A standing person might move a bit more than a sitting one, or moves different. [Sorry, I did not saw that user jrturton has written the same, bit this indicates that this might work]
Sitting persons often type on the keyboard. You can measure that with the accelerometer, by frequence analysis after doing a FFT.
Walking persons: A person that walks does not sit: Detect typical walking steps, with aclerometer or even with an ios API that is new in ios7. (I remeber there is a step counter)
These all are no accurate detections, but may raise the probability to detect a sitting person-.
If you get that to work, I will have major respect. Post an update if you succeed.
Expect 2,5 to 3,5 fulltime working month to get that to work (in some cases)
Related
Is it possible to calculate small distances with CoreMotion?
For example a user moves his iOS device up or down, left and right and facing the device in front of him (landscape).
EDIT
Link as promised...
https://www.youtube.com/watch?v=C7JQ7Rpwn2k position stuff starts at about 23 minutes in.
His summary...
The best thing to do is to try and not use position in your app.
There is a video that I will find to show you. But short answer... No. The margin for error is too great and the integration that you have to do (twice) just amplifies this error.
At best you will end up with the device telling you it is slowly moving in one direction all the time.
At worst it could think it's hurtling around the planet.
2020 Update
So, iOS has added the measure app that does what the OP wanted. And uses a combination of accelerometer and gyroscope and magnetometer in the phone along with ARKit to get the external reference that I was talking about in this answer.
I’m not 100% certain but if you wanted to do something like the OP was asking you should be able to dig into ARKit and find some apis in there that do what you want.
👍🏻
After coming across this question, I am concerned that there will not be an answer to the question, but I will hope, anyways.
I have setup a few geofences (most small and one large). I am using the simulator and I have outputted the radius of the large CLRegion and it tells me that the radius is 10881.98m around a certain coordinate, but when I simulate the geolocation to 11281.86m away from that same certain coordinate, it does not trigger the locationManager:didExitRegion: delegate method for the large region.
While the large region will not trigger locationManager:didExitRegion:, I have confirmed that the smaller regions will trigger the delegate method every time. Is there a reason why this is not firing? Is there a distance buffer around a region? Is it documented somewhere?
Any help would be great.
EDIT: From testing, I need to cut down the radius by around 45.28% in order to have the geofence trigger. Obviously this is not a great solution, as it is very imprecise and it goes against the whole idea of geofencing.
My guess is that this is an issue unique to the simulator. While CLRegion does not technically have a buffer or padding, the OS takes substantially longer to determine you have physically left the geofence area. On fences of that size, I would image it could take longer. On smaller regions, 100-200M, I've seen it take several minutes of driving, but easily 300-400M before triggering an event. From what the Apple Engineer told me at WWDC 2013, the OS takes its time in determining that you left. It is also harder for the system to determine you left because of its reliance on cell tower triangulation and known wifi networks. It needs to go well beyond the known networks before it can safely trigger the exit event.
I know it isn't an exact answer, but hopefully you'll understand a bit more how they work under the hood and what Apple's expectation of them is. Good luck.
I have a requirement mentioned below:
Already have a floor plan map image
First detect current location on floor
Then select the destination location using floor plan map image
Now application should provide direction & distance for that source to destination path
This is like how google direction works, but its in-house map require.
For example,
- Current position of user is: At his desk
- Where is Meeting Room #11
- So application should provide direction and distance updates on the map/floor plan image.
Any kind of suggestions/help would be great.
Thanks in advance
Couple of points...
You could create various audio files and play them as way points based on routing. Same principal as 'turn right at the next light'.
Definitely want to set your accuracy to: kCLLocationAccuracyBest. But this will still probably only get you accuracy of around +/- 10 meters at best.
Do a floor plan overlay using MapOverlayView.
If you are indoor, iPhone uses cell towers or WIFI for a location fix. This might be a problem for you because if you are looking to map multiple floors, only GPS can give you altitude readings - ground floor, second floor, etc...
I don't want to pour cold water on your idea but I have not heard of anyone successfully doing an indoor navigation app on an iPhone using standard stuff. If you really wanted to move forward on this project, your best accuracy might be using indoor bluetooth transmitters as navigational beacons...?
What you want is path-planing in the map, is that? If so, there is lot of algoritms you can use. You can choose a block size based on your map and resolution needs, divide de map into this, amd mark each block as navegable or not. Then getting from the first block trying in the direction of the destionation block, check if the neighboor block is blocked or not, and get going, until you reach (or not, if its not reacheable) the destination block.
Thats a pseudo-implementation, you have some option to do it, if I understand your needs.
(I dont know your hardware as said by others, with simple GPS and indoor navigation, assuming a 15m resolution is a good balance between optimistic/pesimistc signal, If its for robot-navigation, its not a goos approach in the GPS terms, but the algorimt is).
I want to know the accuracy and the distance filter of the low-power significant change location service (i.e if I use startMonitoringSignificantLocationChanges how much it's accurate, and what is the distance of the significant change)?
I need some experimental (non documentary) info from real time apps
I had a chance to speak with the Apple Location Engineers at WWDC this past year and this is how it was explained to me.
The significant location change is the least accurate of all the location monitoring types. It only gets its updates when there is a cell tower transition or change. This can mean a varying level of accuracy and updates based on where the user is. City area, more updates with more towers. Out of town, interstate, fewer towers and changes.
This is also the hardest location type to test for since you can't use the simulator either. I'm not sure if they have fixed it to work with the GPX files for 6.0, but the significant location change api did not work at all in the simulator prior to iOS 6.
I have tried to avoid using the signification location change for many of these reasons. Sometimes it can't be helped. I ended up using the region monitoring API's as they are far more accurate and just as good on battery life. Hope this helps.
From the Apple documentation:
This interface delivers new events only when it detects changes to the
device’s associated cell towers, resulting in less frequent updates
and significantly lower power usage.
There doesn't appear to be much more specific information available about the exact accuracy, so I would assume you have accuracy roughly equivalent to the approximate distance between cell towers in the area that the iOS device is currently located in (which is shorter in more highly populated areas).
I had to build an app back then that uses cell tower significant location changes.
Short answer: very inaccurate.
I was clearly crossing the boundaries of my region.
From what we observe in our app, it can be a few hundred metres to a few kilometres off. Our testing was in the city area, cell towers in suburbs parellel to the train tracks and other suburban cell towers.
Pretty rough.
It was consistent most of the time. I notice that every time I was about to go into the tunnel to the underground train station, it would fire off my 3 region crossing notifications that I have setup for the CBD city area.
I'm using Xcode 4.6.2, and you can indeed simulate significant location change on this simulator.
In the iOS Simulator, the menu entries you need are Debug->Location->Freeway Drive.
Caveats (I welcome being told I'm wrong):
1. After a long while, there seem to be no more significant location change events.
2. You can only drive a pre-defined route in the general Cupertino/SF area. If all you care about is significant location change, that's fine.
Be careful, although you can access the speed property of the location got from the significant location update, it's useless! the simulator actually gives the speed but in real devices the speed is not available because location got from cell towers will not include the actual speed(unlike GPS). more the that as said before the location itself is very inaccurate it can be a few km off.
Be aware of that.
The only way to get the speed is have two cllocation and compute the speed manually
I am developing an app that uses the user's location to be displayed on a map with other users.
I want to ensure that all users have a bit of privacy when it comes to their location being displayed openly to other users, so I am hoping to just set their location with a specified offset (lets say 1 mile) and display the "edited" location to all other users while still showing the "exact" location to the current user.
Example - If I am looking at the map, I want my "user location" (the blue dot) to be somewhat exact, while all other player's will see my location slightly offset from the real location.
What is the best way to achieve this?
I think the question you actually want the answer to is this:
How do I convert the user's location into an "approximate location" in a way that preserves the user's privacy?
It's not an easy problem:
Offsetting by a specific distance doesn't work:
There's a trivial attack if the direction is fixed.
If the direction does not change often enough, then the attacker only needs to wait to identify what looks like a road.
If the direction changes too often, then they'll tend to form a 1-mile circle around the target's house/work.
Offsetting by a random distance/direction doesn't work; the attacker just needs to collect enough samples; the clusters will likely be centered on the target's home/work.
Quantizing to a grid naively (e.g. "X is within this grid square") will tell you when the target crosses a grid boundary. This is especially bad if the target lives on a grid boundary.
Here's something that works a little better, but wil still (eventually) give away the user's location:
Pick an (approximately) 1-mile grid. For a "square" grid, you could use the Pierce quincunxal projection (there are four points of infinite distortion but you can make those all at sea — it looks like you can limit distortion on land to a factor of 2). There are also projections onto cube and, for a triangular grid, an icosahedron.
When you first need to report the user's location, give the nearest point on the grid. Also pick a threshold distance between 1 and 2 grid "squares", or so.
While the user is within the threshold distance of the center of the grid square, continue to report the same grid square. Otherwise, repeat.
It'll still eventually be obvious if the user happens to live on a grid boundary. There are various ways to attempt to fix this problem (e.g. a bias to reporting grid squares you've reported before), but these will eventually fail.
This seems a lot like trying to remove a digital watermark (the user's actual location) by using lossy compression (the approximation process) while producing an output image/audio (approximate location) that sounds/looks like the original. (The analogy works a little better if you treat the "watermark" as the user's daily habits, which will be visible in the output unless you know exactly what those habits are and can remove them.)
Or in signal processing terms: A low SNR simply means you have to listen for longer to extract the signal.
Are you showing everyone else as a pin? It might be strange if you show a pin at an exact location but the other user isn't there. For example if someone was a mile north and you showed their pin at the same location as the current user. Maybe you should display the other users with an MKOVerlay circle, and then use some calculation base on a userID to shift it slightly off centre so that people don't find out that it is always shifted 500m east and thus easily see here people are.
Whether or not you change the display, the code you seek is here: Get the GPS coordinate given the current location, bearing and distance