I want to calculate total distance covered by person using core location.User press start button it will show calculated distance on iphone screen .It also give some updated data after certain period of time.even though the person didn't made any movement it still shows some distance. Can any one help me.
Related
I just want some hint of how can I create an app in iOS which can do following.
When a user is at point X, user will click on start button so app will start a timer and track the movement. User will be on a horse and user needs to ride in a full circle. When user comes back to point X the app should draw the route taken by the user on the horse.
Aim is to ride completely in a circle. I want to make this app to
practice and see how close to a circle I ride.
I tried to look at GPS locator but I am not sure whether it will give me efficient results because the circle I ride can be as small as 60m or less in radius.
I don't know if iOS GPS can be this accurate. I read article on motion sensor and how to track rotation and acceleration.
But I am not quite sure how to use that to my advantage.
I just need some tips like which API to use etc.
Using the Standard Positioning Service one can achieve 15 meter
horizontal accuracy 95% of the time. This means that 95% of the time,
the coordinates you read from your GPS receiver display will be within
15 meters of your true position on the earth.
More Information click here
For integrate Map and draw path using current position google map is good option for integrate in iOS mobile .
small and range and get accurate result use indoor position system.
For more information about Indoor positioning system (IPS) click here
http://developer.estimote.com/
and Github iOS demo : get code
Is it possible to detect user movement via gyroscope? In general I need to calculate user's movement without GPS. I don't need to know user coordinates just something like this - 100 meters straight, turn left, 50 meters straight ect.
A gyroscrope measures changes in spatial orientation. Thus it can't tell you how far the user travelled. It can't even tell you if the user has turned right, since if the user also rotates the device left, you'll learn nothing.
I downloaded the Indoor Atlas iPhone SDK and also generated path maps and test paths for my venue. SDK navigates me perfectly when I am moving from one place to another but when I stop moving it generates scattered output with the position radius from 10 to 25. I am expecting precise co-ordinates in both the above cases in my project.
Is there any way to get more precision?
IndoorAtlas technology is using the history of magnetic field observations for computing the precise location. This means that the device needs to move some distance in order to collect enough data to converge to a correct location estimate, i.e., to have a location fix. We are constantly improving our service to decrease the time needed for the first location fix.
If you experience your position moving after you've already stopped walking yourself, please contact support#indooratlas.com with details of your application and venue where this is experienced and we'll look into it. Thanks!
I am making an ios application in which it is to be determined that wether the person is sitting or standing.I wanted to know that if there is any method to find automatically that the person is sitting or standing like we can get the height from sea level with the help of CLLocation Manager.So like this can we get the height of iPhone from the ground level in any way?
This is not possible for the following reasons:
The phone can tell you its height above sea level, the accuracy of which has a larger margin of error than the difference between a sitting and a standing person
Even if 1. did not apply, and you knew the precise height of the ground at your current location and the additional height of the phone, this would still be meaningless, as it doesn't take into account buildings, the height of the person, their posture and so forth.
You may have more luck using the motion coprocessor on newer models, you could assume that a standing person moves about more than a sitting person, or something. Or accelerometer readings to detect changes of position. But altitude is definitely not the way to go.
You cannot find out by altitude if a person is standing or sitting.
Accuracy of GPS is much to low. Which is at best 6m for altitude.
But if you are really clever you could try other approaches:
-Use the acceleratoin sensor: A standing person might move a bit more than a sitting one, or moves different. [Sorry, I did not saw that user jrturton has written the same, bit this indicates that this might work]
Sitting persons often type on the keyboard. You can measure that with the accelerometer, by frequence analysis after doing a FFT.
Walking persons: A person that walks does not sit: Detect typical walking steps, with aclerometer or even with an ios API that is new in ios7. (I remeber there is a step counter)
These all are no accurate detections, but may raise the probability to detect a sitting person-.
If you get that to work, I will have major respect. Post an update if you succeed.
Expect 2,5 to 3,5 fulltime working month to get that to work (in some cases)
I am developing an app that uses the user's location to be displayed on a map with other users.
I want to ensure that all users have a bit of privacy when it comes to their location being displayed openly to other users, so I am hoping to just set their location with a specified offset (lets say 1 mile) and display the "edited" location to all other users while still showing the "exact" location to the current user.
Example - If I am looking at the map, I want my "user location" (the blue dot) to be somewhat exact, while all other player's will see my location slightly offset from the real location.
What is the best way to achieve this?
I think the question you actually want the answer to is this:
How do I convert the user's location into an "approximate location" in a way that preserves the user's privacy?
It's not an easy problem:
Offsetting by a specific distance doesn't work:
There's a trivial attack if the direction is fixed.
If the direction does not change often enough, then the attacker only needs to wait to identify what looks like a road.
If the direction changes too often, then they'll tend to form a 1-mile circle around the target's house/work.
Offsetting by a random distance/direction doesn't work; the attacker just needs to collect enough samples; the clusters will likely be centered on the target's home/work.
Quantizing to a grid naively (e.g. "X is within this grid square") will tell you when the target crosses a grid boundary. This is especially bad if the target lives on a grid boundary.
Here's something that works a little better, but wil still (eventually) give away the user's location:
Pick an (approximately) 1-mile grid. For a "square" grid, you could use the Pierce quincunxal projection (there are four points of infinite distortion but you can make those all at sea — it looks like you can limit distortion on land to a factor of 2). There are also projections onto cube and, for a triangular grid, an icosahedron.
When you first need to report the user's location, give the nearest point on the grid. Also pick a threshold distance between 1 and 2 grid "squares", or so.
While the user is within the threshold distance of the center of the grid square, continue to report the same grid square. Otherwise, repeat.
It'll still eventually be obvious if the user happens to live on a grid boundary. There are various ways to attempt to fix this problem (e.g. a bias to reporting grid squares you've reported before), but these will eventually fail.
This seems a lot like trying to remove a digital watermark (the user's actual location) by using lossy compression (the approximation process) while producing an output image/audio (approximate location) that sounds/looks like the original. (The analogy works a little better if you treat the "watermark" as the user's daily habits, which will be visible in the output unless you know exactly what those habits are and can remove them.)
Or in signal processing terms: A low SNR simply means you have to listen for longer to extract the signal.
Are you showing everyone else as a pin? It might be strange if you show a pin at an exact location but the other user isn't there. For example if someone was a mile north and you showed their pin at the same location as the current user. Maybe you should display the other users with an MKOVerlay circle, and then use some calculation base on a userID to shift it slightly off centre so that people don't find out that it is always shifted 500m east and thus easily see here people are.
Whether or not you change the display, the code you seek is here: Get the GPS coordinate given the current location, bearing and distance