I'm trying to build a web app specifically for iOS that relies on accelerometer data for navigation. (A site that you could theoretically move through (from page to page) spatially.)
For example, taking a step (or moving the device) forward would take you to one web page or URL, and left, right, or backwards would take to their own unique URLs. Any ideas on how to make this happen?
I spent quite a lot of time looking in to this for an experiment I was writing.
The trouble is that it is very hard/impossible to get an accurate (or even approximate) location in space from your device.
The trouble is that the device measures acceleration which needs a couple of calculations to get down to position.
You can use this value to measure location relative to a start point but the next problem is the noise that the accelerometer receives.
The closest I could get to getting it working was to smooth out the noise of the accelerometer and then calculate back to try and determine location. But due to the noise I found that the device would constantly think it was moving in one direction.
After a number of days of trying different methods I determined that it really isn't possible without external tracking of the device.
What you can do is use the orientation of the device to navigate. i.e. tilt forwards, tilt backwards, tilt left, tilt right do do different things.
Related
I just want some hint of how can I create an app in iOS which can do following.
When a user is at point X, user will click on start button so app will start a timer and track the movement. User will be on a horse and user needs to ride in a full circle. When user comes back to point X the app should draw the route taken by the user on the horse.
Aim is to ride completely in a circle. I want to make this app to
practice and see how close to a circle I ride.
I tried to look at GPS locator but I am not sure whether it will give me efficient results because the circle I ride can be as small as 60m or less in radius.
I don't know if iOS GPS can be this accurate. I read article on motion sensor and how to track rotation and acceleration.
But I am not quite sure how to use that to my advantage.
I just need some tips like which API to use etc.
Using the Standard Positioning Service one can achieve 15 meter
horizontal accuracy 95% of the time. This means that 95% of the time,
the coordinates you read from your GPS receiver display will be within
15 meters of your true position on the earth.
More Information click here
For integrate Map and draw path using current position google map is good option for integrate in iOS mobile .
small and range and get accurate result use indoor position system.
For more information about Indoor positioning system (IPS) click here
http://developer.estimote.com/
and Github iOS demo : get code
I am working on an iOS app with tracking. I have implemented Kalman smoothing in order to present a pleasing path. This is working pretty well at this point.
I am having a bit of trouble dealing with the user-not-moving case though. When the user IS moving we get very good reads back from the CLLocation Manager. And even when a reading is a bit off the Kalman algorithm takes care of it.
When standing still the CLLocation Manager delegate is still receiving "accurate" locations. They have good accuracy, not an unbelievable speed. Looking at the screen with human eyes it's clear that the user is standing still with all these points just scattered around. Some points very close and a few of them far out.
I have tried setting the CLLocationManager property pausesLocationUpdatesAutomatically but it doesn't seem to be working that well. It doesn't always stop when it should and there have been difficulty restarting the tracking again as the antennas are powered down.
So I'm looking to keep the tracking on the whole time but I want to filter out the jitter in post processing. So I determine programmatically that the user is stopped and discard (or ignore) all locations until the user is moving again.
I'm not really sure how to go about this, what algorithm is appropriate to achieve something like this?
I downloaded the Indoor Atlas iPhone SDK and also generated path maps and test paths for my venue. SDK navigates me perfectly when I am moving from one place to another but when I stop moving it generates scattered output with the position radius from 10 to 25. I am expecting precise co-ordinates in both the above cases in my project.
Is there any way to get more precision?
IndoorAtlas technology is using the history of magnetic field observations for computing the precise location. This means that the device needs to move some distance in order to collect enough data to converge to a correct location estimate, i.e., to have a location fix. We are constantly improving our service to decrease the time needed for the first location fix.
If you experience your position moving after you've already stopped walking yourself, please contact support#indooratlas.com with details of your application and venue where this is experienced and we'll look into it. Thanks!
Is it possible to calculate small distances with CoreMotion?
For example a user moves his iOS device up or down, left and right and facing the device in front of him (landscape).
EDIT
Link as promised...
https://www.youtube.com/watch?v=C7JQ7Rpwn2k position stuff starts at about 23 minutes in.
His summary...
The best thing to do is to try and not use position in your app.
There is a video that I will find to show you. But short answer... No. The margin for error is too great and the integration that you have to do (twice) just amplifies this error.
At best you will end up with the device telling you it is slowly moving in one direction all the time.
At worst it could think it's hurtling around the planet.
2020 Update
So, iOS has added the measure app that does what the OP wanted. And uses a combination of accelerometer and gyroscope and magnetometer in the phone along with ARKit to get the external reference that I was talking about in this answer.
Iām not 100% certain but if you wanted to do something like the OP was asking you should be able to dig into ARKit and find some apis in there that do what you want.
šš»
I have a requirement mentioned below:
Already have a floor plan map image
First detect current location on floor
Then select the destination location using floor plan map image
Now application should provide direction & distance for that source to destination path
This is like how google direction works, but its in-house map require.
For example,
- Current position of user is: At his desk
- Where is Meeting Room #11
- So application should provide direction and distance updates on the map/floor plan image.
Any kind of suggestions/help would be great.
Thanks in advance
Couple of points...
You could create various audio files and play them as way points based on routing. Same principal as 'turn right at the next light'.
Definitely want to set your accuracy to: kCLLocationAccuracyBest. But this will still probably only get you accuracy of around +/- 10 meters at best.
Do a floor plan overlay using MapOverlayView.
If you are indoor, iPhone uses cell towers or WIFI for a location fix. This might be a problem for you because if you are looking to map multiple floors, only GPS can give you altitude readings - ground floor, second floor, etc...
I don't want to pour cold water on your idea but I have not heard of anyone successfully doing an indoor navigation app on an iPhone using standard stuff. If you really wanted to move forward on this project, your best accuracy might be using indoor bluetooth transmitters as navigational beacons...?
What you want is path-planing in the map, is that? If so, there is lot of algoritms you can use. You can choose a block size based on your map and resolution needs, divide de map into this, amd mark each block as navegable or not. Then getting from the first block trying in the direction of the destionation block, check if the neighboor block is blocked or not, and get going, until you reach (or not, if its not reacheable) the destination block.
Thats a pseudo-implementation, you have some option to do it, if I understand your needs.
(I dont know your hardware as said by others, with simple GPS and indoor navigation, assuming a 15m resolution is a good balance between optimistic/pesimistc signal, If its for robot-navigation, its not a goos approach in the GPS terms, but the algorimt is).