I am a student,and I am developing an iOS App to track indoor position.
My idea is that from a given reference point (a known position), using inertial sensors in my iphone(such as accelerometer,Gyro,etc) track the phone when moving. And display on a indoor map(a simple indoor plan)when the user is going.
But the problem is that i have no idea how to combine these sensors to give me an actual position?
Does someone has some experience that he can share with me about indoor positioning system using inertial sensors?
Thank you so much.
One Solution is to use Bluetooth Beacons
They connect to your iPhone's Bluetooth and based on their signal strengh you can estimate the distance to each one of them, so you can estimate your indoor position.
read more: Indoor Positioning
Related
I am writing an app that will determine the angle at which the ios device is tilted off vertical. Specifically a window will come up with cross hairs (similar to a rifle scope). When the target object is placed in the center of the cross hairs, I would like to get a reading as to what the up or down angle is as referenced from the device. I suppose it would be data similar to a surveyors transit. It just needs to be accurate to 1/2 degree. I've read about the accelerometer and gyroscope sensors. Both seem relevant but not sure which is best way to go. Any insights would be appreciated. Thanks.
Many AR applications allow user to select a 3D object and display in the phone. If the user walks into a different area. The 3D object still remains at the same location. How can i achieve this? Can GPS solve this problem? But GPS is not accurate in indoor.
You can do it by using ARKit, ARCore or Vuforia SDKs. In ARKit and ARCore you can anchor objects to physical locations and they will stay there even if you walk into a different room. However, you might notice some drift in the location in ARCore because environmental understanding of device change over time or you might lose tracking at some point. With Vuforia you can use extended tracking to track objects but it is a bit different than ARCore and ARKit. You have to use ground plane or smart terrain to utilize extended tracking fully in your situation.
I am currently developing an augmented reality android application in which I would like to display the discharge data of a river along with the river name as augmented features. However, I would like to show the data augmented only if an user is facing his device camera towards the river and not in the opposite direction.
How shall I get to implement this?
I thought that there could be two ways:
feature detection: but I do not know if it would work as the feature here which is river is quiet dynamic.
something to do with the orientation of the phone with respect to the real world. However, I do not really get an idea of how I can implement this.
I think the best way to implement is to use GPS based Augmented reality.So basically you are attaching feature on GPS location and when user holds camera in direction of that GPS location closed to it the detection will happen. Definitely you should not go for image based feature detection.
you may follow the links below
https://www.youtube.com/watch?v=X6djed8e4n0
http://wirebeings.com/markerless-gps-ar.html
I hope this helps.
Is it possible to find an accurate position of a person in a room using multiple Bluetooth beacons set up around the room through proximity detection? I've done some research and it says it is difficult to detect proximity with a Bluetooth sensor but can I overcome this through using multiple Bluetooth beacons? The point is you can use multiple Bluetooth beacons to triangulate positions and we'll be developing an iOS app that dictates the user (blind or disabled) where obstacles are. I'm wondering if this idea is doable and am curious how I can achieve accurate proximity sensing with Bluetooth beacons.
Thank you.
For anyone who thinks of this idea in the future:
Bluetooth really isn't ideal but there can be other ways such as using
other types of sensors.
I am doing an indoor navigation application using I-Beacon. For that i am using the accuracy given by the beacon. But it is changing rapidly. Since the value is changing, the X and Y coordinates of the user location, which has to be calculated is also varying even when i am static. So please help me to make the accuracy a constant when I m not moving.
Thanks in advance
I suggest you to read the following article about the experience with two positioning algorithms Trilateration and NonLinear Regression: R/GA Tech Blog
You will find the complete iOS App that implements both algorithms from these guys on GitHub
The App is very helpful to understand the difficulties of the requirement of the indoor navigation and experiment with it.
Also please note: Apple did announce the indoor positioning on WWDC 2014 with the Core Location Framework in iOS8, but after some couple of month they stopped the program. There were a lot of rush about the new feature. Apple decided than to offer the program only for big companies. You can register for it here.
It is important to understand the Apple strategy: The iBeacons technology is for proximity and advertising in contrast to Core Location Framework indoor positioning features in iOS8. The first one is just an addition to the second one, not replacement.
There is also an interesting article on the Estimote Blog about the physics of beacon tech.. The useful part for you begins with the sentence "When we started building it, we were experimenting with a method called trilateration."
Indoor positioning using beacons is extremely hard, precisely due to the fluctuations in the distance (accuracy) estimates. You could try some averaging and smoothing out algorithms, but that's just the beginning to implementing reliable, beacon-based indoor positioning.
Estimote is working on ready-made library for indoor location with beacons: https://github.com/Estimote/iOS-Indoor-SDK, you might want to give it a try. It only works with Estimote beacons though.