I am writing an app that will determine the angle at which the ios device is tilted off vertical. Specifically a window will come up with cross hairs (similar to a rifle scope). When the target object is placed in the center of the cross hairs, I would like to get a reading as to what the up or down angle is as referenced from the device. I suppose it would be data similar to a surveyors transit. It just needs to be accurate to 1/2 degree. I've read about the accelerometer and gyroscope sensors. Both seem relevant but not sure which is best way to go. Any insights would be appreciated. Thanks.
Using iOS Core Motion I am reading the GForce applied to a device, however if the device is not completely level then the readings are not as accurate.
Say I want the force applied on the Y axis but the phone is not 100% flat. Is there a calculation that can be done using the other axis to correct the figure account for the slight angle?
I'm currently developing an iPhone App (on iPhone 5, iOS 7, Xcode 5) which requires a very accurate determination of the current attitude. The "attitude" of CMDeviceMotion does not fulfil these requirements because Apple's sensor fusion algorithm seems to rely too much on the gyroscope which drifts away rather fast (in my experience). That's why I decided to read out the bare sensor data and later I want to combine it within a sensor fusion algorithm by myself.
When asking for magnetometer data one has two possibilities:
via CMMagnetometerData in CMMotionManager
via CMCalibratedMagneticField in CMDeviceMotion about which Apple says
The CMCalibratedMagneticField returned by this property gives you the total magnetic field in the device’s vicinity without device bias. Unlike the magneticField property of the CMMagnetometer class, these values reflect the earth’s magnetic field plus surrounding fields, minus device bias.
In principle (2.) is exactly what I want.
There is a very simple test if magnetometer data is calibrated properly. For simplicity one can restrict oneself to two dimensions. When the device lies on it's back, the combination B_x^2 + B_y^2 must be constant, independent of the direction the device is pointing to. It must just equal the horizontal component of the Earth's magnetic field (assuming no other fields in the vicinity of the device). Thus, when performing a 360 degrees turn of the device which lies on it's back, the measured data B_y over B_x should display a circle. See here for details.
Now the point: the data of CMCalibratedMagneticField does NOT result in a circle!
Does anyone have an explanation for that? Or does anyone know, how the CMCalibratedMagneticField comes about? Is the magnetometer calibrated in the sense of the link from above when performing the "eight-shaped" movement of the device or what is the movement good for?
Btw. why the "eight-shaped" movement and not flipping the device around it's three axis, which would allow a calibration as described in the link from above?
I would be very glad for any clarification with this issue... Thanks!
There is a problem with the magnetometer in iOS 7, it has an error of +-7º. Try using the 7.1 beta version.
EDIT
The magnetometer has zero-drift over time, but is pretty inaccurate for sudden changes in position. The accelerometer and gyroscope on the other hand adjust quickly for sudden changes but, being inertial sensors, they lose accuracy over a period of time.
So when CMCalibratedMagneticField tries compensate for your rotational motion it uses data from the gyroscope and accelerometer. This is when the accelerometer and gyroscope's +-7º error creeps in and throws your circle off track. Check this answer and this wikipedia article for more info.
As regards to the figure of eight:
Both do the same thing, they orient the "North" of your device in each direction in hope of cancelling out magnetic interference. Flipping your device along all three axes will work better but it is harder to perform and not as easily understood by the user.
Hope this helps.
Right now i am developing an app for testing human eye by reading letters and symbols, for that the user have to maintain 2 feet distance from his device. So I need to detect distance between human face and ios device using front camera.
Regarding this i have some doubts to clarify
For detecting human face i planned to use core-image framework. In that is it possible to detect the human face in background without camera UI?
For calculating distance i planned to use the below formula
distance = focal length * real height of object * camera frame height /(image height * sensor height)
And i have seen few apps in app store, those are using back camera to calculate the distance between the device and object. So i have little bit confusion is it possible to work it out using front camera.
Please help me, how to acheive this or tell me your suggestion whether it is a right way or not.
I'm working on an app that allows the user to rotate the iOS device like a steering wheel. I'm interested in getting a rough approximation of the degrees of the rotation (doesn't have to accurate at all). Is there an API for this?
Yes. It's called Core Motion.