I'm working on an app that allows the user to rotate the iOS device like a steering wheel. I'm interested in getting a rough approximation of the degrees of the rotation (doesn't have to accurate at all). Is there an API for this?
Yes. It's called Core Motion.
Related
I am writing an app that will determine the angle at which the ios device is tilted off vertical. Specifically a window will come up with cross hairs (similar to a rifle scope). When the target object is placed in the center of the cross hairs, I would like to get a reading as to what the up or down angle is as referenced from the device. I suppose it would be data similar to a surveyors transit. It just needs to be accurate to 1/2 degree. I've read about the accelerometer and gyroscope sensors. Both seem relevant but not sure which is best way to go. Any insights would be appreciated. Thanks.
I'm new to Core Motion and I'm very confused. Can somebody please explain what these inputs measure and how they can be useful in simple terms?
Accelerometers measure movement relative to gravity, by virtue of "feeling" the force of movement applied to the device. Force of movement can be described as the rate of acceleration and deceleration of the device, hence the name of this sensor.
Gyroscopes measure changes in rotation by virtue of a suspended element reporting its rotation relative to the device. As the device rotates, this suspended element doesn't rotate, so there's a report coming from it that tells you how far the phone's rotated.
Magnetometers get their idea of rotational position from the north/south magnetic fields that compasses use to know where they are relative to the poles. This data is used (primarily) to help the Gyroscope, because these things suffer from float and inertia.
Combined, the information from these sensors, when filtered well (which Apple does for you with CoreMotion) give you all the movement of a phone.
So you can know if the user is swinging the phone around like a table tennis bat, or steering like a Wii Remote Mario style game controller, or simply walking.
I'm currently developing an iPhone App (on iPhone 5, iOS 7, Xcode 5) which requires a very accurate determination of the current attitude. The "attitude" of CMDeviceMotion does not fulfil these requirements because Apple's sensor fusion algorithm seems to rely too much on the gyroscope which drifts away rather fast (in my experience). That's why I decided to read out the bare sensor data and later I want to combine it within a sensor fusion algorithm by myself.
When asking for magnetometer data one has two possibilities:
via CMMagnetometerData in CMMotionManager
via CMCalibratedMagneticField in CMDeviceMotion about which Apple says
The CMCalibratedMagneticField returned by this property gives you the total magnetic field in the device’s vicinity without device bias. Unlike the magneticField property of the CMMagnetometer class, these values reflect the earth’s magnetic field plus surrounding fields, minus device bias.
In principle (2.) is exactly what I want.
There is a very simple test if magnetometer data is calibrated properly. For simplicity one can restrict oneself to two dimensions. When the device lies on it's back, the combination B_x^2 + B_y^2 must be constant, independent of the direction the device is pointing to. It must just equal the horizontal component of the Earth's magnetic field (assuming no other fields in the vicinity of the device). Thus, when performing a 360 degrees turn of the device which lies on it's back, the measured data B_y over B_x should display a circle. See here for details.
Now the point: the data of CMCalibratedMagneticField does NOT result in a circle!
Does anyone have an explanation for that? Or does anyone know, how the CMCalibratedMagneticField comes about? Is the magnetometer calibrated in the sense of the link from above when performing the "eight-shaped" movement of the device or what is the movement good for?
Btw. why the "eight-shaped" movement and not flipping the device around it's three axis, which would allow a calibration as described in the link from above?
I would be very glad for any clarification with this issue... Thanks!
There is a problem with the magnetometer in iOS 7, it has an error of +-7º. Try using the 7.1 beta version.
EDIT
The magnetometer has zero-drift over time, but is pretty inaccurate for sudden changes in position. The accelerometer and gyroscope on the other hand adjust quickly for sudden changes but, being inertial sensors, they lose accuracy over a period of time.
So when CMCalibratedMagneticField tries compensate for your rotational motion it uses data from the gyroscope and accelerometer. This is when the accelerometer and gyroscope's +-7º error creeps in and throws your circle off track. Check this answer and this wikipedia article for more info.
As regards to the figure of eight:
Both do the same thing, they orient the "North" of your device in each direction in hope of cancelling out magnetic interference. Flipping your device along all three axes will work better but it is harder to perform and not as easily understood by the user.
Hope this helps.
I found this on stackoverflow.
"You will probably need to use quaternions for composing rotations, if you are not doing so already. This avoids the problem of gimbal lock which you can get when orienting a camera by rotation around the 3 axes."
But how do I use the quaternion from the motionmanager in opengl. the code was first based on pitch and yaw only. Now I want to use the roll also, so you can use the gyroscope to look around. Could anybody help me this one?
Thank you.
My advice, don't use Eulers. Just track orientation with vectors(Forward, up, and right) for the object or camera.
To rotate, just do relative rotations of the current forward, up and right. Like rotate right 5 degress.
Quanternions have excessvie operations and physics don't work via yaw, pitch and roll, they are merely measurements to capture orientation, not how things get orientated.
I have iPad first generation. But I should develop an application which requires gyroscope sensor feeds. How can I simulate gyroscope like I am rotating my iPad?
I've created a sensor simulator for iOS. You can check it out here.
I don't think we can do this.
If we could, then what is the point of implementing a gyroscope in iPhone4/4s and iPad in the first place?
If you accept an imperfect simulation, you can use the accelerometer to detect the rotation based on one axis only. Say for example, you put iPad with the monitor facing up (so now the y-axis is horizontal), now if you rotate iPad along with y-axis, you can detect the rotation with the changes of x and z values.