Use Magnetometer and Accelerometer to calculate accurate Yaw - ios

I'm currently using CMMotionManager attitude to get roll, pitch and yaw. However, after running for a few minutes, yaw is drifting and becoming not accurate.
I read there is a way to calculate the yaw using a combination of accelerometer and magnetometer that will keep yaw accurate with a compensation for the constant drift, however, I haven't yet found a working formula.
Here is a little part of my code that I use for getting motion updates.
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.showsDeviceMovementDisplay = true
motionManager.startDeviceMotionUpdates(using: .xMagneticNorthZVertical, to: OperationQueue.current!)
{ deviceManager, error in
if let deviceManager = deviceManager {
let roll = deviceManager?.attitude.roll
let pitch deviceManager?.attitude.pitch
let yaw = deviceManager?.attitude.yaw // Get drifted over time
...
Any idea?
UPDATE
I could compensate the yaw drift by using the following formula:
Create initial reference point (fixed world frame reference) using magnetometer:
m_w = (m_x,m_y,m_z)
Take current magnetomer point (also fixed world frame reference) using magnetometer:
n_w = (n_x,n_y,n_z)
Convert projected reading to angles
a = atan2(m_z,m_x)
b = atan2(n_z,n_x)
Yaw drift can be calculated as follows
y_d = (a-b) * 180 / PI
Now deduct result from current accelerometer yaw.
WALLA!

Don't you mean giro? The MotionManager has a mode in where it mixes 3 inputs, Accelerometer, Gyroscope and the Magnetometer.
For this you gotta set it up to "Device motion"
Device motion. Call the startDeviceMotionUpdates(using:) or
startDeviceMotionUpdates() method to begin updates and periodically
access CMDeviceMotion objects by reading the deviceMotion
property. The startDeviceMotionUpdates(using:) method (new in iOS 5.0)
lets you specify a reference frame to be used for the attitude
estimates.
https://developer.apple.com/documentation/coremotion/cmmotionmanager
However, you must be aware that this increases the battery consumption quite a bit.

Related

Transforming Pitch and Roll values in iOS

I have been trying to compare the pitch and roll pattern (2 second recording) readings.
What I have done is record a Pitch and Roll Values for 10 seconds. In those 10 seconds some similar motion is applied to the device.
When I plot these on graph, the values are also same.
This only happens when providing motion to the device from same orientation/position. For example, device is lying down on the table and a motion is provided to the device. All the readings are same.
But if the device is rotate 180 deg. ccw, the readings are inverted.
Is there a way I can achieve same reading for every position? Via applying some transformation formula? I have done it for acceleration values using pitch roll and yaw. But, don't know how to achieve this for pitch and roll itself.
Basically, what I want to achieve is that the Pitch and Roll values should be independent of yaw.
Here is the plot for pitch values.. all the readings are same expect the two starting from 1.5 in the graph. These were the two times when the device was rotated 180 deg. ccw
UPDATE:
I tried to store the CMAttitude in NSUSerDefaults and then applied the multiplyByInverseOfAttitude. But still the graph plot is inverse.
CMAttitude *currentAtt = motion.attitude;
if (firstAttitude)//firstAttitude is the stored CMAttitude
{
[currentAtt multiplyByInverseOfAttitude:firstAttitude];
}
CMQuaternion quat = currentAtt.quaternion;

Comparison: TYPE_ROTATION_VECTOR with Complemenary Filter

I have been working on Orientation estimation and I need to estimate correct heading when I am walking in straight line. After facing some roadblocks, I started from basics again.
I have implemented a Complementary Filter from here, which uses Gravity vector obtained from Android (not Raw Acceleration), Raw Gyro data and Raw Magnetometer data. I am also applying a low pass filter on Gyro and Magnetometer data and use it as input.
The output of the Complementary filter is Euler angles and I am also recording TYPE_ROTATION_VECTOR which outputs device orientation in terms of 4D Quaternion.
So I thought to convert the Quaternions to Euler and compare them with the Euler obtained from Complementary filter. The output of Euler angles is shown below when the phone is kept stationary on a table.
As it can be seen, the values of Yaw are off by a huge margin.
What am I doing wrong for this simple case when the phone is stationary
Then I walked in my living room and I get the following output.
The shape of Complementary filter looks very good and it is very close to that of Android. But the values are off by huge margin.
Please tell me what am I doing wrong?
I don't see any need to apply a low-pass filter to the Gyro. Since you're integrating the gyro to get rotation, it could mess everything up.
Be aware that TYPE_GRAVITY is a composite sensor reading synthesized from gyro and accel inside Android's own sensor fusion algorithm. Which is to say that this has already been passed through a Kalman filter. If you're going to use Android's built-in sensor fusion anyway, why not just use TYPE_ROTATION_VECTOR?
Your angles are in radians by the looks of it, and the error in the first set wasn't too far from 90 degrees. Perhaps you've swapped X and Y in your magnetometer inputs?
Here's the approach I would take: first write a test that takes accel and gyro and synthesizes Euler angles from it. Ignore gyro for now. Walk around the house and confirm that it does the right thing, but is jittery.
Next, slap an aggressive low-pass filter on your algorithm, e.g.
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
Confirm that this still works. It should be much less jittery but also sluggish.
Finally, add gyro and make a complementary filter out of it.
dt = time_since_last_gyro_update;
yaw += gyroData[2] * dt; // test: might need to subtract instead of add
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
They key thing is to test every step of the way as you develop your algorithm, so that when the mistake happens, you'll know what caused it.

Differences between using AccelerometerData and gyroData to get phone absolute rotation?

I'm getting the phone absolute rotation (minus the z-axis) using the phone accelerometer with something like this:
motionManager.startAccelerometerUpdates()
...
if let data = motionManager.accelerometerData {
let x = data.acceleration.x
let y = data.acceleration.y
}
I know it can also be obtained using motionManager.startGyroUpdates() as stated in this answer:
Obtain absolute rotation using CMDeviceMotion?
I'd like to know what is the differences between using the accelerometer and the gyroscope for this goal. Is it one faster, more precise, less resource hungry than the other?
The accelerometer measures rate of change for linear acceleration. Mostly used for translation in x, y, and z vectors.
The gyroscope is for rotational rate of change in x, y, z vectors.
In your case, if you want the most precise data for rotation, you really should be using the gyroscope and associated data.

Gyro angle way off on iOS

When I start the motion manager, keeping the phone basically still in my hand, I get erroneous values for the attitude. To get the rotation value, I use the CMAttitude object:
CMDeviceMotionHandler motionHandler = ^(CMDeviceMotion *motion, NSError *error) {
[self calculateNewPosition:motion];
_rotationMatrix = [self rotationToMat:[motion attitude].rotationMatrix];
};
[_motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryZVertical toQueue:_motionQueue withHandler:motionHandler];
Now, I know that there is noise in the measurements with tiny sized gyros, and the gravity vector probably needs calibration, but this seems to be too much off. After 0.5-1 seconds, the value of the rotations goes from 0 to over 20°?! Two examples for roll, pitch and yaw:
-1.001736 22.637596 -0.197573
-0.095075 29.075712 -0.014112
If it was the position drifting, when I use double integration, I would understand, but the rotation coming directly from the sensors?
Do you have any idea why this happens?
I just realized by seeing pictures in this question: Gyroscope on iPhone and testing out a bit more, that values are zero for a little while when starting the gyro, then, according to my configured coordinate frame (Z vertical), the values are adjusted to the current position of the phone.
So if I started with the phone in my hand at a 20° pitch, the value for pitch will be 0 for some small amount of time, then switch to 20°. Which means I have to wait for the rotation matrix to be non-zero to start tracking rotations.

iOS: Can I get the pitch/yaw/roll from accelerometer data?

I want to find out the pitch, yaw, and roll on an iPad 1. Since there is no deviceMotion facility, can I get this data from the accelerometer? I assume that I can use the vector that it returns to compare against a reference vector i.e. gravity.
Does iOS detect when the device is still and then take that as the gravity vector? Or do I have to do that?
Thanks.
It's definitely possible to calculate the Pitch and Roll from accelerometer data, but Yaw requires more information (gyroscope for sure but possibly compass could be made to work).
For an example look at Hungry Shark for iOS . Based on how their tilt calibration ui works I'm pretty sure they're using the accelerometer instead of the gyroscope.
Also, here are some formula's I found on a blog post from Taylor-Robotic a for calculating pitch and roll:
Now that we have 3 outputs expressed in g we should be able to
calculate the pitch and the roll. This requires two further equations.
pitch = atan (x / sqrt(y^2 + z^2))
roll = atan (y / sqrt(x^2 + z^2))
This will produce the pitch and roll in radians, to convert them into
friendly degrees we multiply by 180, then divide by PI.
pitch = (pitch * 180) / PI
roll = (roll * 180) / PI
The thing I'm still looking for is how to calibrate the pitch and roll values based on how the user is holding the device. If I can't figure it out soon, I may open up a separate question. Good Luck!

Resources