Transforming Pitch and Roll values in iOS - ios

I have been trying to compare the pitch and roll pattern (2 second recording) readings.
What I have done is record a Pitch and Roll Values for 10 seconds. In those 10 seconds some similar motion is applied to the device.
When I plot these on graph, the values are also same.
This only happens when providing motion to the device from same orientation/position. For example, device is lying down on the table and a motion is provided to the device. All the readings are same.
But if the device is rotate 180 deg. ccw, the readings are inverted.
Is there a way I can achieve same reading for every position? Via applying some transformation formula? I have done it for acceleration values using pitch roll and yaw. But, don't know how to achieve this for pitch and roll itself.
Basically, what I want to achieve is that the Pitch and Roll values should be independent of yaw.
Here is the plot for pitch values.. all the readings are same expect the two starting from 1.5 in the graph. These were the two times when the device was rotated 180 deg. ccw
UPDATE:
I tried to store the CMAttitude in NSUSerDefaults and then applied the multiplyByInverseOfAttitude. But still the graph plot is inverse.
CMAttitude *currentAtt = motion.attitude;
if (firstAttitude)//firstAttitude is the stored CMAttitude
{
[currentAtt multiplyByInverseOfAttitude:firstAttitude];
}
CMQuaternion quat = currentAtt.quaternion;

Related

Euler angles to rotation matrix manual transformation for iOS devices

This is a small background and introduction to the problem:
I have some functionality in my motion- and location-based iOS app, which needs a rotation matrix as an input. Some graphical output is dependent on this matrix. With every movement of the device, graphical output is changed. This is a part of the code which makes that:
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical
toQueue:motionQueue
withHandler:
^(CMDeviceMotion* motion, NSError* error){
//get and process matrix data
}
In this structure only 4 frames are available:
XArbitraryZVertical
XArbitraryCorrectedZVertical
XMagneticNorthZVertical
XTrueNorthZVertical
I need to have another reference, f.e. gyroscope value instead of North and these frames can not offer me exactly what I want.
In order to reach my goal, I use next structure:
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryCorrectedZVertical
toQueue:motionQueue
withHandler:
^(CMDeviceMotion* motion, NSError* error){
//get Euler angles and transform it to rotation matrix
}
You may ask me, why I do not use built in rotation matrix? The answer is simple. I need to make some kind of own reference frame and I can make this via putting inside modified values of angles.
The problem:
In order to get rotation matrix from Euler angles we need to make matrix for each angle and after that multiply them. For 3D case we will have matrix for each axis (3 of them). After that we multiply matrixes. The problem is that the output is dependent on the order of multiplication. XYZ is not equal to ZYX. Wikipedia tells me, that there are 12 variants and I do not know which one is the right one for iOS implementation. I need to know in which order I need to multiply them. In addition, I need to know which angles represents X, Y, Z. For example, X - roll, Y - pitch, Z - yaw.
Actually, this problem was solved by Apple years ago, but I do not have access to .m-files and I do not know which order of multiplication is the right one for iOS device.
Similar question was published here, but order from that math example in the solution does not work for me.
Regarding: Which angles relate to which axis.
See this:
link:https://developer.apple.com/documentation/coremotion/getting_processed_device-motion_data/understanding_reference_frames_and_device_attitude
Regarding rotation order for calculating rotation matrix & Euler angles (Pitch, Roll, Yaw)
Short Answer: ZXY is the rotation order on iOS.
I kept searching for this answer too. Got tired. Not sure why this is not documented somewhere easy to lookup. I decided to collect empirical data and test out which rotation order best matches the values. My values are below.
Methodology:
Wrote a small iPhone App to return quaternion values & corresponding pitch, roll, yaw angles
Computed pitch, roll, yaw values from the quaternions for various rotation orders (XYZ, XZY, YZX, YXZ, ZYX, ZXY)
Calculated RMS error with respect to the pitch, yaw, roll values reported by iOS device motion. Identified the orientation with the least error.
Results:
Rotation orders: ZYX & ZXY both returned values very close to the iOS reported values. However, the Error on ZXY was ~46-597X lower than ZXY for every case. Hence I believe ZXY is the rotation order.

Comparison: TYPE_ROTATION_VECTOR with Complemenary Filter

I have been working on Orientation estimation and I need to estimate correct heading when I am walking in straight line. After facing some roadblocks, I started from basics again.
I have implemented a Complementary Filter from here, which uses Gravity vector obtained from Android (not Raw Acceleration), Raw Gyro data and Raw Magnetometer data. I am also applying a low pass filter on Gyro and Magnetometer data and use it as input.
The output of the Complementary filter is Euler angles and I am also recording TYPE_ROTATION_VECTOR which outputs device orientation in terms of 4D Quaternion.
So I thought to convert the Quaternions to Euler and compare them with the Euler obtained from Complementary filter. The output of Euler angles is shown below when the phone is kept stationary on a table.
As it can be seen, the values of Yaw are off by a huge margin.
What am I doing wrong for this simple case when the phone is stationary
Then I walked in my living room and I get the following output.
The shape of Complementary filter looks very good and it is very close to that of Android. But the values are off by huge margin.
Please tell me what am I doing wrong?
I don't see any need to apply a low-pass filter to the Gyro. Since you're integrating the gyro to get rotation, it could mess everything up.
Be aware that TYPE_GRAVITY is a composite sensor reading synthesized from gyro and accel inside Android's own sensor fusion algorithm. Which is to say that this has already been passed through a Kalman filter. If you're going to use Android's built-in sensor fusion anyway, why not just use TYPE_ROTATION_VECTOR?
Your angles are in radians by the looks of it, and the error in the first set wasn't too far from 90 degrees. Perhaps you've swapped X and Y in your magnetometer inputs?
Here's the approach I would take: first write a test that takes accel and gyro and synthesizes Euler angles from it. Ignore gyro for now. Walk around the house and confirm that it does the right thing, but is jittery.
Next, slap an aggressive low-pass filter on your algorithm, e.g.
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
Confirm that this still works. It should be much less jittery but also sluggish.
Finally, add gyro and make a complementary filter out of it.
dt = time_since_last_gyro_update;
yaw += gyroData[2] * dt; // test: might need to subtract instead of add
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
They key thing is to test every step of the way as you develop your algorithm, so that when the mistake happens, you'll know what caused it.

AVMetadataFaceObject Precision

I'm trying to use the AVMetadataFaceObject to get the yaw and roll of a face in a video. From what I can tell, the precision of the yaw is in increments of 45 degrees and the roll is in increments of 30 degrees.
Is there a way to increase this precision?
(Code as seen in Proper usage of CIDetectorTracking).
You can get the rectangles of the eyes and calculate the angle yourself. You should investigate the changes made here in iOS 7, as there are many improvements in this area.

Sphero concept of "Shake"?

What would be the best way tell the user is shaking the Sphero?
I need to differentiate when the user tilts the Sphero left/right/up/down and when they shake it rapidly a few times in any direction.
Is there a sample project that would be good to look at?
If you're collecting the accelerometer filtered values, and also the "IMU" values, the accelerometer values would be best for detecting shaking, while the IMU values (roll, pitch, yaw) are best for detecting tilt.
If you don't care on which axis it is shaken, then normalize the axis' by getting a square of the sum of their squares: sqrt(x^2 + y^2 + z^2) > 2000. This will give you a magnitude of the acceleration vector. It's a good value for "general acceleration-ness", and it's great for detecting shaking.
If you want to isolate on which axis it is being shaken, then for each axis, evaluate whether its absolute value of acceleration is above a threshold: abs(x) > 2000, since the positive or negative value of an axis is its own vector magnitude.
Then, just use the IMU data's roll, pitch and yaw values to determine the tilt of the Sphero.

iOS: Can I get the pitch/yaw/roll from accelerometer data?

I want to find out the pitch, yaw, and roll on an iPad 1. Since there is no deviceMotion facility, can I get this data from the accelerometer? I assume that I can use the vector that it returns to compare against a reference vector i.e. gravity.
Does iOS detect when the device is still and then take that as the gravity vector? Or do I have to do that?
Thanks.
It's definitely possible to calculate the Pitch and Roll from accelerometer data, but Yaw requires more information (gyroscope for sure but possibly compass could be made to work).
For an example look at Hungry Shark for iOS . Based on how their tilt calibration ui works I'm pretty sure they're using the accelerometer instead of the gyroscope.
Also, here are some formula's I found on a blog post from Taylor-Robotic a for calculating pitch and roll:
Now that we have 3 outputs expressed in g we should be able to
calculate the pitch and the roll. This requires two further equations.
pitch = atan (x / sqrt(y^2 + z^2))
roll = atan (y / sqrt(x^2 + z^2))
This will produce the pitch and roll in radians, to convert them into
friendly degrees we multiply by 180, then divide by PI.
pitch = (pitch * 180) / PI
roll = (roll * 180) / PI
The thing I'm still looking for is how to calibrate the pitch and roll values based on how the user is holding the device. If I can't figure it out soon, I may open up a separate question. Good Luck!

Resources