When I start the motion manager, keeping the phone basically still in my hand, I get erroneous values for the attitude. To get the rotation value, I use the CMAttitude object:
CMDeviceMotionHandler motionHandler = ^(CMDeviceMotion *motion, NSError *error) {
[self calculateNewPosition:motion];
_rotationMatrix = [self rotationToMat:[motion attitude].rotationMatrix];
};
[_motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryZVertical toQueue:_motionQueue withHandler:motionHandler];
Now, I know that there is noise in the measurements with tiny sized gyros, and the gravity vector probably needs calibration, but this seems to be too much off. After 0.5-1 seconds, the value of the rotations goes from 0 to over 20°?! Two examples for roll, pitch and yaw:
-1.001736 22.637596 -0.197573
-0.095075 29.075712 -0.014112
If it was the position drifting, when I use double integration, I would understand, but the rotation coming directly from the sensors?
Do you have any idea why this happens?
I just realized by seeing pictures in this question: Gyroscope on iPhone and testing out a bit more, that values are zero for a little while when starting the gyro, then, according to my configured coordinate frame (Z vertical), the values are adjusted to the current position of the phone.
So if I started with the phone in my hand at a 20° pitch, the value for pitch will be 0 for some small amount of time, then switch to 20°. Which means I have to wait for the rotation matrix to be non-zero to start tracking rotations.
Related
I'm currently using CMMotionManager attitude to get roll, pitch and yaw. However, after running for a few minutes, yaw is drifting and becoming not accurate.
I read there is a way to calculate the yaw using a combination of accelerometer and magnetometer that will keep yaw accurate with a compensation for the constant drift, however, I haven't yet found a working formula.
Here is a little part of my code that I use for getting motion updates.
motionManager.deviceMotionUpdateInterval = 0.1
motionManager.showsDeviceMovementDisplay = true
motionManager.startDeviceMotionUpdates(using: .xMagneticNorthZVertical, to: OperationQueue.current!)
{ deviceManager, error in
if let deviceManager = deviceManager {
let roll = deviceManager?.attitude.roll
let pitch deviceManager?.attitude.pitch
let yaw = deviceManager?.attitude.yaw // Get drifted over time
...
Any idea?
UPDATE
I could compensate the yaw drift by using the following formula:
Create initial reference point (fixed world frame reference) using magnetometer:
m_w = (m_x,m_y,m_z)
Take current magnetomer point (also fixed world frame reference) using magnetometer:
n_w = (n_x,n_y,n_z)
Convert projected reading to angles
a = atan2(m_z,m_x)
b = atan2(n_z,n_x)
Yaw drift can be calculated as follows
y_d = (a-b) * 180 / PI
Now deduct result from current accelerometer yaw.
WALLA!
Don't you mean giro? The MotionManager has a mode in where it mixes 3 inputs, Accelerometer, Gyroscope and the Magnetometer.
For this you gotta set it up to "Device motion"
Device motion. Call the startDeviceMotionUpdates(using:) or
startDeviceMotionUpdates() method to begin updates and periodically
access CMDeviceMotion objects by reading the deviceMotion
property. The startDeviceMotionUpdates(using:) method (new in iOS 5.0)
lets you specify a reference frame to be used for the attitude
estimates.
https://developer.apple.com/documentation/coremotion/cmmotionmanager
However, you must be aware that this increases the battery consumption quite a bit.
This is a small background and introduction to the problem:
I have some functionality in my motion- and location-based iOS app, which needs a rotation matrix as an input. Some graphical output is dependent on this matrix. With every movement of the device, graphical output is changed. This is a part of the code which makes that:
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical
toQueue:motionQueue
withHandler:
^(CMDeviceMotion* motion, NSError* error){
//get and process matrix data
}
In this structure only 4 frames are available:
XArbitraryZVertical
XArbitraryCorrectedZVertical
XMagneticNorthZVertical
XTrueNorthZVertical
I need to have another reference, f.e. gyroscope value instead of North and these frames can not offer me exactly what I want.
In order to reach my goal, I use next structure:
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXArbitraryCorrectedZVertical
toQueue:motionQueue
withHandler:
^(CMDeviceMotion* motion, NSError* error){
//get Euler angles and transform it to rotation matrix
}
You may ask me, why I do not use built in rotation matrix? The answer is simple. I need to make some kind of own reference frame and I can make this via putting inside modified values of angles.
The problem:
In order to get rotation matrix from Euler angles we need to make matrix for each angle and after that multiply them. For 3D case we will have matrix for each axis (3 of them). After that we multiply matrixes. The problem is that the output is dependent on the order of multiplication. XYZ is not equal to ZYX. Wikipedia tells me, that there are 12 variants and I do not know which one is the right one for iOS implementation. I need to know in which order I need to multiply them. In addition, I need to know which angles represents X, Y, Z. For example, X - roll, Y - pitch, Z - yaw.
Actually, this problem was solved by Apple years ago, but I do not have access to .m-files and I do not know which order of multiplication is the right one for iOS device.
Similar question was published here, but order from that math example in the solution does not work for me.
Regarding: Which angles relate to which axis.
See this:
link:https://developer.apple.com/documentation/coremotion/getting_processed_device-motion_data/understanding_reference_frames_and_device_attitude
Regarding rotation order for calculating rotation matrix & Euler angles (Pitch, Roll, Yaw)
Short Answer: ZXY is the rotation order on iOS.
I kept searching for this answer too. Got tired. Not sure why this is not documented somewhere easy to lookup. I decided to collect empirical data and test out which rotation order best matches the values. My values are below.
Methodology:
Wrote a small iPhone App to return quaternion values & corresponding pitch, roll, yaw angles
Computed pitch, roll, yaw values from the quaternions for various rotation orders (XYZ, XZY, YZX, YXZ, ZYX, ZXY)
Calculated RMS error with respect to the pitch, yaw, roll values reported by iOS device motion. Identified the orientation with the least error.
Results:
Rotation orders: ZYX & ZXY both returned values very close to the iOS reported values. However, the Error on ZXY was ~46-597X lower than ZXY for every case. Hence I believe ZXY is the rotation order.
I have been trying to compare the pitch and roll pattern (2 second recording) readings.
What I have done is record a Pitch and Roll Values for 10 seconds. In those 10 seconds some similar motion is applied to the device.
When I plot these on graph, the values are also same.
This only happens when providing motion to the device from same orientation/position. For example, device is lying down on the table and a motion is provided to the device. All the readings are same.
But if the device is rotate 180 deg. ccw, the readings are inverted.
Is there a way I can achieve same reading for every position? Via applying some transformation formula? I have done it for acceleration values using pitch roll and yaw. But, don't know how to achieve this for pitch and roll itself.
Basically, what I want to achieve is that the Pitch and Roll values should be independent of yaw.
Here is the plot for pitch values.. all the readings are same expect the two starting from 1.5 in the graph. These were the two times when the device was rotated 180 deg. ccw
UPDATE:
I tried to store the CMAttitude in NSUSerDefaults and then applied the multiplyByInverseOfAttitude. But still the graph plot is inverse.
CMAttitude *currentAtt = motion.attitude;
if (firstAttitude)//firstAttitude is the stored CMAttitude
{
[currentAtt multiplyByInverseOfAttitude:firstAttitude];
}
CMQuaternion quat = currentAtt.quaternion;
I need to measure distance of wall from user. When user open the camera and point to the any surface i need to get the distance. I have read some link Is it possible to measure distance to object with camera? and i used code for find the iphone camera angle from here http://blog.sallarp.com/iphone-accelerometer-device-orientation.
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
// Get the current device angle
float xx = -[acceleration x];
float yy = [acceleration y];
float angle = atan2(yy, xx);
}
d = h * tan angle
But nothing happen in the nslog and camera.
In the comments, you shared a link to a video: http://youtube.com/watch?v=PBpRZWmPyKo.
That app is not doing anything particularly sophisticated with the camera, but rather appears to be calculate distances using basic trigonometry, and it accomplishes this by constraining the business problem in several critical ways:
First, the app requires the user to specify the height at which the phone's camera lens is being held.
Second, the user is measuring the distance to something sitting on the ground and aligning the bottom of that to some known location on the screen (meaning you have a right triangle).
Those two constraints, combined with the accelerometer and the camera's lens focal length, would allow you to calculate the distance.
If your target cross-hair was in the center of the screen, it greatly simplifies the problem and it becomes a matter of simple trigonometry, i.e. your d = h * tan(angle).
BTW, the "angle" code in the question appears to measure the rotation about the z-axis, the clockwise/counter-clockwise rotation as the device faces you. For this problem, though, you want to measure the rotation of the device about its x-axis, the forward/backward tilt. See https://stackoverflow.com/a/16555778/1271826 for example of how to capture the device orientation in space. Also, that answer uses CoreMotion, whereas the article referenced in your question is using an API that has since been deprecated.
The only way this would be possible is if you could read out the setting of the auto-focus mechanism in the lens. To my knowledge this is not possible.
I want my app to be able to detect a device rotation while the device is held horizontally. I could use readings from a compass but I believe yaw values from the gyroscope would be more accurate. Given yaw readings, what would be the best algorithm to determine a 360° rotation (either clockwise or counter-clockwise). And it has to be a full 360°, not just turning the phone 180° in one direction and back 180° in the opposite direction.
You would use CoreMotion to get the rotation about the vertical axis. You need to add the delta of rotation events. Every time more than a minimal value is in a different direction than the previous you reset your starting point. Then when you arrive at either plus or minus 360 degrees from this start point you have the rotation.
Here is an idea assuming that you can obtain the readout in short intervals, and that the yaw can be zeroed at a specific start point. This is different from the other answer, which detects full circles from a continuously adapted start point.
In this approach, we keep comparing the current yaw to the previous yaw, asking whether a checkpoint at 180 degrees, or PI has been passed. Initially, the checkpoint flag cp_pi is NO, and passing it in either direction toggles its state. Note that yaw changes its sign in two places, at the zero point and again at PI to -PI.
Assuming your object has two properties that are persistent between ticks of the detector, BOOL cp_pi; and float prev_yaw;, we consider that d_yaw is less than PI for crossing 0 and larger than PI for crossing at the opposite end of your circle. When crossing the opposite end, we toggle cp_pi. When cp_pi is YES while crossing 0, we are guaranteed to have passed a full circle - since otherwise, cp_pi would have been toggled back to NO:
-(void)tick
{
float yaw = [self zeroedYaw];
if ((fabs(yaw) == PI) || (yaw == 0.0f)) return;
if (yaw * prev_yaw < 0)
{
float d_yaw = fabs(yaw - prev_yaw);
if (d_yaw > PI)
{
cp_pi = ! cp_pi;
}
else if (cp_pi)
{
// fire detection event
}
}
prev_yaw = yaw;
}
Note that in order to make our life easier, we skip the detector function entirely if yaw is sitting right on one of the checkpoints.