is IPad accelerometer normalized to gravity - ios

While porting my Android app to iOS I was confused with one thing, so I want to find out: am I right or mo code works wrong. At Android device accelerometer returns values in physical measure units: m/s2. On ipad i get summary force approximately = 1.0 for still device(and i expect 9.8). My first explanation is that returning value is normalized to 9.8, so I must multiply it with 9.8 to get real force. My second idea - my code is totally wrong, but it's hard to believe.

From the Docs on CMAcceleration:
CMAcceleration
The type of a structure containing 3-axis acceleration values.
typedef struct {
double x;
double y;
double z;
} CMAcceleration;
X-axis acceleration in G's (gravitational force).
Y-axis acceleration in G's (gravitational force).
Z-axis acceleration in G's (gravitational force).
A G is a unit of gravitation force equal to that exerted by the earth’s gravitational field (9.81 m s−2).

Related

How to get the actual acceleration via CMMotionManager in iOS

I am getting x,y and z device acceleration using startDeviceMotionUpdates(), and reading the userAcceleration data structure with a Timer. Apple documentation states
The total acceleration of the device is equal to gravity plus the acceleration the user imparts to the device.
The values that I am getting even if I jerk the phone around are at the most 5.7nnnn on the X axis for example. Now, if the gravity acceleration is 9.81 m/sec squared, what does the value 5.7nnnn represent in m/sec squared? That is, how do I get the actual m/sec squared value from the raw axis values that userAcceleration gives? How does one interpret the difference between acceleration and deceleration?
CoreMotion (CM) outputs acceleration in g's. So, you'd need to multiply the values with ~9.81 m/s^2.
Also, CM acceleration readings are reversed compared to a more conventional accelerometer; i.e., when the device is stationary on a table, CM measures approximately -1.0 on the z axis, while a conventional accelerometer would measure approximately 9.81 m/s^2 (note that the former value is negative, while the latter is positive). So, if you multiply the CM readings with -9.81, you would get intuitive results: a positive value along an axis would mean acceleration and a negative value would mean deceleration.

Comparison: TYPE_ROTATION_VECTOR with Complemenary Filter

I have been working on Orientation estimation and I need to estimate correct heading when I am walking in straight line. After facing some roadblocks, I started from basics again.
I have implemented a Complementary Filter from here, which uses Gravity vector obtained from Android (not Raw Acceleration), Raw Gyro data and Raw Magnetometer data. I am also applying a low pass filter on Gyro and Magnetometer data and use it as input.
The output of the Complementary filter is Euler angles and I am also recording TYPE_ROTATION_VECTOR which outputs device orientation in terms of 4D Quaternion.
So I thought to convert the Quaternions to Euler and compare them with the Euler obtained from Complementary filter. The output of Euler angles is shown below when the phone is kept stationary on a table.
As it can be seen, the values of Yaw are off by a huge margin.
What am I doing wrong for this simple case when the phone is stationary
Then I walked in my living room and I get the following output.
The shape of Complementary filter looks very good and it is very close to that of Android. But the values are off by huge margin.
Please tell me what am I doing wrong?
I don't see any need to apply a low-pass filter to the Gyro. Since you're integrating the gyro to get rotation, it could mess everything up.
Be aware that TYPE_GRAVITY is a composite sensor reading synthesized from gyro and accel inside Android's own sensor fusion algorithm. Which is to say that this has already been passed through a Kalman filter. If you're going to use Android's built-in sensor fusion anyway, why not just use TYPE_ROTATION_VECTOR?
Your angles are in radians by the looks of it, and the error in the first set wasn't too far from 90 degrees. Perhaps you've swapped X and Y in your magnetometer inputs?
Here's the approach I would take: first write a test that takes accel and gyro and synthesizes Euler angles from it. Ignore gyro for now. Walk around the house and confirm that it does the right thing, but is jittery.
Next, slap an aggressive low-pass filter on your algorithm, e.g.
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
Confirm that this still works. It should be much less jittery but also sluggish.
Finally, add gyro and make a complementary filter out of it.
dt = time_since_last_gyro_update;
yaw += gyroData[2] * dt; // test: might need to subtract instead of add
yaw0 = yaw;
yaw = computeFromAccelMag(); // yaw in radians
factor = 0.2; // between 0 and 1; experiment
yaw = yaw * factor + yaw0 * (1-factor);
They key thing is to test every step of the way as you develop your algorithm, so that when the mistake happens, you'll know what caused it.

How to measure user distance from wall

I need to measure distance of wall from user. When user open the camera and point to the any surface i need to get the distance. I have read some link Is it possible to measure distance to object with camera? and i used code for find the iphone camera angle from here http://blog.sallarp.com/iphone-accelerometer-device-orientation.
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
// Get the current device angle
float xx = -[acceleration x];
float yy = [acceleration y];
float angle = atan2(yy, xx);
}
d = h * tan angle
But nothing happen in the nslog and camera.
In the comments, you shared a link to a video: http://youtube.com/watch?v=PBpRZWmPyKo.
That app is not doing anything particularly sophisticated with the camera, but rather appears to be calculate distances using basic trigonometry, and it accomplishes this by constraining the business problem in several critical ways:
First, the app requires the user to specify the height at which the phone's camera lens is being held.
Second, the user is measuring the distance to something sitting on the ground and aligning the bottom of that to some known location on the screen (meaning you have a right triangle).
Those two constraints, combined with the accelerometer and the camera's lens focal length, would allow you to calculate the distance.
If your target cross-hair was in the center of the screen, it greatly simplifies the problem and it becomes a matter of simple trigonometry, i.e. your d = h * tan(angle).
BTW, the "angle" code in the question appears to measure the rotation about the z-axis, the clockwise/counter-clockwise rotation as the device faces you. For this problem, though, you want to measure the rotation of the device about its x-axis, the forward/backward tilt. See https://stackoverflow.com/a/16555778/1271826 for example of how to capture the device orientation in space. Also, that answer uses CoreMotion, whereas the article referenced in your question is using an API that has since been deprecated.
The only way this would be possible is if you could read out the setting of the auto-focus mechanism in the lens. To my knowledge this is not possible.

UIAccelleration value range

The AppleDeveloper guide seem to imply that UIAccelerationValue can range between a double value of -1.0 and +1.0.
I have logged the values from a real device whilst "shaking" with crazy gestures my iPod touch and got x values above 2.0 (e.g. +2.1, -2.1) and NO y value above 2.0f.
Could anyone explain this?
Has anyone identified the MAX and MIN values for UIAccelerationValue?
My take on this is that Apple has implemented some algorithm that approximates the force of gravity and takes as 1.0 values that are above a standard speed approximation (e.g. have values 9.8 m/s of speed).
Any other guesses?
You may be misunderstanding a part of the documentation. Nowhere does it say that the value ranges between -1.0. and 1.0, as far as I can see. It says that:
The device accelerometer reports values for each axis in units of
g-force, where a value of 1.0 represents acceleration of about +1 g
along a given axis. When a device is laying still with its back on a
horizontal surface, each acceleration event has approximately the
following values:
"g" is used in a particular technical sense here; 1 g is one standard gravity; a phone accelerating faster than this will register readings higher than 1. Violent shaking in the hand is easily enough to cause acceleration and deceleration values higher than 9.8m/s2.

IOS Accelerometer/Gyroscope Question

I want to write an app that gives the degrees of position from some coordinate (bottom of the phone).
For example... If I'm holding the phone at a 45 degree angle, I want to display: 45 degrees on the screen. If the user holds the phone at 45 degrees and rotates the phone around an axis going from the ear piece to the home button, I want to display that angle (between 0-180degrees).
I've implemented the accelerometer and I get the x, y, z values, however, how do I convert them? I know they are in G's (1G, 0.9G, -0.5G on the respective axis), but what's the conversion? Am I even going on the correct track? Should I be using the gyroscope instead?
Thanks.
This question has an example. You can use atan2(y, x) and convert from radians to degrees with * (180/M_PI).
For any real arguments x and y not both equal to zero, atan2(y, x) is the angle in radians between the positive x-axis of a plane and the point given by the coordinates (x, y) on it.
- Wikipedia article on atan2
If you can rely on gyroscope support I'd recommend to use it, because you can get the (Euler) angles directly without any calculations. See iOS - gyroscope sample and follow the links inside.
Don't use UIAccelerometer because it will be deprecated soon. The newer CoreMotion framework is always the better choice, even for old devices.

Resources