IOS Accelerometer/Gyroscope Question - ios

I want to write an app that gives the degrees of position from some coordinate (bottom of the phone).
For example... If I'm holding the phone at a 45 degree angle, I want to display: 45 degrees on the screen. If the user holds the phone at 45 degrees and rotates the phone around an axis going from the ear piece to the home button, I want to display that angle (between 0-180degrees).
I've implemented the accelerometer and I get the x, y, z values, however, how do I convert them? I know they are in G's (1G, 0.9G, -0.5G on the respective axis), but what's the conversion? Am I even going on the correct track? Should I be using the gyroscope instead?
Thanks.

This question has an example. You can use atan2(y, x) and convert from radians to degrees with * (180/M_PI).
For any real arguments x and y not both equal to zero, atan2(y, x) is the angle in radians between the positive x-axis of a plane and the point given by the coordinates (x, y) on it.
- Wikipedia article on atan2

If you can rely on gyroscope support I'd recommend to use it, because you can get the (Euler) angles directly without any calculations. See iOS - gyroscope sample and follow the links inside.
Don't use UIAccelerometer because it will be deprecated soon. The newer CoreMotion framework is always the better choice, even for old devices.

Related

How to measure user distance from wall

I need to measure distance of wall from user. When user open the camera and point to the any surface i need to get the distance. I have read some link Is it possible to measure distance to object with camera? and i used code for find the iphone camera angle from here http://blog.sallarp.com/iphone-accelerometer-device-orientation.
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
// Get the current device angle
float xx = -[acceleration x];
float yy = [acceleration y];
float angle = atan2(yy, xx);
}
d = h * tan angle
But nothing happen in the nslog and camera.
In the comments, you shared a link to a video: http://youtube.com/watch?v=PBpRZWmPyKo.
That app is not doing anything particularly sophisticated with the camera, but rather appears to be calculate distances using basic trigonometry, and it accomplishes this by constraining the business problem in several critical ways:
First, the app requires the user to specify the height at which the phone's camera lens is being held.
Second, the user is measuring the distance to something sitting on the ground and aligning the bottom of that to some known location on the screen (meaning you have a right triangle).
Those two constraints, combined with the accelerometer and the camera's lens focal length, would allow you to calculate the distance.
If your target cross-hair was in the center of the screen, it greatly simplifies the problem and it becomes a matter of simple trigonometry, i.e. your d = h * tan(angle).
BTW, the "angle" code in the question appears to measure the rotation about the z-axis, the clockwise/counter-clockwise rotation as the device faces you. For this problem, though, you want to measure the rotation of the device about its x-axis, the forward/backward tilt. See https://stackoverflow.com/a/16555778/1271826 for example of how to capture the device orientation in space. Also, that answer uses CoreMotion, whereas the article referenced in your question is using an API that has since been deprecated.
The only way this would be possible is if you could read out the setting of the auto-focus mechanism in the lens. To my knowledge this is not possible.

iPhone augmented reality Euler angles rotation – roll issue

I’m working on an iOS augmented reality application.
It is location-based, not marker-based.
I use the GPS, compass and accelerometers to get latitude, longitude, altitude and the 3 euler angles: yaw, pitch and roll. I know using NSLog() that those 6 variables contain valid data.
My application shows some 3d objects over the camera view.
It works fine as long as I use everything but the roll angle.
If I add that third angle, the rotation applied to my opengl world is not good. I do it that way in the main OpenGL draw method
glRotatef(pitch, 1, 0, 0);
glRotatef(yaw, 0, 1, 0);
//glRotatef(roll, 0, 0, 1);
I think there is something wrong with this approach but am certainly not a specialist. Maybe I should create some sort of unique rotation matrix rather than 3 different ones?
Maybe that’s not possible easily? After all most desktop video games, FPS and the like, just let the user change the yaw and the pitch using the mouse, so only 2 angles, not 3. But unlike the mouse, which is a 2d device, a phone used for augmented reality can move in any angles.
But then again, all AR tutorials I have seen online couldn’t handle ‘roll’ properly. ‘Rolling’ your phone would either completely mess AR stuff up or do nothing at all, using some roll-compensation strategies.
So my question is, assuming I have my 3 Euler angles using the phone sensors, how should I apply them to my 3d opengl view?
I think you're likely talking about gimbal lock.
The essence of the problem is that if you rotate with Eulers then there's always a sequence to it. For example, you rotate around x, then around y, then z. But then one axis can always becomes ambiguous because a preceding can move it onto a different axis.
Suppose the rotation were 0 degrees around x, 90 degrees around y, then 20 degrees around z. So you do the x rotation and nothing has changed. You do the y rotation and everything moves 90 degrees. But now you've moved the z axis onto where the x axis was previously. So the z rotation will appear to be around x.
No matter what most people's instincts tell them, there's no way to avoid the problem. The kneejerk reaction is that you'll always rotate around the global axes rather than the local one. That doesn't resolve the problem, it just reverses the order. The z rotation could then the y rotation — which has already occurred — into an x rotation.
You're right that you should aim to create a unique description of rotation separated from measuring angles.
For augmented reality it's actually not all that difficult.
The accelerometer tells you which way down is. The compass tells you which way north is. The two may not be orthogonal though — the compass reading should vary from being exactly at a right angle to the floor on the equator to being exactly parallel to the accelerometer at the poles.
So:
just accept the accelerometer vector as down;
get the cross product of down and the compass vector to get your side vector — it should point along a line of longitude;
then get the cross product of your side vector and your down vector to get a north vector that is suitably perpendicular.
You could equally use the dot product to remove that portion of the compass vector that is in the direction of gravity and cross product from there.
You'll want to normalise everything.
That gives you three basis vectors, so just put them directly into a matrix. No further work required.

Convert world to object coordinates

The iPhone gyroscope receives rotation data relative to some reference attitude and it doesn't change (unless multiplied.) Lets say I face the wall using my iPhone camera, and rotate 45 degrees left (roll += PI/4.)
Now, if I lift the phone towards the ceiling, both yaw and pitch change since the coordinate space is fixed (world coordinate space, doesn't move or rotate with the phone.) Is there a way to determine this angle (the one between the floor plane and the camera direction vector), roll, yaw and pitch given?
Edit: Instead of opening another question I'll try here. Luc's solution works. But how to get the other two angles of rotation? I've read the info on the posted link but it's been years since I studied linear algebra. This might be more math than a programming question, actually.
I don't really code for iPhone so I'll trust you on the "real world coordinates" frame.
In that case, you want the dot product between both z-axis' vectors. That'll give you the cosine of the angle you're looking for, pretty close thus. Since an angle between planes only really makes sense as a value between 0° and 90°, you actually have all the information you need in that cosine.
And there is no latex formatting here, otherwise I'd go into a bit more of detail, but read this page if you're interested, I'll just include the final result here, the rotation matrix for your three rotations :
Now the z-axis' vector of the horizontal plan is (0,0,1) (read this as a vertical vector though) and rotated with this matrix, you simply get its third column.
So we want to have the dot product between that third column and our (0,0,1) vector, so you get cos(β)cos(γ) which is cos(pitch)*cos(roll)
In conclusion, the angle between your plans is arccos(cos(pitch)*cos(roll)). This value will tell you how much your iPhone is inclined, not in which direction of course. But you can work that out from the values of the vector (rightmost column of the matrix) we spoke of.

is IPad accelerometer normalized to gravity

While porting my Android app to iOS I was confused with one thing, so I want to find out: am I right or mo code works wrong. At Android device accelerometer returns values in physical measure units: m/s2. On ipad i get summary force approximately = 1.0 for still device(and i expect 9.8). My first explanation is that returning value is normalized to 9.8, so I must multiply it with 9.8 to get real force. My second idea - my code is totally wrong, but it's hard to believe.
From the Docs on CMAcceleration:
CMAcceleration
The type of a structure containing 3-axis acceleration values.
typedef struct {
double x;
double y;
double z;
} CMAcceleration;
X-axis acceleration in G's (gravitational force).
Y-axis acceleration in G's (gravitational force).
Z-axis acceleration in G's (gravitational force).
A G is a unit of gravitation force equal to that exerted by the earth’s gravitational field (9.81 m s−2).

Blackberry Device Movement Angle Difference

I am working on a Blackberry application in which I need to retrieve the Angle difference when the device moves. It means the difference of angle between when the movement starts and ends. It must be 25 degrees to call some function.
In simple words, call a function when the device moves by 25 degrees.
Please read AccelerometerSensor docs, it is available in API 4.7.0 and higher. All data that you can retrieve is described in class AccelerometerData, it is orientation and acceleration (gravity data).
It is described more in details how to get angle from gravity sensor data in JavaME docs, "Mobile Sensor API" section:
If the phone was placed flat, the accelerometer would tell us that the acceleration along the z-axis (up and down) is about 1000 (this value represents 1G). The accelerations along the X and Y axises (sideways) would be about 0 since the phone is sitting still and gravity only works downwards. Flipping the phone over with the screen facing down, the accelerometer would give us the value of -1000 on the Z-axis. Standing on its side, would give us a value of 1000 or -1000 along either the X- or the Y-axis, depending on which side you put it. Putting the phone in a 45 degree angle along the X-axis would give us a value of ±707 on the Z-axis and ±707 on the Y-axis, since gravity cannot affect either axis with its full force (You can easily calculate what the value should be for a certain angle for each axis using the sine and cosine functions). Using the values from the X and Y-axis from the accelerometer, we can determine the position of the phone at any time, and then use that value to move our ship to avoid the incoming asteroids.
So, having accelerometer data for all 3 axes we may figure it out what is horizontal angle of a device.

Resources