Global Rotation - delphi

in GLScene we have three parameters (RollAngle, PitchAngle and TurnAngle) for rotation around local orientation. in the image below, how can I rotate cube around global orientation (orange axis)?

You would need to convert the axis angle rotation to Euler angles. Here is a link explaining this process in some detail with code:
http://www.euclideanspace.com/maths/geometry/rotations/conversions/angleToEuler/index.htm
From the article:
yaw = atan2(y * sin(angle)- x * z * (1 - cos(angle))
, 1 - (y2 + z2 ) * (1 - cos(angle)))
pitch = asin(x * y * (1 - cos(angle)) + z * sin(angle))
roll = atan2(x * sin(angle)-y * z * (1 - cos(angle))
, 1 - (x2 + z2) * (1 - cos(angle)))
EDIT: Renamed the variables to be consistent with the pitch, yaw, roll naming convention.

Maybe you could use "DummyCube" object as a parent. Then you can rotate first the cube inside dummy cube and then the DummyCube.

This is a dirty cheat, but if the object is at the origin (0,0,0) and there is only one object in the scene, you could swing the camera (and light source) around the object instead of rotating the object.

Related

Determining the rotation around each axis from OpenCV rotation vector

I'm trying to better understand the calibrateCamera and SolvePnP functions in OpenCV, specifically the rotation vectors returned by these functions which I believe is an axis-angle rotation vector (NOT as I had thought initially the yaw,pitch,roll angles). I would like to know the rotation around the x,y and z axis of my checkerboard image. The OpenCV functions return a rotation vector in the form rot = [a,b,c]
Using this answer
as a guide I calculate the angle theta with theta = sqrt(a^2,b^2,c^2) and the rotation axis v = [a/theta, b/theta, c/theta];
Then I take these values and use the Axis-Angle To Euler conversion on euclideanspace.com. shown here:
heading = atan2(y * sin(angle)- x * z * (1 - cos(angle)) , 1 - (y^2 + z^2 ) * (1 - cos(angle)))
attitude = asin(x * y * (1 - cos(angle)) + z * sin(angle))
bank = atan2(x * sin(angle)-y * z * (1 - cos(angle)) , 1 - (x^2 + z^2) * (1 - cos(angle)))
I'm using one of the example OpenCV checkerboard images (Left01.jpg), shown below (note the frame axes in the upper left corner with red = x, green = y, blue = z
Using this image I get a rotation vector from calibrateCamera of [0.166,0.294,0.014]
Running these values through the calculations discussed and converting to degrees I get:
heading = 16.7 deg
attitude = 1.7 deg
bank = 9.3 deg
I believe these correspond to yaw,pitch,roll? The 16.7 degree heading seems high looking at the image, but it's hard to tell. Does this make sense? What would be the correct way to figure out the euler angles (angles around each axis) given the OpenCV rotation vector? Snippets of my code are shown below.
double RMSError = calibrateCamera(
objectPointsArray,
imagePointsArray,
img.size(),
intrinsics,
distortion,
rotation,
translation,
CALIB_ZERO_TANGENT_DIST |
CALIB_FIX_K3 | CALIB_FIX_K4 | CALIB_FIX_K5 |
CALIB_FIX_ASPECT_RATIO);
Mat rvec = rotation.at(0);
//try and get the rotation angles here
//https://stackoverflow.com/questions/12933284/rodrigues-into-eulerangles-and-vice-versa
float theta = sqrt(pow(rvec.at<double>(0),2) + pow(rvec.at<double>(1),2) + pow(rvec.at<double>(2),2));
Mat axis = (Mat_<double>(1, 3) << rvec.at<double>(0) / theta, rvec.at<double>(1) / theta, rvec.at<double>(2) / theta);
float x_ = axis.at<double>(0);
float y_ = axis.at<double>(1);
float z_ = axis.at<double>(2);
//this is yaw,pitch,roll respectively...maybe
float heading = atan2(y_ * sin(theta) - x_ * z_ * (1 - cos(theta)), 1 - (pow(y_,2) + pow(z_,2)) * (1 - static_cast<double>(cos(theta))));
float attitude = asin(x_ * y_ * (1 - cos(theta) + z_ * sin(theta)));
float bank = atan2(x_ * sin(theta) - y_ * z_ * (1 - cos(theta)), 1 - (pow(x_, 2) + pow(z_, 2)) * (1 - static_cast<double>(cos(theta))));
float headingDeg = heading * (180 / 3.14);
float attitudeDeg = attitude * (180 / 3.14);
float bankDeg = bank * (180 / 3.14);

Error in radial distortion based on (zoomed in) OpenCV-style camera parameters

Our AR device is based on a camera with pretty strong optical zoom. We measure the distortion of this camera using classical camera-calibration tools (checkerboards), both through OpenCV and the GML Camera Calibration tools.
At higher zoom levels (I'll use 249 out of 255 as an example) we measure the following camera parameters at full HD resolution (1920x1080):
fx = 24545.4316
fy = 24628.5469
cx = 924.3162
cy = 440.2694
For the radial and tangential distortion we measured 4 values:
k1 = 5.423406
k2 = -2964.24243
p1 = 0.004201721
p2 = 0.0162647516
We are not sure how to interpret (read: implement) those extremely large values for k1 and k2. Using OpenCV's classic "undistort" operation to rectify the image using these values seems to work well. Unfortunately this is (much) too slow for realtime usage.
The thumbnails below look similar, clicking them will display the full size images where you can spot the difference:
Camera footage
Undistorted using OpenCV
That's why we want to take the opposite aproach: leave the camera footage be distorted and apply a similar distortion to our 3D scene using shaders. Following the OpenCV documentation and this accepted answer in particular, the distorted position for a corner point (0, 0) would be
// To relative coordinates
double x = (point.X - cx) / fx; // -960 / 24545 = -0.03911
double y = (point.Y - cy) / fy; // -540 / 24628 = -0.02193
double r2 = x*x + y*y; // 0.002010
// Radial distortion
// -0.03911 * (1 + 5.423406 * 0.002010 + -2964.24243 * 0.002010 * 0.002010) = -0.039067
double xDistort = x * (1 + k1 * r2 + k2 * r2 * r2);
// -0.02193 * (1 + 5.423406 * 0.002010 + -2964.24243 * 0.002010 * 0.002010) = -0.021906
double yDistort = y * (1 + k1 * r2 + k2 * r2 * r2);
// Tangential distortion
... left out for brevity
// Back to absolute coordinates.
xDistort = xDistort * fx + cx; // -0.039067 * 24545.4316 + 924.3162 = -34.6002 !!!
yDistort = yDistort * fy + cy; // -0.021906 * 24628.5469 + 440.2694 = = -99.2435 !!!
These large pixel displacements (34 and 100 pixels at the upper left corner) seem overly warped and do not correspond with the undistorted image OpenCV generates.
So the specific question is: what is wrong with the way we interpreted the values we measured, and what should the correct code for distortion be?

Rotation angles from Quaternion

I have a 3D scene in which in the imaginary sphere I position few objects, now I want to rotate them within device motion.
I use spherical coordinate system and calculate position in sphere like below:
x = ρ * sin⁡ϕ * cos⁡θ
y = ρ * sin⁡ϕ * sin⁡θ
z = ρ * cos⁡ϕ.
Also, I use angles (from 0 to 2_M_PI) for performing rotation horizontally (in z-x)
As result all works perfect until I want to use Quaternion from motion matrix.
I can extract values like pitch, yaw, roll
GLKQuaternion quat = GLKQuaternionMakeWithMatrix4(motionMatrix);
CGFloat adjRoll = atan2(2 * (quat.y * quat.w - quat.x * quat.z), 1 - 2 * quat.y * quat.y - 2 * quat.z * quat.z);
CGFloat adjPitch = atan2(2 * (quat.x * quat.w + quat.y * quat.z), 1 - 2 * quat.x * quat.x - 2 * quat.z * quat.z);
CGFloat adjYaw = asin(2 * quat.x * quat.y + 2 * quat.w * quat.z);
or try also
CMAttitude *currentAttitude = [MotionDataProvider sharedProvider].attitude; //from CoreMotion
CGFloat roll = currentAttitude.roll;
CGFloat pitch = currentAttitude.pitch;
CGFloat yaw = currentAttitude.yaw;
*the values that i got is different for this methods
The problem is that pitch, yaw, roll is not applicable in this format to my scheme.
How can I convert pitch, yaw, roll or quaternion or motionMatrix to required angles in x-z for my rotation model? Am I on correct way of things doing, or I missed some milestone point?
How to get rotation around y axis from received rotation matrix/quaternion from CoreMotion, converting current z and x to 0, so displayed object can be rotated only around y axis?
I use iOS, by the way, but guess this is not important here.

Using CoreMotion / CoreLocation to compute north heading from back of phone

I want to get the azimuth from the back of the phone (-Z axis) for an augmented reality app. My application only runs in Landscape Right. Testing this on iPhone 5S.
Currently, I'm using the following approach:
CoreLocation heading base on back camera (Augmented reality)
I have 2 problems with this approach:
If I'm pointing the back of the device towards north such that I'm currently at 0 degrees, then rotate it clockwise (yaw) a full 360 degrees, I'm now at -20 degrees. Counterclockwise rotations add 20 degrees. This pattern repeats itself such that rotating 720 degrees from 0 now yields -40 degrees and so on. Also, even if I don't necessarily do these clear rotations, but instead move the phone chaotically (spinning, shaking, etc), but end up in the same spot where I was initially, I can't even predict what value it will show.
The other problem is what I think is called gyro drift. If I don't move the device at all, I can clearly see how the value slowly changes over time, by let's say 0.1 degrees every few seconds, sometimes in one direction, sometimes the other, until a certain point where it decides to stop.
The problem is, I don't have the mathematical background to know how to account for these changes. It's especially problematic that I can't seem to compute the rotation matrix from yaw/pitch/roll from deviceMotion.attitude. I tried:
float w = motion.attitude.yaw;
float v = motion.attitude.pitch;
float u = motion.attitude.roll;
r.m11 = cos(v) * cos(w);
r.m12 = sin(u) * sin(v) * cos(w) - cos(u) * sin(w);
r.m13 = sin(u) * sin(w) + cos(u) * sin(v) * cos(w);
r.m21 = cos(v) * sin(w);
r.m22 = cos(u) * cos(w) + sin(u) * sin(v) * sin(w);
r.m23 = cos(u) * sin(v) * sin(w) - sin(u) * cos(w);
r.m31 = -sin(v);
r.m32 = sin(u) * cos(v);
r.m33 = cos(u) * cos(v);
I've tried every Tait–Bryan combination (u-v-w, u-w-v, v-u-w, v-w-u, w-v-u, w-u-v), some of them came close, but still not close enough.
From my observations, it seems like the magneticHeading from CLLocationManager is much more accurate than computed heading from CMMotionManager, but again, even if I got the correct angle, I don't know where should I start to get the equivalent angle in a different coordinate system reference frame. Any help would be greatly appreciated.

Algorithm for creating a circular path around a center mass?

I am attempting to simply make objects orbit around a center point, e.g.
The green and blue objects represent objects which should keep their distance to the center point, while rotating, based on an angle which I pass into method.
I have attempted to create a function, in objective-c, but it doesn't work right without a static number. e.g. (It rotates around the center, but not from the true starting point or distance from the object.)
-(void) rotateGear: (UIImageView*) view heading:(int)heading
{
// int distanceX = 160 - view.frame.origin.x;
// int distanceY = 240 - view.frame.origin.y;
float x = 160 - view.image.size.width / 2 + (50 * cos(heading * (M_PI / 180)));
float y = 240 - view.image.size.height / 2 + (50 * sin(heading * (M_PI / 180)));
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
My magic numbers 160, and 240 are the center of the canvas in which I'm drawing the images onto. 50 is a static number (and the problem), which allows the function to work partially correctly -- without maintaining the starting poisition of the object or correct distance. I don't know what to put here unfortunately.
heading is a parameter that passes in a degree, from 0 to 359. It is calculated by a timer and increments outside of this class.
Essentially what I would like to be able to drop any image onto my canvas, and based on the starting point of the image, it would rotate around the center of my circle. This means, if I were to drop an image at Point (10,10), the distance to the center of the circle would persist, using (10,10) as a starting point. The object would rotate 360 degrees around the center, and reach it's original starting point.
The expected result would be to pass for instance (10,10) into the method, based off of zero degrees, and get back out, (15,25) (not real) at 5 degrees.
I know this is very simple (and this problem description is entirely overkill), but I'm going cross eyed trying to figure out where I'm hosing things up. I don't care about what language examples you use, if any. I'll be able to decipher your meanings.
Failure Update
I've gotten farther, but I still cannot get the right calculation. My new code looks like the following:
heading is set to 1 degree.
-(void) rotateGear: (UIImageView*) view heading:(int)heading
{
float y1 = view.frame.origin.y + (view.frame.size.height/2); // 152
float x1 = view.frame.origin.x + (view.frame.size.width/2); // 140.5
float radius = sqrtf(powf(160 - x1 ,2.0f) + powf(240 - y1, 2.0f)); // 90.13
// I know that I need to calculate 90.13 pixels from my center, at 1 degree.
float x = 160 + radius * (cos(heading * (M_PI / 180.0f))); // 250.12
float y = 240 + radius * (sin(heading * (M_PI / 180.0f))); // 241.57
// The numbers are very skewed.
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
I'm getting results that are no where close to where the point should be. The problem is with the assignment of x and y. Where am I going wrong?
You can find the distance of the point from the centre pretty easily:
radius = sqrt((160 - x)^2 + (240 - y)^2)
where (x, y) is the initial position of the centre of your object. Then just replace 50 by the radius.
http://en.wikipedia.org/wiki/Pythagorean_theorem
You can then figure out the initial angle using trigonometry (tan = opposite / adjacent, so draw a right-angled triangle using the centre mass and the centre of your orbiting object to visualize this):
angle = arctan((y - 240) / (x - 160))
if x > 160, or:
angle = arctan((y - 240) / (x - 160)) + 180
if x < 160
http://en.wikipedia.org/wiki/Inverse_trigonometric_functions
Edit: bear in mind I don't actually know any Objective-C but this is basically what I think you should do (you should be able to translate this to correct Obj-C pretty easily, this is just for demonstration):
// Your object gets created here somewhere
float x1 = view.frame.origin.x + (view.frame.size.width/2); // 140.5
float y1 = view.frame.origin.y + (view.frame.size.height/2); // 152
float radius = sqrtf(powf(160 - x1 ,2.0f) + powf(240 - y1, 2.0f)); // 90.13
// Calculate the initial angle here, as per the first part of my answer
float initialAngle = atan((y1 - 240) / (x1 - 160)) * 180.0f / M_PI;
if(x1 < 160)
initialAngle += 180;
// Calculate the adjustment we need to add to heading
int adjustment = (int)(initialAngle - heading);
So we only execute the code above once (when the object gets created). We need to remember radius and adjustment for later. Then we alter rotateGear to take an angle and a radius as inputs instead of heading (this is much more flexible anyway):
-(void) rotateGear: (UIImageView*) view radius:(float)radius angle:(int)angle
{
float x = 160 + radius * (cos(angle * (M_PI / 180.0f)));
float y = 240 + radius * (sin(angle * (M_PI / 180.0f)));
// The numbers are very skewed.
view.frame = CGRectMake(x, y, view.image.size.width, view.image.size.height);
}
And each time we want to update the position we make a call like this:
[objectName rotateGear radius:radius angle:(adjustment + heading)];
Btw, once you manage to get this working, I'd strongly recommend converting all your angles so you're using radians all the way through, it makes it much neater/easier to follow!
The formula for x and y coordinates of a point on a circle, based on radians, radius, and center point:
x = cos(angle) * radius + center_x
y = sin(angle) * radius + center_y
You can find the radius with HappyPixel's formula.
Once you figure out the radius and the center point, you can simply vary the angle to get all the points on the circle that you'd want.
If I understand correctly, you want to do InitObject(x,y). followed by UpdateObject(angle) where angle sweeps from 0 to 360. (But use radians instead of degrees for the math)
So you need to track the angle and radius for each object.:
InitObject(x,y)
relative_x = x-center.x
relative_y = y-center.y
object.radius = sqrt((relative_x)^2, (relative_y)^2)
object.initial_angle = atan(relative_y,relative_x);
And
UpdateObject(angle)
newangle = (object.initial_angle + angle) % (2*PI )
object.x = cos(newangle) * object.radius + center.x
object.y = sin(newangle) * object.radius + center.y
dx=dropx-centerx; //target-source
dy=-(dropy-centery); //minus = invert screen coords to cartesian coords
radius=sqrt(dy*dy+dx*dx); //faster if your compiler optimizer is bad
if dx=0 then dx=0.000001; //hackpatchfudgenudge*
angle=atan(dy/dx); //set this as start angle for the angle-incrementer
Then go with the code you have and you'll be fine. You seem to be calculating radius from current position each time though? This, like the angle, should only be done once, when the object is dropped, or else the radius might not be constant.
*instead of handling 3 special cases for dx=0, if you need < 1/100 degree precision for the start angle go with those instead, google Polar Arctan.

Resources