OpenCV:How to use the Euler angle to determine camera orientation - opencv

I got a question about how to use the EulerAngle to determine camera's orientation
First, I used solvePnP function and got two output "rvec" and "tvec", then I use Rodrigues to convert rvec to rotation matrix "R". After that, I calculated EulerAngle by using the function below:
void getEulerAngles(cv::Mat matrix)
{
assert(isRotationMatrix(matrix));
float sy=sqrt(matrix.at<double>(0,0)*matrix.at<double>(0,0)+matrix.at<double>(1,0)*matrix.at<double>(1,0));
bool singular = sy<1e-6;
float theta_x=0.0,theta_y=0.0,theta_z=0.0;//theta_x means rotation around X-Axis
if(!singular)
{
theta_x=atan2(matrix.at<double>(2,1),matrix.at<double>(2,2));
theta_x= theta_x*180.0/3.1416 ;
theta_y=atan2(-matrix.at<double>(2,0), sy);
theta_y= theta_y*180.0/3.1416 ;
theta_z=atan2(matrix.at<double>(1,0), matrix.at<double>(0,0));
theta_z= theta_z*180.0/3.1416 ;
}
else
{
theta_x=atan2(-matrix.at<double>(1,2), matrix.at<double>(1,1));
theta_x= theta_x*180.0/3.1416 ;
theta_y=atan2(-matrix.at<double>(2,0), sy);
theta_y= theta_y*180.0/3.1416 ;
theta_z=0;
theta_z= theta_z*180.0/3.1416 ;
}
I know that different rotation order can make different result.So if I want to get the camera's orientation what kind of rotation order should I choose?

I think I kinda know how to solve this problem. The orientation order is always z-y-x.It means you just need to rotate your "tvec" around the Z-Axis, then around the y-axis, finally rotate around the x-axis. And remember to use negative Euler angles.
There is my code:
void calCamPose(cv::Mat t)
// the order of rotation is z-y-x
{
cv::Point3f tvec(t);
float x1=cos(-theta_z)*tvec.x-sin(-theta_z)*tvec.y;
float y1=sin(-theta_z)*tvec.x+cos(-theta_z)*tvec.y;//first rotation
float outx=cos(-theta_y)*x1+sin(-theta_y)*tvec.z;
float z2=cos(-theta_y)*tvec.z+sin(-theta_y)*x1;//second rotation
float outy=cos(-theta_x)*y1-sin(-theta_x)*z2;
float outz=cos(-theta_x)*z2+sin(-theta_x)*y1;//third rotation
cv::Point3f cam_pose=(0,0,0);
cam_pose.x=outx,cam_pose.y=outy,cam_pose.z=outz;
Debug("Cam_Pose");
Debug(cam_pose);
}

Related

ARKit - Why is the camera's viewMatrix position changing when the device is rotated?

When I rotate my test device about a certain axis, the camera's viewMatrix's x-axis position value (column 3, row 1) is changing significantly. On the order of a meter in translation along the x-axis when only the device's rotation is changed 180 degrees. If I rotate a full 360 degrees the x position returns to it's starting 0 degree value. I am not translating the device at all (ignoring minor human error).
Is this a bug or configuration setup issue? Can someone explain why the x-axis position would change when only rotating the device? Is anyone else seeing this?
Here is the basic code setup:
#property (nonatomic, strong) ARWorldTrackingConfiguration *arSessionConfiguration;
#property (nonatomic, strong) ARSession *arSession;
- (void)setup
{
self.arSessionConfiguration = [ARWorldTrackingConfiguration new];
self.arSessionConfiguration.worldAlignment = ARWorldAlignmentGravity;
self.arSessionConfiguration.planeDetection = ARPlaneDetectionHorizontal;
self.arSession = [ARSession new];
self.arSession.delegate = self;
[self.arSession runWithConfiguration:self.arSessionConfiguration];
}
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame
{
matrix_float4x4 viewMatrix = [frame.camera viewMatrixForOrientation:UIInterfaceOrientationPortrait];
NSLog(#"%0.2f, %0.2f, %0.2f", viewMatrix.columns[3][0], viewMatrix.columns[3][1], viewMatrix.columns[3][2]);
}
I am testing on an 10.5" iPad Pro running the latest iOS 11 beta.
This is due to a mis-understanding: the 4th column of the view matrix is not the camera's position.
This is because the view matrix is the inverse of the camera's transformation matrix, i.e. multiplying it by a world point transforms that point to the local basis of the camera.
For a camera with rotation matrix R (3x3) and position c, multiplying its view matrix V (4x4) with a point p is equivalent to:
We can deduce that the view matrix has the following construction:
Therefore, to obtain the actual position of the camera, we must
multiply the 4th column by [minus] the transpose / inverse
of the top-left 3x3 sub-matrix of the view matrix.
I.e., something like:
matrix_float3x3 topLeftSubMtx = /*viewMatrix[0][0] to viewMatrix[3][3]*/;
vector_float4 rightColumn = viewMatrix.columns[3];
float positionX = -[vector_float4.dotProduct topLeftSubMtx.columns[0], rightColumn];
// similarly for Y and Z
(Apologies as I don't know objective-C or ARKit specifics; hopefully you get the gist of this pseudo-ish code)
Get current transform of each frame of camera by following method
func session(_ session: ARSession, didUpdate frame: ARFrame)
{
let currentTransform = frame.camera.transform
}

2D augmented reality with sensor issue

I'm making a "geolocational AR app" in which I use Paint() to draw bitmap on my screen with the use of sensors to do some translation so that my image will only be shown at a specific point.
I've done everything
However, the image shakes while I present it.
I really want the effect as the following videos presents
https://www.youtube.com/watch?v=8U3vWETmk2U
I've implemented low pass filter to ease the situation but it is still not as steady as the images from the video.
This is how I achieve AR movement
float dx = (float) ( (canvas.getWidth()/ horizontalFOV) * (Math.toDegrees(orientation[0])-curBearingTo));
float dy = (float) ( (canvas.getHeight()/ verticalFOV) * Math.toDegrees(orientation[1]));
canvas.translate(0.0f, 0.0f-dy);
canvas.translate(0.0f-dx, 0.0f);
The curBearingTois the result of android location API: BearingTo(location, destination)
And this is how I get Orientation matrix
if (lastAccelerometer != null && lastCompass != null) {
boolean gotRotation = SensorManager.getRotationMatrix(rotation,
identity, lastAccelerometer, lastCompass);
if (gotRotation) {
// remap such that the camera is pointing straight down the Y
// axis
SensorManager.remapCoordinateSystem(rotation,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
cameraRotation);
// orientation vector
SensorManager.getOrientation(cameraRotation, orientation);
}
Any suggestion?

How can I calculate point of UV texture pressed on object?

How can I calculate point of UV texture pressed on object?
For example: I have a ball textured by Earth uv map and I pressed any city and I'd like to get a possition that city on Earth bitmp?
I'm going to try explain :)
I have a code:
bool draw;
int old_position_X;
int old_position_Y;
void __fastcall TForm1::Image3D(TObject *Sender, TShiftState Shift, float X,
float Y, TVector3D &RayPos, TVector3D &RayDir)
{
if (Shift.Contains(ssLeft))
{
if (draw==true)
{
TVector3D HitPos;
Image3D->Context->Pick(X, Y, TProjection::pjCamera, RayPos, RayDir);
RayCastPlaneIntersect(RayPos, RayDir, Image3D->AbsolutePosition, Image3D->AbsoluteDirection, HitPos) ;
HitPos.X -= Image3D->Position->X;
HitPos.Y -= Image3D->Position->Y;
int w=Image3D->Bitmap->Width;
int h=Image3D->Bitmap->Height;
int x=(w/Image3D->Width)*(HitPos.X+Image3D->Width/2.0);
int y=(h/Image3D->Height)*(HitPos.Y+Image3D->Height/2.0);
Image3D->Bitmap->Canvas->BeginScene();
Image3D->Bitmap->Canvas->Stroke->Kind=TBrushKind::bkSolid;
Image3D->Bitmap->Canvas->Stroke->Color=claRed;
Image3D->Bitmap->Canvas->DrawLine(TPointF(old_position_X,old_position_Y),TPointF(x,y),1.0);
Image3D->Bitmap->Canvas->EndScene();
old_position_X=x;
old_position_Y=y;
}
}
else
{
draw=false;
}
}
I can zoom, rotate and move the Image3D and that code make me paint on the Image3D.
By the way I don't understand why I have to divide Image3D width and height by 2 but thats work :) I don't understand dependence between 3D object values (scale, positions etc) and pixels... Especially scale X,Y,Z and Width, Height of 3D objects... And dependence with size of textures and scale of 3D objects...
And now, I'd like to make the same on imported models. How to calculate that position on texture.
I don't expect exactly the code but I would ask for guidance, example code etc
anybody?
The usual way to that is:
Calculate which triangle you have hit, using the ray.
Get the UV coordinates of its three vertices and interpolate.

Calcul new coords of camera after a 90 degres rotation in a isometric 2D projection

I made a 2D isometric renderer. It works fine but now I want to show my scene from 4 differents point of view (NO NW SE SW) but, on a 90° rotation, my camera cannot keep the center of my scene on screen.
What's working :
I calcul new projection of scene to match the new viewport (x y z in my world).
I reorganise part of my scene(chunk) to draw them in a correct order
I reorganise 'tiles' of 'chunks' to draw them in a correct order
I can keep the correct center with a 180 degres rotation.
What's do not working :
I cannot find a correct translation to apply to my camera after a 90 degres rotation.
What I know :
To keep the same center on a 180° rotation with my camera I have to do this :
camera.Position -= new Vector2(2 * camera.Position.X + camera.Width, 2 * camera.Position.Y + camera.Height);
Illustration
If the center of your map is origo (0,0,0), this gets easy:
First you store your default camera position in a Vector3 CameraOffset, then you calculate position using a rotation-matrix. 90* in redians is half a Pi, so we will use PiOverTwo. We will also use an enum to decide what direction to be pointing, so you can say
Camera.Orientation = Orientation.East;
and the camera should fix itself :)
public enum Orientation
{
North, East, South, West
}
in camera:
public Vector3 Position { get; protected set; }
Vector3 _CameraOffset = new Vector3(0, 20, 20);
public Vector3 CameraOffset
{
get
{
return _Orientation;
}
set
{
_Orientation = value;
UpdateOrientation();
}
}
Orientation _Orientation = Orientation.North;
public Orientation Orientation
{
get
{
return _Orientation;
}
set
{
_Orientation = value;
UpdateOrientation();
}
}
private void UpdateOrientation()
{
Position = Vector3.Transform(CameraOffset, Matrix.CreateRotationY(MathHelper.PiOverTwo * (int)Orientation));
}
If you want a gliding movement between positions, I think I can help too ;)
If your camera does not focus on Vector3.Zero and should not rotate around it, you just need to change:
Position = Vector3.Transform(CameraOffset, Matrix.CreateRotationY(MathHelper.PiOverTwo * (int)Orientation));
to:
Position = Vector3.Transform(CameraOffset, Matrix.CreateRotationY(MathHelper.PiOverTwo * (int)Orientation) * Matrix.CreateTranslation(FocusPoint));
Here, FocusPoint is the point in 3D that you rotate around (your worlds center). And now you also know how to let your camera move around, if you call UpdateOrientation() in your Camera.Update() ;)
EDIT; So sorry, totally missed the point that you use 2D. I'll be back later to see if I can help :P

XNA - controlling an object with keyboard input

Ok so I have a ship which moves up and down based on the axis regardless of where the ship is facing.
How do I make the ship move in the direction it's facing? i.e. if my ship is facing east, key up makes it go north rather than east.
Your question isn't very clear - I will assume you're using models and matrices (as opposed to SpriteBatch or something else). So, making a guess - I'd say that the order of your matrix operations is incorrect.
This answer to a similar question may help.
Each matrix operation happens around the origin. So if you're doing your rotation after you move your ship into position, your rotation will also effectively "rotate" the direction of movement.
The easiest way is to make an angle and velocity variable so when you click left and right you change the angle and when you click up and down you changle the speed of your ship.
KeyboardState ks;
float speed = 0;
float angle = 0;
protected override void Update(GameTime gameTime)
{
ks = Keyboard.GetState();
if(ks.IsKeyDown(Keys.Up)) speed += 10;
if (ks.IsKeyDown(Keys.Down)) speed -= 10;
if (ks.IsKeyDown(Keys.Right)) angle += 10;
if (ks.IsKeyDown(Keys.Left)) angle -= 10;
}
You need to have direction vector like this
Vector3 direction = Vector3.Transform(Vector3.Forward, Matrix.CreateFromYawPitchRoll(yaw, pitch, roll));
Next, get your velocity vector
Vector3 velocity = direction * speed;
And move your ship
float time (float) = gameTime.ElapsedTime.TotalSeconds;
position += velocity * time;
In this example yaw is angle, pitch and roll keep 0.

Resources