Retrieve ARVector3 from touch on cameraView - ios

I've been playing around with the enhanced samples and read the full SDK documentation but I'm can't figure out this problem. I'm trying to convert a touch on the self.cameraView to a ARVector3 so I can move the 3D model to that position. At the moment I'm trying to convert the CGPoint from the tapGesture to the world ARNode but no luck so far.
float x = [gesture locationInView:self.cameraView].x;
float y = [gesture locationInView:self.cameraView].y;
CGPoint gesturePoint = CGPointMake(x, y);
ARVector3 *newPosition = [arbiTrack.world nodeFromViewPort:gesturePoint];
ARNode *touchNode = [ARNode nodeWithName:#"touchNode"];
ARImageNode *targetImageNode = [[ARImageNode alloc] initWithImage:[UIImage imageNamed:#"drop.png"]];
[touchNode addChild:targetImageNode];
[targetImageNode scaleByUniform:1];
touchNode.position = newPosition;
[arbiTrack.world addChild:touchNode];
This results in the following situation:
And seen on my iPhone:
Why is a touch in the upper left corner of the self.cameraView a point that is closest by? I actually want to click on the screen and get a X, Y, Z (ARVector3) coordinate back.

Related

scenekit - zoom in/out to selected node of scene

I have a scene in which a human body is displayed. I want to zoom in to a specific body part when a user taps on it.
I changed the position of the camera to the position of Node but it points not exactly on it.
Also I need to keep the selected part in center of the screen when zoom in.
How can I accomplish zoom in / out?
I solved my problem by moving the camera instead of scaling the Model. I got the tap point by Gesture Recognizer and similarly the point of touch.
Now I converted the View-Coordinates to Scene Coordinates
CGPoint p = [gestureRecognize locationInView:scnView];
NSArray *hitResults = [scnView hitTest:p options:nil];
SCNVector3 projectedOrigin = [scnView projectPoint:SCNVector3Zero];
SCNVector3 vector = SCNVector3Make(p.x, p.y, projectedOrigin.z);
SCNVector3 worldPoint = [scnView unprojectPoint:vector];
and then positioned the Camera to the worldPoint.
To reposition it in a Z-axis you want to multiply the currents node matrix with the new matrix.
var node = childNode.transform
var translation = SCNMatrix4MakeTranslation(1.0, 1.0, adjustedZValue)
var newTrans = SCNMatrix4Mult(node, translation)
childNode.transform = newTrans
Edit: Had some names mixed up
a bit cleaned up and more "swifty":
let transform = childNode.transform
let adjustedZValue = Float32(3)
let translation = SCNMatrix4MakeTranslation(1.0, 1.0, adjustedZValue)
let newTrans = SCNMatrix4Mult(transform, translation)
childNode.transform = newTrans

GLKit Object not rotating properly

İ have a 3d object.
i am rotating it with touches like this :
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians( diff.y / 2.0 );
float rotY = GLKMathDegreesToRadians( diff.x / 3.0 );
GLKVector3 yAxis = GLKMatrix4MultiplyAndProjectVector3(GLKMatrix4Invert(_rotMatrix, &isInvertible), GLKVector3Make(0, 0, 1) );
_rotMatrix = GLKMatrix4Rotate(_rotMatrix, rotY, yAxis.x, yAxis.y, yAxis.z);
GLKVector3 xAxis = GLKVector3Make(1, 0, 0);
_rotMatrix = GLKMatrix4Rotate(_rotMatrix, rotX, xAxis.x, xAxis.y, xAxis.z);
}
and setting matrices like this :
_modelViewMatrix = GLKMatrix4Identity;
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, 0.0f, -60.0f);
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, 5.5f, -4.0f);
// i know i can to this by one code
//çevirme işlemleri ilki klimanın kameraya doğru bakması için
//ikincisi parmak hareketlerinden gelen transform matrisi
// 90 derece döndermeyi kapatıyorum
_modelViewMatrix = GLKMatrix4RotateX(_modelViewMatrix, GLKMathDegreesToRadians(90.0f));
_modelViewMatrix = GLKMatrix4Multiply(_modelViewMatrix, _rotMatrix);
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, -5.5f, +4.0f);
self.reflectionMapEffect.transform.modelviewMatrix = _modelViewMatrix;
i am translating modelViewMatrix to objects centre. rotating it. than translating back. than translating -65 on z. but everytime i tried to do it. it's rotates like on the same vector. i think object has it's own centre. and rotating with it's own center and scene's center.
how can i change object's centre with code or how can i rotate this object properly?
The way the matrix multiplication works is considering the object base vectors. You can imagine it as looking form a first person perspective (from the object/model that is). If you first move the object (translate) and then rotate the object will still be at the same position but facing a different rotation, that means it will not simulate the orbiting. If you change the operations to rotate first and then move it will simulate orbiting (but rotating as well). For instance if you rotate the model to face your right and than translate forward it will seem as if translated to your right. So a true orbiting consists of first rotating by some angle, then translating by radius and then rotating by same negative angle. Again, try looking as from the model perspective.
I hope this helps as you did not explain what exactly is it you want/need to accomplish.

Get angle between imageview and touch

I have a image view with positions ( x:138 and y:107 ) which isn't in the center of the screen. Now I wan't to calculate the angle between these points and the horizontal line but I don't know how to do this.
Can anyone tell me more about this?
You can do something like this, where Start- and endpoint is the images positions.
Example:
CGPoint endPoint = CGPointMake(50, 100);
CGPoint startPoint = CGPointMake(100, 100);
float angleVal = (((atan2((endPoint.x - startPoint.x) , (endPoint.y - startPoint.y)))*180)/M_PI);

rotation along x and y axis

I'm using GLKit along with PowerVR library for my opengl-es 2.0 3D app. The 3D scene loads with several meshes, which simulate a garage environment. I have a car in the center of the garage. I am trying to add touch handling to the app, where the user can rotate the room around (e.g., to see all 4 walls surrounding the car). I also want to allow a rotation on the x axis, though limited to a small range. Basically they can see from a little bit of the top of the car to just above the floor level.
I am able to rotate on the Y OR on the X, but not both. As soon as I rotate on both axis, the car is thrown off-axis. The car isn't level with the camera anymore. I wish I could explain this better, but hopefully you guys will understand.
Here is my touches implementation:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 4.0);
float rotY = GLKMathDegreesToRadians(diff.y / 5.0);
PVRTVec3 xAxis = PVRTVec3(1, 0, 0);
PVRTVec3 yAxis = PVRTVec3(0,1,0);
PVRTMat4 yRotMatrix, xRotMatrix;
// create rotation matrices with angle
PVRTMatrixRotationXF(yRotMatrix, rotY);
PVRTMatrixRotationYF(xRotMatrix, -rotX);
_rotationY = _rotationY * yRotMatrix;
_rotationX = _rotationX * xRotMatrix;
}
Here's my update method:
- (void)update {
// Use the loaded effect
m_pEffect->Activate();
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
vUp.x = 0.0f;
vUp.y = 1.0f;
vUp.z = 0.0f;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, vUp);
// rotate the camera based on the users swipe in the X direction (THIS WORKS)
m_mView = m_mView * _rotationX;
// Calculates the projection matrix
bool bRotate = false;
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, bRotate);
}
I've tried multiplying the new X rotation matrix to the current scene rotation first, and then multiplying the new Y rotation matrix second. I've tried the reverse of that, thinking the order of multiplication was my problem. That didn't help. Then I tried adding the new X and Y rotation matrices together before multiplying to the current rotation, but that didn't work either. I feel that I'm close, but at this point I'm just out of ideas.
Can you guys help? Thanks. -Valerie
Update: In an effort to solve this, I'm trying to simplify it a little. I've updated the above code, removing any limit in the range of the Y rotation. Basically I calculate the X and Y rotation based on the user swipe on the screen.
If I understand this correctly, I think I want to rotate the View matrix (camera/eye) with the calculation for the _rotationX.
I think I need to use the World matrix (origin 0,0,0) for the _rotationY calculation. I'll try and get some images of exactly what I'm talking about.
Wahoo, got this working! I rotated the view matrix (created by LookAt method) with the X rotation matrix. I rotated the model view matrix with the Y rotation Matrix.
Here's the modified Update method:
- (void)update {
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, PVRTVec3(0.0f, 1.0f, 0.0f));
// rotate on the X axis (finger swipe Y direction)
m_mView = m_mView * _rotationY;
// Calculates the projection matrix
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, false);
}
Here's the modified touch moved method:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 2.5);
float rotY = GLKMathDegreesToRadians(diff.y / 2.5);
PVRTMat4 rotMatrixX, rotMatrixY;
// create rotation matrices with angle
PVRTMatrixRotationYF(rotMatrixX, -rotX);
PVRTMatrixRotationXF(rotMatrixY, rotY);
_rotationX = _rotationX * rotMatrixX;
_rotationY = _rotationY * rotMatrixY;
}

Moving the Subview after rotating it, not correctly works

I have added an UIImageView as a subview in UIView, and then I move this subview with the following code in UITouchesBegan.
CGPoint newTouch = [[touches anyObject] locationInView:self];
CGPoint lastTouch = [[touches anyObject] previousLocationInView:self];
float xDif = newTouch.x - lastTouch.x;
float yDif = newTouch.y - lastTouch.y;
translate = CGAffineTransformMakeTranslation(xDif, yDif);
[self setTransform: CGAffineTransformConcat([self transform], translate)];
then I rotate this subview with the following code.
self.transform=CGAffineTransformRotate(self.transform, RADIANS(180));
at this stage, all is well..
but when I try once more to move my Subview, it moves in the opposite direction i.e. when I want to move it upwards it moves downwards.
any ideas? suggestions?
Why are you moving the subview using a translate transform? Just update the center property. The problem with using a translate transform is that if it is rotated, it will indeed translate in a direction relative to the current rotation (which may not be what you expected, as you have discovered). If you want to insist on using the translate transform (again, why?) then you need to make sure you are rotated back to 0 (neutral), do your translation of position, then rotate back to your desired rotation.
Order of transforms matter.

Resources