Calculating the swipe distance using touchesBegan and touchesEnded - ios

I am able to find the x,y location on the iPhone screen using a method similar to: How to get a CGPoint from a tapped location?. However, since TouchesBegan and TouchesEnded require void, I am unable to return there locations directly and use in a different method to calculate the distance. The distance method I am trying to use is:
- (CGFloat) calculateDistanceBetweenPoint1: (CGPoint)start point2:(CGPoint)end
{
CGFloat dx = end.x - start.x;
CGFloat dy = end.y - start.y;
return sqrt(dx*dx + dy*dy);
}
where end is the coordinate using touchesEnded and start is the coordinate using touchesBegan. Once I find the distance in pixels I will then convert it to inches or centimeters. Any suggestions?

Related

Objective C compare two CGPoint to see if they are close?

So I currently get the location of a touch by using
CGPoint location = [touch locationInView:self.view];
Now what I want to do is check the location on the next touch to see if the locations are close, say 25 points on x or y axis.
There are a few posts that show how to compare if two touches are equivalent but is there to calculate the distance between multiple points? Any info would be awesome.
To estimate the distance between two CGPoints, you can make use of simple Pythagorean formula:
CGFloat dX = (p2.x - p1.x);
CGFloat dY = (p2.y - p1.y);
CGFloat distance = sqrt((dX * dX) + (dY * dY));

How to set specific areas of uiimage to process touch gestures?

I'm making a photo hunt style app. I've got a number of X-Rays and I need to set specific areas of the uiimage to process touch events as correct and others as incorrect.
I understand that I can use the code below to get the tap location in the image view but how do I declare an area on the image view as correct and compare it to the tap location value?
CGPoint tapLocation = [gesture locationInView:self.imagePlateA];
Any help much appreciated!
So you have to programmatically create "regions" and test to see whether or not they're in that region after you get that point. For example:
//Get the tap location
CGPoint tapLocation = [gesture locationInView:self.imagePlateA];
if ([self checkIfTap:tapLocation inRegionWithCenter:CGPointMake(someX, someY) radius:radius]) {
//YAY WE'RE WITHIN THE BOUNDS OF A CIRCLE AT POINT (someX, someY)
//THAT HAS A RADIUS OF radius
}
and the method of checkIfTap: inRegionWithCenter: radius: can be defined like this:
- (BOOL)checkIfTap:(CGPoint)tapLocation inRegionWithCenter:(CGPoint)center radius:(CGFloat)radius {
CGFloat dx = tapLocation.x - center.x;
CGFloat dy = tapLocation.y - center.y;
//Pythagorean theorem
if (sqrt(dx * dx + dy * dy) < radius) {
return YES;
} else {
return NO;
}
}
If the correct locations of the image is a CGRect rather than points, you could use CGRectContainsPoint()
CGGeometry Reference

How does anchorPoint influence center in a UIView?

I'm working on an app that lets the user resize and rotate a photo using UIGestureRecognizers. I have this code which adjusts the anchorPoint based on where the user is applying touches (to make it look like they're scaling the image at the point where their fingers actually are):
- (void)adjustAnchorPointForGestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
{
UIView *gestureRecognizerView = gestureRecognizer.view;
CGPoint locationInView = [gestureRecognizer locationInView:gestureRecognizerView];
CGPoint locationInSuperview = [gestureRecognizer locationInView:gestureRecognizerView.superview];
gestureRecognizerView.layer.anchorPoint = CGPointMake(locationInView.x / gestureRecognizerView.bounds.size.width, locationInView.y / gestureRecognizerView.bounds.size.height);
gestureRecognizerView.center = locationInSuperview;
}
Later on, I'm simply wanting to calculate the origin based on the center and bounds with this code:
CGRect transformedBounds = CGRectApplyAffineTransform(view.bounds, view.transform);
CGPoint origin = CGPointMake(view.center.x - (transformedBounds.size.width * view.layer.anchorPoint.x), view.center.y - (transformedBounds.size.height * view.layer.anchorPoint.y));
And it's coming out incorrectly (I'm comparing against the frame value which ironically is supposed to be invalidated but actually does have the correct value).
So all in all I'm wondering, what am I not taking into account here? How is the anchorPoint influencing the center in a way I'm not able to determine?
I think the problem is that the origin you are calculating is not really an origin, but rather an offset of the origin of your transformedBounds rect.
I haven't fully tested it, but if you do something like this you should get the correct frame:
CGRect transformedBounds =
CGRectApplyAffineTransform(view.bounds, view.transform);
CGSize originOffset = CGSizeMake(
view.center.x - (transformedBounds.size.width * view.layer.anchorPoint.x),
view.center.y -
(transformedBounds.size.height * view.layer.anchorPoint.y));
transformedBounds.origin.x += originOffset.width;
transformedBounds.origin.y += originOffset.height;

How can I move UIView with touch keeping it inside of a circle?

I want to move a UIView inside of a circle. The UIView moves every point inside the circle but not touch border line of the circle. I am calculating distance between circle and the UIView.
var distance = sqrt(
pow((touchPoint.x - selfCenter.x), 2) + pow((touchPoint.y - selfCenter.y), 2)
)
And limiting the UIView movement towards out of the circle
if distance <= radius {
theUIView.center = touchPoint
}
The problem starts here, if touch move out from circle the UIView stuck at the border, inside the circle. That is why I am trying write else statement as far as I have tried this.
if distance <= radius {
theUIView.center = touchPoint
} else {
theUIView.center = CGPointMake(
touchPoint.x / distance * radius,
touchPoint.y / distance * radius
)
}
Question is, how I can keep the UIView inside the circle and keep moving if touches keep moving. A hint would be great.
There are similar questions here -like this- but did not helped.
Your else case looks wrong. If you want to "project" a point outside of the circle
onto the circle boundary then it should be
if distance <= radius {
theUIView.center = touchPoint
} else {
theUIView.center = CGPointMake(
selfCenter.x + (touchPoint.x - selfCenter.x) / distance * radius,
selfCenter.y + (touchPoint.y - selfCenter.y) / distance * radius
)
}
Remark: The distance can be more easily computed using the hypot() function:
var distance = hypot(touchPoint.x - selfCenter.x, touchPoint.y - selfCenter.y)

rotation along x and y axis

I'm using GLKit along with PowerVR library for my opengl-es 2.0 3D app. The 3D scene loads with several meshes, which simulate a garage environment. I have a car in the center of the garage. I am trying to add touch handling to the app, where the user can rotate the room around (e.g., to see all 4 walls surrounding the car). I also want to allow a rotation on the x axis, though limited to a small range. Basically they can see from a little bit of the top of the car to just above the floor level.
I am able to rotate on the Y OR on the X, but not both. As soon as I rotate on both axis, the car is thrown off-axis. The car isn't level with the camera anymore. I wish I could explain this better, but hopefully you guys will understand.
Here is my touches implementation:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 4.0);
float rotY = GLKMathDegreesToRadians(diff.y / 5.0);
PVRTVec3 xAxis = PVRTVec3(1, 0, 0);
PVRTVec3 yAxis = PVRTVec3(0,1,0);
PVRTMat4 yRotMatrix, xRotMatrix;
// create rotation matrices with angle
PVRTMatrixRotationXF(yRotMatrix, rotY);
PVRTMatrixRotationYF(xRotMatrix, -rotX);
_rotationY = _rotationY * yRotMatrix;
_rotationX = _rotationX * xRotMatrix;
}
Here's my update method:
- (void)update {
// Use the loaded effect
m_pEffect->Activate();
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
vUp.x = 0.0f;
vUp.y = 1.0f;
vUp.z = 0.0f;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, vUp);
// rotate the camera based on the users swipe in the X direction (THIS WORKS)
m_mView = m_mView * _rotationX;
// Calculates the projection matrix
bool bRotate = false;
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, bRotate);
}
I've tried multiplying the new X rotation matrix to the current scene rotation first, and then multiplying the new Y rotation matrix second. I've tried the reverse of that, thinking the order of multiplication was my problem. That didn't help. Then I tried adding the new X and Y rotation matrices together before multiplying to the current rotation, but that didn't work either. I feel that I'm close, but at this point I'm just out of ideas.
Can you guys help? Thanks. -Valerie
Update: In an effort to solve this, I'm trying to simplify it a little. I've updated the above code, removing any limit in the range of the Y rotation. Basically I calculate the X and Y rotation based on the user swipe on the screen.
If I understand this correctly, I think I want to rotate the View matrix (camera/eye) with the calculation for the _rotationX.
I think I need to use the World matrix (origin 0,0,0) for the _rotationY calculation. I'll try and get some images of exactly what I'm talking about.
Wahoo, got this working! I rotated the view matrix (created by LookAt method) with the X rotation matrix. I rotated the model view matrix with the Y rotation Matrix.
Here's the modified Update method:
- (void)update {
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, PVRTVec3(0.0f, 1.0f, 0.0f));
// rotate on the X axis (finger swipe Y direction)
m_mView = m_mView * _rotationY;
// Calculates the projection matrix
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, false);
}
Here's the modified touch moved method:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 2.5);
float rotY = GLKMathDegreesToRadians(diff.y / 2.5);
PVRTMat4 rotMatrixX, rotMatrixY;
// create rotation matrices with angle
PVRTMatrixRotationYF(rotMatrixX, -rotX);
PVRTMatrixRotationXF(rotMatrixY, rotY);
_rotationX = _rotationX * rotMatrixX;
_rotationY = _rotationY * rotMatrixY;
}

Resources