Rotating UIView (Again) - ios

I'm encountering some problems with rotating a uiview again. This time, im trying to rotate a uiview 1/12 the speed of another uiview I'm rotating at normal speed. However, when I try to accomplish this task, the uiview I'm trying to move slower moves like this:
1st Update
https://www.youtube.com/watch?v=wj3nRJo5CMM&feature=youtu.be
2nd Update
https://www.youtube.com/watch?v=YLRkUzXSDtQ&feature=youtu.be
Here's my code:
- (void)rotateHand:(UIPanGestureRecognizer *)panGesture {
if ([(UIPanGestureRecognizer*)panGesture state] == UIGestureRecognizerStateBegan) {
CGPoint touchPoint = [panGesture locationInView:[self view]];
float dx = touchPoint.x - minHandContainer.center.x;
float dy = touchPoint.y - minHandContainer.center.y;
arcTanMin = atan2(dy,dx);
arcTanHour = atan2(hourHand.center.x - minHandContainer.center.x, hourHand.center.y - minHandContainer.center.y);
if (arcTanMin < 0) {
arcTanMin = 2 * M_PI + arcTanMin;
}
if (arcTanHour < 0) {
arcTanHour = 2 * M_PI + arcTanMin;
}
NSLog(#"arcTanMin %f", arcTanMin);
startTransformMin = minHandContainer.transform;
startTransformHour = hourHandContainer.transform;
}
if ([(UIPanGestureRecognizer*)panGesture state] == UIGestureRecognizerStateChanged) {
CGPoint pt = [panGesture locationInView:[self view]];
float dx = pt.x - minHandContainer.center.x;
float dy = pt.y - minHandContainer.center.y;
float ang = atan2(dy,dx);
if (ang < 0) {
ang = 2 * M_PI + ang;
}
float angleDifferenceM = arcTanMin - ang;
float angleDifferenceH = arcTanHour + angleDifferenceM * (1.0/12.0);
NSLog(#"angleDiffM %f", angleDifferenceM);
NSLog(#"angleDiffH %f", angleDifferenceH);
minHandContainer.transform = CGAffineTransformRotate(startTransformMin, -angleDifferenceM);
hourHandContainer.transform = CGAffineTransformRotate(startTransformHour, -angleDifferenceH);
}
}

It appears that you're using arcTanMin as the starting reference angle for both the minute hand and hour hand. So when you make the jump across the x-axis, both angleDifferenceM and angleDifferenceH are making the jump (which is why at the moment of the jump the angle of the hour hand to the y-axis is the same as the angle of the minute hand to the x-axis), but angleDifferenceH doesn't need to make the jump. Change this:
float angleDifferenceH = angleDifferenceM * (1.0/12.0);
to
float angleDifferenceH = arcTanHour + angleDifferenceM * (1.0/12.0);
with an appropriate starting value for arcTanHour.

Related

UITapGesture to drag view but limiting the bounds of which it can be dragged

I currently have 2 circles. One big circle and one little circle. The little circle has a tap gesture recognizer that allows it to be dragged by the user. I would like the little circle's center to go no further than the big circle's radius. I have 4 auto layout constraints on the inner circle. 1 for fixed width, 1 for fixed height, 1 for distance from center for x, and 1 for distance from center for y. Here is how I am going about this:
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
if (recognizer.state == UIGestureRecognizerStateChanged) {
CGPoint translation = [recognizer translationInView:self.view];
CGFloat x = recognizer.view.center.x + translation.x;
CGFloat y = recognizer.view.center.y + translation.y;
CGPoint desiredPoint = CGPointMake(x, y);
//check if point the user is trying to get to is outside the radius of the outer circle
//if it is, set the center of the inner circle to the right position at the distance of the radius and with the same angle
if ([self distanceBetweenStartPoint:self.outerCircleView.center endPoint:desiredPoint] > self.outerCircleRadius) {
CGFloat angle = [self angleBetweenStartPoint:self.outerCircleView.center endPoint:actualPosition];
desiredPoint = [self findPointFromRadius:self.outerCircleRadius startPoint:self.outerCircleView.center angle:angle];
}
//adjust the constraints to move the inner circle
self.innerCircleCenterXConstraint.constant += actualPosition.x - recognizer.view.center.x;
self.innerCircleCenterYConstraint.constant += actualPosition.y - recognizer.view.center.y;
[recognizer setTranslation:CGPointMake(0.0, 0.0) inView:self.view];
}
}
- (CGFloat)distanceBetweenStartPoint:(CGPoint)startPoint endPoint:(CGPoint)endPoint {
CGFloat xDif = endPoint.x - startPoint.x;
CGFloat yDif = endPoint.y - startPoint.y;
//pythagorean theorem
return sqrt((xDif * xDif) + (yDif * yDif));
}
- (CGPoint)findPointFromRadius:(CGFloat)radius startPoint:(CGPoint)startPoint angle:(CGFloat)angle {
CGFloat x = radius * cos(angle) + startPoint.x;
CGFloat y = radius * sin(angle) + startPoint.y;
return CGPointMake(x, y);
}
- (CGFloat)angleBetweenStartPoint:(CGPoint)startPoint endPoint:(CGPoint)endPoint {
CGPoint originPoint = CGPointMake(endPoint.x - startPoint.x, endPoint.y - startPoint.y);
return atan2f(originPoint.y, originPoint.x);
}
This works almost perfectly. The problem is I try to find the percentage that the user moved towards the outside of the circle. So I use the distanceBetweenStartPoint(center of outer circle) endPoint(center of inner circle) method and divide that by the radius of the outer circle. This should give me a value of 1 when the circle has been dragged as far to one side as it can go. Unfortunately I am getting values like 0.9994324 or 1.000923. What could be causing this? Thanks for any insight!

Pinching/Panning a CCNode in Cocos2d 3.0

I want to zoom in out a CCNode by pinching and panning the screen. The node has a background which is very large but the portionof it shown on the screen. That node also contains other sprites.
What I have done by now is that first I register UIPinchGestureRecognizer
UIPinchGestureRecognizer * pinchRecognizer = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(handlePinchFrom:)];
[[[CCDirector sharedDirector] view] addGestureRecognizer: pinchRecognizer];
-(void)handlePinchFrom:(UIPinchGestureRecognizer *) pinch
{
if(pinch.state == UIGestureRecognizerStateEnded) {
prevScale = 1;
}
else {
CGFloat dscale = [self scale] - prevScale + pinch.scale;
if(dscale > 0)
{
deltaScale = dscale;
}
CGAffineTransform transform = CGAffineTransformScale(pinch.view.transform, deltaScale, deltaScale);
[pinch.view setTransform: transform];
// [_contentNode setScale:deltaScale];
prevScale = pinch.scale;
}
}
The problem is that it scalw whole UIView not the CCNode. I have also tried to by setting the scale of my _contentNode.
**EDIT
I ave also tried this
- (void)handlePinchGesture:(UIPinchGestureRecognizer*)aPinchGestureRecognizer
{
if (pinch.state == UIGestureRecognizerStateBegan || pinch.state == UIGestureRecognizerStateChanged) {
CGPoint midpoint = [pinch locationInView:[CCDirector sharedDirector].view];
CGSize winSize = [CCDirector sharedDirector].viewSize;
float x = midpoint.x/winSize.width;
float y = midpoint.y/winSize.height;
_contentNode.anchorPoint = CGPointMake(x, y);
float scale = [pinch scale];
_contentNode.scale *= scale;
pinch.scale = 1;
}
}
But it zoom from the bottom left of the screen.
I had the same problem. I use CCScrollView, that contains CCNode that larger than device screen. I want scroll and zoom it, but node shouldnt scroll out of screen, and scale smaller than screen. So, i create my subclass of CCScrollView, where i handle pinch. It has some strange glitches, but it works fine at all.
When pinch began i set anchor point of my node to pinch center on node space. Then i need change position of my node proportional to shift of anchor point, so moving anchor point doesn't change nodes location on view:
- (void)handlePinch:(UIPinchGestureRecognizer*)recognizer
{
if (recognizer.state == UIGestureRecognizerStateEnded) {
_previousScale = self.contentNode.scale;
}
else if (recognizer.state == UIGestureRecognizerStateBegan) {
float X = [recognizer locationInNode:self.contentNode].x / self.contentNode.contentSize.width;
float Y = [recognizer locationInNode:self.contentNode].y / self.contentNode.contentSize.height;
float positionX = self.contentNode.position.x + self.contentNode.boundingBox.size.width * (X - self.contentNode.anchorPoint.x);
float positionY = self.contentNode.position.y + self.contentNode.boundingBox.size.height * (Y - self.contentNode.anchorPoint.y);
self.contentNode.anchorPoint = ccp(X, Y);
self.contentNode.position = ccp(positionX, positionY);
}
else {
CGFloat scale = _previousScale * recognizer.scale;
if (scale >= maxScale) {
self.contentNode.scale = maxScale;
}
else if (scale <= [self minScale]) {
self.contentNode.scale = [self minScale];
}
else {
self.contentNode.scale = scale;
}
}
}
Also i need change CCScrollView min and max scroll, so my node never scroll out of view. Default anchor point is (0,1), so i need shift min and max scroll proportional to the new anchor point.
- (float) maxScrollX
{
if (!self.contentNode) return 0;
float maxScroll = self.contentNode.boundingBox.size.width - self.contentSizeInPoints.width;
if (maxScroll < 0) maxScroll = 0;
return maxScroll - self.contentNode.boundingBox.size.width * self.contentNode.anchorPoint.x;
}
- (float) maxScrollY
{
if (!self.contentNode) return 0;
float maxScroll = self.contentNode.boundingBox.size.height - self.contentSizeInPoints.height;
if (maxScroll < 0) maxScroll = 0;
return maxScroll - self.contentNode.boundingBox.size.height * (1 - self.contentNode.anchorPoint.y);
}
- (float) minScrollX
{
float minScroll = [super minScrollX];
return minScroll - self.contentNode.boundingBox.size.width * self.contentNode.anchorPoint.x;
}
- (float) minScrollY
{
float minScroll = [super minScrollY];
return minScroll - self.contentNode.boundingBox.size.height * (1 - self.contentNode.anchorPoint.y);
}
UIGestureRecognizerStateEnded doesn't have locationInNode: method, so i added it by category. It just return touch location on node space:
#import "UIGestureRecognizer+locationInNode.h"
#implementation UIGestureRecognizer (locationInNode)
- (CGPoint) locationInNode:(CCNode*) node
{
CCDirector* dir = [CCDirector sharedDirector];
CGPoint touchLocation = [self locationInView: [self view]];
touchLocation = [dir convertToGL: touchLocation];
return [node convertToNodeSpace:touchLocation];
}
- (CGPoint) locationInWorld
{
CCDirector* dir = [CCDirector sharedDirector];
CGPoint touchLocation = [self locationInView: [self view]];
return [dir convertToGL: touchLocation];
}
#end

GLKMathUnproject : detecting a tap on an object

So, it seems that GLKit has GLKMathUnproject which can be used to work out where you clicked in 3D space (awesome)
However, I cant get it to work, I have copied a few different examples in and it still does not detect me clicking on my cube at 0,0,0. I basically do a for next loop and see if my ray hits my cube.
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSSet* allTouches = [event allTouches];
UITouch*touch1 = [[allTouches allObjects] objectAtIndex:0];
CGPoint touch1Point = [touch1 locationInView:self.view];
GLKVector3 window_coord = GLKVector3Make(touch1Point.x,touch1Point.y, 0.0f);
bool result;
GLint viewport[4] = {};
glGetIntegerv(GL_VIEWPORT, viewport);
GLKVector3 near_pt = GLKMathUnproject(window_coord, _baseModelViewMatrix, _projectionMatrix, &viewport[0], &result);
window_coord = GLKVector3Make(touch1Point.x,touch1Point.y, 1.0f);
GLKVector3 far_pt = GLKMathUnproject(window_coord, _baseModelViewMatrix, _projectionMatrix, &viewport[0], &result);
//need to get z=0 from
//assumes near and far are on opposite sides of z=0
float z_magnitude = fabs(far_pt.z-near_pt.z);
float near_pt_factor = fabs(near_pt.z)/z_magnitude;
float far_pt_factor = fabs(far_pt.z)/z_magnitude;
GLKVector3 final_pt = GLKVector3Add( GLKVector3MultiplyScalar(near_pt, far_pt_factor), GLKVector3MultiplyScalar(far_pt, near_pt_factor));
float xDif = (final_pt.x - near_pt.x) / 1000;
float yDif = (final_pt.y - near_pt.y) / 1000;
float zDif = (final_pt.z - near_pt.z) / 1000;
for (int i = 0; i < 100; i ++)
{
if ((near_pt.x + (xDif * i)) > self.cube.position.x - self.cube.scale.x && (near_pt.x + (xDif * i)) < self.cube.position.x + self.cube.scale.x &&
(near_pt.y + (yDif * i)) > self.cube.position.y - self.cube.scale.y && (near_pt.y + (yDif * i)) < self.cube.position.y + self.cube.scale.y &&
(near_pt.z + (zDif * i)) > self.cube.position.z - self.cube.scale.z && (near_pt.z + (zDif * i)) < self.cube.position.z + self.cube.scale.z)
{
NSLog(#"%f %f %f", final_pt.x, final_pt.y, final_pt.z);
NSLog(#"Hit cube");
}
}
}
I found the answer here
Updating OpenGL ES Touch Detection (Ray Tracing) for iPad Retina?
- (void)handleTap: (UITapGestureRecognizer *)recognizer
{
CGPoint tapLoc = [recognizer locationInView:self.view];
tapLoc.x *= [UIScreen mainScreen].scale;
tapLoc.y *= [UIScreen mainScreen].scale;
bool testResult;
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);
float uiKitOffset = 113; //Need to factor in the height of the nav bar + the height of the tab bar at the bottom in the storyboard.
GLKVector3 nearPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3]+uiKitOffset)*-1, 0.0), modelViewMatrix, projectionMatrix, &viewport[0] , &testResult);
GLKVector3 farPt = GLKMathUnproject(GLKVector3Make(tapLoc.x, (tapLoc.y-viewport[3]+uiKitOffset)*-1, 1.0), modelViewMatrix, projectionMatrix, &viewport[0] , &testResult);
farPt = GLKVector3Subtract(farPt, nearPt);
....
}

iOS using accelerometer to move an object within a circle

I am trying use the accelerometer to move an image within a circle. I am having issue that when the image hits the edge of the circle, it just moves the other side of the circle. My code is below:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
//NSLog(#"x : %g", acceleration.x);
//NSLog(#"y : %g", acceleration.y);
//NSLog(#"z : %g", acceleration.z);
delta.x = acceleration.x * 10;
delta.y = acceleration.y * 10;
joypadCap.center = CGPointMake(joypadCap.center.x + delta.x, joypadCap.center.y - delta.y);
distance = sqrtf(((joypadCap.center.x - 160) * (joypadCap.center.x - 160)) +
((joypadCap.center.y -206) * (joypadCap.center.y - 206)));
//NSLog(#"Distance : %f", distance);
touchAngle = atan2(joypadCap.center.y, joypadCap.center.x);
NSLog(#"Angle : %f", touchAngle);
if (distance > 50) {
joypadCap.center = CGPointMake(160 - cosf(touchAngle) * 50, 206 - sinf(touchAngle) * 50);
}
I was having the same issue when attempting to implement a circular spirit level using CMDeviceMotion. I found it was an issue with the coordinates passed to atan2(y,x). This function requires cartesian coordinates, with (0,0) in the centre of the view. However, the screen coordinates have (0,0) in the top left corner. I created methods to convert a point between the two coordinate systems, and now it's working well.
I put up a sample project here on github, but here's the most important part:
float distance = sqrtf(((point.x - halfOfWidth) * (point.x - halfOfWidth)) +
((point.y - halfOfWidth) * (point.y - halfOfWidth)));
if (distance > maxDistance)
{
// Convert point from screen coordinate system to cartesian coordinate system,
// with (0,0) located in the centre of the view
CGPoint pointInCartesianCoordSystem = [self convertScreenPointToCartesianCoordSystem:point
inFrame:self.view.frame];
// Calculate angle of point in radians from centre of the view
CGFloat angle = atan2(pointInCartesianCoordSystem.y, pointInCartesianCoordSystem.x);
// Get new point on the edge of the circle
point = CGPointMake(cos(angle) * maxDistance, sinf(angle) * maxDistance);
// Convert back to screen coordinate system
point = [self convertCartesianPointToScreenCoordSystem:point inFrame:self.view.frame];
}
And:
- (CGPoint)convertScreenPointToCartesianCoordSystem:(CGPoint)point
inFrame:(CGRect)frame
{
float x = point.x - (frame.size.width / 2.0f);
float y = (point.y - (frame.size.height / 2.0f)) * -1.0f;
return CGPointMake(x, y);
}
- (CGPoint)convertCartesianPointToScreenCoordSystem:(CGPoint)point
inFrame:(CGRect)frame
{
float x = point.x + (frame.size.width / 2.0f);
float y = (point.y * -1.0f) + (frame.size.height / 2.0f);
return CGPointMake(x, y);
}

UIView touch location coordinates

What are the coordinates used in UIViews and their corresponding superviews? I have this code which i would like to detect a 'corridor' where the user can touch... similar to this image:alt text http://img17.imageshack.us/img17/4416/bildschirmfoto20100721u.png
This is the code i have:
CGPoint touch = [recognizer locationInView:[shuttle superview]];
CGPoint centre = shuttle.center;
int outerRadius = shuttle.bounds.size.width/2;
int innerRadius = (shuttle.bounds.size.width/2) - 30;
if ((touch.x < outerRadius && touch.y <outerRadius)){
NSLog(#"in outer");
if(touch.x > innerRadius && touch.y > innerRadius) {
NSLog(#"in corridor");
}
}
The radii are approximately 500 and 600, and the touch x and y are 100 and 200...
Thus, the NSLog "in corridor" never gets called.
Thanks
Your condition is wrong. The corridor according to it is a square, with its center at (0, 0) instead of shuttle.center. Try
CGFloat dx = touch.x - centre.x;
CGFloat dy = touch.y - centre.y;
CGFloat r2 = dx*dx + dy*dy;
if (r2 < outerRadius*outerRadius) {
NSLog(#"in outer");
if (r2 > innerRadius*innerRadius)
NSLog(#"in corridor")
}
instead.
Even if the corridor is indeed expected to be a square, you should check with fabs(dx), fabs(dy) not touch.x, touch.y.

Resources