touchesMoved:withEvent: and UIRotationgestureRecognizer don't work together - ios

I'm trying these two methods to move and rotate an UIView. Both methods work separately but if I rotate and then move the UIView it disappears.
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGRect rect = self.aView.frame;
UITouch *touch = [touches anyObject];
CGPoint pPoint = [touch previousLocationInView:self.view];
CGPoint cPoint = [touch locationInView:self.view];
float deltaX = cPoint.x - pPoint.x;
float deltaY = cPoint.y - pPoint.y;
rect.origin.x = rect.origin.x + deltaX;
rect.origin.y = rect.origin.y + deltaY;
self.aView.frame = rect;
}
- (void)rotate:(UIRotationGestureRecognizer *) recognizer {
CGFloat rotation = angle + recognizer.rotation;
NSLog(#"%f", angle * 180 / M_PI);
self.aView.transform = CGAffineTransformMakeRotation (rotation);
if (recognizer.state == UIGestureRecognizerStateEnded)
angle = rotation;
}

Gesture recognisers take priority over touchMoved, so it's hard to use them both with the same view.
Use a UIPanGestureRecognizer instead of touchMoved to handle dragging the UIView. You can then get the UIPanGestureRecognizer and UIRotationGestureRecognizer to cooperate with one another by implementing the
– gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer:
method, which is defined in the UIGestureRecognizerDelegate protocol.

Related

UIButton - move and scale

I have an UIButton that I've creates programmatically. Actually it should'n be UIButton, I just need to have possibility to mark some area above the image.
So the features I need it - move object and resize it. For this i have 2 methods:
- (void) objMove:(id) sender withEvent:(UIEvent *) event
{
UIControl *control = sender;
UITouch *t = [[event allTouches] anyObject];
CGPoint pPrev = [t previousLocationInView:control];
CGPoint p = [t locationInView:control];
CGPoint center = control.center;
center.x += p.x - pPrev.x;
center.y += p.y - pPrev.y;
control.center = center;
}
- (void)objScale:(UIPinchGestureRecognizer *)recognizer
{
UIView *pinchView = recognizer.view;
CGRect bounds = pinchView.bounds;
CGPoint pinchCenter = [recognizer locationInView:pinchView];
pinchCenter.x -= CGRectGetMidX(bounds);
pinchCenter.y -= CGRectGetMidY(bounds);
CGAffineTransform transform = pinchView.transform;
transform = CGAffineTransformTranslate(transform, pinchCenter.x, pinchCenter.y);
CGFloat scale = recognizer.scale;
transform = CGAffineTransformScale(transform, scale, scale);
transform = CGAffineTransformTranslate(transform, -pinchCenter.x, -pinchCenter.y);
pinchView.transform = transform;
recognizer.scale = 1.0;
}
Scale works ok. Moving looks ok until I change the size of object - when i increase object it become moves slower than finger, and vice versa - if object smaller than original it moves faster than finger. why it works like this?
I think you should get startPoint and startCenter in
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
// get startPoint and startCenter here
}
- (void) objMove:(id) sender withEvent:(UIEvent *) event
{
UIControl *control = sender;
UITouch *t = [[event allTouches] anyObject];
CGPoint p = [t locationInView:control];
startCenter.x += p.x - startPoint.x;
startCenter.y += p.y - startPoint.y;
control.center = startCenter;
}
Change your code like this, maybe it works.
Your center is current center, p is current point, pPrev is previous point.
current center adds previous point moved size is wrong.
You should get relative distance, not dynamic distance.

CGAffineTransformMakeRotation goes the other way after 180 degrees (-3.14)

So,
i am trying to do a very simple disc rotation (2d), according to the user touch on it, just like a DJ or something.
It is working, but there is a problem, after certain amount of rotation, it starts going backwards, this amount is after 180 degrees or as i saw in while logging the angle, -3.14 (pi).
I was wondering, how can i achieve a infinite loop, i mean, the user can keep rotating and rotating to any side, just sliding his finger?
Also a second question is, is there any way to speed up the rotation?
Here is my code right now:
#import <UIKit/UIKit.h>
#interface Draggable : UIImageView {
CGPoint firstLoc;
UILabel * fred;
double angle;
}
#property (assign) CGPoint firstLoc;
#property (retain) UILabel * fred;
#end
#implementation Draggable
#synthesize fred, firstLoc;
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
angle = 0;
if (self) {
// Initialization code
}
return self;
}
-(void)handleObject:(NSSet *)touches
withEvent:(UIEvent *)event
isLast:(BOOL)lst
{
UITouch *touch =[[[event allTouches] allObjects] lastObject];
CGPoint curLoc = [touch locationInView:self];
float fromAngle = atan2( firstLoc.y-self.center.y,
firstLoc.x-self.center.x );
float toAngle = atan2( curLoc.y-(self.center.y+10),
curLoc.x-(self.center.x+10));
float newAngle = angle + (toAngle - fromAngle);
NSLog(#"%f",newAngle);
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(newAngle);
self.transform = cgaRotate;
if (lst)
angle = newAngle;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch =[[[event allTouches] allObjects] lastObject];
firstLoc = [touch locationInView:self];
};
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:NO];
};
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:YES];
}
#end
And in the ViewController:
UIImage *tmpImage = [UIImage imageNamed:#"theDisc.png"];
CGRect cellRectangle;
cellRectangle = CGRectMake(-1,self.view.frame.size.height,tmpImage.size.width ,tmpImage.size.height );
dragger = [[Draggable alloc] initWithFrame:cellRectangle];
[dragger setImage:tmpImage];
[dragger setUserInteractionEnabled:YES];
dragger.layer.anchorPoint = CGPointMake(.5,.5);
[self.view addSubview:dragger];
I am open to new/cleaner/more correct ways of doing this too.
Thanks in advance.
Flip the angle if it's below -180 or above 180 degrees. Consider the following touchesMoved implementation:
#implementation RotateView
#define DEGREES_TO_RADIANS(angle) ((angle) / 180.0 * M_PI)
CGFloat angleBetweenLinesInDegrees(CGPoint beginLineA, CGPoint endLineA, CGPoint beginLineB, CGPoint endLineB)
{
CGFloat a = endLineA.x - beginLineA.x;
CGFloat b = endLineA.y - beginLineA.y;
CGFloat c = endLineB.x - beginLineB.x;
CGFloat d = endLineB.y - beginLineB.y;
CGFloat atanA = atan2(a, b);
CGFloat atanB = atan2(c, d);
// convert radians to degrees
return (atanA - atanB) * 180 / M_PI;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint curPoint = [[touches anyObject] locationInView:self];
CGPoint prevPoint = [[touches anyObject] previousLocationInView:self];
// calculate rotation angle between two points
CGFloat angle = angleBetweenLinesInDegrees(self.center, prevPoint, self.center, curPoint);
// Flip
if (angle > 180) {
angle -= 360;
} else if (angle < -180) {
angle += 360;
}
self.layer.transform = CATransform3DRotate(self.layer.transform, DEGREES_TO_RADIANS(angle), .0, .0, 1.0);
}
#end
When dragging around the outer bounds of the view, it will rotate it continuously like a spinning wheel. Hope it helps.
You have some problems here:
1-)
CGPoint curLoc = [touch locationInView:self];
and
firstLoc = [touch locationInView:self];
You are transforming your view, and then asking for the location of a touch in it. You cannot get the correct location of a touch in a rotated view.
Make them something not transformed. (for example self.superview after putting it in a container)
2-)
cellRectangle = CGRectMake(-1,self.view.frame.size.height,tmpImage.size.width ,tmpImage.size.height );
You are placing your Draggable instance out of the screen by passing self.view.frame.size.height as the CGRect's y parameter.

Panning a subview of UIScrollView after zooming in

I have added a subview to a UIScrollView. When I zoom into the scroll view I want to pan around the subview.
In touchesBegan: I'm getting the initial location of the touch and then touchesMoved: I am able to determine how much to move the subview. It works perfectly when zoomscale is 1.0. However, when it is zoomed the pointer "breaks out" of the subview which it is intended to move (illustration here - pointer location is ilustrated as marquee tool).
The center of the view should be on pointer location, and not in it's current position! px and py variables ensure that wherever on the subview is clicked, while dragging it postion of the pointer always stays the same. illustration
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
location.x = location.x * self.zoomScale;
location.y = location.y * self.zoomScale;
px = location.x;
py = location.y;
if ([touch view] == rotateView) {
self.scrollEnabled = NO;
return;
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
location.x = location.x * self.zoomScale;
location.y = location.y * self.zoomScale;
if ([touch view] == rotateView) {
rotateView.center = CGPointMake(rotateView.center.x + (location.x - px), rotateView.center.y + (location.y - py));
px = location.x;
py = location.y;
return;
}
    }
Instead of the approach you're taking, make the subview another UIScrollView and let it handle the panning.
(You may wish to set scrollEnabled = NO on your subview until zooming has occurred.)

UIImage which follow touches

I'm trying to draw a ruller which follow the touches on the screen. On the first touch, I set the first point and on all the other, the ruller follow the finger on ther screen resizing and rotating around the first point depending on where the finger is.
I tried to do this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
self.regleFirstPoint = p;
UIImageView* img = [[UIImageView alloc] initWithImage:self.regleImg];
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, 0, self.regleImg.size.height);
[self.view addSubview:img];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
// Width
float deltaY = p.y - self.regleFirstPoint.y;
float deltaX = p.x - self.regleFirstPoint.x;
float width = sqrt((deltaX * deltaX) + (deltaY * deltaY));
// Angle
float angleInRadians = atanf(deltaY / deltaX);
float angleInDegrees = angleInRadians * 180 / M_PI; // JUST FOR INFO
NSLog(#"angle : %f / %f", angleInRadians, angleInDegrees);
// Resizing image
UIImageView* img = [self.regles lastObject];
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, width, self.regleImg.size.height/2);
img.center = self.regleFirstPoint;
CGAffineTransform transform = CGAffineTransformIdentity;
img.transform = CGAffineTransformRotate(transform, angleInRadians);
}
The ruller doesn't follow the finger correctly, I think I missed something. What's wrong with my code ?
EDIT : I also tried this after some researches :
// Resizing images
img.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, largeur, self.regleImg.size.height/2);
[img.layer setAnchorPoint:CGPointMake(self.regleFirstPoint.x / img.bounds.size.width, self.regleFirstPoint.y / img.bounds.size.height)];
img.transform = CGAffineTransformRotate(img.transform, angleInRadians);
This is just a question of order:
Set transform of your view to identity
Change the frame of your view
Finally, apply your transform
Here is an updated piece of code, just used an UIView instead of your image:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
// Hold first touch
self.regleFirstPoint = [touch locationInView:self.view];
// Reset view / image
_rulerView.transform = CGAffineTransformIdentity;
_rulerView.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, 0, CGRectGetHeight(_rulerView.frame));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self.view];
// Compute width
float deltaX = p.x - self.regleFirstPoint.x;
float deltaY = p.y - self.regleFirstPoint.y;
float width = sqrt((deltaX * deltaX) + (deltaY * deltaY));
// Compute angle
float angleInRadians = atan2(deltaY, deltaX);
NSLog(#"angle (rad) : %f", angleInRadians);
NSLog(#"angle (deg) : %f", angleInRadians * 180 / M_PI);
// First reset the transformation to identity
_rulerView.transform = CGAffineTransformIdentity;
// Set anchor point
_rulerView.layer.anchorPoint = CGPointMake(0.0f, 0.5f);
// Resizing view / image
_rulerView.frame = CGRectMake(self.regleFirstPoint.x, self.regleFirstPoint.y, width, CGRectGetHeight(_rulerView.frame));
// Reset the layer position to the first point
_rulerView.layer.position = self.regleFirstPoint;
// Apply rotation transformation
_rulerView.transform = CGAffineTransformMakeRotation(angleInRadians);
}
Hope that helps.
Cyril

UIImageView rotation animation in the touched direction

I have one UIImageView having an image of an arrow. When user taps on the UIView this arrow should point to the direction of the tap maintaing its position it should just change the transform. I have implemented following code. But it not working as expected. I have added a screenshot. In this screenshot when i touch the point upper left the arrow direction should be as shown.But it is not happening so.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
imageViews.transform = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rangle11));
previousTouchedPoint = touchedPoint ;
}
- (CGFloat) pointPairToBearingDegrees:(CGPoint)startingPoint secondPoint:(CGPoint) endingPoint
{
CGPoint originPoint = CGPointMake(endingPoint.x - startingPoint.x, endingPoint.y - startingPoint.y); // get origin point to origin by subtracting end from start
float bearingRadians = atan2f(originPoint.y, originPoint.x); // get bearing in radians
float bearingDegrees = bearingRadians * (180.0 / M_PI); // convert to degrees
bearingDegrees = (bearingDegrees > 0.0 ? bearingDegrees : (360.0 + bearingDegrees)); // correct discontinuity
return bearingDegrees;
}
I assume you wanted an arrow image to point to where ever you touch, I tried and this is what i could come up with. I put an image view with an arrow pointing upwards (haven't tried starting from any other position, log gives correct angles) and on touching on different locations it rotates and points to touched location. Hope it helps ( tried some old math :-) )
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
CGFloat angle = [self getAngle:touchedPoint];
imageView.transform = CGAffineTransformMakeRotation(angle);
}
-(CGFloat) getAngle: (CGPoint) touchedPoints
{
CGFloat x1 = imageView.center.x;
CGFloat y1 = imageView.center.y;
CGFloat x2 = touchedPoints.x;
CGFloat y2 = touchedPoints.y;
CGFloat x3 = x1;
CGFloat y3 = y2;
CGFloat oppSide = sqrtf(((x2-x3)*(x2-x3)) + ((y2-y3)*(y2-y3)));
CGFloat adjSide = sqrtf(((x1-x3)*(x1-x3)) + ((y1-y3)*(y1-y3)));
CGFloat angle = atanf(oppSide/adjSide);
// Quadrant Identifiaction
if(x2 < imageView.center.x)
{
angle = 0-angle;
}
if(y2 > imageView.center.y)
{
angle = M_PI/2 + (M_PI/2 -angle);
}
NSLog(#"Angle is %2f",angle*180/M_PI);
return angle;
}
-anoop4real
Given what you told me, I think the problem is that you are not resetting your transform in touchesBegan. Try changing it to something like this and see if it works better:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
imageViews.transform = CGAffineTransformIdentity;
imageViews.transform = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rangle11));
previousTouchedPoint = touchedPoint ;
}
Do you need the line to "remove the discontinuity"? Seems atan2f() returns values between +π to -π. Won't those work directly with CATransform3DMakeRotation()?
What you need is that the arrow points to the last tapped point. To simplify and test, I have used a tap gesture (but it's similar to a touchBegan:withEvent:).
In the viewDidLoad method, I register the gesture :
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapped:)];
[self.view addGestureRecognizer:tapGesture];
[tapGesture release];
The method called on each tap :
- (void)tapped:(UITapGestureRecognizer *)gesture
{
CGPoint imageCenter = mFlecheImageView.center;
CGPoint tapPoint = [gesture locationInView:self.view];
double deltaY = tapPoint.y - imageCenter.y;
double deltaX = tapPoint.x - imageCenter.x;
double angleInRadians = atan2(deltaY, deltaX) + M_PI_2;
mFlecheImageView.transform = CGAffineTransformMakeRotation(angleInRadians);
}
One key is the + M_PI_2 because UIKit coordinates have the origin at the top left corner (while in trigonometric, we use a bottom left corner).

Resources