UISwipeGestureRecognizer executes in an if/else statement when it should not - ios

I am attempting to run an action in iOS 7 Sprite Kit when the user swipes the left half of the screen.
To accomplish this I created a for loop waiting for touch events and, on touch, there is an if statement that checks to see if the touch location is less than half of the view's bounds. The if statement itself is executing properly (the else half returns the proper NSLog if the touch is initiated on the right half of the screen). The action triggered by the UISwipeGestureRecognizer, however, is called regardless of where the touch is initiated. I have included a code sample below.
Is there is a reason this is not working as expected?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:self.view];
NSLog(#"Touch location: %f, %f",touchLocation.x,touchLocation.y);
if (touchLocation.x<self.view.bounds.size.width/2) {
UISwipeGestureRecognizer *swipeRight = [[UISwipeGestureRecognizer alloc] initWithTarget:(self) action:#selector(screenSwipedRight)];
swipeRight.numberOfTouchesRequired = 1;
swipeRight.direction=UISwipeGestureRecognizerDirectionRight;
[self.view addGestureRecognizer:(swipeRight)];
UISwipeGestureRecognizer *swipeLeft = [[UISwipeGestureRecognizer alloc] initWithTarget:(self) action:#selector(screenSwipedLeft)];
swipeLeft.numberOfTouchesRequired = 1;
swipeLeft.direction=UISwipeGestureRecognizerDirectionLeft;
[self.view addGestureRecognizer:(swipeLeft)];
UISwipeGestureRecognizer *swipeUp = [[UISwipeGestureRecognizer alloc] initWithTarget:(self) action:#selector(screenSwipedUp)];
swipeUp.numberOfTouchesRequired = 1;
swipeUp.direction=UISwipeGestureRecognizerDirectionUp;
[self.view addGestureRecognizer:(swipeUp)];
}
else {
NSLog(#"Touches were on the right!");
}
}
}
-(void)screenSwipedRight
{
CGFloat percentToRight = 1-(self.playerOne.position.x / self.view.bounds.size.width);
NSTimeInterval timeToRight = self.horizontalRunSpeed * percentToRight;
NSLog(#"Percent to right = %f",percentToRight);
NSLog(#"Time to right = %f",timeToRight);
SKAction *moveNodeRight = [SKAction moveToX:self.view.bounds.size.width-self.playerOne.size.width duration:timeToRight];
[self.playerOne runAction:[SKAction sequence:#[moveNodeRight]]];
}

Is there is a reason this is not working as expected?
Yes. You are adding a new set of swipe gesture recognizers any time any touch begins on the left half of the screen. You are never removing them. You are never restricting the conditions for the gesture recognizer to begin.
This should fix your issue:
Delete your implementation of touchesBegan:withEvent:.
Add your gesture recognizers in viewDidLoad.
Set delegate = self for all of the gesture recognizers.
Add this code:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
CGPoint touchLocation = [touch locationInView:self.view];
NSLog(#"Touch location: %f, %f",touchLocation.x,touchLocation.y);
BOOL shouldBegin = (touchLocation.x < self.view.bounds.size.width / 2);
return shouldBegin;
}

Related

Gesture recognizer with Voice Over active

I developed an application which allows to the user to draw his finger signature in a canvas.
This feature is implemented using UIPanGestureRecognizer with a specific target action to draw a line in a UIView, but when the “Voice Over” is active the gesture recognizer action is not triggered anymore.
Gesture initialize code
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)];
pan.maximumNumberOfTouches = pan.minimumNumberOfTouches = 1;
[self addGestureRecognizer:pan];
Gesture action code
- (void)pan:(UIPanGestureRecognizer *)pan {
CGPoint currentPoint = [pan locationInView:self];
CGPoint midPoint = midpoint(previousPoint, currentPoint);
if (pan.state == UIGestureRecognizerStateBegan)
{
[path moveToPoint:currentPoint];
}
else if (pan.state == UIGestureRecognizerStateChanged)
{
[path addQuadCurveToPoint:midPoint controlPoint:previousPoint];
}
previousPoint = currentPoint;
[self setNeedsDisplay];
}
Is there any way to draw a line in a view using gesture with “Voice Over” active?
Thanks and regards!
I resolved my problem setting both isAccessibilityElement and accessibilityTraits properties for UIView canvas:
canvasView.isAccessibilityElement = YES;
canvasView.accessibilityTraits = UIAccessibilityTraitAllowsDirectInteraction;

Dragging coordinates with CGPoint

I want to get the coordinates from the user's finger while dragging. I tried this code, but it says the coordinates are always {0, 0},
What's wrong?
- (IBAction)Drag{
UIPanGestureRecognizer *Recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(dragged)];
[self.view addGestureRecognizer:Recognizer];
}
-(void) dragged{
UITouch *touch ;
CGPoint location = [touch locationInView:touch.view];
NSLog(#"%#", NSStringFromCGPoint (location));
}
I also tried NSLog(#"%.2f %.2f" location.x, location.y); and got the same.
Thanks
It's quite normal, you're using touch without having ever assigned a value to it.
The action for a gesture recognizer takes a parameter, which is the recognizer itself, which in turn has a locationInView: method, so you should use that. Also, you need to check the state of the recognizer. Finally, you probably don't want to add the gesture recognizer when you need it, just add it from the start.
// probably in your viewDidLoad
UIPanGestureRecognizer *Recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self
action:#selector(panGestureRecognizerAction:)];
[self.view addGestureRecognizer:Recognizer];
- (void)panGestureRecognizerAction:(UIPanGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan ||
recognizer.state == UIGestureRecognizerStateChanged)
{
CGPoint location = [recognizer.state locationInView:touch.view];
NSLog(#"%#", NSStringFromCGPoint (location));
}
}

TouchesMoved only moves a little

I am trying to allow the user to drag a label around the screen, but in the simulator it only moves a little each time I touch somewhere on the screen. It will jump to the location and then drag slightly, but then it will stop dragging and I have to touch a different location to get it to move again. Here is my code in my .m file.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *Drag = [[event allTouches] anyObject];
firstInitial.center = [Drag locationInView: self.view];
}
My ultimate goal is to be able to drag three different labels on the screen, but I'm just trying to tackle this problem first. I would greatly appreciate any help!
Thanks.
Try using a UIGestureRecognizer instead of -touchesMoved:withEvent:. And implementing something similar to the following code.
//Inside viewDidLoad
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(dragonMoved:)];
panGesture.minimumNumberOfTouches = 1;
[self addGestureRecognizer:panGesture];
//**********
- (void)dragonMoved:(UIPanGestureRecognizer *)gesture{
CGPoint touchLocation = [gesture locationInView:self];
static UIView *currentDragObject;
if(UIGestureRecognizerStateBegan == gesture.state){
for(DragObect *dragView in self.dragObjects){
if(CGRectContainsPoint(dragView.frame, touchLocation)){
currentDragObject = dragView;
break;
}
}
}else if(UIGestureRecognizerStateChanged == gesture.state){
currentDragObject.center = touchLocation;
}else if (UIGestureRecognizerStateEnded == gesture.state){
currentDragObject = nil;
}
}

Is it possible to get the x and y coordinates of a touch?

is it possible to get the x and y coordinates of a touch? If so could someone please provide a very simple example where the coordinates are just logged to the console.
Using touchesBegan Event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"Touch x : %f y : %f", touchPoint.x, touchPoint.y);
}
This event is triggered when touch starts.
Using Gesture
Register your UITapGestureRecognizer in viewDidLoad: Method
- (void)viewDidLoad {
[super viewDidLoad];
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapGestureRecognizer:)];
[self.view setUserInteractionEnabled:YES];
[self.view addGestureRecognizer:tapGesture];
}
Setting up the tapGestureRecognizer function
// Tap GestureRecognizer function
- (void)tapGestureRecognizer:(UIGestureRecognizer *)recognizer {
CGPoint tappedPoint = [recognizer locationInView:self.view];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
NSLog(#"Touch Using UITapGestureRecognizer x : %f y : %f", xCoordinate, yCoordinate);
}
Sample Project
First you need to add a gesture recognizer to the view you want.
UITapGestureRecognizer *myTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(myTapRecognizer:)];
[self.myView setUserInteractionEnabled:YES];
[self.myView addGestureRecognizer:myTap];
Then in the gesture recognizer method you make a call to locationInView:
- (void)myTapRecognizer:(UIGestureRecognizer *)recognizer
{
CGPoint tappedPoint = [recognizer locationInView:self.myView];
CGFloat xCoordinate = tappedPoint.x;
CGFloat yCoordinate = tappedPoint.y;
}
You may want to take a look at apple's UIGestureRecognizer Class Reference
Here's a very basic example (place it inside your view controller):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
NSLog(#"%#", NSStringFromCGPoint(currentPoint));
}
This triggers every time the touch moves. You can also use touchesBegan:withEvent: which triggers when a touch starts, and touchesEnded:withEvent: which triggers when a touch ends (i.e. a finger is lifted).
You can also do this using a UIGestureRecognizer, which in many cases is more practical.

Dragging Multiple Images

I am trying to develop an analysing app that determines if you are "clever"
What this involves doing is taking a picture of yourself and dragging points onto your face, where the nose, mouth and eyes are. However, The code I have tried does not work:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == eye1)
{
eye1.center = location;
}
else if ([touch view] == eye2)
{
eye2.center = location;
}
else if ([touch view] == nose)
{
nose.center = location;
}
else if ([touch view] == chin)
{
chin.center = location;
}
else if ([touch view] == lip1)
{
lip1.center = location;
}
else if ([touch view] ==lip2)
{
lip2.center = location;
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
What is happening, because when I just have a single image, it works, but is not helpful for me. What can I do to make it work? The spots start at the bottom of the screen in a "Toolbar" and then the user drags them onto the face. I kinda want the finished result to look like:
There are two basic approaches:
You can use the various touches methods (e.g. touchesBegan, touchesMoved, etc.) in your controller or the main view, or you can use a single gesture recognizer on the main view. In this situation, you'd use touchesBegan or, if using a gesture recognizer, a state of UIGestureRecognizerStateBegan, determine locationInView of the superview, and then test whether the touch is over one of your views by testing CGRectContainsPoint, using the frame of the various views as the first parameter, and by using the location as the second parameter.
Having identified the view that the gesture began, then in touchesMoved or, if in a gesture recognizer, a state of UIGestureRecognizerStateChanged, and move the view based upon the translationInView.
Alternatively (and easier IMHO), you can create individual gesture recognizers that you attach to each of the subviews. This latter approach might look like the following. For example, you first add your gesture recognizers:
NSArray *views = #[eye1, eye2, lip1, lip2, chin, nose];
for (UIView *view in views)
{
view.userInteractionEnabled = YES;
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[view addGestureRecognizer:pan];
}
Then you implement a handlePanGesture method:
- (void)handlePanGesture:(UIPanGestureRecognizer *)gesture
{
CGPoint translation = [gesture translationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateChanged)
{
gesture.view.transform = CGAffineTransformMakeTranslation(translation.x, translation.y);
[gesture.view.superview bringSubviewToFront:gesture.view];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
gesture.view.transform = CGAffineTransformIdentity;
gesture.view.center = CGPointMake(gesture.view.center.x + translation.x, gesture.view.center.y + translation.y);
}
}

Resources