UIPanGestureRecognizer sometimes not working on iOS 7 - ios

I'm getting intermittent reports from users on iOS 7 saying that the UIPanGestureRecognizer stops working on certain views every once in a while. They're supposed to be able to swipe a view to the right/left, but it just breaks and doesn't work for some unknown reason. Force quitting the app and relaunching it fixes the problem.
This problem never happened on iOS 6. And I don't have any code that disables the gesture recognizer at any time besides the gestureRecognizerShouldBegin delegate that forces the gesture to only recognize horizontal pans:
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer isMemberOfClass:[UIPanGestureRecognizer class]]) {
CGPoint translation = [gestureRecognizer translationInView:[self superview]];
if (fabsf(translation.x) > fabsf(translation.y)) {
if (translation.x > 0)
return YES;
}
}
return NO;
}
Did anything change in the UIPanGestureRecognizer (or just the plain UIGestureRecognizer) that could be causing this problem?

I think I finally solved this issue. Apparently iOS 7 handles gestures in subviews differently than it did in iOS 6 and earlier. To handle this, Apple implemented a new delegate:
(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
If you return YES, that should get your gesture recognizer to work. I've implemented it and haven't had any issues so far (though admittedly this was a rare bug that I could never reliably reproduce, so it's possible that it just hasn't recurred yet).
For more information, see
https://stackoverflow.com/a/19892166/1593765.

Why would you return NO in gesture recognizer just because on gestureRecognizerShouldBegin: the movement is only vertical?
Since it's gesture made by user with his finger (and not made by a machine), there will always be some randomness in it's movement due to inaccuracy of moving finger. gestureRecognizerShouldBegin: will be called just after user touches the screen and the translation you get might be just a few pixels. Your recognizer will fail if user i.e. when putting his finger on screen moves it 2 pixels up, even if he then moves it 200 pixels to the right.
This shouldn't cause the gesture recognizer to be permanently disabled but you should look into it as well, since it might confuse users when their gestures are not recognized for seemingly no reason.

Related

UIScreenEdgePanGestureRecognizer delays UIButtons events

I ran into a pretty annoying issue here involving UIScreenEdgePanGestureRecognizerand UIButton interactions.
Here the context: I am developing an Ipad app. The view of my UIViewController contains a lot of UIButton's both at its very top and bottom. I realized after a while that, when being pressed at specific location, the UIButton's weren't behaving as usual. Taking the top row as an example, when the button is pressed at a location really close from the edge of the screen, the event linked to that button would take around .5 second to be triggered. I can also visually see it as the button takes time to be highlighted.
What the cause is: I did a bit of research and from what I could read from this post there is apparently a "conflict between the possibility of a gesture directed at your app and a gesture directed at the system". Basically as the user tap near the edges of the screen, the device is waiting for the user to swipe down or up (using UIScreenEdgePanGestureRecognizer, for example) and that is most likely why I get this delay with my UIButton's.
Now: All the posts regarding this issue are a little bit outdated. I tried different work-around and "not-that-convenient" solution:
i.e. Subclassing my UIButton's to allow the touch to "bypass" UIScreenEdgePanGestureRecognizer with the following method:
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL pointInside = [super pointInside: point withEvent: event];
for (UIButton *view in self.subviews) {
if (CGRectContainsPoint(view.frame, point)) pointInside = YES;
return pointInside;
}
}
return NO;
}
But in vain, still doesn't fix the issue.
I am wondering if anyone came up with an easy solution to that ? Or maybe Apple did a fix ?
Any suggestions / solutions / comments would be greatly appreciated folks!

iOS: Swipe up gesture recognizer not working

My table view has a text field above it. And whenever the text field is focussed, a swipe gesture is registered. When the swipe gesture is recognized, the keyboard is dismissed. The code is working for all gestures except for swipe up gesture is not working. This is my code
swipe = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(dismissKeyboard)];
[swipe setDirection:UISwipeGestureRecognizerDirectionUp];
Can someone please let me know if there is any problem?
if all the other gestures works, that means there is no logic problem.
check out spelling errors.
and reapply the swipe gesture, and check out everything (outlets etc.).
I don't know about this case, but I know that when I've had gestures on a custom container view and then added a child view with its own gestures, I've had to iterate through the child's gestures and tell them to require my gestures to fail (i.e. mine take precedence). I've done this with scroll views successfully:
for (UIGestureRecognizer *gesture in self.scrollView.gestureRecognizers)
{
[gesture requireGestureRecognizerToFail:myGesture];
}
The only times I've had problems with that are views like UITextView which remove and add gestures as you go in and out of edit mode, so that's a hassle.
Also, while I tried this with standard gestures, I've subsequently shifted to custom gestures that I've programmed to failed as quickly as possible (check the start location and fail immediately if it won't support the direction my gesture requires, rather than waiting for a bunch of touchesMoved to come to the same conclusion). If you don't want to interfere with the child view's gestures, be as aggressive as possible in letting yours fail. Maybe this isn't an issue with a swipe gesture, but it's a possible consideration if you find that your gestures end up changing the behavior of the child view noticeably.
But I suspect you'll probably just have to figure out which views have the gestures that are interfering with yours and make them require yours to fail first.
Any chance you're colliding with one of the scrollview's gestures? It doesn't seem likely if your other gestures are working, but it might be worth at least trying the gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: method in the UIGestureRecognizerDelegate protocol.

ios: pinchGesture that only happen once

In IOS 6, how to add PinchGesture that only detect once?
I have a UIView that I add to pinchGesture:
[self addPinchGestureRecognizersToView:self.view];
Then I attach a function to this Pinch to call out a uiview. The problem is when I pinch, the event occur a few times, that make the ViewController to addSubview many times depend on how many times the event occur.
So how can I actually limit it to just 1 time or remove it at the moment it detect a pinch. I tried:
[self.view removeGestureRecognizer:UIPinchGestureRecognizer];
But I got an compile error.
Thanks for all the suggestions. I just thought of the simplest solution - Add a BOOLEAN to check. The rest work like a charm.
You should know pinch gesture is a continuous gesture. That is to say it can be recognized many time during the touch procedure.
If you want to recognize only once, you can remove it the first time it recognizes. The reason you get a compile error is you should 'remember' your gesture and remove it later.
[self.view removeGestureRecognizer:UIPinchGestureRecognizer];
This method call is invalid. UIPinchGestureRecognizer is a class not an instance. You have to replace it with the correct recognizer you have added.
for (UIGestureRecognizer* recognizer in [self.view.gestureRecognizers copy]) {
if ([recognizer isKindOfClass:[UIPinchGestureRecognizer class]]) {
[self.view removeGestureRecognizer:recognizer];
}
}

Finger Tracking and Gesture Recognizing

So I'm developing an app where I have all my gestures are being recognized. My problem comes when I attempt to add UIImageViews wherever the finger touches the screen.
These Views are to follow the finger, which they do, but the problem is I believe they are swallowing the touches not allowing the Gestures to be recognized. I have tried:
[_fingerHolder1 setUserInteractionEnabled:NO];
[_fingerHolder2 setUserInteractionEnabled:NO];
But it doesn't seem to change anything.
I am adding these the View in the ccTouchesBegan/Moved/Ended methods, whereas the gestures are being recognized in their respective handlers.
I have looked at using the UIPanGesture but I'm having some trouble recognizing the swipes as well as setting the coordinates for theUIImageViews of the finger trackers while doing this. Should I experiment with this more or is there a different solution?
Any help is appreciated!
The UIImageView will receive and process touches, hence they will not be forwarded to the cocos2d OpenGL view (also a UIView).
To make this work you need to create a subclass of UIImageView and override each touches… method and manually forward the event to cocos2d's view, here's the example for touchesBegan:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[[CCDirector sharedDirector].view touchesBegan:touches withEvent:event];
}
Use this UIImageView subclass in place of the original ones you use currently.
That will make regular cocos2d touch events work, and it should also make UIGestureRecognizers behave as expected if you've added those to cocos2d's view.
If I understand what you need (please correct me if I'm wrong), you want to move some UIViews when a drag(pan) event is detected, but you also add UIImageViews when the user touches the screen and this disables the touches.
You should set UIIMageView.userInteractionEnable = YES(by default is set to NO), basically every view that should detect touches should have userInteractionEnable = YES.
If you want to ignore some touches on some subviews you should implement:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch method of UIGestureRecognizerDelegate.
For handling different types of gesture you should implement the method:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}

UITapGestureRecognizer giving invalid values for locationInView:

I have tracked down the problem to be the locationInView: on the UITapGestureRecognizer always giving me CGPoint of (0,-67) in portrait and (0,268) in landscape. (If i don't cast it as UITapGestureRecognizer, i get (0,180) in landscape occasionally.
This problem does not exist in iOS 5 Simulator. It happens often in iOS 6 Simulator and almost 90% of the time in an iOS 6 Device.
My guess now is that the gesture recognizer is not valid anymore by the time it calls the action method.. but that doesn't make sense as that means we always need to call the locationInView: in the delegate methods..
Details:
What I'm trying to do:
Recognize a Tap gesture on a MKMapView, covert that point to coordinate and display it as an annotation
What I did:
In the action method of the gesture recognizer..
CLLocationCoordinate2D coordinate = [self.mapView convertPoint:[(UITapGestureRecognizer *)sender locationInView:self.mapView] toCoordinateFromView:self.mapView];
I did do introspection to make sure that the sender is indeed an UITapGestureRecognizer..
I also did try return YES for (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer but that does not help.
What should happen:
The coordinate should be corresponding to the point tapped on the map.
What did happen:
The coordinate is always slightly far off to the left.
Update:
So.. i have tested with all the delegate methods in <UIGestureRecognizerDelegate>. Both in the action method as above and – gestureRecognizer:shouldReceiveTouch: the gesture recognizer gives invalid position (0,-64) for locationInView:. (it was 0,-67 as stated above, but become 0,-64 after i updated Xcode to the latest version few minutes ago, lol) However, in – gestureRecognizerShouldBegin: and – gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: it gives the correct CGPoint.
My question is.. is this the intended behavior? Or did i mess up something? Else it means i need to fire my action in one of the delegate methods, as i do need the correct position of the gesture recognizer. This option doesn't sound very right, though..
Sorry for the troubles guys, I have found out the cause.
I simply have left locationInView: in a performBlock: of a NSManagedObjectContext, which is clear, as UIGestureRecognizer is a UI thing, and the Rule #1 in iOS is "UI stuffs only belong to the main thread." This immediately explains the inconsistent behavior of the locationInView: and why it is more likely to succeed in the earlier stage..
Lesson learned, again: Read "gesture recognizer" as UIGestureRecognizer. "UI".

Resources