UITapGestureRecognizer giving invalid values for locationInView: - ios

I have tracked down the problem to be the locationInView: on the UITapGestureRecognizer always giving me CGPoint of (0,-67) in portrait and (0,268) in landscape. (If i don't cast it as UITapGestureRecognizer, i get (0,180) in landscape occasionally.
This problem does not exist in iOS 5 Simulator. It happens often in iOS 6 Simulator and almost 90% of the time in an iOS 6 Device.
My guess now is that the gesture recognizer is not valid anymore by the time it calls the action method.. but that doesn't make sense as that means we always need to call the locationInView: in the delegate methods..
Details:
What I'm trying to do:
Recognize a Tap gesture on a MKMapView, covert that point to coordinate and display it as an annotation
What I did:
In the action method of the gesture recognizer..
CLLocationCoordinate2D coordinate = [self.mapView convertPoint:[(UITapGestureRecognizer *)sender locationInView:self.mapView] toCoordinateFromView:self.mapView];
I did do introspection to make sure that the sender is indeed an UITapGestureRecognizer..
I also did try return YES for (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer but that does not help.
What should happen:
The coordinate should be corresponding to the point tapped on the map.
What did happen:
The coordinate is always slightly far off to the left.
Update:
So.. i have tested with all the delegate methods in <UIGestureRecognizerDelegate>. Both in the action method as above and – gestureRecognizer:shouldReceiveTouch: the gesture recognizer gives invalid position (0,-64) for locationInView:. (it was 0,-67 as stated above, but become 0,-64 after i updated Xcode to the latest version few minutes ago, lol) However, in – gestureRecognizerShouldBegin: and – gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: it gives the correct CGPoint.
My question is.. is this the intended behavior? Or did i mess up something? Else it means i need to fire my action in one of the delegate methods, as i do need the correct position of the gesture recognizer. This option doesn't sound very right, though..

Sorry for the troubles guys, I have found out the cause.
I simply have left locationInView: in a performBlock: of a NSManagedObjectContext, which is clear, as UIGestureRecognizer is a UI thing, and the Rule #1 in iOS is "UI stuffs only belong to the main thread." This immediately explains the inconsistent behavior of the locationInView: and why it is more likely to succeed in the earlier stage..
Lesson learned, again: Read "gesture recognizer" as UIGestureRecognizer. "UI".

Related

Accurate start position for UIPanGestureRecognizer?

I am using a UIPanGestureRecogniser to implement drag and drop. When the drag starts I need to identify the object that is being dragged. However the objects are relatively small. And if the user doesn't hit the object right in the centre of the object it isn't getting dragged.
The problem is that when the gesture handler is first called with the state UIGestureRecognizerStateBegan, the finger has already moved several pixels, and so [UIPanGestureRecognizer locationInView:] returns that point, which is not where the gesture truly started. That makes sense as it can only recognize a pan after a few pixels of movement. However, I need the absolute start of the gesture, not the position after the gesture has first been recognized.
I'm thinking that maybe I need to implement a tap gesture recognizer as well, purely to capture the first touch. But that seems like a hack for what is not an unusual requirement. Is there no other way of getting that first touch from within the pan gesture recognizer?
UIGestureRecognizerDelegate protocol provides methods gestureRecognizerShouldBegin: and gestureRecognizer:shouldReceiveTouch: that can help you evaluate the touches before the pan has transitioned to state UIPanGestureRecognizerStateBegan

ios UIGestureRecognizer for longpress, then swipe

I'm using a UIGestureRecognizer to recognize a single tap, a double tap and a longpress.
What I would like to do is also recognize a longpress, then swipe to either left or right.
Would this be possible given that I'm already consuming a longpress? I'm confused on this one and would appreciate pointers on how to do.
Thanks
Just tried this out myself and it seems as if the UILongPressGestureRecognizer will transition to its end state as soon as the UISwipeGestureRecognizer begins. Just make sure shouldRecognizeSimultaneouslyWithGestureRecognizer: returns YES for this gesture combination.
You'd need to use two gesture recognisers and make sure you track the state of the long press one when you receive a callback to say it's ended, and then do something based on the swipe/pan gesture following it.

UIPanGestureRecognizer sometimes not working on iOS 7

I'm getting intermittent reports from users on iOS 7 saying that the UIPanGestureRecognizer stops working on certain views every once in a while. They're supposed to be able to swipe a view to the right/left, but it just breaks and doesn't work for some unknown reason. Force quitting the app and relaunching it fixes the problem.
This problem never happened on iOS 6. And I don't have any code that disables the gesture recognizer at any time besides the gestureRecognizerShouldBegin delegate that forces the gesture to only recognize horizontal pans:
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer isMemberOfClass:[UIPanGestureRecognizer class]]) {
CGPoint translation = [gestureRecognizer translationInView:[self superview]];
if (fabsf(translation.x) > fabsf(translation.y)) {
if (translation.x > 0)
return YES;
}
}
return NO;
}
Did anything change in the UIPanGestureRecognizer (or just the plain UIGestureRecognizer) that could be causing this problem?
I think I finally solved this issue. Apparently iOS 7 handles gestures in subviews differently than it did in iOS 6 and earlier. To handle this, Apple implemented a new delegate:
(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
If you return YES, that should get your gesture recognizer to work. I've implemented it and haven't had any issues so far (though admittedly this was a rare bug that I could never reliably reproduce, so it's possible that it just hasn't recurred yet).
For more information, see
https://stackoverflow.com/a/19892166/1593765.
Why would you return NO in gesture recognizer just because on gestureRecognizerShouldBegin: the movement is only vertical?
Since it's gesture made by user with his finger (and not made by a machine), there will always be some randomness in it's movement due to inaccuracy of moving finger. gestureRecognizerShouldBegin: will be called just after user touches the screen and the translation you get might be just a few pixels. Your recognizer will fail if user i.e. when putting his finger on screen moves it 2 pixels up, even if he then moves it 200 pixels to the right.
This shouldn't cause the gesture recognizer to be permanently disabled but you should look into it as well, since it might confuse users when their gestures are not recognized for seemingly no reason.

ios: pinchGesture that only happen once

In IOS 6, how to add PinchGesture that only detect once?
I have a UIView that I add to pinchGesture:
[self addPinchGestureRecognizersToView:self.view];
Then I attach a function to this Pinch to call out a uiview. The problem is when I pinch, the event occur a few times, that make the ViewController to addSubview many times depend on how many times the event occur.
So how can I actually limit it to just 1 time or remove it at the moment it detect a pinch. I tried:
[self.view removeGestureRecognizer:UIPinchGestureRecognizer];
But I got an compile error.
Thanks for all the suggestions. I just thought of the simplest solution - Add a BOOLEAN to check. The rest work like a charm.
You should know pinch gesture is a continuous gesture. That is to say it can be recognized many time during the touch procedure.
If you want to recognize only once, you can remove it the first time it recognizes. The reason you get a compile error is you should 'remember' your gesture and remove it later.
[self.view removeGestureRecognizer:UIPinchGestureRecognizer];
This method call is invalid. UIPinchGestureRecognizer is a class not an instance. You have to replace it with the correct recognizer you have added.
for (UIGestureRecognizer* recognizer in [self.view.gestureRecognizers copy]) {
if ([recognizer isKindOfClass:[UIPinchGestureRecognizer class]]) {
[self.view removeGestureRecognizer:recognizer];
}
}

Custom UIGestureRecognizer Not Working As Expected

I have a UITableView which I present in a UIPopoverController. The table view presents a list of elements that can be dragged and dropped onto the main view.
When the user begins a pan gesture that is principally vertical at the outset, I want the UITableView to scroll as usual. When it's not principally vertical at the outset, I want the application to interpret this as a drag-and-drop action.
My unfortunately lengthy journey down this path has compelled me to create a custom UIGestureRecognizer. In an attempt to get the basics right, I left this custom gesturer as an empty implementation at first, one that merely calls the super version of each of the five custom methods Apple says should be overridden:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
(void)reset;
This results in nothing happening, i.e. the custom gesture's action method is never called, and the table view scrolls as usual.
For my next experiment, I set the gesture's state to UIGestureRecognizerStateBegan in the touchesBegan method.
This caused the gesture's action method to fire, making the gesture appear to behave just like the standard UIPanGestureRecognizer. This obviously suggested I was responsible for managing the gesture's state.
Next up, I set the gesture's state to UIGestureRecognizerStateChanged in the touchesMoved method. Everything still fine.
Now, instead, I tried setting the gesture's state to UIGestureRecognizerStateFailed in the touchesMoved method. I was expecting this to terminate the gesture and restore the flow of events to the table view, but it didn't. All it did was stop firing the gesture's action method.
Lastly, I set the gesture's state to UIGestureRecognizerStateFailed in the touchesBegan method, immediately after I had set it to UIGestureRecognizerStateBegan.
This causes the gesture to fire its action method exactly once, then pass all subsequent events to the table view.
So...sorry for such a long question...but why, if I cause the gesture to fail in the touchesBegan method (after first setting the state to UIGestureRecognizerStateBegan), does it redirect events to the table view, as expected. But if I try the same technique in touchesMoved (the only place I can detect that a move is principally vertical), why doesn't this redirection occur?
Sorry for making this more complicated than it actually was. After much reading and testing, I've finally figured out how to do this.
First, creating the custom UIGestureRecognizer was one of the proper solutions to this issue, but when I made my first test of the empty custom recognizer, I made a rookie mistake: I forgot to call [super touches...:touches withEvent:event] for each of the methods I overrode. This caused nothing to happen, so I set the state of the recognizer to UIGestureRecognizerStateBegan in touchesBegan, which did result in the action method being called once, thus convincing me I had to explicitly manage states, which is only partially true.
In truth, if you create an empty custom recognizer and call the appropriate super method in each method your override, your program will behave as expected. In this case, the action method will get called throughout the dragging motion. If, in touchesMoved, you set the recognizer's state to UIGestureRecognizerStateFailed, the events will bubble up to the super view (in this case a UITableView), also as expected.
The mistake I made and I think others might make is thinking there is a direct correlation between setting the gesture's state and the chronology of the standard methods when you subclass a gesture recognizer (i.e. touchesBegan, touchesMoved, etc.). There isn't - at least, it's not an exact mapping. You're better off to let the base behavior work as is, and only intervene where necessary. So, in my case, once I determined the user's drag was principally vertical, which I could only do in touchesMoved, I set the gesture recognizer's state to UIGestureRecognizerStateFailed in that method. This took the recognizer out of the picture and automatically forwarded a full set of events to the encompassing view.
For the sake of brevity, I've left out a ton of other stuff I learned through this exercise, but would like to point out that, of six or seven books on the subject, Matt Neuburg's Programming IOS 4 provided the best explanation of this subject by far. I hope that referral is allowed on this site. I am in no way affiliated with the author or publisher - just grateful for an excellent explanation!
That probably happens because responders expect to see an entire touch from beginning to end, not just part of one. Often, -touchesBegan:... sets up some state that's then modified in -touchesMoved..., and it really wouldn't make sense for a view to get a -touchesMoved... without having previously received -touchesBegan.... There's even a note in the documentation that says, in part:
All views that process touches,
including your own, expect (or should
expect) to receive a full touch-event
stream. If you prevent a UIKit
responder object from receiving
touches for a certain phase of an
event, the resulting behavior may be
undefined and probably undesirable.

Resources