Conflict with a UIPanGestureRecognizer on a UITableView's superview - ios

I'm trying to build something roughly similar to the drawer menu in Apple Maps on iOS.
In this Xcode project I'm attaching a UIPanGestureRecognizer on the VC's view, and as the panning happens, move vertically a UITableView with scrolling disabled.
The issue is every time after the pan ends, the didSelectRow method is called only after a second tap happens somewhere on the UITableView. Of course I'd like it to be called after the first tap.
The funny thing is that the bug does not happen if I enable the table's scrolling, and in the gesture recognizer's delegate have shouldRecognizeSimultaneouslyWith returning true.
Other funny thing is a very similar thing seems to happen in Apple Maps itself, if you try pulling the drawer up with the finger resting on a recent location entry from the list inside the drawer.
Thanks for your help!

I don’t understand well what are your saying.
But I think that the main problem is with “Chain Responder”. When you use PanGestureRecognizerand the UITableView property isScrollEnable = false in the Responder Chain the PanGestureRecognizer it is the first who is called, and the system wait to that fails or the event is not handled, then it is passed to the next in the Responder Chain, who is the UITableView. For that reason, it takes too long to get called the didSelectRow function
I suggest to you create a new UIView and insert in the ViewController in the storyboard o nib and put the UITableVIew outside of that UIView, then link the PanGestureRecognizer to that new UIView. In that way the Responder Chain don’t get in conflict with both, because the system can detect when the drag is in the new UIView and call only to the PanGestureRecognizer and when it is in the UITableView will call to the didSelectRow
Best Regards
Write if it does not resolve

Related

UIScrollView pass event to child chain on WillEndDragging

Edit: I am editing my initial question (see below for history) as I am getting new information.
I figured out that when the swipe motion starts from inside the button bounds, we never receive TouchesEnded or TouchesCancelled, only TouchesMoved. However, if I can react on WillEnddragging, it would be great. Is it possible to cancel a gesture on WillEndDragging and also pass this cancel down the children chain?
History:
I am using Xamarin Forms and I have the following issue
I have custom controls part of native scrolling views, like ScrollView or CollectionView, that remain in "clicked" state after the finger enters them but then initiates a scroll gesture.
I had a similar issue on UWP in the past and managed to solve it with the UIElement.PointerCaptureLost event.
Sorry if I am wasting your time on trivial stuff, but I am really stuck and I greatly appreciate your help.
I have tried different approaches suggested, including setting DelaysContentTouches to NO, and playing around with CanCancelContentTouches and overriding TouchesShouldCancelInContentView to always return NO, in a ScrollView custom renderer.
I have had a read of
Allow UIScrollView and its subviews to both respond to a touch
and
UIScrollView sending touches to subviews
Maybe the accepted answer here helps, but I am not sure how to get the tag of my custom view.
What I am expecting is my custom controls to receive the cancelled touch event (or something similar) as happens in both Android and Windows
This was easier than it looked. Solved by adding a UIGestureRecognizerDelegate to my UIGestureRecognizer class and in the delegate I overwrote ShouldRecognizeSimultaneously to return true.

UITextInteraction + UIScrollView

I have a custom text input view that adopts the UITextInput protocol. This view is embedded within a UIScrollView. The UITextinput adopting view includes UITextInteraction. When I drag the selection handles to the edges (top or bottom) of the current visible area, the view should scroll and select text automatically (like UITextView). Does anybody know how to achieve this?
I was hoping that UITextInteraction might inform me via its delegate of necessary events, but it does not (or even takes care of this functionality automatically).
I have tried to intercept closestPosition(to point: CGPoint) -> UITextPosition?, which is called whenever the user touches the UITextInput adopting view. Therefore, I can use it to track the dragging operation of a selection handle. Once, the user reaches the top of the view, I scroll up. However, I cannot detect when the user lets go of the handle (when touch ends). When this happens, the scrollView should stop scrolling. In my case, the scroll view keeps scrolling to the top.
I also tried to intercept selectionRects(for range: UITextRange) -> [UITextSelectionRect], but it is called sporadically during a scrolling.
I also cannot detect touchesEnded(). The UITextInteraction seems to block the call. Further, I cannot implement my own pan gesture. UITextInteraction blocks this as well during a selection operation.
Has anybody successfully used UITextInteraction? It seems very pre-mature at this stage.
Here's the way I got this info out of UITextInteraction for the text insertion point placement. I have not yet attempted to track the selection handles, but will update the answer if I manage it
UITextInteraction has a property called gesturesForFailureRequirements. If you dump the content of these you'll notice one named UIVariableDelayLoupeGesture which is a subclass of UILongPressGestureRecognizer. This is the one we want when the use "picks up" the cursor with their finger
I add my own target/selector to this gesture recogniser when adding my text interaction like so:
for gesture in interaction.gesturesForFailureRequirements {
if gesture.isKind(of: UILongPressGestureRecognizer.self) {
gesture.addTarget(self, action: #selector(longPressEdgeScrollGesture(_:)))
}
}
Then in longPressEdgeScrollGesture you can get the coordinates of the cursor in your scroll view to activate your edge scrolling timer as necessary
#objc private func longPressEdgeScrollGesture(_ gesture: UILongPressGestureRecognizer) {
guard gesture.numberOfTouches > 0 else {
return
}
let location = gesture.location(in: gesture.view)
print(location)
//Start/stop your edge scrolling as required
}
I found a solution to my original problem. It is somewhat of a work around, but hopefully Apple will improve UITextInteraction.
Trying to intercept any functions in UITextInput led nowhere. Thankfully some very clever people on Twitter figured out what to do. You need to intercept the right UIGestureRecognizer that is added automatically. Much like Simeon explained in his answer. The recognizer in question is called UITextRangeAdjustmentGestureRecognizer. But you cannot find it via gesturesForFailureRequirements. You need to become a delegate of the gestures and then you can find mentioned gesture via the delegate method of shouldRecognizeSimultaneouslyWith otherGestureRecognizer.
Once you add a target + action to that gesture you can observe the dragging and handle edge scrolling.

Get UITextView Gesture (To Identify Location of Tap/LongPress)

I'm rather confident [editable] UITextView's become firstResponder when a long press or tap gesture occurs within the scrollView. I want to identify where in the view this touch occured. Digging through the documentation and source code didn't yield me much. I might be going about this wrong. My concern is a race condition if I just add my own tap recognizer (how can I be sure it is called before the textView's delegate methods).
For practical clarification, I want to call two similar functions from a delegate function (editingDidBegin) but depending if they touched the left or right half of the text view, I want to call either of the two.

Touch Down firing in a weird way

I'm adding a touch down action to a uitextfield (actually it's a subclass, but I think that might not be important). I created a simple view controller and added this textbox to it, and wired up the event to println("Hello").
When I quickly tap the item (both in simulator, and on my phone) it works perfectly and says hello!
I then created a UITableViewController subclass, and in one of the static cells I added the same textbox.
In this case, when I quickly tap the textbox nothing happens! When I actually hold down the mouse or my finger for about 1/2 a second, it works. But not if I quickly tap it.
This is different from the previous textbox, which always works perfectly no matter how fast I tap it.
Are there some problems with different events being intercepted ors something of that sort?
I even went so far as to add a tap gesture recognizer to both the table cell, and the textbox, but neither work unless I hold it down (the table cell action won't even fire unless I click off the textbox and into the cell proper, of course).
Thanks so much this is very strange.
UIButton not showing highlight on tap in iOS7
and
iOS - Delayed "Touch Down" event for UIButton in UITableViewCell
have a lot of information about this. Apparently there is a delay for uitableviewcells that can be avoided by taking some of the approaches above.
I'll post the solution that works for me once I work on it. thanks!
EDIT OP DID DELIVER!! (lol sorry)
in IOS8, the idea is that table cells no longer have the uiscrollview that would basically delay the touching, so what you can do instead is something like this in your page did load:
for subview in self.tableView.subviews as [UIView]
{
if subview is UIScrollView
{
let scroll = subview as UIScrollView
scroll.delaysContentTouches = false
break
}
}
So see how we're iterating over self.tableview's subviews, and anytime we hit a scrollview set delaysContentTouches to false. This worked for me on both the simulator and on my phone.

Prevent reordering of elements in gestureRecognizers array

I'm experiencing a bug in my app that is causing gestures to stop working that I previously added to a UITextField via addGestureRecognizer:. Essentially, I add a tap and long press gesture recognizer to the UITextField (which already has 7 gesture recognizers applied from iOS). When logging self.textField.gestureRecognizers, it shows the existing 7 gestures and then the two I added at the end of the array. The gestures work just like I expected.
However, when I present a modal view controller and then dismiss it, my two gestures stop working on the text field. I'm not sure exactly why, but the view does disappear and it resignsFirstResponder (the keyboard is always up when the modal VC is presented) which may be related. But I discovered the gestures aren't removed from the text field, but the order of the gestures in the array has changed. My custom gestures are now located at index 0 and 1 instead of 7 and 8. I believe the 7 default gestures are conflicting/overriding my custom ones (I assume later placement in the array overrides those before it) which explains why they stop working even though they're still applied.
My questions are:
- Do you know why it is reordering the elements in self.textField.gestureRecognizers?
- How do I prevent that from occurring to ensure my custom gestures always work, without breaking the default gestures for UITextField?
My current solution is to add the two gestures for the first time then store the array of total (9) gestures, then in viewDidAppear I change the gestureRecognizers array (yes it is settable) to my stored array. This guarantees the array will be the 7 built-in gestures followed by my two custom gestures in that order. But I discovered my gestures are overriding the default gestures (that bring up the popup to Cut, Copy, etc), so I have to reset the gestures back to the default 7 after my custom gesture occurs (which is just fine - I only need to trigger the action a single time after recognizing my custom gesture). Simple enough to do - I store the original gestures in a property as well. But this doesn't feel like the best solution. I'd prefer to figure out the cause and address that or go about the situation differently instead of duct-taping the code together.
My first solution was to always add my two gestures in viewDidAppear
viewDidAppear: is called when your view controller's view first appears, but it is also called again later when the presented view controller is dismissed.
Thus you are adding the gesture recognizers twice.
The simplest solution is to use a BOOL instance variable (we call this a "flag") which you set to YES the first time and test afterwards:
if (!self.addedGestures) {
self.addedGestures = YES;
// ... add them! ...
}
Now you will only add them once.
(On the other hand it might be argued that if you care about the order of the gesture recognizers in the array you are already doing something wrong. Use delegate methods to resolve conflicts between gesture recognizers - that's what they are for.)

Resources