iOS: Detecting touch down, segue, touch up - ios

User puts her finger on the screen. This triggers a UITouchEvent, phase Began, which calls the touchesBegan:withEvent: method in controllerA, which performs a segue from controllerA to controllerB.
User lifts her finger off the screen. This triggers a UITouchEvent, phase Ended, which calls some callback method.
Question: What and where is this callback method? It's not in controllerA, and it's not in controllerB. From what I can tell, it's not in any view. But it exists.

To clarify, here's what's going on (according to #switz):
In response to -touchesBegan:withEvent:, a view controller is presented
modally via a segue
When the user lifts up their finger, the view controller should be dismissed.
The question is how to react to the finger being lifted, since
-touchesEnded:withEvent: is not invoked.
The short answer is the presented view controller needs to use the "Over Full
Screen" modalPresentationStyle instead of the default "Full Screen" style
(this can either be specified as the presentation style of the segue, or if
that's "Default" then the presentation style of the presented view controller).
The long answer requires a brief overview of how touch handling works. This
explanation ignores gesture recognizers:
When a touch begins, it's delivered to the "topmost" view that contains the
touch point. From there it gets passed along the responder chain until some
object decides to handle the touch (which is signified by implementing
-touchesBegan:withEvent: and not calling super).
Subsequent changes to the touch (e.g. moved, ended, canceled) are delivered back
to the same view that accepted the touch. The view will continue to receive the
touch events until the touch finishes or cancels.
A touch is canceled either when the application moves to the background (because
e.g. a phone call came in), or when a UIKit class like UIScrollView decides
that it needs to take over touch handling (because the finger moved far enough
that it looks like the user wants to scroll). There's also some funny stuff here
with UIScrollView.delaysContentTouches, but that can be ignored.
But there's a wrinkle, something that isn't documented: touch delivery only
happens so long as the view remains associated with the window. If the view that
is considered "topmost" (the view that is associated with the UITouch) is
removed from the window, then the touch is considered to have vanished and,
importantly, no events for that touch are delivered again, to anyone. This is
true even if the view in question is not the object handling touches.
And that final wrinkle is the cause for this problem. Because the default
"Full Screen" presentation style actually removes the old view controller's
view from the window, the touch handling immediately stops. However, the "Over
Full Screen" presentation style does not remove it, it merely covers up the old
view with the one. "Over Full Screen" is typically used when the presented view
controller is not fully opaque, but in this case we're using it so touch
handling isn't interrupted.
But that's not all. There's another problem here, which is when the view that's
being touched lives inside a UIScrollView (one that either is scrollable or
always bounces). In that case, even with "Over Full Screen", you'll find that,
while the touch events continue to be delivered, moving your finger around a bit
suddenly causes the touch to be canceled. This is because the UIScrollView
doesn't know it's covered up and has decided that the user is actually trying to
scroll. This causes it to cancel the touch.
There is a solution to this, though. It's kind of ugly, but the solution is to
immediately cancel any scrolling on any enclosing scroll view when performing
the segue. This can be done with code like the following:
class ViewController: UIViewController {
// this is called from -touchesBegan:withEvent: from a child view
// the child view is `sender`
func touchDown(sender: UIView) {
var view = sender.superview
while view != nil {
if let scrollView = view as? UIScrollView {
// toggle the panGestureRecognizer enabled state to immediately
// cause it to fail.
let enabled = scrollView.panGestureRecognizer.enabled
scrollView.panGestureRecognizer.enabled = true
scrollView.panGestureRecognizer.enabled = enabled
}
view = view?.superview
}
performSegueWithIdentifier(identifier, sender: self)
}
// ...
}
Of course, no discussion of touch handling would be complete without gesture
recognizers. Gesture recognizers change pretty much everything about touch
handling. They get first dibs on any touches, and they can interrupt view touch
handling at any time. For example, the UIScrollView's UIPanGestureRecognizer
is what is used for scrolling, and when it moves into the "began" state (because
the user has moved their finger enough), that's what causes the touch to be
canceled.
So given this, really the best solution here is to not implement
-touchesBegan:withEvent: at all, but to use a gesture recognizer. The easiest
solution here is to use a UILongPressGestureRecognizer with
minimumPressDuration set to 0 and allowableMovement set to some
ridiculously high value (since you don't want movement to cancel the touch). I'm
recommending this because UILongPressGestureRecognizer is a continuous
recognizer, meaning it will send events for Began, Moved, and Ended, and with
the recommended settings, it will send them in response to the touch beginning,
moving, and ending. What's more, once your recognizer starts handling the touch,
this automatically prevents any other recognizers (such as the scroll view's pan
recognizer) from "taking over" and canceling the touch.
Note that if you're attaching your gesture recognizer to the scrollView itself
(e.g. a UITableView) but only want to respond to touches in certain locations
(such as on a row), then you'll need to restrict the recognizer. You can use the
delegate method gestureRecognizer(_:shouldReceiveTouch:) to do this, something
like this:
func gestureRecognizer(recognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
// if you might be the delegate of multiple recognizers, check for that
// here. This code will assume `recognizer` is the correct recognizer.
// We're also assuming, for the purposes of this code, that we're a
// UITableViewController and want to only capture touches on rows in the
// first section.
let touchLocation = touch.locationInView(self.tableView)
if let indexPath = self.tableView.indexPathForRowAtPoint(touchLocation) {
if indexPath.section == 0 {
// we're on one of the special rows
return true
}
}
return false
}
This way the recognizer won't prevent the tableView's panGestureRecognizer
from scrolling when the user touches elsewhere on the table.

Related

UITextInteraction + UIScrollView

I have a custom text input view that adopts the UITextInput protocol. This view is embedded within a UIScrollView. The UITextinput adopting view includes UITextInteraction. When I drag the selection handles to the edges (top or bottom) of the current visible area, the view should scroll and select text automatically (like UITextView). Does anybody know how to achieve this?
I was hoping that UITextInteraction might inform me via its delegate of necessary events, but it does not (or even takes care of this functionality automatically).
I have tried to intercept closestPosition(to point: CGPoint) -> UITextPosition?, which is called whenever the user touches the UITextInput adopting view. Therefore, I can use it to track the dragging operation of a selection handle. Once, the user reaches the top of the view, I scroll up. However, I cannot detect when the user lets go of the handle (when touch ends). When this happens, the scrollView should stop scrolling. In my case, the scroll view keeps scrolling to the top.
I also tried to intercept selectionRects(for range: UITextRange) -> [UITextSelectionRect], but it is called sporadically during a scrolling.
I also cannot detect touchesEnded(). The UITextInteraction seems to block the call. Further, I cannot implement my own pan gesture. UITextInteraction blocks this as well during a selection operation.
Has anybody successfully used UITextInteraction? It seems very pre-mature at this stage.
Here's the way I got this info out of UITextInteraction for the text insertion point placement. I have not yet attempted to track the selection handles, but will update the answer if I manage it
UITextInteraction has a property called gesturesForFailureRequirements. If you dump the content of these you'll notice one named UIVariableDelayLoupeGesture which is a subclass of UILongPressGestureRecognizer. This is the one we want when the use "picks up" the cursor with their finger
I add my own target/selector to this gesture recogniser when adding my text interaction like so:
for gesture in interaction.gesturesForFailureRequirements {
if gesture.isKind(of: UILongPressGestureRecognizer.self) {
gesture.addTarget(self, action: #selector(longPressEdgeScrollGesture(_:)))
}
}
Then in longPressEdgeScrollGesture you can get the coordinates of the cursor in your scroll view to activate your edge scrolling timer as necessary
#objc private func longPressEdgeScrollGesture(_ gesture: UILongPressGestureRecognizer) {
guard gesture.numberOfTouches > 0 else {
return
}
let location = gesture.location(in: gesture.view)
print(location)
//Start/stop your edge scrolling as required
}
I found a solution to my original problem. It is somewhat of a work around, but hopefully Apple will improve UITextInteraction.
Trying to intercept any functions in UITextInput led nowhere. Thankfully some very clever people on Twitter figured out what to do. You need to intercept the right UIGestureRecognizer that is added automatically. Much like Simeon explained in his answer. The recognizer in question is called UITextRangeAdjustmentGestureRecognizer. But you cannot find it via gesturesForFailureRequirements. You need to become a delegate of the gestures and then you can find mentioned gesture via the delegate method of shouldRecognizeSimultaneouslyWith otherGestureRecognizer.
Once you add a target + action to that gesture you can observe the dragging and handle edge scrolling.

Duplicate UIScrollView cancel behavior with UIPanGestureRecognizer

My app uses a paged horizontal scroll view. Each page has UIControls which the user can tap.
UIScrollView does a good job of handling cancellation of touches and swipes. If the user starts swiping fast enough, it's always a swipe. If the user touches down long enough to activate the highlighted state, the scroll view doesn't attempt to swipe.
I'm trying to duplicate this behavior with a UIPanGestureRecognizer subclass so that I can respond to downward swipes within my scrollview. However, I can't get the gesture to cancel in the event of the UIControls getting highlighted.
So far I've done the following:
self.refreshGesture.cancelsTouchesInView = YES;
self.refreshGesture.delaysTouchesBegan = NO;
self.refreshGesture.delaysTouchesEnded = NO;
This seems to duplicate the way UIScrollView passes touches to views, but it doesn't duplicate the way that UIScrollView's pan gesture recognizer gets cancelled. self.refreshGesture is always triggered no matter how slowly the user swipes, or what the state of the UIControls are.
I've tried setting the delegate on my gesture, and this may be the way to go. But I haven't found a combination that works. For example, just checking if the touch starts within a UIControl cancels too frequently. I've also tried overriding gestureRecognizerShouldBegin in my controls, but this seems like a hack and has far reaching implications (interferes with UITextView's gestures, for example).
In this GIF, you can see that the control activates on touch, and the scrollview cancels scroll if that happens. But my downward pan gesture is not cancelled in the same manner:
I wasn't able to duplicate this exactly, but there are two possibilities suggested by WWDC 2014 #235.
Add a transparent scrollview over your main content and move its gesture recognizer onto your root view. This is what did. It let me use UIScrollViewDelegate which ended up being sufficient.
Use a "timeout" gesture recognizer. The video suggests requiring the timeout gesture to fail, but in my case it worked better to use a long press gesture and cancel my pan if the long press fired. 0.1 seconds seemed to work better than their suggested 0.15 seconds.

Using Swipe Gesture and Touches Began/Moved/Ended at the same time

I'm trying to use a swipe gesture along with some logic in touches began/moved/ended. Ideally, it would be good if:
User swipes left/right, touches began/moved/ended logic is not called (or cancelled).
For all other cases, touches began/moved/ended logic is called as usual.
Is this possible?
I tried adding the following (based on process both touch event and gesture recognizer) but touches moved/ended is still called:
leftSwipeGestureRecognizer.delaysTouchesBegan = true
self.leftSwipeGestureRecognizer.cancelsTouchesInView = false
Should be:
self.leftSwipeGestureRecognizer.cancelsTouchesInView = YES
This mean: touches are cancelled in case gesture was recognized, otherwise, touches began/moved/ended called.
From documentation:
When this property is YES (the default) and the receiver recognizes
its gesture, the touches of that gesture that are pending are not
delivered to the view and previously delivered touches are cancelled
through a touchesCancelled:withEvent: message sent to the view. If a
gesture recognizer doesn’t recognize its gesture or if the value of
this property is NO, the view receives all touches in the multi-touch
sequence.
In this case I would create a custom UIGestureRecognizer for a new behaviour in touches began/moved/ended. Useful link here. Than I would set delegate for both swipe and custome recognizers and implement gestureRecognizer:shouldRequireFailureOfGestureRecognizer: method to fulfill requirements. Link to documentation.

How to make UIView stop receiving touch events?

I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!

How to get stepper and longpress to coexist?

I tried setting up a view with a longpress gesture and a stepper configured for continuous updates. With the longpress, the continuous feature of the stepper does not occur. For now, I've disabled the longpress. I guess I don't need it. But for future reference, how would I allow for both to coexist?
Just to be clear, here is the way the screen was set up when I tried this.
App was set up with a simple view controller.
A subview was added to this view (could have been a controller, but I just made it a UIView).
Several labels and stepper were added to this subview.
The steppers were wired up as outlets and actions.
A longpress recognizer was added to the main view in IB.
For completeness, a tap gesture was also added to the main view in IB.
Taps on the main view function as expected. Taps on the steppers function as expected. Longpress on the main view functions as expected. Longpress on the stepper does not.
I modified the code called by the longpress to check for the frame of the subview and not act if the touch location was within that rectangle, but that didn't make a difference. I did not try getting the longpress to fail in that situation, but I suppose I'll try that next. EDIT: OK, maybe not. There doesn't seem to be an API for that. However, there is this kludge, that I'm not going to try.
Attached is a screen shot from profiler with an inverted call tree so you can see what each item is being called by.
darkStepped: is the IBAction that is called by the stepper. If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
The stack trace reveals that the stepper's _updateCount method is dispatched through a timer.
This could be related to the fact that a stepper has an "autoIncrement" mode where, as long as your keep it pressed, it will update at a given (varying) rate. So, instead of simply calling _updateCount, the stepper sets up a timer to handle this behaviour.
For whatever reason the timer is used, the timer explains why you do not see the gesture recogniser in the stack trace.
In your case what happens is that the stepper gets the touches, handles them, and do not forward them to any gesture recognisers attached to it.
This can be explained as follows, although this snippet does not explicitly mention a long press recogniser in relation to a UIStepper control:
According to Apple Docs:
Interacting with Other User Interface Controls
In iOS 6.0 and later, default control actions prevent overlapping gesture recognizer behavior. For example, the default action for a button is a single tap. If you have a single tap gesture recognizer attached to a button’s parent view, and the user taps the button, then the button’s action method receives the touch event instead of the gesture recognizer. This applies only to gesture recognition that overlaps the default action for a control, which includes:
A single finger single tap on a UIButton, UISwitch, UIStepper, UISegmentedControl, and UIPageControl.
...
If you have a custom subclass of one of these controls and you want to change the default action, attach a gesture recognizer directly to the control instead of to the parent view. Then, the gesture recognizer receives the touch event first. As always, be sure to read the iOS Human Interface Guidelines to ensure that your app offers an intuitive user experience, especially when overriding the default behavior of a standard control.
So, it seems you can attach the gesture recogniser directly to the control (possibly you need to subclass UIStepper for this to work, I am not really sure how to interpret the last paragraph). Hopefully this will not disable the basic workings of the stepper (but maybe it will).
After carefully reviewing Apple's docs again, I've found the solution. I added the view controller as the delegate to the longpress gesture recognizer
self.longPress.delegate = self;
(and, of course, adding <UIGestureRecognizerDelegate> to the interface, and then added this method to the view controller:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// Determine if the touch is inside the custom subview
if (gestureRecognizer == self.longPress) {
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.antControl.frame, touchLocation)) {
return NO;
}
}
return YES;
}
This way the gesture recognizer doesn't even get called when the longpress occurs within the frame of self.antControl, which is the subview mentioned in the question.

Resources