UITextInteraction + UIScrollView - ios

I have a custom text input view that adopts the UITextInput protocol. This view is embedded within a UIScrollView. The UITextinput adopting view includes UITextInteraction. When I drag the selection handles to the edges (top or bottom) of the current visible area, the view should scroll and select text automatically (like UITextView). Does anybody know how to achieve this?
I was hoping that UITextInteraction might inform me via its delegate of necessary events, but it does not (or even takes care of this functionality automatically).
I have tried to intercept closestPosition(to point: CGPoint) -> UITextPosition?, which is called whenever the user touches the UITextInput adopting view. Therefore, I can use it to track the dragging operation of a selection handle. Once, the user reaches the top of the view, I scroll up. However, I cannot detect when the user lets go of the handle (when touch ends). When this happens, the scrollView should stop scrolling. In my case, the scroll view keeps scrolling to the top.
I also tried to intercept selectionRects(for range: UITextRange) -> [UITextSelectionRect], but it is called sporadically during a scrolling.
I also cannot detect touchesEnded(). The UITextInteraction seems to block the call. Further, I cannot implement my own pan gesture. UITextInteraction blocks this as well during a selection operation.
Has anybody successfully used UITextInteraction? It seems very pre-mature at this stage.

Here's the way I got this info out of UITextInteraction for the text insertion point placement. I have not yet attempted to track the selection handles, but will update the answer if I manage it
UITextInteraction has a property called gesturesForFailureRequirements. If you dump the content of these you'll notice one named UIVariableDelayLoupeGesture which is a subclass of UILongPressGestureRecognizer. This is the one we want when the use "picks up" the cursor with their finger
I add my own target/selector to this gesture recogniser when adding my text interaction like so:
for gesture in interaction.gesturesForFailureRequirements {
if gesture.isKind(of: UILongPressGestureRecognizer.self) {
gesture.addTarget(self, action: #selector(longPressEdgeScrollGesture(_:)))
}
}
Then in longPressEdgeScrollGesture you can get the coordinates of the cursor in your scroll view to activate your edge scrolling timer as necessary
#objc private func longPressEdgeScrollGesture(_ gesture: UILongPressGestureRecognizer) {
guard gesture.numberOfTouches > 0 else {
return
}
let location = gesture.location(in: gesture.view)
print(location)
//Start/stop your edge scrolling as required
}

I found a solution to my original problem. It is somewhat of a work around, but hopefully Apple will improve UITextInteraction.
Trying to intercept any functions in UITextInput led nowhere. Thankfully some very clever people on Twitter figured out what to do. You need to intercept the right UIGestureRecognizer that is added automatically. Much like Simeon explained in his answer. The recognizer in question is called UITextRangeAdjustmentGestureRecognizer. But you cannot find it via gesturesForFailureRequirements. You need to become a delegate of the gestures and then you can find mentioned gesture via the delegate method of shouldRecognizeSimultaneouslyWith otherGestureRecognizer.
Once you add a target + action to that gesture you can observe the dragging and handle edge scrolling.

Related

How to use custom controls on Mapbox MGLMapView

I want to implement KCFloatingActionButton (or something similar) on a Mapbox MGLMapView but can't get the button action to work. The button recognises the initial touch, the colour changes, but nothing else happens. I'm not sure if the map is stealing touches or has higher precedence somehow.
Inspecting the view hierarchy shows the UIButton and associated UIViews are above the map so I'm at a loss. I thought it might have been the use of views in the button as well as the actual UIButton object but I'm able to attach a gestureRecognizer to a simple UIView on the map no problem (except the map still registers drags and taps through the view). Also vanilla UIButton objects work fine.
EDIT: Just swapped view to MKMapView and button works fine.
So you learn something new everyday. Just wish it was yesterday.
Superview contain subviews. In my case the MGLMapView contains a KCFloatingActionButton (itself deriving from a UIView). Superviews have gesture recognizers attached. In my case the mapView has the usual taps and swipes normally associated with maps. Default behaviour is for those gestures to 'not' be blocked by subviews so the mere presence of the button doesn't stop the superview from recognising it's gesture.
By adopting the UIGestureRecognizerDelegate and implementing:
override func gestureRecognizerShouldBegin(_ gestureRecognizer: UIGestureRecognizer) -> Bool {
return false
}
I can selectively handle how my 'button' handles gestures deriving from the superview. As I read it UIButtons default to this behaviour for single taps but my UIView-based button doesn't.

Get UITextView Gesture (To Identify Location of Tap/LongPress)

I'm rather confident [editable] UITextView's become firstResponder when a long press or tap gesture occurs within the scrollView. I want to identify where in the view this touch occured. Digging through the documentation and source code didn't yield me much. I might be going about this wrong. My concern is a race condition if I just add my own tap recognizer (how can I be sure it is called before the textView's delegate methods).
For practical clarification, I want to call two similar functions from a delegate function (editingDidBegin) but depending if they touched the left or right half of the text view, I want to call either of the two.

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

iOS: Detecting touch down, segue, touch up

User puts her finger on the screen. This triggers a UITouchEvent, phase Began, which calls the touchesBegan:withEvent: method in controllerA, which performs a segue from controllerA to controllerB.
User lifts her finger off the screen. This triggers a UITouchEvent, phase Ended, which calls some callback method.
Question: What and where is this callback method? It's not in controllerA, and it's not in controllerB. From what I can tell, it's not in any view. But it exists.
To clarify, here's what's going on (according to #switz):
In response to -touchesBegan:withEvent:, a view controller is presented
modally via a segue
When the user lifts up their finger, the view controller should be dismissed.
The question is how to react to the finger being lifted, since
-touchesEnded:withEvent: is not invoked.
The short answer is the presented view controller needs to use the "Over Full
Screen" modalPresentationStyle instead of the default "Full Screen" style
(this can either be specified as the presentation style of the segue, or if
that's "Default" then the presentation style of the presented view controller).
The long answer requires a brief overview of how touch handling works. This
explanation ignores gesture recognizers:
When a touch begins, it's delivered to the "topmost" view that contains the
touch point. From there it gets passed along the responder chain until some
object decides to handle the touch (which is signified by implementing
-touchesBegan:withEvent: and not calling super).
Subsequent changes to the touch (e.g. moved, ended, canceled) are delivered back
to the same view that accepted the touch. The view will continue to receive the
touch events until the touch finishes or cancels.
A touch is canceled either when the application moves to the background (because
e.g. a phone call came in), or when a UIKit class like UIScrollView decides
that it needs to take over touch handling (because the finger moved far enough
that it looks like the user wants to scroll). There's also some funny stuff here
with UIScrollView.delaysContentTouches, but that can be ignored.
But there's a wrinkle, something that isn't documented: touch delivery only
happens so long as the view remains associated with the window. If the view that
is considered "topmost" (the view that is associated with the UITouch) is
removed from the window, then the touch is considered to have vanished and,
importantly, no events for that touch are delivered again, to anyone. This is
true even if the view in question is not the object handling touches.
And that final wrinkle is the cause for this problem. Because the default
"Full Screen" presentation style actually removes the old view controller's
view from the window, the touch handling immediately stops. However, the "Over
Full Screen" presentation style does not remove it, it merely covers up the old
view with the one. "Over Full Screen" is typically used when the presented view
controller is not fully opaque, but in this case we're using it so touch
handling isn't interrupted.
But that's not all. There's another problem here, which is when the view that's
being touched lives inside a UIScrollView (one that either is scrollable or
always bounces). In that case, even with "Over Full Screen", you'll find that,
while the touch events continue to be delivered, moving your finger around a bit
suddenly causes the touch to be canceled. This is because the UIScrollView
doesn't know it's covered up and has decided that the user is actually trying to
scroll. This causes it to cancel the touch.
There is a solution to this, though. It's kind of ugly, but the solution is to
immediately cancel any scrolling on any enclosing scroll view when performing
the segue. This can be done with code like the following:
class ViewController: UIViewController {
// this is called from -touchesBegan:withEvent: from a child view
// the child view is `sender`
func touchDown(sender: UIView) {
var view = sender.superview
while view != nil {
if let scrollView = view as? UIScrollView {
// toggle the panGestureRecognizer enabled state to immediately
// cause it to fail.
let enabled = scrollView.panGestureRecognizer.enabled
scrollView.panGestureRecognizer.enabled = true
scrollView.panGestureRecognizer.enabled = enabled
}
view = view?.superview
}
performSegueWithIdentifier(identifier, sender: self)
}
// ...
}
Of course, no discussion of touch handling would be complete without gesture
recognizers. Gesture recognizers change pretty much everything about touch
handling. They get first dibs on any touches, and they can interrupt view touch
handling at any time. For example, the UIScrollView's UIPanGestureRecognizer
is what is used for scrolling, and when it moves into the "began" state (because
the user has moved their finger enough), that's what causes the touch to be
canceled.
So given this, really the best solution here is to not implement
-touchesBegan:withEvent: at all, but to use a gesture recognizer. The easiest
solution here is to use a UILongPressGestureRecognizer with
minimumPressDuration set to 0 and allowableMovement set to some
ridiculously high value (since you don't want movement to cancel the touch). I'm
recommending this because UILongPressGestureRecognizer is a continuous
recognizer, meaning it will send events for Began, Moved, and Ended, and with
the recommended settings, it will send them in response to the touch beginning,
moving, and ending. What's more, once your recognizer starts handling the touch,
this automatically prevents any other recognizers (such as the scroll view's pan
recognizer) from "taking over" and canceling the touch.
Note that if you're attaching your gesture recognizer to the scrollView itself
(e.g. a UITableView) but only want to respond to touches in certain locations
(such as on a row), then you'll need to restrict the recognizer. You can use the
delegate method gestureRecognizer(_:shouldReceiveTouch:) to do this, something
like this:
func gestureRecognizer(recognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
// if you might be the delegate of multiple recognizers, check for that
// here. This code will assume `recognizer` is the correct recognizer.
// We're also assuming, for the purposes of this code, that we're a
// UITableViewController and want to only capture touches on rows in the
// first section.
let touchLocation = touch.locationInView(self.tableView)
if let indexPath = self.tableView.indexPathForRowAtPoint(touchLocation) {
if indexPath.section == 0 {
// we're on one of the special rows
return true
}
}
return false
}
This way the recognizer won't prevent the tableView's panGestureRecognizer
from scrolling when the user touches elsewhere on the table.

UIPanGestureRecognizer.maximumNumberOfTouches not respected in nested scroll views?

I have a root UIScrollView that only scrolls vertically, this scrollview represents rows in my jagged grid. I have configured this scroll view's pan gesture recognizer for two touches for both minimum and maximum number of touches requires.
Inside this scrollview I have one or more UIScrollView instances that only scrolls horizontally, these scrollviews each represent a single row in my jagged grid view. I have configured the pan gesture recognizers for all of these scroll views for one touch minimum, and two touches maximum.
So far it works, I get a nice jagged grid view where I can scroll vertically between rows, and horizontally to scroll each row independently. I have intentionally set to minimum number of touches as 2, as not to inter fear with scrolling if I add fro example a UITableView as a subview for any of cell within this jagged grid view (cell == a position defined by a row and column in that row).
Using a UITableView as a cell works, the table view works as expected that is. But scrolling with two fingers also scrolls inside the table view, not at the root scroll view for vertically scrolling between rows.
I have tried configuring the table views pan gesture recognizer to allow a maximum of one touches, in hope that two finger touches would be ignored. This does not work, the maximumNumberOfTouches property of the table view's pan gesture recognizer seams to be ignored.
What could I have done wrong?
A screen shot displaying the layout to clarify what I have done:
Multiple scrolling tends to get tricky, and I don't for sure, but I think Apple does not encourage this. Even so, I still think it's possible. It may be that vertical scrolling on the table view gets mixed with the scroll view vertical scrolling or something else.
Try checking if the delegates for the gesture recognizers are correctly set.
Another way around this is:
- having a Scroll view with buttons, from which you can open popovers with custom controllers (insert there whatever you want).
- create a big UITableViewController and setting the cell's contents as scrollviews etc. I think you could get the same result.
My advice is not to get stuck on just one method, when there could be others more simpler and more intuitive.
TableViews on Scroll views are generally not a great idea. When a TableView receives the touches, even if doesn't need to do anything with it, it won't send them to it's superView.
You might wanna try either of these 2 things:
In your TableView you should send the touches to your superView manually and let them handle them appropriately. I've seen this method being used in one of my side-projects but I'm not able to post an example of it at this time.
The second thing might be easier to implement. Since TableView is a subclass of ScrollView you can call upon the delaysContentTouches of those TableViews. This property will delay the touch-down even on that TableView until it can determine if scrolling is the intent, as is written in the AppleDocs: http://developer.apple.com/library/ios/#documentation/uikit/reference/UIScrollView_Class/Reference/UIScrollView.html#//apple_ref/occ/cl/UIScrollView
Let me know if either of the 2 ways works for you, I'm quite curious about this subject generally.
But don't you try some tricks rather than implementing all such changes :
1) By Default, disable the scrolling of the TableView when the view is created.
2) Once the view gets generated, Recognize the gestures whether its Scrolling using single or multiple touches, if user touch the Child Scrollview.Look out the tag ,based on gestures, you can enable the Scrolling of Tableview.
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
//Get the tag of ScrollView
// Check for the Parent as well as Child SCrollview.
// If its child, just enable the scrolling of tableView.
}
- (void)tapAction:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint *poit = [Tile locationInView:gestureRecognizer.view];
/// [[[gestureRecognizers.view] objectAtIndex:0] removeFromSuperview];
// imageContent = [[UIImageView alloc]initWithFrame:CGRec tMake(0, 0, 200, 250)];
// [imageContent setImage:[UIImage imageNamed:#"Default.png"]];
NSLog(#"You tapped # this :%d",gestureRecognizer.view.tag);
//Regonize the gestures
}
There may be some unnecessary code since, there is no code snippet with your question.I gave a try & showed a trick if that could work for you & solve the problem. ;)
Try this link and pay attention to how they solve nested views.
Remember the best Practices for Handling Multitouch Events:
When handling events, both touch events and motion events, there are a few recommended techniques and patterns you should follow.
Always implement the event-cancellation methods.
In your implementation, you should restore the state of the view to what it was before the current multitouch sequence, freeing any transient resources set up for handling the event. If you don’t implement the cancellation method your view could be left in an inconsistent state. In some cases, another view might receive the cancellation message.
If you handle events in a subclass of UIView, UIViewController, or (in rare cases) UIResponder,
You should implement all of the event-handling methods (even if it is a null implementation).
Do not call the superclass implementation of the methods.
If you handle events in a subclass of any other UIKit responder class,
You do not have to implement all of the event-handling methods.
But in the methods you do implement, be sure to call the superclass implementation. For example,
[super touchesBegan:theTouches withEvent:theEvent];
Do not forward events to other responder objects of the UIKit framework.
The responders that you forward events to should be instances of your own subclasses of UIView, and all of these objects must be aware that event-forwarding is taking place and that, in the case of touch events, they may receive touches that are not bound to them.
Custom views that redraw themselves in response to events should only set drawing state in the event-handling methods and perform all of the drawing in the drawRect: method.
Do not explicitly send events up the responder (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.

Resources