How to use custom controls on Mapbox MGLMapView - ios

I want to implement KCFloatingActionButton (or something similar) on a Mapbox MGLMapView but can't get the button action to work. The button recognises the initial touch, the colour changes, but nothing else happens. I'm not sure if the map is stealing touches or has higher precedence somehow.
Inspecting the view hierarchy shows the UIButton and associated UIViews are above the map so I'm at a loss. I thought it might have been the use of views in the button as well as the actual UIButton object but I'm able to attach a gestureRecognizer to a simple UIView on the map no problem (except the map still registers drags and taps through the view). Also vanilla UIButton objects work fine.
EDIT: Just swapped view to MKMapView and button works fine.

So you learn something new everyday. Just wish it was yesterday.
Superview contain subviews. In my case the MGLMapView contains a KCFloatingActionButton (itself deriving from a UIView). Superviews have gesture recognizers attached. In my case the mapView has the usual taps and swipes normally associated with maps. Default behaviour is for those gestures to 'not' be blocked by subviews so the mere presence of the button doesn't stop the superview from recognising it's gesture.
By adopting the UIGestureRecognizerDelegate and implementing:
override func gestureRecognizerShouldBegin(_ gestureRecognizer: UIGestureRecognizer) -> Bool {
return false
}
I can selectively handle how my 'button' handles gestures deriving from the superview. As I read it UIButtons default to this behaviour for single taps but my UIView-based button doesn't.

Related

UITextInteraction + UIScrollView

I have a custom text input view that adopts the UITextInput protocol. This view is embedded within a UIScrollView. The UITextinput adopting view includes UITextInteraction. When I drag the selection handles to the edges (top or bottom) of the current visible area, the view should scroll and select text automatically (like UITextView). Does anybody know how to achieve this?
I was hoping that UITextInteraction might inform me via its delegate of necessary events, but it does not (or even takes care of this functionality automatically).
I have tried to intercept closestPosition(to point: CGPoint) -> UITextPosition?, which is called whenever the user touches the UITextInput adopting view. Therefore, I can use it to track the dragging operation of a selection handle. Once, the user reaches the top of the view, I scroll up. However, I cannot detect when the user lets go of the handle (when touch ends). When this happens, the scrollView should stop scrolling. In my case, the scroll view keeps scrolling to the top.
I also tried to intercept selectionRects(for range: UITextRange) -> [UITextSelectionRect], but it is called sporadically during a scrolling.
I also cannot detect touchesEnded(). The UITextInteraction seems to block the call. Further, I cannot implement my own pan gesture. UITextInteraction blocks this as well during a selection operation.
Has anybody successfully used UITextInteraction? It seems very pre-mature at this stage.
Here's the way I got this info out of UITextInteraction for the text insertion point placement. I have not yet attempted to track the selection handles, but will update the answer if I manage it
UITextInteraction has a property called gesturesForFailureRequirements. If you dump the content of these you'll notice one named UIVariableDelayLoupeGesture which is a subclass of UILongPressGestureRecognizer. This is the one we want when the use "picks up" the cursor with their finger
I add my own target/selector to this gesture recogniser when adding my text interaction like so:
for gesture in interaction.gesturesForFailureRequirements {
if gesture.isKind(of: UILongPressGestureRecognizer.self) {
gesture.addTarget(self, action: #selector(longPressEdgeScrollGesture(_:)))
}
}
Then in longPressEdgeScrollGesture you can get the coordinates of the cursor in your scroll view to activate your edge scrolling timer as necessary
#objc private func longPressEdgeScrollGesture(_ gesture: UILongPressGestureRecognizer) {
guard gesture.numberOfTouches > 0 else {
return
}
let location = gesture.location(in: gesture.view)
print(location)
//Start/stop your edge scrolling as required
}
I found a solution to my original problem. It is somewhat of a work around, but hopefully Apple will improve UITextInteraction.
Trying to intercept any functions in UITextInput led nowhere. Thankfully some very clever people on Twitter figured out what to do. You need to intercept the right UIGestureRecognizer that is added automatically. Much like Simeon explained in his answer. The recognizer in question is called UITextRangeAdjustmentGestureRecognizer. But you cannot find it via gesturesForFailureRequirements. You need to become a delegate of the gestures and then you can find mentioned gesture via the delegate method of shouldRecognizeSimultaneouslyWith otherGestureRecognizer.
Once you add a target + action to that gesture you can observe the dragging and handle edge scrolling.

Disable UIPageViewController scrolling when touches start on UISwitch contained in one of the pages

I have a UIPageViewController where one page is a Settings screen. On that screen, there is a UISwitch to toggle a setting on or off.
While I've noticed many people tap a UISwitch to toggle it like I do, I've observed that some users slide a UISwitch to toggle it.
Attempting to slide the UISwitch can cause problems when it's on a UIViewController that's part of a UIPageViewController, since sliding the switch can start sliding the UIPageViewController as if the user wants to change pages.
This behavior that feels very broken and inconsistent. It seems that if the user touches the switch, but hesitates briefly before sliding, the touches are registered by the UISwitch and it works as the user expects. But, if the user touches the UISwitch and immediately starts sliding, the UIPageViewController gets the touches instead. There seems to be a very fine line (hesitation threshold) between the UISwitch getting the touches or not.
How would you solve this problem?
Here's an example that starts with simple taps to toggle the UISwitch, and then shows some of the different ways to try to slide the UISwitch that lead to inconsistent results:
Possible Solutions & Issues
One way I've considered resolving this is to detect touches anywhere on the Settings UIViewController, and if the touches begin somewhere in the frame of the UISwitch, prevent the UIPageViewController from sliding.
My worry is that simply disabling sliding for the UIPageViewController would not guarantee the touches are passed to the UISwitch. That would mean a user might try to slide the UISwitch and the UIPageViewController would not slide, but the UISwitch would also not respond to the touches, making it still seem broken and inconsistent.
To start exploring this possible solution, I've tried detecting touches by overriding the touchesBegan method like this:
override func touchesBegan(_ touches: Set<UITouch>,
with event: UIEvent?) {
print("touchesBegan")
super.touchesBegan(touches,
with: event)
}
I've tried detecting touches this way on both the UIPageViewController and the UIViewController of the specific Settings page, but neither gets called. I've also tried setting view.isUserInteractionEnabled = true on the views, and different combinations of true or false on the different UIViewControllers, but still can't seem to detect touches.
I've also considered just ditching the whole UIPageViewController paradigm for this app and making the Settings screen a modal instead, since that would resolve the need for this kind of complicated solution, but there are other advantages to the UIPageViewController paradigm in the app, so I want to explore whether it'd be possible to keep. Ultimately, that seems like it would be a better idea than weird workarounds, but I wanted to post this anyway in case someone else experiences the same issues or anyone has other possible solutions.
UPDATE: SOLUTION
I spent a ton of time experimenting based on the answers that were submitted, while also learning a lot about how iOS works, so thank you to everyone who answered.
Ultimately, Leon's answer got me going down the path to a simple solution that worked with a UIPanGestureRecognizer, which is very similar to what I was trying with a UISwipeGestureRecognizer based on Carl's answer, but ultimately didn't produce the same results.
I ended up subclassing UISwitch and adding a UIPanGestureRecognizer that does nothing. This makes it behave exactly as I want: any time a user starts sliding the switch the UIPageViewController does not pan.
class NoPanSwitch: UISwitch {
override init(frame: CGRect) {
super.init(frame: frame)
let recognizer = UIPanGestureRecognizer(target: self,
action: nil)
addGestureRecognizer(recognizer)
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
Page view controller has has gestureRecognizers property for gesture recognisers controlling interactions within page. I would try adding a pan gesture recogniser to the UISwitch and adding it to page controller. The effect I would expect is that the pan gesture recogniser of the page controller will require the gesture recognisers in the array to fail before recognising page panning so the scrolling doesn't happen when you flick the switch. If it doesn't work out like that, you might want to swap Page controller for a scroll view with a stack or collection view with your controllers inside. Then you'll have full control over school view's panGEstureRecognizer to achieve the above behaviour.
Can you subclass the UISwitches you are using? This sounds more like a job for that class, overriding gestureRecognizerShouldBegin() to return NO for a swipe gesture recognizer moving horizontally. The documentation for that method mentions that UISlider uses that method for the same purpose; UISwitch should probably do the same.
Another option might be to create a custom UIGestureRecognizer which via delegate or the UIGestureRecognizerSubclass methods forces any other UISwipeGestureRecognizer to fail if it starts over a UISwitch, and is itself set with cancelsTouchesInView to false (and not do anything when it "succeeds").
Just try this one and see whether it's working as you want or not.
Detect the touches on UISwitch when UISwitch touchBegan set delegate and datasource of UIPageViewController to `nil.
On touchesEnd set again set datasource and delegate for UIPageViewController.
In this way you will be able to interact with your UISwitch as you want.

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

iOS - Filtering and forwarding touches to subviews

The application I'm building has a full-screen MKMapView, with another UIView subclass placed over it, full-screen as well and completely transparent. I would like for the UIView subclass to handle single touch gestures, such as taps and single finger drags, and ignore anything else. This would allow the MKMapView to be interacted with using other means, especially panning/scrolling with two fingers by disabling 3D functions.
My issue here is that MKMapView does not use the touchesXXX:withEvent: methods for its user interaction. So, I can't detect touch count in those methods on the view and forward to the map. Likewise, the hitTest:withEvent: method can't be used to determine which view handles the touches, because the UIEvent object there returns an empty set of touches.
I've considered letting all touches forward through the view and using a gesture recognizer to handle events, but I really need the single touch/drag on the overlay view to have no effect on the map view.
Is there a way to accomplish this filtering based on the number of touches? Or a way to disable the single touch gestures on the map view?
The solution to this is actually very simple.
Give the map view a parent view that it fills completely
Give the parent view pan and tap gesture recognizers configured to only respond to one finger touches
On the MKMapView, set the scrollEnabled property to NO (the "Allows Scrolling" checkbox in IB)
The gesture recognizers allow you to get the gestures, and setting scrollEnabled to NO prevents the MapView from swallowing the pan gestures.
Sample project here: https://github.com/Linux-cpp-lisp/sample-no-gesture-mapview

How to get stepper and longpress to coexist?

I tried setting up a view with a longpress gesture and a stepper configured for continuous updates. With the longpress, the continuous feature of the stepper does not occur. For now, I've disabled the longpress. I guess I don't need it. But for future reference, how would I allow for both to coexist?
Just to be clear, here is the way the screen was set up when I tried this.
App was set up with a simple view controller.
A subview was added to this view (could have been a controller, but I just made it a UIView).
Several labels and stepper were added to this subview.
The steppers were wired up as outlets and actions.
A longpress recognizer was added to the main view in IB.
For completeness, a tap gesture was also added to the main view in IB.
Taps on the main view function as expected. Taps on the steppers function as expected. Longpress on the main view functions as expected. Longpress on the stepper does not.
I modified the code called by the longpress to check for the frame of the subview and not act if the touch location was within that rectangle, but that didn't make a difference. I did not try getting the longpress to fail in that situation, but I suppose I'll try that next. EDIT: OK, maybe not. There doesn't seem to be an API for that. However, there is this kludge, that I'm not going to try.
Attached is a screen shot from profiler with an inverted call tree so you can see what each item is being called by.
darkStepped: is the IBAction that is called by the stepper. If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
The stack trace reveals that the stepper's _updateCount method is dispatched through a timer.
This could be related to the fact that a stepper has an "autoIncrement" mode where, as long as your keep it pressed, it will update at a given (varying) rate. So, instead of simply calling _updateCount, the stepper sets up a timer to handle this behaviour.
For whatever reason the timer is used, the timer explains why you do not see the gesture recogniser in the stack trace.
In your case what happens is that the stepper gets the touches, handles them, and do not forward them to any gesture recognisers attached to it.
This can be explained as follows, although this snippet does not explicitly mention a long press recogniser in relation to a UIStepper control:
According to Apple Docs:
Interacting with Other User Interface Controls
In iOS 6.0 and later, default control actions prevent overlapping gesture recognizer behavior. For example, the default action for a button is a single tap. If you have a single tap gesture recognizer attached to a button’s parent view, and the user taps the button, then the button’s action method receives the touch event instead of the gesture recognizer. This applies only to gesture recognition that overlaps the default action for a control, which includes:
A single finger single tap on a UIButton, UISwitch, UIStepper, UISegmentedControl, and UIPageControl.
...
If you have a custom subclass of one of these controls and you want to change the default action, attach a gesture recognizer directly to the control instead of to the parent view. Then, the gesture recognizer receives the touch event first. As always, be sure to read the iOS Human Interface Guidelines to ensure that your app offers an intuitive user experience, especially when overriding the default behavior of a standard control.
So, it seems you can attach the gesture recogniser directly to the control (possibly you need to subclass UIStepper for this to work, I am not really sure how to interpret the last paragraph). Hopefully this will not disable the basic workings of the stepper (but maybe it will).
After carefully reviewing Apple's docs again, I've found the solution. I added the view controller as the delegate to the longpress gesture recognizer
self.longPress.delegate = self;
(and, of course, adding <UIGestureRecognizerDelegate> to the interface, and then added this method to the view controller:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// Determine if the touch is inside the custom subview
if (gestureRecognizer == self.longPress) {
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.antControl.frame, touchLocation)) {
return NO;
}
}
return YES;
}
This way the gesture recognizer doesn't even get called when the longpress occurs within the frame of self.antControl, which is the subview mentioned in the question.

Resources