Pan gesture won't forward touch to Mapbox MGLMapView - ios

I am trying to achieve something very basic with a MGLMapView from Mapbox iOS SDK. I render a MGLMapView with scroll enabled so one can move the view on pan gesture; I would like to detect if the view has been moved so I attached a pan gesture recognizer to this view.
Please note that I am using the interface builder to do so, as you can see on the screenshot below.
I linked the gesture recognizer to an IBAction which is indeed triggered anytime one tries to move the view on the map. However, the view is not moved, i.e. the touch event is not forwarded to the MGLMapView anymore. Of course I unchecked the option 'Cancel touches in view' of my pan gesture recoginzer. Just to be sure I also linked the pan gesture recognizer to an IBOublet in my code so I can set its member cancelsTouchesInView to false but it does not change anything.
I tried to add a tap gesture recognizer (2 touches) in a similar way and it works fine, i.e. associated IBAction is triggered and touch is forwarded to the map view (map view is zoomed in on double tap).
What did I miss here with the pan gesture recognizer?
Thanks a lot for your help.

So I used a different approach to achieve what I wanted to: rather than attaching a new pan gesture recognizer to the map view, I attach a new target to the existing pan gesture recognizer.
for gestureRecognizer in self.mapView.gestureRecognizers! {
if let _ = gestureRecognizer as? UIPanGestureRecognizer {
gestureRecognizer.addTarget(self, action: #selector(self.panOnMap))
break
}
}
Then both default panning and my method panOnMap are called.
I am still not fully satisfied of this solution since it looks more like a hack. Furthermore I noticed that they are two pan gesture recognisers attached to the map view, and I am not sure which one I should target.

Related

Executing pan gesture recognizer ONLY after long press recognizer has fired

I'm trying to implement a drag and drop UI for my UIView using the pan gesture recognizer. I have that piece of code working, but now I want to only execute the drag and drop logic only AFTER the user has long pressed on my to-be-dragged view.
I'm implementing the code in the below question
Recognize long press and pan gesture recognizers together but it's not exactly what I want. Any idea?
Set up your view controller as the delegate of the pan gesture recognizer.
Implement the gestureRecognizerShouldBegin(_:) method. Return false until after the long press gesture recognizer fires.
Found another post whose title was a bit misleading so I didn't look into it too much the first time.
Combine longpress gesture and drag gesture together
It turns out, UILongPressGesture already can help me achieve the drag and drop effect that I want. That means I do NOT need the UIPanGesture at all. I just used selector/handler for the pan gesture for the long press gesture. Except the long press gesture doesn't have the translation properties, so I use
myView.center = sender.location(in: myView.superview)
to achieve the same dragging effect.

How many types of gesture does IOS Support

How many type of gesture that ios can recognize and we can work on occurring that gesture.
Language Swift / Objective C
Platform Xcode
There are seven type of gesture that support in ios.
Tap Gesture Recognizer
Pinch Gesture Recognizer
Rotation Gesture Recognizer
Swipe Gesture Recognizer
Pan Gesture Recognizer
Screen Edge Pan Gesture Recognizer
Long Press
Basically it comes down to Gesture Recogniser types which are:
Tapping (any number of taps) - UITapGestureRecognizer
Pinching in and out (for zooming a view) - UIPinchGestureRecognizer
Panning or dragging - UIPanGestureRecognizer
Swiping (in any direction) - UISwipeGestureRecognizer
Rotating (fingers moving in opposite directions) -
UIRotationGestureRecognizer
Long press (also known as “touch and hold”) -
UILongPressGestureRecognizer
If this isn't what you need you can create a subclass of UIGestureRecognize and come up with your own solution. You can find everything you need on this topic in Apple Docs.
Since iOS 9 there's a "Peek an Pop" action aka hard press aka 3D touch - some might consider this a gesture too, but it's a little bit more complicated. You can find some info here.

iOS - Filtering and forwarding touches to subviews

The application I'm building has a full-screen MKMapView, with another UIView subclass placed over it, full-screen as well and completely transparent. I would like for the UIView subclass to handle single touch gestures, such as taps and single finger drags, and ignore anything else. This would allow the MKMapView to be interacted with using other means, especially panning/scrolling with two fingers by disabling 3D functions.
My issue here is that MKMapView does not use the touchesXXX:withEvent: methods for its user interaction. So, I can't detect touch count in those methods on the view and forward to the map. Likewise, the hitTest:withEvent: method can't be used to determine which view handles the touches, because the UIEvent object there returns an empty set of touches.
I've considered letting all touches forward through the view and using a gesture recognizer to handle events, but I really need the single touch/drag on the overlay view to have no effect on the map view.
Is there a way to accomplish this filtering based on the number of touches? Or a way to disable the single touch gestures on the map view?
The solution to this is actually very simple.
Give the map view a parent view that it fills completely
Give the parent view pan and tap gesture recognizers configured to only respond to one finger touches
On the MKMapView, set the scrollEnabled property to NO (the "Allows Scrolling" checkbox in IB)
The gesture recognizers allow you to get the gestures, and setting scrollEnabled to NO prevents the MapView from swallowing the pan gestures.
Sample project here: https://github.com/Linux-cpp-lisp/sample-no-gesture-mapview

How to get stepper and longpress to coexist?

I tried setting up a view with a longpress gesture and a stepper configured for continuous updates. With the longpress, the continuous feature of the stepper does not occur. For now, I've disabled the longpress. I guess I don't need it. But for future reference, how would I allow for both to coexist?
Just to be clear, here is the way the screen was set up when I tried this.
App was set up with a simple view controller.
A subview was added to this view (could have been a controller, but I just made it a UIView).
Several labels and stepper were added to this subview.
The steppers were wired up as outlets and actions.
A longpress recognizer was added to the main view in IB.
For completeness, a tap gesture was also added to the main view in IB.
Taps on the main view function as expected. Taps on the steppers function as expected. Longpress on the main view functions as expected. Longpress on the stepper does not.
I modified the code called by the longpress to check for the frame of the subview and not act if the touch location was within that rectangle, but that didn't make a difference. I did not try getting the longpress to fail in that situation, but I suppose I'll try that next. EDIT: OK, maybe not. There doesn't seem to be an API for that. However, there is this kludge, that I'm not going to try.
Attached is a screen shot from profiler with an inverted call tree so you can see what each item is being called by.
darkStepped: is the IBAction that is called by the stepper. If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
If the stepper were triggered by a gesture recognizer, wouldn't I expect to see the gesture recognizer in the call tree?
The stack trace reveals that the stepper's _updateCount method is dispatched through a timer.
This could be related to the fact that a stepper has an "autoIncrement" mode where, as long as your keep it pressed, it will update at a given (varying) rate. So, instead of simply calling _updateCount, the stepper sets up a timer to handle this behaviour.
For whatever reason the timer is used, the timer explains why you do not see the gesture recogniser in the stack trace.
In your case what happens is that the stepper gets the touches, handles them, and do not forward them to any gesture recognisers attached to it.
This can be explained as follows, although this snippet does not explicitly mention a long press recogniser in relation to a UIStepper control:
According to Apple Docs:
Interacting with Other User Interface Controls
In iOS 6.0 and later, default control actions prevent overlapping gesture recognizer behavior. For example, the default action for a button is a single tap. If you have a single tap gesture recognizer attached to a button’s parent view, and the user taps the button, then the button’s action method receives the touch event instead of the gesture recognizer. This applies only to gesture recognition that overlaps the default action for a control, which includes:
A single finger single tap on a UIButton, UISwitch, UIStepper, UISegmentedControl, and UIPageControl.
...
If you have a custom subclass of one of these controls and you want to change the default action, attach a gesture recognizer directly to the control instead of to the parent view. Then, the gesture recognizer receives the touch event first. As always, be sure to read the iOS Human Interface Guidelines to ensure that your app offers an intuitive user experience, especially when overriding the default behavior of a standard control.
So, it seems you can attach the gesture recogniser directly to the control (possibly you need to subclass UIStepper for this to work, I am not really sure how to interpret the last paragraph). Hopefully this will not disable the basic workings of the stepper (but maybe it will).
After carefully reviewing Apple's docs again, I've found the solution. I added the view controller as the delegate to the longpress gesture recognizer
self.longPress.delegate = self;
(and, of course, adding <UIGestureRecognizerDelegate> to the interface, and then added this method to the view controller:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
// Determine if the touch is inside the custom subview
if (gestureRecognizer == self.longPress) {
CGPoint touchLocation = [touch locationInView:self.view];
if (CGRectContainsPoint(self.antControl.frame, touchLocation)) {
return NO;
}
}
return YES;
}
This way the gesture recognizer doesn't even get called when the longpress occurs within the frame of self.antControl, which is the subview mentioned in the question.

How to differentiate between user swipe and tap action?

I am developing a app in which I have a view which contains subView in it.
I want to track both swipe and tap actions such as a single click.
Actions should be tracked only when the user touches within my subview. When the user taps I want to perform one action, when the user swipes I want perform another.
For tracking the swipe, I implemented UIGestureRecognizer and it is working fine. But I don't know how to track the tap option. Please guide me how to achieve this.
The main thing is, when I tap it should call tap action only and vice versa.
You can use UITapGestureRecognizer for tap gestures.
"UITapGestureRecognizer is a concrete subclass of UIGestureRecognizer
that looks for single or multiple taps. For the gesture to be
recognized, the specified number of fingers must tap the view a
specified number of times."
This method includes the numberOfTapsRequired ("The number of taps for the gesture to be recognized.") and numberOfTouchesRequired ("The number of fingers required to tap for the gesture to be recognized") properties where you can set exactly how you want it to react to user action.
In this case, as you only want it to be activated when tapped once, the default settings for both these properties (both have default values of 1) should be fine.
The best place to get the information is Defining How Gesture Recognizers Interact of Event Handling Guide for iOS
When a view has multiple gesture recognizers attached to it, you may
want to alter how the competing gesture recognizers receive and
analyze touch events. By default, there is no set order for which
gesture recognizers receive a touch first, and for this reason touches
can be passed to gesture recognizers in a different order each time.
You can override this default behavior to:
Specify that one gesture recognizer should analyze a touch before another gesture recognizer.
Allow two gesture recognizers to operate simultaneously.
Prevent a gesture recognizer from analyzing a touch.

Resources