iOS - forward touches to another view - ios

There is a similar SO Question that exists for this problem but unfortunately there were no suitable answers provided.
I have a google maps view (GMSMapView) that is entirely covered by a transparent sibling view that acts as a container for thumbnail images. The thumbnails are child views of the container view, not the map view. These child views are randomly scattered about the map and therefore partially hide portions of the map's surface.
Tapping on one of these thumbnails triggers a segue to a different VC that shows a zoomed view of the image.
The problem:
Given that these thumbnails lie on top of the map, they prevent the normal map gestures from occurring if the gesture intersects one of the thumbnails. For example, if a user wishes to pinch-zoom, rotate or pan the map and one of his/her fingers begins overtop of a thumbnail, the touches are intercepted by the thumbnail.
Non-starters:
Obviously, I can't set userInteractionEnabled to false on a thumbnail because I need to detect tap gestures to trigger the segue.
I don't think that I can customize the responder chain using UIView's hitTest:withEvent and pointInside:withEvent methods on the thumbnail view because they are not in the same branch in the view hierarchy as the map view AND the dispatching logic is dependent on the type of gesture (which I don't think is available at this point - touchesBegan, etc. are called once the appropriate view has been chosen to receive the event). Please correct me if I'm wrong...
Attempted Solution:
Given the above, the strategy I'm attempting is to overlay all other views in the view controller with a transparent "touch interceptor view". This view's only purposes is to receive all touch messages -- by overriding touchesBegan(), touchesMoved(), touchesEnded() -- and dispatch them to other views as appropriate.
In other words, depending on the type of gesture recognized (tap vs. other), I could call the appropriate target view's (either one of the thumbnails or the map) touchesBegan(), touchesMoved(), touchesEnded() methods directly by forwarding the touches and event parameter.
Unfortunately, while this works when the target view is a simple UIView, it seems most UIView subclasses (including GMSMapView) don't allow forwarding of touch events in this manner; as described in the following article (see section: A Tempting Non-Solution).
Any ideas would be greatly appreciated.

Related

Order of UIGestureRecognizer touchesBegan(_:with:) and point(inside:with:)

The Question
When the user taps on a view which of these two functions is called first? touchesBegan(_:with:) or point(inside:with:).
The Context
I want to subclass a PKCanvasView (which inherits from UIScrollView) to allow interaction through the view (i.e. allow interaction to the view below and disable interaction to the PKCanvasView) when the point of the touch is outside of the UIBezierPath of a stroke on the canvas.
This is easy enough by overriding point(inside:with:). My issue lies in the fact that I only want to allow interaction to the view below if the touch event UITouch.TouchType is not an apple pencil : .pencil. (so that the user can draw with the apple pencil and interact with the view below using their finger)
The only way I think can get this information is by also overriding touchesBegan(_:with:). Here I can access the event and its touch type. I would then somehow pass this to be read inside of point(inside:with:).
However, that all relies on extracting the UITouch.TouchType information before I check if the touch point is overlapping with any PKStroke paths.
So: Is touchesBegan(_:with:) called before point(inside:with:)?

iOS - Filtering and forwarding touches to subviews

The application I'm building has a full-screen MKMapView, with another UIView subclass placed over it, full-screen as well and completely transparent. I would like for the UIView subclass to handle single touch gestures, such as taps and single finger drags, and ignore anything else. This would allow the MKMapView to be interacted with using other means, especially panning/scrolling with two fingers by disabling 3D functions.
My issue here is that MKMapView does not use the touchesXXX:withEvent: methods for its user interaction. So, I can't detect touch count in those methods on the view and forward to the map. Likewise, the hitTest:withEvent: method can't be used to determine which view handles the touches, because the UIEvent object there returns an empty set of touches.
I've considered letting all touches forward through the view and using a gesture recognizer to handle events, but I really need the single touch/drag on the overlay view to have no effect on the map view.
Is there a way to accomplish this filtering based on the number of touches? Or a way to disable the single touch gestures on the map view?
The solution to this is actually very simple.
Give the map view a parent view that it fills completely
Give the parent view pan and tap gesture recognizers configured to only respond to one finger touches
On the MKMapView, set the scrollEnabled property to NO (the "Allows Scrolling" checkbox in IB)
The gesture recognizers allow you to get the gestures, and setting scrollEnabled to NO prevents the MapView from swallowing the pan gestures.
Sample project here: https://github.com/Linux-cpp-lisp/sample-no-gesture-mapview

Can two UICollectionViews respond to a single gesture?

I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.
The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.

hacking ios ui responder chain

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.
You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Passing touch events to appropriate sibling UIViews

I'm trying to handle touch events with touchesBegan in an overlay to a parent UIView but also allow the touch input to pass through to sibling UIViews underneath. I expected there would be some straight-forward way to process a touch event and then say "now send it to the next responder as if this one didn't exist", but all I can find is the nextResponder method which appears to be giving back the parent to my overlay view. That parent is then not really passing it onto the next sibling of that overlay view so I'm stuck uncertain how to do what seems like a simple task that is usually accomplished with a touch callback that gets a True or False return value to tell it whether to keep processing down the widget hierarchy.
Am I missing something obvious?
Late answer, but I think you would be better off overriding hitTest:withEvent: instead of touchesBegan. It seems to me that touchesBegan is a pretty "high-level" method that is there to just do a simple thing, so you cannot alter at that level if the event if propagated further. The right place to do that is hitTest:withEvent:.
Also have a look at this S.O. answer for more details about this point.
I understand the desired behavior you're looking for Joey - I haven't found something in the API that supports this automatic messaging-up-the-chain behavior with sibling views.
What I originally wrote below was with respect to just informing a parent UIView about a touch. This still applies, but I believe you need to take it a step further and have the parent UIView use the hit testing technique that Sergio described on each of it's subviews that are siblings to the overlay, and have the parent UIView manually invoke a "do something" method on each of it's subviews that pass the hit test. Each of those sibling views can return a BOOL value on whether to abort informing other siblings or continue the chain.
If you find yourself using this pattern a lot, consider adding a category method on UIView that encapsulates the hit testing and asking views to perform a selector.
My Original Answer
With a little bit of manual work, you can wire this together yourself. I've had to do this, and it worked for me, because I had an oft-repeated use case (an overlay view on a button), where it made sense to create some custom classes. If your situation is similar, one of these techniques will suffice.
Option 1:
If the overlay doesn't need to do anything but look pretty, have it opt out of touch handling completely with userInteractionEnabled = NO. This will make it so that the touch event goes to it's parent UIView (the one it is an overlay to).
Option 2:
Have the overlay absorb the touch event (as it would by default), and then invoke a method on the parent UIView indicating that a touch or certain gesture was recognized, and here's what it is. This way, the UIView behind the overlay still gets to act on the touch recognition, even if someone else did the interception.
With Option 2, it's more a fit for simple UIControlEvent types, like UIControlEventTouchDown and UIControlEventTouchUpInside. In my case (a custom UIButton subclass with a custom overlay view on top of it), I'll wire touch down and touch up events on the button to two separate methods. These fire if a touch down or touch up inside event occurs on the button itself. But, they are also hooks I can invoke from the overlay view if I need to simulate that a button press occurred.
Depending on your needs, you could have a known protocol between the overlay and it's parent UIView or just have the overlay test the UIView informally, with a respondsToSelector: check before invoking performSelector: on it with the custom method you want called that would have fired automatically if the UIView wasn't covered by an overlay.

Resources