hacking ios ui responder chain - ios

I'm facing a delicated problem handling touch events. This is problably not a usual stuff to make but i think it is possible. I just don't know how...
I have a Main View (A) and Main View (B) with a lot of subviews 1,2,3,4,5,...
MainView
SubView(A)
1
2
3
SubView(B)
1
2
3
Some of these sub sub views (1,2,4) are scrollviews.
It happens that I want to change between A and B with a two finger pan.
I have tried to attach a UIPanGestureRecognizer to MainView but the scrollviews cancel the touches and it only works sometimes.
I need a consistent method to first capture the touches, detect if it is a two finger pan, and only then decide if it will pass the touches downwards (or upwards... i'm not sure) the responder chain.
I tried to create a top level view to handle that, but I cant make to have the touches being passed through that view.
I have found a lot of people with similar problems but couldn't find a solution to this problem from their solutions.
If any one could give me a light, that would be great as i'm already desperate with this.

You can create a top level view to capture the touches and the coordinates of touches then you can check if the coordinates of touch is inside of the sub views. You can do that using
BOOL CGRectContainsPoint(CGRect rect, CGPoint point)
Method. Rect is a frame of the view, and point is point of touch.
Please not that the frames and touch locations are relative to their super views, therefore you need to convert them to coordinate system of app window.
Or maybe it can be more helpful
Receiving touch events on more then one UIView simultaneously

Related

Proper UIGestureRecognizer and Delegate design

This is a pretty hypothetical question just to understand proper design but lets say I have two custom UIViews.
One of them is essentially a container that I'll call a drawer. Its purpose is to hide and show content. It's a lot like the notification center on iOS where you swipe to pull it open and flick it back up to close it. It's a generic container than can contain any other UIView. It has a UIPanGestureRecognizer to track the finger that's pulling it open/closed. It might also have a UISwipeGestureRecognizer to detect a "flick".
The other view is a custom map widget that has UIPan/Rotation/Pinch GestureRecognizers.
I think the drawer view should be the UIGestureRecognizerDelegate for the Pan/Swipe GestureRecognizers so that it can prevent touches from being delivered unless the user is grabbing "the handle".
My first instinct is for the map to be the UIGestureRecognizerDelegate of the pan/rotation/pinch gestures so that it can allow them to all run simultaneously.
The problem I'm having is that, I really don't want the map to receive any touches or begin recognizing gestures until the drawer is completely open. I'd like to be able to enforce this behavior automatically in the drawer itself so that it works for all subviews right out of the box.
The only way that I can think to do this is to wire all of the gestures handlers to the ViewController and let it do everything, but to me that breaks encapsulation as now it has to know that the map gestures need to run simultaneously, that the drawer should only get touches on it's handle and that the map should only get touches when it's open.
What are some ways of doing this where the logic can stay in the Views where I think it belongs?
I would do something like this to make the subviews of the drawer disabled while panning. Essentially loop through the drawer's subviews and disbale interaction on them.
[self.subviews enumerateObjectsUsingBlock:^(UIView *subview, NSUInteger idx, BOOL *stop){
subview.userInteractionEnabled = NO;
}];
And something similar again for when you want to re-enable user interaction on the subviews.
This should already Just Work™. A gesture recogniser is attached to a view; when a continuous gesture is recognised, all subsequent touches associated with that gesture are associated with that view.
So in your case, when the drawer pan is recognised, no touches associated with that pan should ever cause behaviour in your map view's pan/pinch/rotation gestures (unless you explicitly specify that they should using the appropriate delegate methods).
Or do you mean that you want to prevent the user from, halfway through opening the drawer, using another finger (i.e. another gesture) to start scrolling the (half-visible) map? If so, you should just set userInteractionEnabled on the drawer's contentView (or equivalent) to NO at UIGestureRecognizerStateBegan/Changed and YES again at UIGestureRecognizerStateEnded/Cancelled.

How to make UIView stop receiving touch events?

I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!

Dragging an uiview like facebooks menu slide

I know this has been probably asked before but I've seen many approaches and i don't know which is best for me, so plz don't send me a link to another post unless it addresses my problem directly.
I have a controller which has a uiview on the top (like a header) (this header is bigger than it seems because is partially hidden on top). on that view i have a uibutton which now with a touch up inside shows the entire header view and taping again returns it to its starting position (changing frame with animation). I want to also be able to drag the view but only changing position on the y axis(dragging up and down)... i was thinking of adding the dragInside/Outside event to the button but this doesn't give me the position of the finger... and also want to know when the user releases the drag so the view ends animation to any of its two possible states (showing or partially hidden). Is this a "touches began" , "touches moved" , "touches ended" thing? if it is please provide a code example. I also want to do this with another view but this is on the left side... same thing but this one moves on the X axis... any help is appreciated. or maybe it can be made with drag event if i only can save a CGpoint of last touch, maybe that's better, any other suggestions
Look at using a UIPanGestureRecognizer to detect the touch movements. Use the translationInView: of the gesture to set the view y position. The translation is the total movement since the start of the gesture so you don't need to remember and accumulate the offset position yourself.
The main thing to worry about while implementing this is bounding the y position of the view so that no matter how far the user drags the view won't go too high or low on the screen.
Use a UIPanGestureRecognizer, that's a class dedicated to handling such drag/pan gestures.
Everything is described here in Apple's documentation, including examples, so you should find your answer here.
There is also some sample code in Apple Developer Library that shows you how to use Gesture Recognizers if needed.

a last touch event does not come if the view goes out of the screen programmatically

I have a situation where I apply an effect to a UIView when a touch begins and reverse that effect when that touch ends. So basically I am tracking touchesbegan, touchesEnded and touchesCancelled methods of UIView.
But the problem is that when the view goes out of the screen, i.e. when it or one of its parents gets removed from superview, it does not get any more touch events. Is there any way to give this "last" touchesended event to the view? Maybe if the UIView gets notified about being invisible, I can also use this event for that purpose.
Ok I am going to move the answers in comments to original question to make a good summary of important points.
The reason I am tracking touch events is that I want to apply some
nice effects such as glowing on touch start and remove those effects
on touch ending.
The reason why I can not simulate touchesEnded on removing those
views is that I do not directly remove them. Instead I remove one of
the ancestor views of them. I can not keep track of ancestor views
all the way to UIWindow, it is technically impossible I think.
Instead, framework should provide this to as an event I think.
I solved my problem by overriding -(void)willMoveToWindow:(UIWindow *)newWindow method and checking if newWindow is nil.

UITouch & UIEvents: fighting the framework?

Imagine a view with, say, 4 subviews, next to each other but non overlapping.
Let's call them view#1 ... view#4
All 5 such views are my own UIView subclasses (yes, I've read: Event Handling as well as iOS Event Guide and this SO question and this one, not answered yet)
When the user touches one of them, UIKit "hiTests" it and delivers subsequent events to that view: view#1
Even when the finger goes outside view#1, over say view#3.
Even if this "drag" is now over view#3, view#1 still receives touchesMoved, but view#3 receives nothing.
I want view#3 to start replying to the touches. Maybe with a "touchedEntered" of my own, together with possibly a "touchesExited" on view#1.
How would I go about this?
I can see two approaches.
side step the problem and do all the touch handling in the parent
view whenever I detect a touchesMoved outside of view#1 bounds or,
transfer to the parent view telling it to "redispatch". Not very
clear how such redispatching would work, though.
For solution #2 where I am getting confused is not about the forwarding per se, but how to find the UIVIew I want to forward to. I can obviously loop through the parent subviews until I find one whose bounds/frame contain the touch, but I am wondering if I am missing something, that Apple would have already provided but I cannot relate to this problem.
Any idea?
I have done this, but I used CALayers instead of sub-UIViews. That way, there is no worries about the subviews catching/redispatching events to the parent UIView. You might not be able to do that, but it does simplify things. My solution tended to use CGRectContainsPoint() a lot.
You may want to read Event Handling again, as it comes pretty close to answering your question:
A touch object...is associated with its hit-test view for its
lifetime, even if the touch represented by the object subsequently
moves outside the view.
Given that, if you want to accomplish your goal of having different views react to the user's finger crossing over them, and if you want to do it within the touch-handling mechanism provided by UIView, you should go with your first approach: have the parent view handle the touch. The parent can use -hitTest:withEvent: or -pointInside:withEvent: as it's tracking a touch to determine if the touch is in one of the subviews, and if so can send an appropriate message.

Resources