How to response a touch event which begin to touch down outside of a view and drag enter it in iOS - ios

I want to response this kind of touch event on a view:begin to touch down outside of the view and drag enter it. I have tried to use the iOS UIControlEvent such as UIControlEventTouchDragEnter,UIControlEventTouchDragInside and the UIGesture, and I found no way can do this directly.
Finally, I implement it with my own way:Create a custom subclass of UIView and overwrite the method (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event and forward touch event up to the responder chain. In the touchesBegan, touchesMoved, touchesEnded method of parent view, I use the location of UITouch object to judge touch down outside of the view and drag enter it.
I am not satisfied with this way. Is there anyone can tell me a more efficient and elegant way to work out it? Thank you very much for your time and consideration.

begin to touch down outside of the view and drag enter it
You can never do this more "elegantly" because if your initial touch down is outside the view, then it is not associated with this view and never will be. Whatever view the touch's initial hit test associates it with, that is the view that will always be this touch's view throughout the gesture, and touch events will be sent only to that view. The default definition of hit testing is that the view that the initial touch is inside is the hit-test view, and that, by hypothesis, is not your view.

Related

Whats the relation between First Responder and hitTest methods?

I understand how the system find the view handles touch events by calling the following methods on a view and its subviews
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
But I don't understand the role of first responder in this mechanism.
Does firstResponder represents the start point of the hitTest traverse?
I would recommend a complete reading of the first article
Using Responders and the Responder Chain to Handle Events
in Apple documentation
Touches, Presses, and Gestures
Short answer:
Touch events are delivered directly to the first responder.
When your app receives an event, UIKit automatically directs that event to the most appropriate responder object, known as the first responder.
First responder is determined by hit-testing.
UIKit uses view-based hit-testing to determine where touch events occur. Specifically, UIKit compares the touch location to the bounds of view objects in the view hierarchy. The hitTest(_:with:) method of UIView traverses the view hierarchy, looking for the deepest subview that contains the specified touch, which becomes the first responder for the touch event.
If the first responder does not handle the event, the event is then passed from responder to responder in the active responder chain.
There's not a lot of relationship between them, except that the result of hit test might cause the window to make the hit view become firstResponder.
firstResponder is all about keyboard events and, at least on macOS, menu item actions and commands like cut, copy, paste, undo etc...
When a keyboard event is received by the app from the Window Server, it goes to the firstResponder. If it's not interested in it, then it goes up the chain to nextResponder until it exhausts the responder chain. On macOS there are related but separate concepts of the mainWindow and keyWindow. They are usually the same, but can be different. If they are different the responder chain first starts with the keyWindow, and when that chain is exhausted, it then goes to the mainWindow. Then the application gets a crack at it. Then the application's delegate. Then if it's a document based app, the document and then the document's delegate.
On iOS, I'm a little fuzzy on the exact details, but it's similar. Actually I think it's simpler, because you don't have multiple windows.
Hit testing on the other hand is all about the view heirarchy. So the app finds which window (on macOS) the hit occurs in, then from there it proceeds down to it's immediate subviews, and then down its subviews, etc... until it finds a leaf view that is hit.

What gets called instead of - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event when Accessibility(voice over is on)

Recently I have put a breakpoint in a UIViews method
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
}
method and checked if the compiler stops here when a user taps on the UIView while voiceover is on, but it never came to the breakpoint, does anyone know what gets called and how the touch can be intercepted?
The standard hitTest mechanism is not used when VoiceOver is on. Instead, UIView has an _accessibilityHitTest:withEvent: method, but unlike macOS, it is private and can't easily be overridden or called.
Similar to hitTest, _accessibilityHitTest uses _accessibilityPointInside:withEvent:, which, in turn, calls pointInside:withEvent: (which is public).
First of all, note that users must double-tap to "activate" or "tap" a view when VoiceOver is enabled. If you still aren't hitting hitTest:…, then break on acccessibilityActivate(). This is the default accessibility action triggered by a double-tap. You may also be interested in the activationPoint, which is the default location of the simulated touch VoiceOver emits upon activation. Note that the activation point isn't relevant to all VoiceOver interactions (eg. adjustable controls).
The hit-test view is given the first opportunity to handle a touch event. If the hit-test view cannot handle an event, the event travels up that view’s chain of responders as described in “The Responder Chain Is Made Up of Responder Objects” until the system finds an object that can handle it. Please look at this.

How to make UIView stop receiving touch events?

I'm working on an app where the user is expected to rapidly touch and swipe across multiple UIViews, each of which is supposed to do an action once the user's finger has reached it. I've got a lot of views and so the typical thing to do, where I'd iterate over each view to see if a touch is inside of its bounds, is a no-go - there's just too much lag. Is there any other way to get touch events from one view to another (that is beside the first one)? I thought maybe there is some way to cancel the touch event, but I've searched and so far have come up empty.
One of the big problems I have is that if I implement my touch handling in my view controller, touchesBegan only fires for the first touch - if the user touches something and then, without moving the first finger, taps on something else, that tap is not recorded in either touchesBegan or touchesMoved. But if I implement my touch handling in the UIViews themselves, once a view registers a touch, if the user does not lift their finger up and moves it, the views around the first view do not register the touch. Only if the user lifts his finger and then puts it back down will the surrounding views register the touch.
So my question is, lets say I have two views side by side, my touch handling code is implemented in the views, and I put my finger down on view 1. I then slide my finger over to view 2 - what do I need to do to make view 2 register that touch, which started in view 1 and never "ended"?
Set userInteractionEnabled property of UIView to NO.
view.userInteractionEnabled = NO;
UIView has the following property:
#property(nonatomic, getter=isUserInteractionEnabled) BOOL userInteractionEnabled
Ok, I figured out what was going on. Thing is, I have my views as subviews of a scrollview, which is itself a subview of my main view. With scrollEnabled = NO, I could touch my subviews - but apparently the scrollview was only forwarding me the initial touch event, and all subsequent touches were part of that initial event. Because of that, I had many weird problems such as touching two views one after the other, both would select and highlight, but if I took the first finger off the screen both views would de-select. This was not the desired behavior.
So what I did is I subclassed the scrollview and overrode the touch handling methods to send the events to its first responder, which is its superview, which is the view where I'm doing my touch handling. Now it works!

Custom UIGestureRecognizer Not Working As Expected

I have a UITableView which I present in a UIPopoverController. The table view presents a list of elements that can be dragged and dropped onto the main view.
When the user begins a pan gesture that is principally vertical at the outset, I want the UITableView to scroll as usual. When it's not principally vertical at the outset, I want the application to interpret this as a drag-and-drop action.
My unfortunately lengthy journey down this path has compelled me to create a custom UIGestureRecognizer. In an attempt to get the basics right, I left this custom gesturer as an empty implementation at first, one that merely calls the super version of each of the five custom methods Apple says should be overridden:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
(void)reset;
This results in nothing happening, i.e. the custom gesture's action method is never called, and the table view scrolls as usual.
For my next experiment, I set the gesture's state to UIGestureRecognizerStateBegan in the touchesBegan method.
This caused the gesture's action method to fire, making the gesture appear to behave just like the standard UIPanGestureRecognizer. This obviously suggested I was responsible for managing the gesture's state.
Next up, I set the gesture's state to UIGestureRecognizerStateChanged in the touchesMoved method. Everything still fine.
Now, instead, I tried setting the gesture's state to UIGestureRecognizerStateFailed in the touchesMoved method. I was expecting this to terminate the gesture and restore the flow of events to the table view, but it didn't. All it did was stop firing the gesture's action method.
Lastly, I set the gesture's state to UIGestureRecognizerStateFailed in the touchesBegan method, immediately after I had set it to UIGestureRecognizerStateBegan.
This causes the gesture to fire its action method exactly once, then pass all subsequent events to the table view.
So...sorry for such a long question...but why, if I cause the gesture to fail in the touchesBegan method (after first setting the state to UIGestureRecognizerStateBegan), does it redirect events to the table view, as expected. But if I try the same technique in touchesMoved (the only place I can detect that a move is principally vertical), why doesn't this redirection occur?
Sorry for making this more complicated than it actually was. After much reading and testing, I've finally figured out how to do this.
First, creating the custom UIGestureRecognizer was one of the proper solutions to this issue, but when I made my first test of the empty custom recognizer, I made a rookie mistake: I forgot to call [super touches...:touches withEvent:event] for each of the methods I overrode. This caused nothing to happen, so I set the state of the recognizer to UIGestureRecognizerStateBegan in touchesBegan, which did result in the action method being called once, thus convincing me I had to explicitly manage states, which is only partially true.
In truth, if you create an empty custom recognizer and call the appropriate super method in each method your override, your program will behave as expected. In this case, the action method will get called throughout the dragging motion. If, in touchesMoved, you set the recognizer's state to UIGestureRecognizerStateFailed, the events will bubble up to the super view (in this case a UITableView), also as expected.
The mistake I made and I think others might make is thinking there is a direct correlation between setting the gesture's state and the chronology of the standard methods when you subclass a gesture recognizer (i.e. touchesBegan, touchesMoved, etc.). There isn't - at least, it's not an exact mapping. You're better off to let the base behavior work as is, and only intervene where necessary. So, in my case, once I determined the user's drag was principally vertical, which I could only do in touchesMoved, I set the gesture recognizer's state to UIGestureRecognizerStateFailed in that method. This took the recognizer out of the picture and automatically forwarded a full set of events to the encompassing view.
For the sake of brevity, I've left out a ton of other stuff I learned through this exercise, but would like to point out that, of six or seven books on the subject, Matt Neuburg's Programming IOS 4 provided the best explanation of this subject by far. I hope that referral is allowed on this site. I am in no way affiliated with the author or publisher - just grateful for an excellent explanation!
That probably happens because responders expect to see an entire touch from beginning to end, not just part of one. Often, -touchesBegan:... sets up some state that's then modified in -touchesMoved..., and it really wouldn't make sense for a view to get a -touchesMoved... without having previously received -touchesBegan.... There's even a note in the documentation that says, in part:
All views that process touches,
including your own, expect (or should
expect) to receive a full touch-event
stream. If you prevent a UIKit
responder object from receiving
touches for a certain phase of an
event, the resulting behavior may be
undefined and probably undesirable.

iPhone: Multiple touches on a UIButton are not recognized

I have a UIButton (a subclass of one, actually) that interacts with the user via the touchesbegan: and touchesmoved: functions.
What I would like is for the user to be able to press down the button, drag their finger away, and have a second finger touch the button (all while the first finger has never left the screen).
Problem is, the second touch event never calls touchesbegan: unless the first finger has been released.
Is there some way to override this, or am I trying to do the impossible?
Have you tried setting multipleTouchesEnabled to YES?
If the interactions are using touchesbegan: and touchesmoved: then use a UIView instead of a UIButton. A button is a UIControl, and the way to interact with UIControls is
- (void)addTarget:(id)target action:(SEL)action forControlEvents:(UIControlEvents)controlEvents.
I'm not sure this two ways of getting events mix well.

Resources