What is first responder? - ios

According to Apple's documentation:
When your app receives an event, UIKit automatically directs that
event to the most appropriate responder object, known as the first
responder.
Same documentation explain how first responder is determined:
The hitTest:withEvent: method of UIView traverses the view hierarchy,
looking for the deepest subview that contains the specified touch,
which becomes the first responder for the touch event.
What I don't understand is why there is a property of UIResponder called isFirstResponder? And why becomeFirstResponder exists. Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.

What I don't understand is why there is a property of UIResponder called firstResponder?
There isn't. UIResponder does not have a public property named firstResponder.
And why becomeFirstResponder exists.
The main use of becomeFirstResponder is to programmatically choose which text field gets keyboard events.
Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
There are more kinds of events than touch events. For example, there are keyboard events and motion events. The first responder tracked by UIKit is for non-touch events. In other systems, this concept is usually called the “focus” or more specifically the “keyboard focus”. But (in iOS) the first responder can be a view that doesn't respond to keyboard events.
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
That's ok, because touch events don't really start at the first responder. They start at the view returned by -[UIView hitTest:withEvent:].
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.
There are more kinds of non-touch events that start with the first responder, but aside from that, you have resolved it correctly.

This is not a "quick answer" topic -- your best bet is to do some searching and read through several articles about it.
But, briefly...
.becomeFirstResponder() is often used to activate text fields without requiring the user to tap in the field. Common case is with multiple text fields (fill out the form type of interface), where you would automatically "jump" to the next field based on input:
myTextField.becomeFirstResponder()
Again, as you've already seen from glancing at the docs, there is much more to it than that... but far too much for an answer here.

Related

Whats the relation between First Responder and hitTest methods?

I understand how the system find the view handles touch events by calling the following methods on a view and its subviews
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
But I don't understand the role of first responder in this mechanism.
Does firstResponder represents the start point of the hitTest traverse?
I would recommend a complete reading of the first article
Using Responders and the Responder Chain to Handle Events
in Apple documentation
Touches, Presses, and Gestures
Short answer:
Touch events are delivered directly to the first responder.
When your app receives an event, UIKit automatically directs that event to the most appropriate responder object, known as the first responder.
First responder is determined by hit-testing.
UIKit uses view-based hit-testing to determine where touch events occur. Specifically, UIKit compares the touch location to the bounds of view objects in the view hierarchy. The hitTest(_:with:) method of UIView traverses the view hierarchy, looking for the deepest subview that contains the specified touch, which becomes the first responder for the touch event.
If the first responder does not handle the event, the event is then passed from responder to responder in the active responder chain.
There's not a lot of relationship between them, except that the result of hit test might cause the window to make the hit view become firstResponder.
firstResponder is all about keyboard events and, at least on macOS, menu item actions and commands like cut, copy, paste, undo etc...
When a keyboard event is received by the app from the Window Server, it goes to the firstResponder. If it's not interested in it, then it goes up the chain to nextResponder until it exhausts the responder chain. On macOS there are related but separate concepts of the mainWindow and keyWindow. They are usually the same, but can be different. If they are different the responder chain first starts with the keyWindow, and when that chain is exhausted, it then goes to the mainWindow. Then the application gets a crack at it. Then the application's delegate. Then if it's a document based app, the document and then the document's delegate.
On iOS, I'm a little fuzzy on the exact details, but it's similar. Actually I think it's simpler, because you don't have multiple windows.
Hit testing on the other hand is all about the view heirarchy. So the app finds which window (on macOS) the hit occurs in, then from there it proceeds down to it's immediate subviews, and then down its subviews, etc... until it finds a leaf view that is hit.

UIKit: Initial first responder?

When an app launches, what is the initial first responder?
According to the UIKit docs the first responder can be set with the becomeFirstResponder message. However, if this message isn't sent, what is the initial first responder? The UIApplication? The key window?
Also, is there a property anywhere which points to the current first responder?
In both MacOS & iOS, each window has their own UIResponder (or, to be more precise, each window IS a UIResponder -- UIWindow descends from UIResponder), which means that each window can have their own first responder. On MacOS, there can be many open windows (each one with a first responder) and under iOS, there is usually one UIWindow displayed at any one time.
Each window will have a first responder (whether the window itself, or a text field which is receiving keyboard events, or whatever). You can query each window's responder chain by walking down each of them via the "nextResponder" API.
I'm probably simplifying things a little too much but for the sake of a nice, simple summarized answer I hope this helps. Here is more information about the iOS Responder chain, which shows how an initial view (e.g. the first responder) gets an event and if it can't handle it, the event get passed up to parent views, to the window and to the application.

Presentation overlay for iOS

When I demo my touch apps to remote teams the people on the other end dont know where I am touching. To remedy this, I have been working on an event intercepting view/window that can display touches over applications. No matter how may variations on nextResponder I call, I am unable to react to the touch and pass it along to the controllers underneath. Specifically scroll views dont react nor do buttons.
Is there a way to take an event, get its position, then pass it along to what ever component would have been responding to it initially (the controller underneath)?
Update:
I am making some progress with a UIView. The new view is always returning NO to pointInside.This works great for when the touch starts, but it doesnt track moves or releases. Is there a strategy to adding gesture recognizers to the touch in order to track its event lifecycle?
Joe
You could try creating your own subclass of UIApplication that overrides sendEvent:. Your implementation should call [super sendEvent:event] as well as process the event as needed.
Update your main.m and pass the name of you custom UIApplication class as the 3rd parameter to the call of UIApplicationMain.
After some more due diligence, I found my oversight. In the layer that was on top and displaying the touches, user interaction needed to be set to false. Once I set that to false, I was able to use that layer for display while catching events on the layers below. The project still isn't done but I am one step closer.
Take care,
Joe

UITouch & UIEvents: fighting the framework?

Imagine a view with, say, 4 subviews, next to each other but non overlapping.
Let's call them view#1 ... view#4
All 5 such views are my own UIView subclasses (yes, I've read: Event Handling as well as iOS Event Guide and this SO question and this one, not answered yet)
When the user touches one of them, UIKit "hiTests" it and delivers subsequent events to that view: view#1
Even when the finger goes outside view#1, over say view#3.
Even if this "drag" is now over view#3, view#1 still receives touchesMoved, but view#3 receives nothing.
I want view#3 to start replying to the touches. Maybe with a "touchedEntered" of my own, together with possibly a "touchesExited" on view#1.
How would I go about this?
I can see two approaches.
side step the problem and do all the touch handling in the parent
view whenever I detect a touchesMoved outside of view#1 bounds or,
transfer to the parent view telling it to "redispatch". Not very
clear how such redispatching would work, though.
For solution #2 where I am getting confused is not about the forwarding per se, but how to find the UIVIew I want to forward to. I can obviously loop through the parent subviews until I find one whose bounds/frame contain the touch, but I am wondering if I am missing something, that Apple would have already provided but I cannot relate to this problem.
Any idea?
I have done this, but I used CALayers instead of sub-UIViews. That way, there is no worries about the subviews catching/redispatching events to the parent UIView. You might not be able to do that, but it does simplify things. My solution tended to use CGRectContainsPoint() a lot.
You may want to read Event Handling again, as it comes pretty close to answering your question:
A touch object...is associated with its hit-test view for its
lifetime, even if the touch represented by the object subsequently
moves outside the view.
Given that, if you want to accomplish your goal of having different views react to the user's finger crossing over them, and if you want to do it within the touch-handling mechanism provided by UIView, you should go with your first approach: have the parent view handle the touch. The parent can use -hitTest:withEvent: or -pointInside:withEvent: as it's tracking a touch to determine if the touch is in one of the subviews, and if so can send an appropriate message.

Passing touch events to appropriate sibling UIViews

I'm trying to handle touch events with touchesBegan in an overlay to a parent UIView but also allow the touch input to pass through to sibling UIViews underneath. I expected there would be some straight-forward way to process a touch event and then say "now send it to the next responder as if this one didn't exist", but all I can find is the nextResponder method which appears to be giving back the parent to my overlay view. That parent is then not really passing it onto the next sibling of that overlay view so I'm stuck uncertain how to do what seems like a simple task that is usually accomplished with a touch callback that gets a True or False return value to tell it whether to keep processing down the widget hierarchy.
Am I missing something obvious?
Late answer, but I think you would be better off overriding hitTest:withEvent: instead of touchesBegan. It seems to me that touchesBegan is a pretty "high-level" method that is there to just do a simple thing, so you cannot alter at that level if the event if propagated further. The right place to do that is hitTest:withEvent:.
Also have a look at this S.O. answer for more details about this point.
I understand the desired behavior you're looking for Joey - I haven't found something in the API that supports this automatic messaging-up-the-chain behavior with sibling views.
What I originally wrote below was with respect to just informing a parent UIView about a touch. This still applies, but I believe you need to take it a step further and have the parent UIView use the hit testing technique that Sergio described on each of it's subviews that are siblings to the overlay, and have the parent UIView manually invoke a "do something" method on each of it's subviews that pass the hit test. Each of those sibling views can return a BOOL value on whether to abort informing other siblings or continue the chain.
If you find yourself using this pattern a lot, consider adding a category method on UIView that encapsulates the hit testing and asking views to perform a selector.
My Original Answer
With a little bit of manual work, you can wire this together yourself. I've had to do this, and it worked for me, because I had an oft-repeated use case (an overlay view on a button), where it made sense to create some custom classes. If your situation is similar, one of these techniques will suffice.
Option 1:
If the overlay doesn't need to do anything but look pretty, have it opt out of touch handling completely with userInteractionEnabled = NO. This will make it so that the touch event goes to it's parent UIView (the one it is an overlay to).
Option 2:
Have the overlay absorb the touch event (as it would by default), and then invoke a method on the parent UIView indicating that a touch or certain gesture was recognized, and here's what it is. This way, the UIView behind the overlay still gets to act on the touch recognition, even if someone else did the interception.
With Option 2, it's more a fit for simple UIControlEvent types, like UIControlEventTouchDown and UIControlEventTouchUpInside. In my case (a custom UIButton subclass with a custom overlay view on top of it), I'll wire touch down and touch up events on the button to two separate methods. These fire if a touch down or touch up inside event occurs on the button itself. But, they are also hooks I can invoke from the overlay view if I need to simulate that a button press occurred.
Depending on your needs, you could have a known protocol between the overlay and it's parent UIView or just have the overlay test the UIView informally, with a respondsToSelector: check before invoking performSelector: on it with the custom method you want called that would have fired automatically if the UIView wasn't covered by an overlay.

Resources