When an app launches, what is the initial first responder?
According to the UIKit docs the first responder can be set with the becomeFirstResponder message. However, if this message isn't sent, what is the initial first responder? The UIApplication? The key window?
Also, is there a property anywhere which points to the current first responder?
In both MacOS & iOS, each window has their own UIResponder (or, to be more precise, each window IS a UIResponder -- UIWindow descends from UIResponder), which means that each window can have their own first responder. On MacOS, there can be many open windows (each one with a first responder) and under iOS, there is usually one UIWindow displayed at any one time.
Each window will have a first responder (whether the window itself, or a text field which is receiving keyboard events, or whatever). You can query each window's responder chain by walking down each of them via the "nextResponder" API.
I'm probably simplifying things a little too much but for the sake of a nice, simple summarized answer I hope this helps. Here is more information about the iOS Responder chain, which shows how an initial view (e.g. the first responder) gets an event and if it can't handle it, the event get passed up to parent views, to the window and to the application.
Related
According to Apple's documentation:
When your app receives an event, UIKit automatically directs that
event to the most appropriate responder object, known as the first
responder.
Same documentation explain how first responder is determined:
The hitTest:withEvent: method of UIView traverses the view hierarchy,
looking for the deepest subview that contains the specified touch,
which becomes the first responder for the touch event.
What I don't understand is why there is a property of UIResponder called isFirstResponder? And why becomeFirstResponder exists. Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.
What I don't understand is why there is a property of UIResponder called firstResponder?
There isn't. UIResponder does not have a public property named firstResponder.
And why becomeFirstResponder exists.
The main use of becomeFirstResponder is to programmatically choose which text field gets keyboard events.
Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
There are more kinds of events than touch events. For example, there are keyboard events and motion events. The first responder tracked by UIKit is for non-touch events. In other systems, this concept is usually called the “focus” or more specifically the “keyboard focus”. But (in iOS) the first responder can be a view that doesn't respond to keyboard events.
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
That's ok, because touch events don't really start at the first responder. They start at the view returned by -[UIView hitTest:withEvent:].
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.
There are more kinds of non-touch events that start with the first responder, but aside from that, you have resolved it correctly.
This is not a "quick answer" topic -- your best bet is to do some searching and read through several articles about it.
But, briefly...
.becomeFirstResponder() is often used to activate text fields without requiring the user to tap in the field. Common case is with multiple text fields (fill out the form type of interface), where you would automatically "jump" to the next field based on input:
myTextField.becomeFirstResponder()
Again, as you've already seen from glancing at the docs, there is much more to it than that... but far too much for an answer here.
I understand how the system find the view handles touch events by calling the following methods on a view and its subviews
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
But I don't understand the role of first responder in this mechanism.
Does firstResponder represents the start point of the hitTest traverse?
I would recommend a complete reading of the first article
Using Responders and the Responder Chain to Handle Events
in Apple documentation
Touches, Presses, and Gestures
Short answer:
Touch events are delivered directly to the first responder.
When your app receives an event, UIKit automatically directs that event to the most appropriate responder object, known as the first responder.
First responder is determined by hit-testing.
UIKit uses view-based hit-testing to determine where touch events occur. Specifically, UIKit compares the touch location to the bounds of view objects in the view hierarchy. The hitTest(_:with:) method of UIView traverses the view hierarchy, looking for the deepest subview that contains the specified touch, which becomes the first responder for the touch event.
If the first responder does not handle the event, the event is then passed from responder to responder in the active responder chain.
There's not a lot of relationship between them, except that the result of hit test might cause the window to make the hit view become firstResponder.
firstResponder is all about keyboard events and, at least on macOS, menu item actions and commands like cut, copy, paste, undo etc...
When a keyboard event is received by the app from the Window Server, it goes to the firstResponder. If it's not interested in it, then it goes up the chain to nextResponder until it exhausts the responder chain. On macOS there are related but separate concepts of the mainWindow and keyWindow. They are usually the same, but can be different. If they are different the responder chain first starts with the keyWindow, and when that chain is exhausted, it then goes to the mainWindow. Then the application gets a crack at it. Then the application's delegate. Then if it's a document based app, the document and then the document's delegate.
On iOS, I'm a little fuzzy on the exact details, but it's similar. Actually I think it's simpler, because you don't have multiple windows.
Hit testing on the other hand is all about the view heirarchy. So the app finds which window (on macOS) the hit occurs in, then from there it proceeds down to it's immediate subviews, and then down its subviews, etc... until it finds a leaf view that is hit.
I'm having trouble identifying why a UIButton ( specifically a UIBarButtonItem, the menu button in my case) gets the accessibility focus when a UIViewController changes.
The problem is it "cuts" the announcement being read and takes the focus at no need.
So my questions:
How do I find "who" is giving a UI object an accessibility focus. (I tried to override the becomeFirstResponder - it never gets called).
How do I select, programmatically, which UI object gets the accessibility focus now.
Thanks !
To select which object becomes first responder, you can simply call
[becomeFirstResponder()][1]
on the UIResponder object you want to be the focus.
I've recently discovered a problem in my app that only seems to occur in iOS 10 where the system keyboard does not display when programmatically triggering a text field to become first responder inside of a completion handler -- specifically the completion handler I get back from a Touch ID attempt.
The crazy part of this issue is, even though the keyboard is not shown, the area on the iPhone where the keyboard normally would be is still responding to touch inputs as if the user is typing on the keyboard!
After doing a lot of investigation and debugging into the issue, I stumbled across the fact that the hidden property is set to YES on the private UIRemoteKeyboardWindow that gets created after becomeFirstResponder is invoked on the text field. In other situations where I bring up the keyboard, the value of that hidden property is set to NO.
Has anybody else run into this problem in iOS 10? If so, anybody found a solution to this? I tried manually setting the hidden value to YES on the window instance but that had no effect on it. I'm pretty much grasping at straws at this point.
Attachments:
Here's the output of the windows from the UIApplication instance when the text field becomes first responder outside of the Touch ID completion handler (pay close attention to UIRemoteKeyboardWindow):
And when the UITextField becomes the first responder inside the Touch ID handler...
First Update
So I did not consider the becomeFirstResponder being done on the main thread before that some have pointed out, but unfortunately, it did not resolve the issue -- however, I did make some additional discoveries. The hidden window issue seems to stem from outputting the details of the UIApplication instance's windows immediately after issuing the becomeFirstResponder action. After doing that, I set a breakpoint on the UITextField editing callback and proceed to interact with the keyboard (that is invisible) -- and when I output the window details, it doesn't seem like the hidden property is ever set to YES (which can possibly rule out that property being set as the cause of the issue), but I still have an invisible keyboard! I started debugging the view hierarchy and below is a screenshot of what it looks like when I examine the keyboard window:
Hopefully you guys can see what I discovered here which is that the keys are present but there appears to be some white view blocking them from sight. The thing is, I don't even see those white views on my app screen. I just see what normally sits behind the keyboard when it's present.
As a temporary workaround, call becomeFirstResponder after a delay fixed this, however, not happy with the hacky solution.
Looks like the issue occurring for different scenarios too - keyboard could be invisible even if you are selecting the textField manually after cancelling touchId alert.
UIKit text input components, such as UITextView and UITextField have a property inputView to add a custom keyboard. There are two questions I have relating to this.
If the keyboard is currently visible and the property is set to a new input view, nothing happens. Resigning and regaining first responder status refreshes the input and displays the new view. Is this the best way to do it? If so it might answer my bigger question:
Is it possible to animate the transition between two input views?
From the UIResponder docs:
Responder objects that require a custom view to gather input from the user should redeclare this property as readwrite and use it to manage their custom input view. When the receiver subsequently becomes the first responder, the responder infrastructure presents the specified input view automatically. Similarly, when the view resigns its first responder status, the responder infrastructure automatically dismisses the specified view.
So unfortunately the answer to 1 is Yes and 2 is No.
Actually there is a method to do it cleanly: UIResponder's reloadInputViews, available from iOS 3.2!
I think you can animated it with some extra work:
Create a clear background window of a higher UIWindowLevel than the keyboard window.
Add your custom keyboard there and animate its frame into place.
Then set it as your text input's inputView and refresh the first responder as you do.
Your custom keyboard will change its parent view from your custom window to the keyboard one, but hopefully the user won't notice ;)