I'm having trouble identifying why a UIButton ( specifically a UIBarButtonItem, the menu button in my case) gets the accessibility focus when a UIViewController changes.
The problem is it "cuts" the announcement being read and takes the focus at no need.
So my questions:
How do I find "who" is giving a UI object an accessibility focus. (I tried to override the becomeFirstResponder - it never gets called).
How do I select, programmatically, which UI object gets the accessibility focus now.
Thanks !
To select which object becomes first responder, you can simply call
[becomeFirstResponder()][1]
on the UIResponder object you want to be the focus.
Related
According to Apple's documentation:
When your app receives an event, UIKit automatically directs that
event to the most appropriate responder object, known as the first
responder.
Same documentation explain how first responder is determined:
The hitTest:withEvent: method of UIView traverses the view hierarchy,
looking for the deepest subview that contains the specified touch,
which becomes the first responder for the touch event.
What I don't understand is why there is a property of UIResponder called isFirstResponder? And why becomeFirstResponder exists. Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.
What I don't understand is why there is a property of UIResponder called firstResponder?
There isn't. UIResponder does not have a public property named firstResponder.
And why becomeFirstResponder exists.
The main use of becomeFirstResponder is to programmatically choose which text field gets keyboard events.
Should not the first responder be determined dynamically by UIKit based on location of the specific touch event?
There are more kinds of events than touch events. For example, there are keyboard events and motion events. The first responder tracked by UIKit is for non-touch events. In other systems, this concept is usually called the “focus” or more specifically the “keyboard focus”. But (in iOS) the first responder can be a view that doesn't respond to keyboard events.
Additionally, canBecomeFirstResponder return NO for UIView, which is clearly incorrect since views do handle touch events.
That's ok, because touch events don't really start at the first responder. They start at the view returned by -[UIView hitTest:withEvent:].
The only way I can think that can resolve this confusion is if all these methods are relevant only to events of the type of shake, remote control and editing menu. But the documentation is not clear about it.
There are more kinds of non-touch events that start with the first responder, but aside from that, you have resolved it correctly.
This is not a "quick answer" topic -- your best bet is to do some searching and read through several articles about it.
But, briefly...
.becomeFirstResponder() is often used to activate text fields without requiring the user to tap in the field. Common case is with multiple text fields (fill out the form type of interface), where you would automatically "jump" to the next field based on input:
myTextField.becomeFirstResponder()
Again, as you've already seen from glancing at the docs, there is much more to it than that... but far too much for an answer here.
I've recently discovered a problem in my app that only seems to occur in iOS 10 where the system keyboard does not display when programmatically triggering a text field to become first responder inside of a completion handler -- specifically the completion handler I get back from a Touch ID attempt.
The crazy part of this issue is, even though the keyboard is not shown, the area on the iPhone where the keyboard normally would be is still responding to touch inputs as if the user is typing on the keyboard!
After doing a lot of investigation and debugging into the issue, I stumbled across the fact that the hidden property is set to YES on the private UIRemoteKeyboardWindow that gets created after becomeFirstResponder is invoked on the text field. In other situations where I bring up the keyboard, the value of that hidden property is set to NO.
Has anybody else run into this problem in iOS 10? If so, anybody found a solution to this? I tried manually setting the hidden value to YES on the window instance but that had no effect on it. I'm pretty much grasping at straws at this point.
Attachments:
Here's the output of the windows from the UIApplication instance when the text field becomes first responder outside of the Touch ID completion handler (pay close attention to UIRemoteKeyboardWindow):
And when the UITextField becomes the first responder inside the Touch ID handler...
First Update
So I did not consider the becomeFirstResponder being done on the main thread before that some have pointed out, but unfortunately, it did not resolve the issue -- however, I did make some additional discoveries. The hidden window issue seems to stem from outputting the details of the UIApplication instance's windows immediately after issuing the becomeFirstResponder action. After doing that, I set a breakpoint on the UITextField editing callback and proceed to interact with the keyboard (that is invisible) -- and when I output the window details, it doesn't seem like the hidden property is ever set to YES (which can possibly rule out that property being set as the cause of the issue), but I still have an invisible keyboard! I started debugging the view hierarchy and below is a screenshot of what it looks like when I examine the keyboard window:
Hopefully you guys can see what I discovered here which is that the keys are present but there appears to be some white view blocking them from sight. The thing is, I don't even see those white views on my app screen. I just see what normally sits behind the keyboard when it's present.
As a temporary workaround, call becomeFirstResponder after a delay fixed this, however, not happy with the hacky solution.
Looks like the issue occurring for different scenarios too - keyboard could be invisible even if you are selecting the textField manually after cancelling touchId alert.
I'm working on a custom keyboard for iOS which will have its own search field (similarly implemented by PopKey).
My keyboard's textfield is able to take the focus with becomeFirstResponder and I'm able to give it up by using resignFirstResponder. However after I resign focus, the host app has a difficult time retaking focus despite touching the form. The app's textfield will still show the text cursor blinking.
Any ideas? Thanks
The solution is a hack, as of right now you can't really give the host app its focus back.
Subclass a UITextField and on its delegate implement
textFieldShouldBeginEditing by returning NO.
Add a BOOL property isSelected that gets set to YES in touchesBegan (not to be confused with the default selected property)
In your keyboard's keyPressed method, if searchField.isSelected, manipulate the searchField.text. Else, manipulate textDocumentProxy like normal.
Add a clear button and method that wipes searchField.text and searchField.isSelected, allowing any further keystrokes to return to the textDocumentProxy
Add an animation that replicates the blinking type cursor
UIKit text input components, such as UITextView and UITextField have a property inputView to add a custom keyboard. There are two questions I have relating to this.
If the keyboard is currently visible and the property is set to a new input view, nothing happens. Resigning and regaining first responder status refreshes the input and displays the new view. Is this the best way to do it? If so it might answer my bigger question:
Is it possible to animate the transition between two input views?
From the UIResponder docs:
Responder objects that require a custom view to gather input from the user should redeclare this property as readwrite and use it to manage their custom input view. When the receiver subsequently becomes the first responder, the responder infrastructure presents the specified input view automatically. Similarly, when the view resigns its first responder status, the responder infrastructure automatically dismisses the specified view.
So unfortunately the answer to 1 is Yes and 2 is No.
Actually there is a method to do it cleanly: UIResponder's reloadInputViews, available from iOS 3.2!
I think you can animated it with some extra work:
Create a clear background window of a higher UIWindowLevel than the keyboard window.
Add your custom keyboard there and animate its frame into place.
Then set it as your text input's inputView and refresh the first responder as you do.
Your custom keyboard will change its parent view from your custom window to the keyboard one, but hopefully the user won't notice ;)
This is one of those "it was working a while ago" troubleshooting efforts.
I'm working on the document preview view controller, in which is a scroll view, which itself contains subclasses of UIView that represent each document. I'm modeling this pretty closely to how Keynote handles its document preview, except I build my scroll view horizontally and with paging. But the standard user experience is present: Long press on a document icon causes all document icons to start jiggling, nab bar has + button and Edit button, etc.
The issue at hand is that when you tap on the name of a document, I hide all the others, move the one being edited front and center, build a new text edit field, add it as a subview atop the real name label, and set it as first responder; but the
[editNameTextField setClearButtonMode:UITextFieldViewModeWhileEditing];
while correctly showing in the edit field is not taking any action when the user taps on the clear button.
I can't figure out what I may have done to cause this to not work -- it had been!
My first thought was that somehow my instance of this subclass is no longer the delegate for this text edit field. To try and confirm/deny that, I usurped a tap on the image view of the document preview to compare the delegate property to self, and it passes.
if (editNameTextField) {
NSLog(#"editNameTextField is still active");
if ([editNameTextField.delegate isEqual:self]) {
NSLog(#"we're still the delegate for the editNameTextField");
}
}
Editing the text within the edit field works fine. Pressing the Return/Done key correctly sends the delegate message textFieldShouldReturn:
While investigating this I implemented the delegate method textFieldShouldClear: just to write a log message if the method gets called (and return YES of course). It never gets called.
My next thought was that perhaps a subview had covered up the area where the clear button sits. So I implemented textFieldShouldBeginEditing: and used the opportunity to bring my the text field to the front. That didn't change anything either. I set a debugger breakpoint there to play a sound when it was called, and it got called, so I know my text edit field is frontmost.
I have only one troubleshooting strategy remaining: Go backwards through snap shots until it starts working again. Before doing that I thought I'd see if any of the more experienced folks out here have any suggestions of what to try next.
Where are you adding the textfield? As a subview of the scrollView? If you added the textfield and it is out of bounds of its parent view it won't receive any touches.
You can try and not call becomeFirstResponder and see if clicking it will show keyboard. Another possible error might be that the parent view of the UITextField has userInteractionEnabled = NO.
Without seeing more code I'm afraid I can not offer more solutions.