I'm adding VoiceOver support to my app. So far, so good, but I'd really like to be able to specify which element is the first one spoken after a UIAccessibilityScreenChangedNotification. I haven't seen a way to do this. Making something the summary element doesn't really seem to do it. Am I missing something?
This has always been perfectly possible to do.
Just write something along the lines of:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification,
self.myFirstElement);
}
#end
This works for both UIAccessibilityScreenChangedNotification and UIAccessibilityLayoutChangedNotification.
Now for Swift 5
override func viewDidAppear(_ animated: Bool) {
UIAccessibility.post(notification: UIAccessibility.Notification.screenChanged,
argument: myFirstElement)
}
I don't think there is an API value that specifies an order of reading, other than using Summary Element value on startup - it is by design.
So you would have to test the order and default for the UIKit elements or any custom controls, because it depends on your design. You can also mark items as non-accessible elements so they won't be 'read', accessible elements read by default, and containers for accessible elements to allow you to better control your intended interactions. I don't know if making the item selected will help.
I take it you are already using the Accessibility Inspector to test your application before testing on iOS.
If you are needing some background on the subject, Rune's Working With VoiceOver Support and Gemmell's Accessibility for Apps may be worth reading.
What about using UIAccessibilityAnnouncementNotification?
This technique worked for me.
VoiceOver will announce the value of the first element in the accessibleElements array. This can be sorted to suit your needs.
Related
I am working on an app and trying to make it as accessible as possible. I am trying to move focus to a certain element once an action takes place. I was curious about the difference between these two functions:
UIAccessibilityFocusedElement vs. UIAccessibilityPostNotification
If someone could explain the difference between the two it would be greatly appreciated.
UIAccessibilityPostNotification is used to change things (like focused elements but also pausing and resuming assistive technology like that:
UIAccessibility.post(notification: .pauseAssistiveTechnology, argument: UIAccessibility.AssistiveTechnologyIdentifier.notificationSwitchControl)
UIAccessibility.post(notification: .resumeAssistiveTechnology, argument: UIAccessibility.AssistiveTechnologyIdentifier.notificationSwitchControl)
It can also announce something:
UIAccessibility.post(notification: .announcement, argument: "Say something")
or refresh focus after accessibility scroll
UIAccessibility.post(notification: .pageScrolled, argument: nil)
On the other hand UIAccessibilityFocusedElement can't change anything. It just returns currently focused element (or nil) this way:
UIAccessibility.focusedElement(using: UIAccessibility.AssistiveTechnologyIdentifier.notificationVoiceOver)
On a side note - for now only assistive technology that can be paused or resumed is notificationSwitchControl, trying that with Voice Over causes crashes
If you are trying to move focus to an element based off an actions / screen change scenario.
I think you should probably take a look at:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, element_to_be_focused>);
Should be posted when a new view appears that encompasses a major portion of the screen.
or
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, element_to_be_focused);
Should be posted when the layout of a screen changes, for example when an individual element appears or disappears.
I'm currently trying to make my game become more accessible by adding VoiceOver support. Everything is working fine on iOS, but I have some struggle with the watchOS Version. I need a way to find out, if VoiceOver is currently enabled to remove certain images based questions in the game. So ist there anything like:
UIAccessibilityIsVoiceOverRunning()
in WatchKit?
And also, is it possible to move the accessibility focus to a certain element? Something comparable to:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, view);
Thanks, Klemens
To check if it's running
let isVoiceOverOn: Bool = WKAccessibilityIsVoiceOverRunning()
if isVoiceOverOn {
// do some VoiceOver stuff
} else {
// do some stuff that does not make sense for VoiceOver
}
To find out when VoiceOver starts and stops observe:
WKAccessibilityVoiceOverStatusChanged
I have a custom UITableView subclass in which I override +accessInstanceVariablesDirectly to return NO in order to ensure attributes with no setter cannot be set using KVC.
When removing this table view from the view hierarchy, the app crashes - sometimes - and now for the weird part: only if Accessibility is enabled! (i.e. the Accessibility Inspector is visible, or you have Accessibility enabled on a physical device)
If I do not override +accessInstanceVariablesDirectly, everything works fine. I figured maybe UITableView relies on accessing some instance variables directly - but then what is the point of this method, if I can break superclasses by using it? Is there a way to specify this behavior per-attribute, like +automaticallyNotifiesObserversForKey:? However I am baffled by the fact that this issue only exists when Accessibility is enabled.
I tried analyzing the project with various Instruments, but without success.
You can find a minimal project reproducing the issue here. I would greatly appreciate any pointers on why this is happening or how to achieve what I want nonetheless.
This issue appears to be fixed in iOS 9.
I've been developing a custom keyboard for iOS 8, but stumbled upon a problem trying to send images using the keyboard. I did some research and it seems like there isn't a simple way to do this with UITextDocumentProxy because only NSStrings are allowed.
Am I either overlooking any simple ways to use a custom keyboard to send images and/or is there any way to work around this issue?
Thanks in advance
Apparently, you are not the only person to try a keyboard like this. If you look at the animated GIF on the site, the keyboard uses copy/paste to add the image to the messages.
The UIKeyInput Protocol, which is inherited by UITextDocumentProxy, has the following function:
func insertText(_ text: String) //Swift
- (void)insertText:(NSString *)text //ObjC
These only take a String or an NSString, as you already know. Because there are no other methods to insert content, it can be assumed that, currently, there is no other way.
Right now, I would suggest that you adopt a similar usage. When the user taps on the image, add it to the UIPasteboard. Maybe present a little banner on top of the keyboard saying, "You can now paste" or something, but that would be up to you to decide.
I have an app with a custom text editor that implements the UITextInput protocol. In iOS 6, Apple added one new required method to the protocol:
- (NSArray *)selectionRectsForRange:(UITextRange *)range
I've implemented this, but I can't seem to find a way to trigger it. At least in the way my app works, it seems to never get called by the text system. Does anyone know what it's used for?
This method is only used by subclasses of UITextView. This is the only method that would give you the system selection and loupe. This is what I was told at WWDC.
I am working on my own DTRichTextEditor as well and I implemented it nevertheless, maybe one day we get the selection/loupes also for our own UIViews that are not derived from UITextView.