Apple Watch v1.01 Voice Over Accessibility of WKInterfaceTable rows doesn't work - ios

I'm trying to get some simple Voice Over setup working on my Watch app. I use a WKInterfaceTable who's rows have multiple elements within them. At the moment voice over just goes over each individual element reading them out.
I want to set that each of my rows
isAccessibilityElement
to YES so that it will itself become voice over selectable and will hide its children elements.
Unfortunately this just doesn't seem to work. The rowcontroller's have to inherit off of NSObject, not WKInterfaceObject and it just seems to not be respecting the isAccessibilityElement property. I set it, and the label for each rowController but voice over continues to just select the children of the row, and ignore any of the accessibility setup I've got on the rowController itself.
I've seen a WWDC video this year talking about this stuff, I'm presuming this is an WatchKit 2.0 feature that this stuff all works, or has anyone had any success getting their Watchkit 1.0 apps to have anything other than the most simple built in voice over accessibility?
Thanks for your time

Instead of setting the accessibilityHint and accessibilityLabel on the row itself, connect the main Group to your Cell NSObject class as well and set the accessibility properties on this group:
#IBOutlet var mainGroup: WKInterfaceGroup!
and
let row = workoutTypeSelectionTable.rowController(at: index) as! WorkoutSelectionCell
row.mainGroup.setAccessibilityHint("Double tap to start.")
row.mainGroup.setAccessibilityLabel("Start a new \(disciplineTitle) session.")

Related

changing the accessibility label in control center (remote command center)

I've got a simple app that plays a radiostation. I've added a MPRemoteCommandCenter to let the user control the audio via the control center.
Thats all working fine.
However, I want to change their accessibility labels. But this is the part where things don't work as expected.
I've set up my remoteCommandCenter as follows:
let remoteCommandCenter = MPRemoteCommandCenter.shared()
Then, I added controls and handlers:
remoteCommandCenter.playCommand.isEnabled = true
remoteCommandCenter.playCommand.addTarget(self, action: #selector(ExternalPlaybackController.handleExternalPlayPauseCommandEvent(_:)))
And then, I want to add some accessibility label:
remoteCommandCenter.playCommand.accessibilityLabel = "Play radio"
This is were things don't work. If I debug the code, the compiler will execute that line. What am I making wrong?
Can you even change the accessibility labels of the remoteCommandCenter?
Can you even change the accessibility labels of the remoteCommandCenter?
I never worked with this kind of component but I think that it is ignored by VoiceOver because the screen reader doesn't recognize it as an accessibility element.
In my view, your code compiles with no problems because your accessibility properties belong to the UIAccessibility informal protocol which means that it's well recognized as code.
However, it's not interpreted by VoiceOver as information to be read out because your element isn't a kind of a UIKit control.
I suggest to create an UIAccessibilityElement for your playCommand so as to customize its behaviour as you wish ⟹ Apple doc states :
The UIAccessibility informal protocol is also implemented by the UIAccessibilityElement class, which represents custom user interface objects. If you create a completely custom UIView subclass, you might need to create an instance of UIAccessibilityElement to represent it. In this case, you would support all the UIAccessibility properties to correctly set and return the accessibility element’s properties.

How to get VoiceOver to announce section labels in iOS?

In the iPhone Weather App, when using VoiceOver, I noticed that tapping into a section for the first time, it will announce the section.
For example, in iOS 9, tapping on any item the middle strip for the first time will announce "Hourly forecasts" before continuing to describe the element you tapped on. Tapping anything else in the strip will not announce "hourly forecasts".
Tapping on anything on the bottom table, will announce "Daily forecasts" before continuing to describe the element you tapped on. Tapping anything else in this table will not prefix with "Daily Forecasts".
Is there a simple API to name sections of your app? Or do you have to do this manually by tracking the voiceover cursor and dynamically changing your label? (Does this even work? Can you change the accessibilityLabel after something is tapped but before it is read?)
There are two approaches I guess:
Subclassing the UITableViewCell and overriding the accessibilityLabel.
- (NSString *) accessibilityLabel
{
NSString* voiceOverString;
// append section title on voiceOverString and then the elements value
return voiceOverString;
}
See this link from Apple docs:
You can setAccessibilityLabel of the cell from cellForRowAtIndexPath. The example is for the weather app itself.
Is there a simple API to name sections of your app?
It seems like the most appropriate reference is Apple's Accessibility Programming Guide.
And its API, Apple's UIAccessibility Documentation.
Setting the shouldGroupAccessibilityChildren property seems like the best way to accomplish your goal. The linked API describes it as,
A Boolean value indicating whether VoiceOver should group together the elements that are children of the receiver, regardless of their positions on the screen. Setting the value of this property to YES on the parent view of the items in the vertical columns causes VoiceOver to respect the app’s grouping and navigate them correctly.
Things to keep in mind:
Is the target element an accessibility element? (you can check using the isAccessibilityElement property; standard UIKit controls and views implement the UIAccessibility protocol by default)
If so, you just need to set its accessibility attributes
If not, you need to change its value, view.isAccessibilityElement = true
The accessibilityLabel property identifies the element
The accessibilityHint property describes the action triggered by an element
You can set accessibility attributes in the storyboard
Or, you can set accessibility attributes in the implementation of your view subclass

Swipe to switch focused accessibility element iOS

When the user has Voice Over on in certain apps, a one handed swipe to the right or the left changes the focused accessibility element and speaks it (for example, the App Store top charts view). I would like to have this in my own app (which uses a storyboard).
I can think of several ways to do this myself with a swipe gesture recognizer and a list of accessibility elements in order, but it seems like there must be a way to do this in the accessibility API. However, my research has turned up nothing.
Is this a built in feature? If so, how can I add it in my storyboard or in code?
Edit:
Per advice from one of the answer I have implemented the UIAccessibility protocol for my view.Here is the code.
- (NSInteger)accessibilityElementCount{
return 4;
}
- (id)accessibilityElementAtIndex:(NSInteger)index{
return [#[self.menuButton, self.firstButton, self.secondButton, self.thirdButton] objectAtIndex:index];
}
- (NSInteger)indexOfAccessibilityElement:(id)element{
return [#[self.menuButton, self.firstButton, self.secondButton, self.thirdButton] indexOfObject:element];
}
The view I am having this issue with is defined in an interface builder storyboard. As you can no doubt infer from the code, it has 3 buttons as subviews.
What you are describing is the built-in behavior for VoiceOver and can't be changed on a per-app basis.
If you want to modify the order elements are focused, look at the UIAccessibilityContainer protocol for iOS 7 or accessibilityElements property of NSObject for iOS 8. If you don't want to implement either of those, you can also simply set accessibilityElementsHidden to YES for elements you want VoiceOver to ignore.
I have fixed the problem by adding accessibility labels to the buttons in the storyboard. Because voice over already spoke their label correctly, I had not bothered to do so before.

UIAccessibility - containers

There's a "Containers" rotor option in Voiceover which allows the user to quickly navigate through "high level" sections of the screen via single finger swipe up and swipe down actions. For example, the Calendar app has three high level items: navbar, contents and toolbar.
My app uses custom UIView subclasses and, no matter what I try to do, all my views seem to belong to a single container. I can't split them into logical sections. I tried putting them in separate views implementing the UIAccessibilityContainer protocol and setting a few of the accessibility properties on the parent views.
Does anyone know how to create multiple containers?
I did some digging on this issue and think its a private trait Apple is using. First I noticed the only containers recognized are standard UIKit type objects like UITableViews, UITabBars, UINavigationBars, etc. So next I used the debugger to inspect the value of the accessibility traits for these components. They're all 0x200000000000. Just to be sure I didn't miss an UIAccessibilityTrait I checked all of their values. None of them match the value. Furthermore if you set your views accessibility traits to this mysterious value it'll work just like you want! I tried determining the location of this constant but didn't have much luck. If you want to do more digging it looks like apple stores accessibilityTraits using an NSObject category that uses associated objects with some constant value named AXTraitsIdentifier.
Practically speaking you could do something like the below but since its not defined in a public API its functionality could change in the future
//Note the navBar has to be run through a voice over pass before the value is set :( or you can just directly set the value to 0x200000000000.
myContainerView.accessibilityTraits = navBar.accessibilityTraits;
I'd love to hear if anyone one else has info on this? So far I haven't found an ideal solution.
I have been able to make the views in my app reachable by single finger swipe up and swipe down actions when the "Containers" rotor option is selected by setting the accessibilityContainerType property of my views to semanticGroup.

How can I add to the iOS VoiceOver rotor for a custom view?

Recently, I've been working to get my application functioning well with VoiceOver. Generally it's been simple and straightforward, but there are some behaviors from system apps that I'd like to emulate, and I'm having a hard time locating the API to set things up.
In particular, I'm interested in adding a couple of options to the VoiceOver "rotor" and responding to them when the user increases and decreases the value. However, despite the fact that apps like Apple's Maps app add items to the rotor and are able to respond, I can't figure out how to do so for my app.
Has anyone succeeded in doing this? And if so, how?
With iOS 8, you can use the -accessibilityCustomActions method to return an array of UIAccessibilityCustomAction objects, representing the actions you'd like to present "rotor-style".
UPDATE: iOS 10 finally adds ability to add custom rotor items to VoiceOver (not the same thing as the "Actions" rotor item) - just add array of UIAccessibilityCustomRotor objects to accessibilityCustomRotors of the appropriate container view.
OLD ANSWER:
There is currently no API to add your own rotor items. You can only implement how some of the existing rotor items work:
"Adjust value" - here you should return UIAccessibilityTraitAdjustable trait for accessibilityTraits and then implement the accessibilityIncrement/accessibilityDecrement methods
"Headings" - you mark some views as UIAccessibilityTraitHeader, then those should be the view the user moves through when the user rotates to "Headings" and flicks up/down
OLD UPDATE: "Actions" - see UIAccessibilityCustomAction
I guess you should file a radar if you need to add custom items to rotor.

Resources