Swipe to switch focused accessibility element iOS - ios

When the user has Voice Over on in certain apps, a one handed swipe to the right or the left changes the focused accessibility element and speaks it (for example, the App Store top charts view). I would like to have this in my own app (which uses a storyboard).
I can think of several ways to do this myself with a swipe gesture recognizer and a list of accessibility elements in order, but it seems like there must be a way to do this in the accessibility API. However, my research has turned up nothing.
Is this a built in feature? If so, how can I add it in my storyboard or in code?
Edit:
Per advice from one of the answer I have implemented the UIAccessibility protocol for my view.Here is the code.
- (NSInteger)accessibilityElementCount{
return 4;
}
- (id)accessibilityElementAtIndex:(NSInteger)index{
return [#[self.menuButton, self.firstButton, self.secondButton, self.thirdButton] objectAtIndex:index];
}
- (NSInteger)indexOfAccessibilityElement:(id)element{
return [#[self.menuButton, self.firstButton, self.secondButton, self.thirdButton] indexOfObject:element];
}
The view I am having this issue with is defined in an interface builder storyboard. As you can no doubt infer from the code, it has 3 buttons as subviews.

What you are describing is the built-in behavior for VoiceOver and can't be changed on a per-app basis.
If you want to modify the order elements are focused, look at the UIAccessibilityContainer protocol for iOS 7 or accessibilityElements property of NSObject for iOS 8. If you don't want to implement either of those, you can also simply set accessibilityElementsHidden to YES for elements you want VoiceOver to ignore.

I have fixed the problem by adding accessibility labels to the buttons in the storyboard. Because voice over already spoke their label correctly, I had not bothered to do so before.

Related

How to get VoiceOver to announce section labels in iOS?

In the iPhone Weather App, when using VoiceOver, I noticed that tapping into a section for the first time, it will announce the section.
For example, in iOS 9, tapping on any item the middle strip for the first time will announce "Hourly forecasts" before continuing to describe the element you tapped on. Tapping anything else in the strip will not announce "hourly forecasts".
Tapping on anything on the bottom table, will announce "Daily forecasts" before continuing to describe the element you tapped on. Tapping anything else in this table will not prefix with "Daily Forecasts".
Is there a simple API to name sections of your app? Or do you have to do this manually by tracking the voiceover cursor and dynamically changing your label? (Does this even work? Can you change the accessibilityLabel after something is tapped but before it is read?)
There are two approaches I guess:
Subclassing the UITableViewCell and overriding the accessibilityLabel.
- (NSString *) accessibilityLabel
{
NSString* voiceOverString;
// append section title on voiceOverString and then the elements value
return voiceOverString;
}
See this link from Apple docs:
You can setAccessibilityLabel of the cell from cellForRowAtIndexPath. The example is for the weather app itself.
Is there a simple API to name sections of your app?
It seems like the most appropriate reference is Apple's Accessibility Programming Guide.
And its API, Apple's UIAccessibility Documentation.
Setting the shouldGroupAccessibilityChildren property seems like the best way to accomplish your goal. The linked API describes it as,
A Boolean value indicating whether VoiceOver should group together the elements that are children of the receiver, regardless of their positions on the screen. Setting the value of this property to YES on the parent view of the items in the vertical columns causes VoiceOver to respect the app’s grouping and navigate them correctly.
Things to keep in mind:
Is the target element an accessibility element? (you can check using the isAccessibilityElement property; standard UIKit controls and views implement the UIAccessibility protocol by default)
If so, you just need to set its accessibility attributes
If not, you need to change its value, view.isAccessibilityElement = true
The accessibilityLabel property identifies the element
The accessibilityHint property describes the action triggered by an element
You can set accessibility attributes in the storyboard
Or, you can set accessibility attributes in the implementation of your view subclass

Accessibility on UIStepper

I know that voiceover can a be bag of hurt when used with an UIStepper because the UIButtons that it contains can't be customised. However I use this control to change a value displayed on a label:
I don't want to insert a new control just for voiceover and subclassing a control like UIStepper doesn't seem a good solution. Any ideas to implement voiceover with this interface?
You could have a view which wraps your label(s) and stepper together and which deals with the accessibility. So, the subviews are all disabled for accessibility and the wrapper view presents the text in the labels, the current value of the stepper and provides / handles a swipe based interface to increment and decrement the stepper. So, overall, the wrapper view would work like a slider.
An interface implementing a stepper may be well vocalized and presented by VoiceOver using the adjustable values.
A complete explanation is detailed here including illustrations and code snippets (ObjC + swift).

Prevent UISegmentedControl segment selection on focus on tvOS

I'm working on a simple UI on a tvOS app and I'm facing a strange problem.
When a UISegmentedControl get focused you can move your focus around and it automatically changes the selected segment. But what I'm looking for is a way to limit the segment selection only when the user taps the segment, not when he focused it.
Any idea?
Thanks in advance.
You need to have your own internal variable for the selected segment and only change its value when the select button is pressed (which you can get using a gesture recognizer). When the segment loses focus (detectable in didUpdateFocus function) you assign the value of your internal variable to the selected index of the segment control.
You need to subclass UISegmentedControl then override didUpdateFocusInContext. In the "Custom Class" field in IB use the name of your custom class.
You can subclass UISegmentedControl and disable the behavior by defining:
#objc func _selectFocusedSegment(){
print ("select focused segment")
}
Beware that this solution is a hack. As far as I know there is no good, clean way to accomplish what you want short of steering clear of UISegmentedControl.
Also know that when a UISegmentedControl 'changes focus' between segments, it does not actually change focus. So hooking into focus updates like Nostradamus is suggesting will not work. To the focus engine UISegmentedControl behaves like a single large focusable element, not like a group of focusable segments. You can see this for yourself by debug inspecting a UIFocusUpdateContext on focusing towards or away from a UISegmentedControl.
I stumbled onto _selectFocusedSegment by defining a UISegmentedControl subclass and debug logging the various NSObject.perform methods, among others. My intent was to reverse engineer how UISegmentedControl retains a sticky last focused item, which is quite difficult to do on Apple TV. I was not able to find out exactly how UISegmentedControl manages focus, but I was able to find the answer to your question along the way.

UIAccessibility - containers

There's a "Containers" rotor option in Voiceover which allows the user to quickly navigate through "high level" sections of the screen via single finger swipe up and swipe down actions. For example, the Calendar app has three high level items: navbar, contents and toolbar.
My app uses custom UIView subclasses and, no matter what I try to do, all my views seem to belong to a single container. I can't split them into logical sections. I tried putting them in separate views implementing the UIAccessibilityContainer protocol and setting a few of the accessibility properties on the parent views.
Does anyone know how to create multiple containers?
I did some digging on this issue and think its a private trait Apple is using. First I noticed the only containers recognized are standard UIKit type objects like UITableViews, UITabBars, UINavigationBars, etc. So next I used the debugger to inspect the value of the accessibility traits for these components. They're all 0x200000000000. Just to be sure I didn't miss an UIAccessibilityTrait I checked all of their values. None of them match the value. Furthermore if you set your views accessibility traits to this mysterious value it'll work just like you want! I tried determining the location of this constant but didn't have much luck. If you want to do more digging it looks like apple stores accessibilityTraits using an NSObject category that uses associated objects with some constant value named AXTraitsIdentifier.
Practically speaking you could do something like the below but since its not defined in a public API its functionality could change in the future
//Note the navBar has to be run through a voice over pass before the value is set :( or you can just directly set the value to 0x200000000000.
myContainerView.accessibilityTraits = navBar.accessibilityTraits;
I'd love to hear if anyone one else has info on this? So far I haven't found an ideal solution.
I have been able to make the views in my app reachable by single finger swipe up and swipe down actions when the "Containers" rotor option is selected by setting the accessibilityContainerType property of my views to semanticGroup.

How can I add to the iOS VoiceOver rotor for a custom view?

Recently, I've been working to get my application functioning well with VoiceOver. Generally it's been simple and straightforward, but there are some behaviors from system apps that I'd like to emulate, and I'm having a hard time locating the API to set things up.
In particular, I'm interested in adding a couple of options to the VoiceOver "rotor" and responding to them when the user increases and decreases the value. However, despite the fact that apps like Apple's Maps app add items to the rotor and are able to respond, I can't figure out how to do so for my app.
Has anyone succeeded in doing this? And if so, how?
With iOS 8, you can use the -accessibilityCustomActions method to return an array of UIAccessibilityCustomAction objects, representing the actions you'd like to present "rotor-style".
UPDATE: iOS 10 finally adds ability to add custom rotor items to VoiceOver (not the same thing as the "Actions" rotor item) - just add array of UIAccessibilityCustomRotor objects to accessibilityCustomRotors of the appropriate container view.
OLD ANSWER:
There is currently no API to add your own rotor items. You can only implement how some of the existing rotor items work:
"Adjust value" - here you should return UIAccessibilityTraitAdjustable trait for accessibilityTraits and then implement the accessibilityIncrement/accessibilityDecrement methods
"Headings" - you mark some views as UIAccessibilityTraitHeader, then those should be the view the user moves through when the user rotates to "Headings" and flicks up/down
OLD UPDATE: "Actions" - see UIAccessibilityCustomAction
I guess you should file a radar if you need to add custom items to rotor.

Resources