I have a custom control to increment and decrement values. Now that I've added support for voice over, I've stumbled upon a problem.
My customView has the accessibility trait .adjustable and I implemented the correct methods for increasing and decreasing the values.
However, the voice over user can also double tap on that view to activate it. The problem is, that this triggers a gesture which is irrelevant to voice over users.
Is there a way to prevent an adjustable accessibility view from being activated so that the element is only adjustable, not double-tappable like a button?
There are two important properties to know when a double-tap occurs:
accessibilityActivate.
accessibilityActivationPoint.
In your case, you could just return true by overriding accessibilityActivate and if it's not enough, provide as well a CGPoint coordinate that triggers nothing (depends of your custom control and its neighborhood).
Otherwise, use the accessibilityElementIsFocused instance method to know wether you can trigger actions as this complete example shows up.
I ended up using UIAccessibility.isVoiceOverRunning to stop any tasks which would be triggered by a doubletap on that specific element.
Related
I've created a custom view that acts like a UISlider - there is a "track", and a handle to "grab" to change the value. For particular reasons, I can't just make it a subclass of UISlider. I'm trying to make this slider as accessible as possible with VoiceOver. I have accessibilityIncrease and accessibilityDecrease on my custom view that handle single finger drag up and single finger drag down. This changes the value of the slider by 10% at a time.
However, I'd like to allow more fine grained control, just like a non-VoiceOver slider. By default , UISlider has double tap and hold, and you can drag up/down to "pan" the slider. I'd like to add exactly that to my custom view, but I can't find the correct incantation to handle the double tap and hold gesture.
Is there something I can do to mimic the double tap and hold gesture from UISlider on my custom view?
Thanks very much!!!
If you want to implement this kind of new gesture for VoiceOver users, just forget it.
The recommended gesture for this kind of UI control is definitely the implementation of adjustable value as you already did apparently.
I don't think it's a good idea to try and implement new VoiceOver gestures in an application because its users have their habits and they may be totally lost with your customed control if they cannot handle it unless you add an hint to explain but that's definitely not what I recommend anyway.
Otherwise, you could take a look at the pass through concept introduced in the What's New in Accessibility WWDC 2017 video that deals with the same idea but for a panning gesture...
How does the UIActionSheet's hit detection work? When a user selects an option and then moves it's finger to another option, the other option is highlighted as seen in the GIF's below. The detection also knows when a user is scrolling.
So this is achieved by listening for multiple UIControlEvents. Chances are you're used to listening for touchUpInside as this is standard for UIButton behaviour. But there are plenty more besides that. A full list and documentation can be found here.
In your case, you want to listen to touchDragEnter and touchDownInside, making the callback from these invoke some code that changes the background colour of your button.
You should also listen for touchDragExit and touchUpInside to return the background colour to normal.
Additionally, you should run the action code in touchUpInside.
I hope this clears things up!
UIAlertController contains a stack of items in subview with type UIStackView, this stack view is placed on the view with type _UIInterfaceActionRepresentationsSequenceView that has three gesture recognizers with types:
• UIScrollViewDelayedTouchesBeganGestureRecognizer
• UIScrollViewPanGestureRecognizer
• _UIDragAutoScrollGestureRecognizer
You can inspect it with Xcode built-in tool: User Interface Inspector.
I think that custom handlers of these recognisers provide this drag-and-highlight function.
Internal logic of UIAlertController handle touches and hit test subview in stack and set highlighted boolean property to YES for item under user's finger and to NO for others.
In our current UI, next to certain labels, we have a help-tip button that when clicked, explains the details of what the label references. As such, VoiceOver identifies these two items as separate accessibility items.
However, when using accessibility, we're hoping we can just do everything in the label itself. This way when the label gets focused, the user will here 'Account value, $20 (the accessibilityLabel), double-tap for help (the accessibilityHint)'
However, unlike a button, a label doesn't have an action associated with it so I'm not sure how to wire up actually triggering the accessibility gesture indicating I want to do something.
Short of converting all of our labels over to buttons, is there any way to listen to the accessibility 'action' method on our labels?
My current work-around is to make only the Help-tip buttons accessible, then move all the relevant information to their accessibility properties, but that seems like code smell as it's easy for a developer to miss that when updating the code.
In your UILabel subclass, override accessibilityActivate() and implement whatever double-tapping should do:
override func accessibilityActivate() -> Bool {
// do things...
return true
}
If the action can fail, return false in those instances.
Have your tried adding a UITapGestureRecognizer to the Labels?
Something like :
let tapGesture: UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: "tapResponse:")
tapGesture.numberOfTapsRequired = 1
sampleLabel.userInteractionEnabled = true
sampleLabel.addGestureRecognizer(tapGesture)
func tapResponse(recognizer: UITapGestureRecognizer) {
print("tap")
}
Ok, this was easier than I thought. To make a UILabel respond to accessibility actions similar to how a button does, you simply implement a UITapGestureRecognizer. The Accessibility framework uses that just like any other UIView.
let tapGestureRecognizer = UITapGestureRecognizer(target:self, action:#selector(labelTapped))
testLabel.userInteractionEnabled = true
testLabel.addGestureRecognizer(tapGestureRecognizer)
Once you do that, your label will respond to accessibility actions.
Group your label and your hint button as one unique accessible element.
Once done, you can use :
The accessibilityActivationPoint property to define the hint button to be triggered when the double tap occurs.
The accessibilityActivate method to indicate the action to be done when you double tap your new element.
According to your environment, I don't recommend to implement a custom action for such a simple use case... the two solutions above should do the job.
Absolutely! You can do this by using UIAccessibilityCustomActions on the accessibility element rather than using tap gesture recognizers. This is because accessibility operates differently than normal users and single tapping while the voice over focus lands somewhere will not give you the desired result as in the case of a normal use, nor will it permit you to execute multiple options on the same accessibility element.
At their recent WWDC, Apple put out an excellent video explaining how to add UIAccessibilityCustomActions to any kind of accessibility element. If you start this video 33 minutes in, you will be able to see how this is implemented.
Once in place, your Voice Over users will be able to scroll through the options and select the one that most suits his/her intentions, thereby permitting multiple actions to be accessible from the same UILabel.
I found that touchDown event is kind of slow, in that it requires a major, fairly long touch and does not respond to a light tap. Why is that?
Whereas, touchesBegan responds just when I need it to, i.e. responds even to very light, quick touches. But that's not an event but a method that can be overridden.
Problem is, touchesBegan apparently requires me to either 1) subclass a label (I need to respond to touching a label), or 2) analyze the event to figure out whether it came from the right label. I am wondering whether it's a code smell and whether there should be an event for simple touch.
Try to add a UITapGestureRecognizer to your label.
First of all, allow label to handle user interaction:
label.userInteractionEnabled = true
You can assign tap gesture to label. Then in handler method you need to switch over state property of recognizer. If it is .Began, you got the event that you need.
The cool thing about this approach that you could use this one handler for all of your labels. Inside handler, you can get touched label like this:
let label = tapRecognizer.view as! UILabel
"Code smell"? No, it's a user interface smell. A user interface stink.
If you make a button in your user interface behave different from buttons in any other application, people will hate your app. Do what people are used to.
When VoiceOver is enabled, I'd like to find out if the user is performing the left/right flick action while a UIButton is selected.
There are few methods help you with when a specific element has received or lost focus:
accessibilityElementDidLoseFocus
accessibilityElementDidBecomeFocused
But nothing within the UIAccessibilityAction to help find if the user attempted a flick left or right.
Is there a way to find out what the user is attempting to do?
No. You should not attempt to override the left and right VoiceOver swipe gestures. if you need to adjust a value with swiping, consider implementing a custom control with the trait UIAccessibilityTraitAdjustable. If you need to support direct gesture interaction, adopt UIAccessibilityTraitAllowsDirectInteraction.
Edit: To answer your question, you might be able to watch focus change, issue a screen change notification, return new children, and focus the first. Please see my comment below about why this may be undesirable.