I am working on the IOS application, related to voice over, my Question is : When accessibility voice over was enabled how can i get the swipe gestures left, right, top and down, what re the function for detecting these in swift?
First of all, you need to let VoiceOver know that about your view (or another element). So if you are in a view controller, this should work: self.view.isAccessibilityElement = true
Second, you need to let VoiceOver know that your view will handle user interactions on its own: self.view.accessibilityTraits = UIAccessibilityTraitAllowsDirectInteraction. After that your view should start getting gestures notifications.
Here's another relevant answer: https://stackoverflow.com/a/20712889/2219578
It isn't possible to catch the left, right, top and bottom VoiceOver gestures : I've seen neither a protocol nor a kind of notification for this.
However, you can detect a scrolling action and be aware of the element focus provided by VoiceOver.
Related
I've created a custom view that acts like a UISlider - there is a "track", and a handle to "grab" to change the value. For particular reasons, I can't just make it a subclass of UISlider. I'm trying to make this slider as accessible as possible with VoiceOver. I have accessibilityIncrease and accessibilityDecrease on my custom view that handle single finger drag up and single finger drag down. This changes the value of the slider by 10% at a time.
However, I'd like to allow more fine grained control, just like a non-VoiceOver slider. By default , UISlider has double tap and hold, and you can drag up/down to "pan" the slider. I'd like to add exactly that to my custom view, but I can't find the correct incantation to handle the double tap and hold gesture.
Is there something I can do to mimic the double tap and hold gesture from UISlider on my custom view?
Thanks very much!!!
If you want to implement this kind of new gesture for VoiceOver users, just forget it.
The recommended gesture for this kind of UI control is definitely the implementation of adjustable value as you already did apparently.
I don't think it's a good idea to try and implement new VoiceOver gestures in an application because its users have their habits and they may be totally lost with your customed control if they cannot handle it unless you add an hint to explain but that's definitely not what I recommend anyway.
Otherwise, you could take a look at the pass through concept introduced in the What's New in Accessibility WWDC 2017 video that deals with the same idea but for a panning gesture...
I'm trying to make a musical keyboard UI element accessible. Just like how GarageBand does it. In other words, at first touch user is told by VoiceOver that they are touching a musical keyboard, and from that point every tap on musical keyboard view plays notes and there’s no further VoiceOver interruptions until user touches outside of the musical keyboard frame.
I have a UICollectionView where each cell represents a musical key and when user taps on it notes are played as expected. However, I have trouble getting this to work like the GB does. For the UICollectionView object, I’ve set accessibilityLabel and have set accessibilityTraits to UIAccessibilityTraitAllowsDirectInteraction. But that doesn’t seem to work. It doesn’t play any notes when VoiceOver is on. On the first tap VoiceOver anounces whatever the accessibilityLabel is set to and then just beeps on every tap.
I have custom UIGestureRecogniser subclass that I use for the collection view cell tap detection. Do I need to do something special under these circumstances?
Any ideas? Do I need to be doing anything else?
Figured it out. Not sure what the reasoning is however.
Embeded the musical keyboard UICollectionView in another view and made that view accessible with the UIAccessibilityTraitAllowsDirectInteractiontrait. Now it works as expected.
I'm trying to design (a "properly designed," not "hack") custom alert view. The view should attach itself to the top of the keyboard; sliding up with the keyboard (if there is an alert) or being hidden (if there is no alert).
The view should always "stick" to the keyboard... including, for instance, when the keyboard hides. In that case, the view should slide right down, out of sight, along with the keyboard.
Here's an example of what I'm trying to achieve (with an active alert):
I have originally thought about subclassing UIAlertView, but it looks like that is not recommended. And, after experimenting a bit, this is clearly a tricky task. I've got an alert that shows up but, it turns into problems staying in sync with the keyboard, and I haven't found a way to make it "track" with the motion of the keyboard... not smoothly.
Any ideas?
You can achieve this with inputAccessoryView of UITextField and UITextView. See Custom Views for Data Input chapter in Apple's "Text Programming Guide for iOS" for more information.
For example, a very simple red bar above the keyboard can be added with the following code:
let keyboardAlertView = UIView(frame:CGRectMake(0,0,320,44))
keyboardAlertView.backgroundColor = UIColor.redColor()
textField.inputAccessoryView = keyboardAlertView
When VoiceOver is enabled, I'd like to find out if the user is performing the left/right flick action while a UIButton is selected.
There are few methods help you with when a specific element has received or lost focus:
accessibilityElementDidLoseFocus
accessibilityElementDidBecomeFocused
But nothing within the UIAccessibilityAction to help find if the user attempted a flick left or right.
Is there a way to find out what the user is attempting to do?
No. You should not attempt to override the left and right VoiceOver swipe gestures. if you need to adjust a value with swiping, consider implementing a custom control with the trait UIAccessibilityTraitAdjustable. If you need to support direct gesture interaction, adopt UIAccessibilityTraitAllowsDirectInteraction.
Edit: To answer your question, you might be able to watch focus change, issue a screen change notification, return new children, and focus the first. Please see my comment below about why this may be undesirable.
When VoiceOver is active on an iOS device, the single-finger swipe(left or right) gesture allows users to browse the different elements in the view. Is there a way to detect if a user used the single-finger swipe gesture when using voiceover?
You might be asking either of 2 things:
You want to know when the VoiceOver user successfully issued the single-finger swipe left/right gesture to VoiceOver - VoiceOver will process ("steal") the gesture from your code and do its thing (move VoiceOver cursor to the next/previous element). The closest you can get is to get notifications for a UIView when the VoiceOver cursor lands on it or leaves it (see the UIAccessibilityFocus protocol).
You want to make part of your UI not subject to VoiceOver gestures (VoiceOver will not process ("steal") gestures in this area) so that you can detect the gestures yourself (including the single-finger swipe left/right) in a standard way and process them in the way you want for your app. Then you must add the UIAccessibilityTraitAllowsDirectInteraction trait to the accessibilityTraits property to the relevant UIView (see UIAccessibility protocol for more details). A prominent example of where this is used is in GarageBand for iOS - the piano keyboard or drums have this trait so that VoiceOver user can play on the instruments without turning VoiceOver off.
I ended up creating a category/extension on UIView and overriding accessibilityElementDidBecomeFocused().
Here I can get a global hook, which gets called every time the accessibility state has changed.
Swift example:
extension UIView {
//MARK: Accessibility
override public func accessibilityElementDidBecomeFocused() {
super.accessibilityElementDidBecomeFocused()
UIApplication.sharedApplication().sendEvent(UIEvent())
}
}