Stop voiceover reading a text - ios

I have a long text on my view, when I tap on it VoiceOver reads the text.
Is there a default behavior to stop VoiceOver reading?
If not, is there a way to do it programmatically? for example when the view receive a tap.
Thanks in advance.

Without knowing the content or the interface it is difficult to give a solid answer to this question but one way to approach this problem it to try to not think of the experiences between a VoiceOver user and any other user as different experiences in the first place.
If you don't want VoiceOver users to repeatedly hear a long string of text you probably are also making the assumption that other users are going to be skipping over it after they have read it once as well.
Consider altering your interface so that the information is only presented once in a flow or is only presented when the user needs it and requests it, like contextual help.
Again, not knowing the interface or the purpose of the text makes it hard to answer this question directly but I generally find that building one interface to be inclusive of everyone often helps to point out that what might be perceived as just an Accessibility concern is actually a broader user experience concern and not just confined to the VoiceOver interface.
I hope that helps a little bit.

Related

iOS accessibility: what are the pros/cons for hardcoding "double tap to activate" as hint?

iOS has built-in support for accessibility, for UIButtons it reads the title of the button followed by a hint "double tap to activate" (by default). Sometimes we are required to make a non-UIButton control behaving similar to UIButton in terms of accessibility, so we would set its accessibility trait to button and hardcode "double tap to activate" for accessibilityHint.
I don't like altering system behaviours, and I've seen accessibility users who prefer single tap instead of double tap (there's an option they can set), although I haven't checked if the opt for single tap instead of double tap, does the system hint become "single tap to activiate".
What is the common practice regarding accessibility support for a non-UIButton control that is tappable? Thanks!
I've seen accessibility users who prefer single tap instead of double tap (there's an option they can set)
I'm really curious to know how it's possible using VoiceOver because a single tap with one finger deals with the accessibility focus. In the UIButton documentation, Apple states: 🤓
VoiceOver speaks the value of the title [...] when a user taps the button once.
Would you mind detailing the way to activate this option you mentioned because I'm really astonished, please? 🤔
What is the common practice regarding accessibility support for a non-UIButton control that is tappable?
Using a hint is a very good practice to provide useful information to the user but this information mustn't be crucial for using because the accessibility hint may be deactivated in the device settings.😰
Admittedly speaking, this kind of element must be read out in such a way that its goal and its using are clear enough for any user: that's what traits are made for. 👍
Many traits are well known and give rise to different actions like adjustable values, customed actions and the rotor items using.
Besides, it's also possible to use the accessibilityActivate() method so as to define the purpose of a double-tap with one finger of an accessible element. 🤯
The way you want to vocally expose the possible actions on a tappable control depends on the content of your application.
Finally, keep in mind that hardcoding a hint must be understood as a plus information but definitely not as an essential one because it can be deactivated by the user: a conception oriented a11y is very important when building an app. 😉

Customising the Decimal pad in my app

Swift...
So I've got an existing app and I'm working on its appearance. The current task is customising the decimal pad that pops up when the user hits a textField.
I've looked around on how to make it but it always seems that you have to go into the iPhone/ipad settings and add the custom keyboard.
eg. This StackOverFlow Question
and they all seem to point to this same tutorial..
iOS 8: Creating a Custom Keyboard
My problem is that I don't want the user to have to go into settings.
So the question is....IS THIS POSSIBLE?
The following pic is what I want to use. I have made this in an XIB file through adding a target keyboard which makes the new folder with KeyboardViewController.swift , info.plist and NumPad.xib. Though i think I'm on the wrong track, can someone point me the right way please.
Also anyone know the exact dimensions this view should be.. assuming what I'm asking is in fact possible. Let me know if I'm not being clear enough!
NumPad.xib(pic)
Many many thanks,
Steve
SOLUTION: Thanks to Andrea for correcting my search keywords. It led me to this Stack Question which hopefully sends some others to the correct end of the internet that have mistakes custom keyboard with custom input views!
Sure it is possible without going into settings, but they are called custom input views.
You should look into inputViews here what Apple says about them Custom views for data input.
Basically when the user press a text field instead of loading the usual keyboard it loads an inputView that you specify, pay attention that custom keyboard term is misleading. If you google for tutorial you'll find most probably link like the ones that you found.
For a practical example check this tutorial or this, is a little bit old, but the principle are still the same

UIAccessibilityTraitAllowsDirectInteraction and VoiceOver: issue or bug in iOS?

This is a quite strange behavior that 'persecutes' me since iOS 7.0 :) I hope someone of you can help me this time! As you probably know when you are using VoiceOver your gestures are totally different from the 'normal way'. When you need to bypass VoiceOver for a specific view you can set its accessibility traits as UIAccessibilityTraitAllowsDirectInteraction. When the view has this parameter set the user can interact with it as usual (like VoiceOver is not active in that particular view).
Quite often happens that this ability is randomly lost so VoiceOver acts in its normal way.
Did anyone of you encounter this problem in its experience? Did he solve it? Fortunately turning off and on VO seems to temporarily solve this issue (until next time it happens again)
Any idea? Thank you very much
I've seen this with other things as well. For example, notifications can be spotty, particular Screen Changed or Content Changed notifications. I believe this happens as a result of turning VoiceOver on and off. For example, if you were to turn VoiceOver on, leave it running, and open your application as a user would, you would never experience these issues.
However, if you use the VoiceOver shortcut. Or interrupt the application, re-install, and restart while using Xcode, you can disrupt the VoiceOver's connection to the application. It doesn't bond correctly. So, simple things like navigation work fine. But advanced features like notifications (and perhaps some of the more complicated traits) don't work.
Essentially, I would classify this as a bug, but a bug that only shows itself when you use VoiceOver in a way that only a developer would use it.

Actual difference between UIAccessibilityLayoutChangedNotification and UIAccessibilityScreenChangedNotification?

I’m trying to ascertain what exactly happens differently when posting a UIAccessibilityLayoutChangedNotification, and a UIAccessibilityScreenChangedNotification. From what I can see, I can use them interchangeably everywhere and nothing different happens.
The Apple documentation simply says to use LayoutChanged when (for example) an element has been hidden or shown, and to use ScreenChanged if the entire screen changes, but I’m interested in what THEY do when I provide this information, and what I should see differently when using one or the other.
Can anyone give a clear explanation of implementation differences between the two?
These two notifications are for dynamic content on views, and communicating these changes to VoiceOver for screenreader users. There is little difference between these two notifications, except for their default behavior, and the silly little "boop beep" for ScreenChange notifications.
In both instances, the argument
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, arg);
Represents a string to be read out, or an on screen element, which VoiceOver will shift its focus to. In the event of dramatic context changes, it is important to send focus to a place that makes sense, or announce that such changes have taken place. Either approach is acceptable from an accessibility point of view, though I prefer approaches that involve the least amount of change possible. In the event of simple layout changes, it is almost always best just to announce the context change, and leave focus where it was. Though sometimes, the element that caused the context change is hidden, and then it is clearly necessary to direct voiceover to highlight new content, because the default behavior in this case is undefined, or perhaps deterministic, but determined by a framework that knows absolutely nothing about your app!
The difference between the two events, given that they both do exactly the same thing, is in their default behavior. If you supply nil to the UIAccessibilityLayoutChangedNotification it is as if you have done nothing. If you supply a nil argument to the UIAccessibilityScreenChangedNotification it will send focus to the first UIObject in your view hierarchy that is marked as an accessibilityElement, once all view hierarchy changes and drawings are complete.
UIAccessibilityLayoutChangedNotification
A good use case example for UIAccessibilityLayoutChangedNotification is for dynamic forms. You want to let users know that, based on decisions they've made in the form, new options are available. For example, if in a form you select that you are a Veteran, additional areas of the form may pop up to provide more input, but these areas may have been hidden to other users who did not care about them. So you could shift focus to these elements after user interaction:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, firstNewFormElement);
Which would shift focus to the provided element, and announce it's accessibilityLabel.
Or just tell them that the new form elements are there:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, #"Veterans form elements available");
Which would leave focus where it is, but VoiceOver would announce "Veterans form elements available".
Note: This particular behavior is bugged on my iPad (8.1.2).
Or finally you could do this:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
Which does absolutely nothing :). Seriously, I don't even think the a11y framework backend cares. This particular line of code is a complete waste!
UIAccessibilityScreenChangedNotification
A good use case example for the UIAccessibilityScreenChangedNotification is customized tabbed browsing situations. When the entire screen, with the exception of your navigation area, changes. You want to let voiceover know that essentially the entire screen changed, but NOT to focus the first element (your first tab) but to focus the first content element.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstNonGlobalNavElement);
Which would play the "boop beep" sound and then shift focus to just beneath your global navigation bar. Or you could do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, #"You're on a new tab");
Which would wait for the new tab to load, play the "beep boop" sound, announce "You're on a new tab" in voiceover, then shift focus to the first element on the screen, then announce the accessibilityLabel for that element. (PHEW! That's a lot! This is jarring for screen reader users. Avoid this scenario, unless absolutely necessary).
And finally you can of course do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
Which is equivalent to:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstA11yElement);
Both of which will play the "beep boop" sound, shift VoiceOver focus to the first element on the screen, and then announce it.
Finally
In a comment somebody mentioned caching, and I occasionally comment in my answer about things the A11y Backend may or may not care about. While it is certainly possible that there is some backend magic happening, I don't believe in either of these circumstances, the back end cares at all. The reason I say this is because:
If you've ever used the UIAccessibilityContainer protocol, you can watch as your container of views gets queried. There is no caching going on. Even the accessibilityElementCount property gets pinged each time VoiceOver changes focus to a new AccessibilityElement within your container. Then it goes through the process of checking which element it is on, asking for the next element, and so on. It is designed at its core to handle dynamic situations. If you were to insert a new element into your container after interaction, it would still go through all of these queries and be just fine about it! Furthermore, if you override the properties of the UIAccessibility protocol, in order to provide dynamic hints and labels, you can also see that these functions get called every time! As such, I believe that the A11y Framework backend gleans ABSOLUTELY ZERO information from these notifications. The only information VoiceOver needs to do its job is it's currently focused Accessibility Element, and this elements Accessibility Container. The notifications are simply there for you to make your app more usable for VoiceOver users.
Imagine if this weren't the case how many times Safari would post these notifications!!!! :)
These particular statements can only be confirmed by someone with backend knowledge of the framework, who works with the code, and should be viewed as conjecture. It could be the case that this is highly version/implementation dependent. Definitely open to discussion on these points! The rest of this post is pretty concrete.
For Your Reference
Most of this comes from experience working with the frameworks, but here is a useful reference if you wish to dig further.
https://developer.apple.com/documentation/uikit/accessibility/uiaccessibility
https://developer.apple.com/documentation/uikit/uiaccessibilitylayoutchangednotification
https://developer.apple.com/documentation/uikit/uiaccessibilityscreenchangednotification
And finally, an open source repo of the silly little app I put together to test all this stuff.
https://github.com/chriscm2006/IOS-A11y-Api-Test
UIAccessibilityScreenChangedNotification is to indicate that the whole screen has changed and VoiceOver should reset.
UIAccessibilityLayoutChangedNotification is to indicate that one or more, but not all, elements on the screen have changed.
when your UI changes dramatically. Usually when a user moves into a different part of your app (navigates to a different screen). VoiceOver notifies the user with a tone, and it clears its caches and does other preparations to deal with a new set of accessibility data. It alerts VoiceOver that the screen has changed and there may be new elements on the screen so VoiceOver will rebuild it's index of accessibility elements.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
If some part of your UI changes, but the user hasn’t necessarily jumped to an entirely different part of your app. (Example: in the iTunes Store app, tapping on the price label ($0.99, etc.) next to a song changes it to a “Buy” button.) This notification tells VoiceOver to re-read the current state of all accessible items that are on-screen, and by doing this it figures out what has changed and informs the user of those changes. It alerts VoiceOver that the layout has changed and that it's current index is out of date because the items on the screen have reordered themselves.
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);

Disable iOS "Perspective Compensation" programmatically?

Maybe I'm just searching for the wrong term, but I've been able to find very little information on this subject, and I think it could be a problem for my app.
A while back, there was an article on the accuracy of the touch screens on iOS devices, and it seemed quite poor compared to other phones. Here is a link a posting about it:
http://forums.macrumors.com/showthread.php?t=1660713
Anyway, many of the commenters referred to "perspective compensation" as a cause for the inaccuracy. Basically, they are saying that iOS intentionally registers touches above the actual point of contact to compensate for the typical viewing angle of the user or for the angle of their finger or something like that. I have found that there is some credibility to that claim myself by doing as one of the commenters suggested and trying to use my iPhone upside down. I did find that it was difficult to touch things in some cases, and I have also noticed this problem in one of the apps I'm developing.
So, in case you want to skip all that rambling above, here is why it's a problem for me:
I am developing an app that is intended to be used by two people at the same time. The iPhone or iPad is placed on a surface between two people who are sitting across from one another, and they are instructed to quickly and accurately touch items on their respective halves of the screen competitively. What the article's comments made me suspect might happen, and what I have also found in practice is that the person using the phone upside down will have trouble touching buttons and dots on their first try. I've also tested slowly with a stylus and found that the touchable area of a button does indeed extend below a button, or above the button for the person using the phone upside down, hence the discrepancy and problem/disadvantage for that person.
So finally, if you want to skip that also, here is my question: Can "perspective compensation"(if that's what it's called) be disabled programmatically, and can it be done for specific views of an app? Have any of you noticed this and dealt with it in an app of yours?
While I have found that "perspective compensation" does seem to be occurring, I have not found any official documentation of it, and therefore have no idea how or if it can be disabled. When I search for "perspective compensation," the only results I find are links to the same article and comments.
I can't help but expect that this may have been asked before or is solvable with a simple checkbox, but perhaps for lack of the correct term to use, I have been unable to find any leads.
Thanks in advance for any of your solutions or suggestions!
This can't be done with the current SDK. All we have access to is the touch location, which is at a single point. Other search terms you might try are "digitizer" or "raw touch data", but there is definitely no check box or simple option.
To implement this, you will have to compensate for the touch location yourself. You'll need to play around with a compensating offset value for the upside-down buttons. Hit testing on views is probably the best place to do this, then your buttons can just respond to events as normal.

Resources