I’m trying to ascertain what exactly happens differently when posting a UIAccessibilityLayoutChangedNotification, and a UIAccessibilityScreenChangedNotification. From what I can see, I can use them interchangeably everywhere and nothing different happens.
The Apple documentation simply says to use LayoutChanged when (for example) an element has been hidden or shown, and to use ScreenChanged if the entire screen changes, but I’m interested in what THEY do when I provide this information, and what I should see differently when using one or the other.
Can anyone give a clear explanation of implementation differences between the two?
These two notifications are for dynamic content on views, and communicating these changes to VoiceOver for screenreader users. There is little difference between these two notifications, except for their default behavior, and the silly little "boop beep" for ScreenChange notifications.
In both instances, the argument
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, arg);
Represents a string to be read out, or an on screen element, which VoiceOver will shift its focus to. In the event of dramatic context changes, it is important to send focus to a place that makes sense, or announce that such changes have taken place. Either approach is acceptable from an accessibility point of view, though I prefer approaches that involve the least amount of change possible. In the event of simple layout changes, it is almost always best just to announce the context change, and leave focus where it was. Though sometimes, the element that caused the context change is hidden, and then it is clearly necessary to direct voiceover to highlight new content, because the default behavior in this case is undefined, or perhaps deterministic, but determined by a framework that knows absolutely nothing about your app!
The difference between the two events, given that they both do exactly the same thing, is in their default behavior. If you supply nil to the UIAccessibilityLayoutChangedNotification it is as if you have done nothing. If you supply a nil argument to the UIAccessibilityScreenChangedNotification it will send focus to the first UIObject in your view hierarchy that is marked as an accessibilityElement, once all view hierarchy changes and drawings are complete.
UIAccessibilityLayoutChangedNotification
A good use case example for UIAccessibilityLayoutChangedNotification is for dynamic forms. You want to let users know that, based on decisions they've made in the form, new options are available. For example, if in a form you select that you are a Veteran, additional areas of the form may pop up to provide more input, but these areas may have been hidden to other users who did not care about them. So you could shift focus to these elements after user interaction:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, firstNewFormElement);
Which would shift focus to the provided element, and announce it's accessibilityLabel.
Or just tell them that the new form elements are there:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, #"Veterans form elements available");
Which would leave focus where it is, but VoiceOver would announce "Veterans form elements available".
Note: This particular behavior is bugged on my iPad (8.1.2).
Or finally you could do this:
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
Which does absolutely nothing :). Seriously, I don't even think the a11y framework backend cares. This particular line of code is a complete waste!
UIAccessibilityScreenChangedNotification
A good use case example for the UIAccessibilityScreenChangedNotification is customized tabbed browsing situations. When the entire screen, with the exception of your navigation area, changes. You want to let voiceover know that essentially the entire screen changed, but NOT to focus the first element (your first tab) but to focus the first content element.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstNonGlobalNavElement);
Which would play the "boop beep" sound and then shift focus to just beneath your global navigation bar. Or you could do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, #"You're on a new tab");
Which would wait for the new tab to load, play the "beep boop" sound, announce "You're on a new tab" in voiceover, then shift focus to the first element on the screen, then announce the accessibilityLabel for that element. (PHEW! That's a lot! This is jarring for screen reader users. Avoid this scenario, unless absolutely necessary).
And finally you can of course do this:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
Which is equivalent to:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstA11yElement);
Both of which will play the "beep boop" sound, shift VoiceOver focus to the first element on the screen, and then announce it.
Finally
In a comment somebody mentioned caching, and I occasionally comment in my answer about things the A11y Backend may or may not care about. While it is certainly possible that there is some backend magic happening, I don't believe in either of these circumstances, the back end cares at all. The reason I say this is because:
If you've ever used the UIAccessibilityContainer protocol, you can watch as your container of views gets queried. There is no caching going on. Even the accessibilityElementCount property gets pinged each time VoiceOver changes focus to a new AccessibilityElement within your container. Then it goes through the process of checking which element it is on, asking for the next element, and so on. It is designed at its core to handle dynamic situations. If you were to insert a new element into your container after interaction, it would still go through all of these queries and be just fine about it! Furthermore, if you override the properties of the UIAccessibility protocol, in order to provide dynamic hints and labels, you can also see that these functions get called every time! As such, I believe that the A11y Framework backend gleans ABSOLUTELY ZERO information from these notifications. The only information VoiceOver needs to do its job is it's currently focused Accessibility Element, and this elements Accessibility Container. The notifications are simply there for you to make your app more usable for VoiceOver users.
Imagine if this weren't the case how many times Safari would post these notifications!!!! :)
These particular statements can only be confirmed by someone with backend knowledge of the framework, who works with the code, and should be viewed as conjecture. It could be the case that this is highly version/implementation dependent. Definitely open to discussion on these points! The rest of this post is pretty concrete.
For Your Reference
Most of this comes from experience working with the frameworks, but here is a useful reference if you wish to dig further.
https://developer.apple.com/documentation/uikit/accessibility/uiaccessibility
https://developer.apple.com/documentation/uikit/uiaccessibilitylayoutchangednotification
https://developer.apple.com/documentation/uikit/uiaccessibilityscreenchangednotification
And finally, an open source repo of the silly little app I put together to test all this stuff.
https://github.com/chriscm2006/IOS-A11y-Api-Test
UIAccessibilityScreenChangedNotification is to indicate that the whole screen has changed and VoiceOver should reset.
UIAccessibilityLayoutChangedNotification is to indicate that one or more, but not all, elements on the screen have changed.
when your UI changes dramatically. Usually when a user moves into a different part of your app (navigates to a different screen). VoiceOver notifies the user with a tone, and it clears its caches and does other preparations to deal with a new set of accessibility data. It alerts VoiceOver that the screen has changed and there may be new elements on the screen so VoiceOver will rebuild it's index of accessibility elements.
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);
If some part of your UI changes, but the user hasn’t necessarily jumped to an entirely different part of your app. (Example: in the iTunes Store app, tapping on the price label ($0.99, etc.) next to a song changes it to a “Buy” button.) This notification tells VoiceOver to re-read the current state of all accessible items that are on-screen, and by doing this it figures out what has changed and informs the user of those changes. It alerts VoiceOver that the layout has changed and that it's current index is out of date because the items on the screen have reordered themselves.
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);
Related
I am writing an iOS app (just iOS for now, so I don't need to consider other platforms at the moment) in React Native.
I have a screen in my app that is a ScrollView of items that was retrieved from a server and I'd like to mark each item as "read" as it passes out of the top of the screen for the first time (i.e. mark as read on scroll type functionality). Once an item has been marked as read there you can mark it as unread but through other actions not related to scrolling.
I cannot for the life of me figure out a good way to do this. Ideally the items themselves would be able to detect whether or not they have disappeared off the top of the screen and just update the server that way, but I can't seem to find if that's possible (I easily could have missed something in the docs but I don't think so).
At the moment my solution is calculate how far down the ScrollView is, divide that by the height of each item (which is static for now... I don't know what I'll do when it becomes not static, if ever), and that's how many items I need to mark as read. At that point I do logic to determine if the local item has already been marked as read and if not I update the local item and send an update to the server.
A previous solution was to just update the server on each item, but that seemed like it got out of hand too quickly because you can scroll pretty fast and each item needs to be marked as read accurately.
The server api calls are idempotent, so sending multiple updates for the same item, while not great, is also not the end of the world. Also, I am running this in the emulator on my Mac and I haven't yet tested it with a real device (I have one, but I am still in kind of early stages of development).
I am happy to provide any other information needed!
The onViewableItemsChanged prop will return a list of items who's visibility has changed. Keep in mind that this visibility is decided by the viewabilityConfig prop
https://reactnative.dev/docs/sectionlist#onviewableitemschanged
I'm trying to make my app more accessible and so far the standard accessibility things like labels and hints are doing wonders. I'm hitting a problem however with dynamically updating content that's displayed in a UITableView.
Each row of the table updates every second or so, but if I try to create each cell's accessibilityLabel at this point then I find that there is a problem with the VoiceOver reading out the selected label keeps interrupting itself as the label contents changes so the system just starts reading the label content from the beginning again (actually an odd quirk shows the voice over sometimes works correctly for the first cell that was selected, but upon selecting a new cell this bug returns).
I've tried to see if there's anyway to try and understand whether VoiceOver is currently active but as far as I can see there is only a notification posted when VoiceOver finishes
UIAccessibilityAnnouncementDidFinishNotification
There's no equivalent notification for when VoiceOver begins. So there's no way for my TableViewController to know that VoiceOver is currently active and that it shouldn't update any accessibilityLabels.
I'd hoped I could at least detect that one of my TableView cells was the selected accessibilityElement using the
accessibilityElementIsFocused
method. However in all my testing I've not been able to see this reliably fire for a custom UITableViewCell.
I also tried implementing the getter for accessibilityLabel for my custom cell hoping this may work, but sadly the same behaviour occurs.
The only solution I'm left with is a user configurable frequency for dynamic content accessibility updates, say 5, 10, 20 seconds... which can block me updating my label until I know that the last changed content would have definitely been read out. Actually even this could be interrupted if the user chose to select a cell at say 8 seconds after the last update, 2 seconds in for a 10 second limit and the label would update causing the voice over to restart.
Has anyone any ideas of how best to handle this dynamic updating content? I'm presuming the tableview cells are complicating matters a little, but in general I just don't understand how apple expects you to handle dynamic content. All it needs to solve this is another notification
UIAccessibilityAnnouncementDidStartNotification
Or even better a method to enquire as to whether VoiceOver is currently active. But I don't seem to be able to find any!
Thanks for your time, would really appreciate any tips on this. Cheers!
You want to do two things. First you want to take advantage of the "Updates Frequently" trait. This should improve the behavior of the app when the content is on.
This should help a lot. Then you alse need to provide a way for user to halt the updating content. Independent of whether you do the above, this is an absolute requirement to satisfy WCag 2.0 guideline 2.2.2.
I'm building an iOS with bilingual content, the user will be able to switch between languages at any point and the content will be updated to the selected language. What is the best way to keep track of all of my UIView components to facilitate the language switch?
The options I see are;
Make each element that could possibly change a property of my ViewController;
Give each element a tag and grab the elements with viewWithTag when required;
Throw out the whole container view and rebuild it from scratch.
To be honest none of these 3 options sound ideal, are there others options I haven't thought of? What is considered the best way to keep track of multiple elements?
Thanks,
Edit 3 Mar 2014
More details.
The app will have two versions of all the dynamic content, French and English. Only one language will be displayed at a time, but the user will be able to switch languages at any time and all of the displayed content should be updated.
App description.
The app is part of an instillation and will be run in kiosk mode with a landscape orientation. The left hand 2/3 of the screen will be a horizontal scroll view though which the the user can page though the content. When the user sees content that they wish to investigate further a vertical scrollview will slide up from the bottom and fill the right hand 1/3 of the screen. The user will then be able to page though details on the content. It is this dynamic detail content that I am currently working on changing according to language.
What I'd do is subclass all "basic" components (UILabel, UIButton and others) and make them respond to a global, custom NSNotification sent by your controllers when the user switches a language, with the use of the global notification center ([NSNotificationCenter defaultCenter]). That way, anyone has a chance to update.
In a recent localisation attempt we just made every visible string an outlet and defined them as NSLocalizedString, then we have plist dictionaries for each language. It takes awhile to get all set up, but is relatively easy to maintain once it's there!
Not saying that's the best way, but that's what we did!
https://developer.apple.com/internationalization/
That link may help you! :)
I have a long text on my view, when I tap on it VoiceOver reads the text.
Is there a default behavior to stop VoiceOver reading?
If not, is there a way to do it programmatically? for example when the view receive a tap.
Thanks in advance.
Without knowing the content or the interface it is difficult to give a solid answer to this question but one way to approach this problem it to try to not think of the experiences between a VoiceOver user and any other user as different experiences in the first place.
If you don't want VoiceOver users to repeatedly hear a long string of text you probably are also making the assumption that other users are going to be skipping over it after they have read it once as well.
Consider altering your interface so that the information is only presented once in a flow or is only presented when the user needs it and requests it, like contextual help.
Again, not knowing the interface or the purpose of the text makes it hard to answer this question directly but I generally find that building one interface to be inclusive of everyone often helps to point out that what might be perceived as just an Accessibility concern is actually a broader user experience concern and not just confined to the VoiceOver interface.
I hope that helps a little bit.
This may not be a unique question, but I don't know how to phrase it because my Google skills are insufficient.
I am writing an app framework and I am spending a lot of time writing a system to store the positions and other properties of UIView elements on the screen.
What I want to know is whether something like this already exists, since even though my system works well, I am concerned about memory usage with a large number of elements.
Basically, I subclassed UIControl and added a "state" property that stores position, alpha, colorscheme data and all forms of transforms that apply to a certain state in the interface. This is akin to actors positioned on a stage. When a scene change happens (a button is pressed, or something) the actors (UIViews) know exactly where to be and how to look from stored data.
This means that with a single button press and NSNotification, I can broadcast a simple integer that identifies the state to be in, and all necessary properties will be animated accordingly.
Am I wasting my time? Does something like this already exist. It does not appear to be included in the tools that Apple provides.