iOS - Accessibility methods - moving focus - ios

I am working on an app and trying to make it as accessible as possible. I am trying to move focus to a certain element once an action takes place. I was curious about the difference between these two functions:
UIAccessibilityFocusedElement vs. UIAccessibilityPostNotification
If someone could explain the difference between the two it would be greatly appreciated.

UIAccessibilityPostNotification is used to change things (like focused elements but also pausing and resuming assistive technology like that:
UIAccessibility.post(notification: .pauseAssistiveTechnology, argument: UIAccessibility.AssistiveTechnologyIdentifier.notificationSwitchControl)
UIAccessibility.post(notification: .resumeAssistiveTechnology, argument: UIAccessibility.AssistiveTechnologyIdentifier.notificationSwitchControl)
It can also announce something:
UIAccessibility.post(notification: .announcement, argument: "Say something")
or refresh focus after accessibility scroll
UIAccessibility.post(notification: .pageScrolled, argument: nil)
On the other hand UIAccessibilityFocusedElement can't change anything. It just returns currently focused element (or nil) this way:
UIAccessibility.focusedElement(using: UIAccessibility.AssistiveTechnologyIdentifier.notificationVoiceOver)
On a side note - for now only assistive technology that can be paused or resumed is notificationSwitchControl, trying that with Voice Over causes crashes

If you are trying to move focus to an element based off an actions / screen change scenario.
I think you should probably take a look at:
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, element_to_be_focused>);
Should be posted when a new view appears that encompasses a major portion of the screen.
or
UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, element_to_be_focused);
Should be posted when the layout of a screen changes, for example when an individual element appears or disappears.

Related

Two custom framework using same framwork inside them, after using them in Application cause issue, One of the two will be used. Which one is undefined [duplicate]

I'm using xcode 13 and making a demo on coredata.
objc[6188]: Class _PathPoint is implemented in both
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIKitCore.framework/UIKitCore
(0x114a8fa78) and
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/TextInputUI.framework/TextInputUI
(0x12cd4a8b0). One of the two will be used. Which one is undefined.
objc[6188]: Class _PointQueue is implemented in both
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/UIKitCore.framework/UIKitCore
(0x114a8fa50) and
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/TextInputUI.framework/TextInputUI
(0x12cd4a8d8). One of the two will be used. Which one is undefined.
Apple developer Quinn “The Eskimo!” # Developer Technical Support # Apple answered this question here:
This is not an error per se. Rather, it’s the Objective-C runtime telling you that:
Two frameworks within your process implement the same class (well, in this case classes, namely _PathPoint and _PointQueue).
The runtime will use one of them, choosing it in an unspecified way.
This can be bad but in this case it’s not. Both of the implementations are coming from the system (well, the simulated system) and thus you’d expect them to be in sync and thus it doesn’t matter which one the runtime uses.
So, in this specific case, these log messages are just log noise.
This error usually happens when you add a gesture recognizer inside a view that also has a gesture recognizer of its own. In your case, you may have added the textField as a subview of a view that has some sort of gesture recognizer, or scroll. So when you tap on that textField, it does not know which gesture to trigger. So, look into your implemetation and figure out if the textView is inside another view that has a geture.
In my case, I created a view, which had a gesture recognizer, and this view had a UITextView() as a subview. So when the parent view was tapped and the gestured was used, I would get this error. I solved it by disabling user interaction on my textView.
textView.isUserInteractionEnabled = false
I got this error when using Rosetta to open Xcode 13.1.
After disabling the option "open with Rosetta" those errors were gone.
I was getting the same error. I had hyperlink in textView
--> ex "I’ve read the Privacy Notice ..."
Then I realized that the link I entered into the hyperlink must meet the universal conditions. My hyperlink is;
wrong case --> "https://www.AçıkRıza.com"
true case --> "https://www.acikriza.com"
I was then able to build it successfully.

VoiceOver: UIAccessibilityLayoutChangedNotification not working

Using VoiceOver, UIAccessibility.post(notification: .layoutChanged, argument: someView) just re-announces the currently focused element instead of moving focus and announcing the accessibilityLabel of someView. Even calling UIAccessibility.post(notification: .layoutChanged, argument: "what the heck") does nothing and just re-announces the currently focused element, when it should announce the string passed in as the argument according to the docs. I'm currently running Xcode 11.3.1 on the simulator. Tried on a physical device as well and same problem. Any help would be gladly appreciated :)
Figured out that this post function only actually focuses on the passed in view when running on a physical device, and seems to be broken on simulator.
It may be that the target element someView is not an Accessibility Element. Check if someView.isAccessibilityElement is set to true.

In React Native is there a way to recognize stylus (pen) vs touch (finger) event?

I'm working on the RN application that has one screen with a list of "drawable" areas in it. So this screen should be scrollable AND drawable.
What I'm trying to do - is to find a solution to distinguish touch events coming from fingers (these will be used to scroll and disallow drawing) and stylus via Apple Pencil (these will be used to draw and disallow scrolling).
In both Gesture Responder and PanResponder there are events being passed on each move. Each of those events (alongside with the nativeEvent) contains the type property. However, it's always null for me in both simulator and device.
Is there any way to recognize a move event as a finger vs stylus?
We had a similar requirement for one of our projects, and what we did was to use a Pressable component, to which a handlePress function was passed as prop.
This function accepted the GestureResponderEvent as event callback argument.
By using the event.nativeEvent.altitudeAngle property that was added recently, we were able to detect Apple Pencil touches.
function handlePress(event: GestureResponderEvent) {
//#ts-expect-error React Native Types do not include altitudeAngle
const isPencilTouch = !!event.nativeEvent.altitudeAngle;
...
}

Appium: How to wait for new element with same name as visible one to appear on screen?

We're using Appium with iOS Simulator and test functions written in Java.
We have an iOS App with screen 1 containing a UICollection view, and tell Appium to click on one of its elements.
This opens screen 2 (and the scrolling animation takes about 500 ms), which also contains an UICollection view. I want to find out the size of the UICollection view of the second screen with Appium.
The problem is that Appium is too fast and executes the findElements() method directly after the click, which causes it to find the UICollection view of the first screen.
clickOnElementOnFirstScreen();
webDriver.findElements( By.className( "UIACollectionCell" ) ).size();
// is supposed to find the UICollection view on the second screen,
// but actually finds the UICollection view on the first screen
Appium provides several waiting functions. However as far as I can see all of them are intended to be used in this fashion:
"wait until element at location X / with name X becomes visible"
If I try to use these waiting functions, they don't wait at all because they immediately find the UICollection view of the first screen, which has the same location and name as the one on the second screen.
The only solution I have found is to use Thread.sleep:
Thread.sleep(1000);
webDriver.findElements( By.className( "UIACollectionCell" ) ).size();
But we don't want to use Thread.sleep in code that will run on the client's server on hundreds of tests.
We might be able to modify the App and enter metadata into the views so that Appium is able to distinguish them, but this situation occurs in several places and the App is being programmed by the client, so we want to avoid this too.
What is a simple and safe way to wait for the new screen to appear, without modifying the code of the iOS App?
I have found only dirty workaround for this issue.
static waitFor(Duration duration) {
try {
def WebDriverWait wait = new WebDriverWait(mobileDriver, duration.standardSeconds)
wait.until(visibilityOfElementLocated(By.xpath("//Fail")))
//Wait until false case is visible to ensure proper timeout
} catch (Exception e) {
}
}
Another workaround/solution that has been posted on the Appium forums is:
First search for some other element that distinguishes the 2. screen from the 1. screen; once that is visible, it's safe to search for the originally desired element.

Initial VoiceOver selection

I'm adding VoiceOver support to my app. So far, so good, but I'd really like to be able to specify which element is the first one spoken after a UIAccessibilityScreenChangedNotification. I haven't seen a way to do this. Making something the summary element doesn't really seem to do it. Am I missing something?
This has always been perfectly possible to do.
Just write something along the lines of:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification,
self.myFirstElement);
}
#end
This works for both UIAccessibilityScreenChangedNotification and UIAccessibilityLayoutChangedNotification.
Now for Swift 5
override func viewDidAppear(_ animated: Bool) {
UIAccessibility.post(notification: UIAccessibility.Notification.screenChanged,
argument: myFirstElement)
}
I don't think there is an API value that specifies an order of reading, other than using Summary Element value on startup - it is by design.
So you would have to test the order and default for the UIKit elements or any custom controls, because it depends on your design. You can also mark items as non-accessible elements so they won't be 'read', accessible elements read by default, and containers for accessible elements to allow you to better control your intended interactions. I don't know if making the item selected will help.
I take it you are already using the Accessibility Inspector to test your application before testing on iOS.
If you are needing some background on the subject, Rune's Working With VoiceOver Support and Gemmell's Accessibility for Apps may be worth reading.
What about using UIAccessibilityAnnouncementNotification?
This technique worked for me.
VoiceOver will announce the value of the first element in the accessibleElements array. This can be sorted to suit your needs.

Resources