How to dismiss the UIActivityViewController during a UI test with Xcode 11 & iOS 13 - ios

Apple has re-designed the share sheet that appears, which has now broken my UI tests.
I have attempted to record a new UI test through Xcode, but as soon as I tap on the dismiss button, the test terminates so I have not been able to capture the event.
Ultimately, I just want to know how I can access the gray 'X' shown with the arrow below:

I have just tested this with Xcode 13 and have found that the original answer no longer works. However, I am keeping it for posterity, or those using previous versions of Xcode.
Xcode 13
I have tested this with Xcode 13.0 and verified it works for iPhone and iPad:
let activityListView = app.otherElements.element(matching: .other,
identifier: "ActivityListView")
XCTAssertTrue(activityListView.waitForExistence(timeout: 2.0))
activityListView.buttons["Close"].tap()
Previous versions
After some trial and error, I was able to locate where my specific elements were with the following:
app.otherElements.element(boundBy: 1).buttons.element(boundBy: 0).tap()
Using app.otherElements.element(boundBy: 1) would identify the share sheet for me. I had attempted to locate it through accessibility identifiers, but I could not find one that worked, including previously valid ones used in iOS 12 and below.
Please note that based on the layout of your screen, the index value
may differ from what I am seeing.
Next, .buttons.element(boundBy: 0).tap() was used to locate the Close button. I again attempted to use identifiers, but could not find anything that represented the button.
When I attempted to discern additional information through the console while testing, I would always wind up crashing the test. This result was surprising, as I was able to query these elements with Xcode 10.
Ultimately, I would like to find working identifier values so that I can have something that works reliably across products, without the trial and error to find the share sheet's index value.
For iPad
The following will dismiss the popover for an iPad:
app.otherElements["PopoverDismissRegion"].tap()

Related

Unexpected number of splitViewControllers when targeting an iPhone

I'm learning to use the split view in an app. I am asking the table to reload the data every time I add new things. My code and settings are seem fine. But not work on iPhone. After I ask someone and he told me to try on an iPad device, it works.
Finally, I find out where the problem is, but I don't know how to fix it. I added a line that prints the size of splitView.viewControllers. The iPad simulator returns 2, which is what I am expecting. But it returns 1 without any changes in the iPhone device. So, what's wrong with it, and how may I solve it? Thank you very much.
enter image description here
enter image description here

Seeing small dashes under Button's title in Xcode 8.1

I was creating a simple app in Xcode 8.1 with swift 3 and I got this problem (picture): dashes under buttons symbols.
How can I fix this issue?
Thanks.
So Nirmit dagly (https://stackoverflow.com/users/3401707/nirmit-dagly) has given the exact solution and it works perfectly.
He says: "You need to check button style on your iPhone device's setting. To check please go to General -> Accessibility -> Button Shapes. If it is enabled, then make it disable and run the app again. It'll hide the underlines from buttons."
I do thank him for his helpful answer and I has republish it here for the benefit of others.
Thank you again Nirmit and I prefer if we can get other independent solutions(not linked to the parameters of the phone user.).

How can you see the XCUIElement tree?

Background:
I'm experimenting with ui level testing in iOS 9.0 with XCode GM.
Question:
Is there a command in XCode GM that will allow you to see a 'tree' of accessible elements and their relationships? Something similar to the 'page' command in Appium?
Ideally I would be able to run a command in the debugger that would give me a list of elements available for selection/manipulation. Currently you can use debugDescription on a single XCUIElement but that only gives you info for that element.
Set a break point where you would like to see the tree... in the debugger type:
po print(XCUIApplication().debugDescription)
That prints out everything XCUITesting has access to. You can also just throw that in to your test:
func testTreeExample() {
XCUIApplication().buttons["login"].tap()
print(XCUIApplication().debugDescription)
XCUIApplication().buttons["next"].tap()
print(XCUIApplication().debugDescription)
}
Thta way if you are having trouble finding something you can have it automatically print out what the app sees right after you do something.
This isn't exactly what you're asking for, but Xcode’s Accessibility Inspector makes it much easier to look at your view hierarchy in terms of what elements are accessible via Identifiers. (N.B. It's not the "Label" in IB's Accessibility panel that matters, it's the "Identifier" field.):
In Xcode 7.2, open Xcode->Open Developer Tool->Accessibility Inspector. (You may need to give the app permission to run in System Preferences.) Then launch your iOS app from Xcode and hover over any UI element in the SIMULATOR. You’ll see comprehensive information about the element type, description, hierarchy, etc.
Anytime you record UI actions and the output doesn't look right, use the tool to figure out what accessibility descriptions need to be added, changed, or removed. (I spent a couple days trying to get a deeply embedded UISegmentedControl to change via the UI Test harness, and the problem became obvious once I figured out how to use the Accessibility Inspector tool.)
Thanks to the folks at shinobicontrols.com for the great tip!
I would suggest choosing from the menu bar: Debug > View Debugging > Capture View Hierarchy when running in debug. Not only do you a way of visually representing the views but also the left-side debug navigator shows the hierarchy. This may not be one-for-one with UI Testing's perspective but it can be very helpful. Hope that helps.
The way Appium does this is using Facebook WebdriverAgent.
As far as I can tell, the way they do it, essentially, is starting from the root application element and collecting information about each child, then recursing.
What about http://fbidb.io?
With idb ui describe-all command you get the accessibility information of all the elements on screen (not the entire app) https://fbidb.io/docs/commands#accessibility-info
Put a breakpoint in any of your tests then just do: po XCUIApplication() and that will print out the whole app's accessibility hierarchy in easy to read tree format.

WatchKit reloadRootControllersWithNames causing error, with pageController or after push/pop

I have a basic watchkit app that loads a page based navigation of 3 interface controllers. This works well, but I'd then like to trigger an action to remove the page-control and essentially revert back to the original InterfaceController that was present when the app first loads.
// load page based control, with 3 views. this works ok
[WKInterfaceController reloadRootControllersWithNames:#[#"pageController1",#"pageController2",#"pageController3"]
contexts:#[#"data1",#"data2",#"data3"]];
// attempt to reload original interface controller, identified by storyboard id
[WKInterfaceController reloadRootControllersWithNames:#[#"myInterfaceController"] contexts:#[#{}]];
The page based navigation remove, the original navigation loads after a short spinner. However it fails to function correctly and original Actions result in this error.
Extension[6766:123665] *********** ERROR
-[SPRemoteInterface _interfaceControllerClientIDForControllerID:] clientIdentifier for interfaceControllerID:(null) not found
Is there a better way to cleanly reload the original InterfaceController?
EDIT, 2/19
It seems there are some other actions that are causing this error too. For instance, if segue to a second InterfaceController and then popController to get back, the error often appears. It is always related to a secondary call to this function.
[WKInterfaceController reloadRootControllersWithNames: contexts:]
EDIT2, 3/18
As previously mentioned, this is reproducible 100% of the time by doing the seguePush, the popController, then attempting to reloadRootControllersWithNames.
If the seguePush/popController is not done beforehand, then the reloadRootControllersWithNames will work fine.
This situation seems to be in addition to the multi->single-multi instance of this bug.
This is actually not a bug because according to Apple:
You cannot combine hierarchical and page-based interface styles. At design time, you must choose the style that best suits your app’s content and design for that style.
So unfortunately, we can't mix Hierarchical and Page-based navigation patterns within the same Watch app.
Just one of many limitations we have to deal with when developing apps for  Watch
This is a bug in WatchKit in Xcode 6.2 Beta 5. Please dupe the following radar on Apple's Bug Reporting System to help raise the priority to get this fixed.
In the meantime, a workaround that I've found can be found on the dev forums. What you can do is add a dummy interface controller to any single interface controller page set so you always have two. This will fix the error until Apple get's the bug fixed (hopefully in Beta 6). Please dupe!
I was able to solve my instance of this problem by not using popController on a pushed view controller. Instead I use a reloadRootControllersWithNames in place of the popController.
How this allows both push and paging, via an example:
Push a view controller
reloadRootControllersWithNames to return to the original controller. (The transition is not quite as animated, but is sufficient)
Create page based view controller.
reloadRootControllersWithNames to return to the original controller
Repeat 1 or 3 as needed.
This eliminates the error at the cost of non-animated popControllers, and allows partial pushing and paging. It would not allow more complex push navigation though.
There may be a better method of navigating to a sub interface controller without a push call, but I'm not aware of it on the watch yet.
None or the answers above worked for me. This problem began when I changed the icon names for the app and the watch app name. I solved it like this:
1) Click on your Watch app Target > Capabilities > make sure app Group
is in ON.
2) Make sure the App Group is selected.
3) Clic on the circled arrow Refresh icon (this will apparently just
refresh this thing if you already had it)
4-Repeat steps 1-3, but for your Watch App EXTENSION target too.
5-Click on the Scheme button (on the right side of the STOP button),
and clic on Edit Schemes.
6-Click Run > Info 7-In executable select your target (Actually it
should already be selecting but opening this window seems to
refresh the option, and wipe the error)
Apparently all these things above are not updated automatically when you change the icon name (Target names) and you have to go to those menus and open them to refresh them manually. Shame on Apple perhaps?

Dismiss alert on initial launch on iOS simulator

I'm working on the automated UI tests for my app and I'm having trouble when trying to set up the environment for running the tests. The plan is roughly this:
build the application
shutdown simulator if running
erase the simulator to make a clean install
install my app on the simulator
run UIAutomation tests
Everything is working except when the application is launched by instruments to execute the tests, the alert appears to ask if the user allows notifications. This is all as expected, but I can't find the way to get rid of the alert.
Things I have already tried:
creating onAlert as a first thing in my test script, in case it appears before the my alert callback is defined
delay the target by 5 seconds in case the tests actually run even before the UI of the app is visible in the simulator
I also went through all the permutations of the above that can be found on SO, I never get my onAlert callback invoked, no matter what I do. So another thing I tried was:
try dismissing the alert with applescript
The script I wrote:
tell application "System Events"
tell process "iOS Simulator"
set allUIElements to entire contents of window 1
repeat with anElement in allUIElements
try
log anElement
end try
end repeat
end tell
end tell
and it displays:
static text “MyApp” Would Like to Send You Notifications of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
static text Notifications may include alerts, sounds, and icon badges. These can be configured in Settings. of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
UI element 3 of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
Looks like the buttons are placed inside the "UI element 3" but I can't retrieve any elements from inside it, let alone clicking on it. So I checked with Accessibility Manager:
It sits there as one of the children, the other ones are notification title and message. But when I go to that element, it is highlighted and I see this:
It is identified as generic element, it doesn't have any children...
The interesting thing is when I choose the OK button in the Accessibility Inspector, I can actually see it's a child of the window, yet it is never listed:
Can someone please shed some light on what is going on here? How can I press that button with Applescript?
If you are doing automation using Instrument, the you will need to register callback (onAlert) for performing any action on alerts.
But the problem in your case is that the alert appears before your script actually start executing and at that time no callback is registered for alert.
So if the alert can come with a delay of around 10 sec when you start application, then only it can be handled. But this can only be controlled through source code and not by your Automation code.
So only option which is left is you need to manual dismiss the alert once fresh application is installed
I am also facing same problem and found it to be a limitation of the tool
There are too many limitaion of this tool and thats why i shifted to UFT
I had a similar problem. I just wanted position of the last control on the alert. So I came up with following piece of code:
on get_simulator_last_object_rect(simulator_index)
tell application "System Events"
set ProcessList to (unix id of processes whose name is "iOS Simulator")
set myProcessId to item simulator_index of ProcessList
tell window 1 of (processes whose unix id is myProcessId)
-- Forcefully print the UI elements so that Accessibility hierarchy is built
UI elements
-- Then wait precisely to let the Accessibility view hierarchy is ready
delay 0.5
set lowest_label_lowest_position to 0
set _x to 0
set _y to 0
set _width to 0
set _height to 0
repeat with element in UI elements
set {_x, _y} to position of element
set {_width, _height} to size of element
set current_control_lowest_position to _y + _height
if current_control_lowest_position > lowest_label_lowest_position then set lowest_label_lowest_position to current_control_lowest_position - _height / 2
end repeat
return {{_x, _y}, {_width, lowest_label_lowest_position}}
end tell
end tell
end get_simulator_alert_ok_button_position
I have a desktop app to control my actions. I use this apple script in my Desktop app to get the frame of the last control. Now that I have the frame, I create a mouse event and perform click on the frame, after activating the simulator.
Although I have not yet tried, but I am pretty sure that you can create mouse events from apple script and perform click on the frame / center of the frame.
Hope this helps.
Thanks,
RKS

Resources