I have freshly added a bunch of key commands (Cmd-..., etc) in the app by implementing
-(NSArray<UIKeyCommand*>*)keyCommands;
on a UIViewController subclass. Everything works wonderfully when manually tested in the app. The problem is how to UI test this in the iOS simulator.
I don't seem to be able invoke these commands using a method on XCUIElement. According to its documentation, there only seems to be one text input method on iOS
- (void)typeText:(NSString *)text;
with no (apparent) way to bless the input character with a key modifier flag (XCUIKeyModifierFlags) for Cmd, Alt, etc keys.
The method
- (void)typeKey:(NSString *)key modifierFlags:(XCUIKeyModifierFlags)flags;
appears to be macOS only. It would be a shame to provide these commands but not be able to test them in our UI testing suite.
Any ideas that can help me make some progress would be hugely welcome.
The keyboard input of the iphone emulator requires some settings. Make sure you currently select the simulator, and then set it in the upper left corner toolbar:
I/O -> Input -> Send Keyboard Input to Device
I/O -> Keyboard -> Connect Hardware Keyboard
Related
I've taken over the work on an iOS app, I've managed to work quite well with it thus far adding new functionality despite not being a trained iOS developer. However I've hit a patch where I simply cannot get the keyboard to show on screen when I tap on a UITextfield, there are areas of the app where it works but any new areas I add this simply will not work. Is there a standard bit of code that controls showing the keyboard when you tap a text field?
Need help
Keyboard opens up automatically unless you forced to not open.
You can check following, See screenshots.
Enable is checked
User Interaction Enable is checked
If you are checking on simulator try “command + k” from keypad
textFieldShouldBeginEditing delegate returns TRUE
-(BOOL)textFieldShouldBeginEditing:(UITextField *)textField{
return TRUE;
}
Are you running on a simulator or a real device?
I'm using Appium on Mac OS with iPhone 5S with operation system 9.2.
When i'm trying to hide the Keyboard with the method:
driver.hidekeyboard();
Nothing happens and the application crash.
Need help please
Thanks
Ohad
If in case your application is crashing on when you are trying to hide the keyboard, then that could be a possible bug you are looking at and to understand the cause of that and get it fixed would be the topmost recommendation. Your statement
driver.hidekeyboard();
is good enough for what you desire out of the execution assuming the driver used is AppiumDriver or its subclass.
Also if you are sure that the keyboard is displayed and you can even manually hide the keyboard, a forced way to do that is as #Gaurav has suggested in the comments using the following code :
driver.navigate().back();
In case of uncertainty of the visibility of keyboard, you can give this a try :
driver.getKeyboard();
driver.hideKeyboard();
Is it possible to perform UI tests on Action Extension targets? I am unable to create a UI testing target with the Action Extension as the "Target to be Tested." I am trying to load the Action Extension from within Safari (or Photos, although Safari/both is prefered)
If I record my interactions I can get as far as:
app.icons["Safari"].tap()
I can then manually add:
XCUIDevice.sharedDevice().pressButton(.Home)
before the generated code, but it doesn't work as expected (the simulator is left on the home screen).
I have also tried:
UIApplication.sharedApplication().openURL(NSURL(string: "https://google.com")!)
but that also doesn't open Safari.
I'm not even sure if I'll be able to interact in an automated with with the Action Extension if it does get launched, but hopefully it'll be possible.
So, it's possible to switch apps with XCUITest, but it's undocumented. If you check out Facebook's WebDriverAgent, they did a header dump and made a helper for launching from the springboard. You can call:
XCUIApplication* safari = [[XCUIApplication alloc] initWithPrivatePath:nil bundleID:#"com.apple.safari"];
[safari launch];
And then interact with Safari just like you do your app. However, I've run into a similar problem where XCUITest won't actually launch the extension itself. Once open (i.e. you tap physically on the extension button while the test is running), the test runner works perfectly, and you can interact with your extension in the same context as your app. However, having the test runner tap to launch the extension does nothing. I've also got an Apple Dev Forum question going on this topic.
Update:
It turns out if you use the app to press the screen at the location of the button, the extension will load and you can interact with it! Note that the API for tapping a coordinate is very wonky. The x, y are multipliers of the frame of the thing that you created the coordinate from. Relevant sample code:
// app is your XCUIApplication
// In this case we are tapping in the horizontal middle and at the y coordinate 603 (this is for a 6+ screen size)
XCUICoordinate* coordinateOfRowThatLaunchesYourExtension = [app coordinateWithNormalizedOffset:CGVectorMake(0.5, 603.0 / 736.0)];
[coordinateOfRowThatLaunchesYourExtension tap];
This will press the button for your extension in the action sheet, after Apple's extension picker has been invoked. For whatever reason / bug in XCUITest simply pressing your app in the action sheet doesn't work:
[app.sheets.staticTexts[#"MyApp"] tap];
Background:
I'm experimenting with ui level testing in iOS 9.0 with XCode GM.
Question:
Is there a command in XCode GM that will allow you to see a 'tree' of accessible elements and their relationships? Something similar to the 'page' command in Appium?
Ideally I would be able to run a command in the debugger that would give me a list of elements available for selection/manipulation. Currently you can use debugDescription on a single XCUIElement but that only gives you info for that element.
Set a break point where you would like to see the tree... in the debugger type:
po print(XCUIApplication().debugDescription)
That prints out everything XCUITesting has access to. You can also just throw that in to your test:
func testTreeExample() {
XCUIApplication().buttons["login"].tap()
print(XCUIApplication().debugDescription)
XCUIApplication().buttons["next"].tap()
print(XCUIApplication().debugDescription)
}
Thta way if you are having trouble finding something you can have it automatically print out what the app sees right after you do something.
This isn't exactly what you're asking for, but Xcode’s Accessibility Inspector makes it much easier to look at your view hierarchy in terms of what elements are accessible via Identifiers. (N.B. It's not the "Label" in IB's Accessibility panel that matters, it's the "Identifier" field.):
In Xcode 7.2, open Xcode->Open Developer Tool->Accessibility Inspector. (You may need to give the app permission to run in System Preferences.) Then launch your iOS app from Xcode and hover over any UI element in the SIMULATOR. You’ll see comprehensive information about the element type, description, hierarchy, etc.
Anytime you record UI actions and the output doesn't look right, use the tool to figure out what accessibility descriptions need to be added, changed, or removed. (I spent a couple days trying to get a deeply embedded UISegmentedControl to change via the UI Test harness, and the problem became obvious once I figured out how to use the Accessibility Inspector tool.)
Thanks to the folks at shinobicontrols.com for the great tip!
I would suggest choosing from the menu bar: Debug > View Debugging > Capture View Hierarchy when running in debug. Not only do you a way of visually representing the views but also the left-side debug navigator shows the hierarchy. This may not be one-for-one with UI Testing's perspective but it can be very helpful. Hope that helps.
The way Appium does this is using Facebook WebdriverAgent.
As far as I can tell, the way they do it, essentially, is starting from the root application element and collecting information about each child, then recursing.
What about http://fbidb.io?
With idb ui describe-all command you get the accessibility information of all the elements on screen (not the entire app) https://fbidb.io/docs/commands#accessibility-info
Put a breakpoint in any of your tests then just do: po XCUIApplication() and that will print out the whole app's accessibility hierarchy in easy to read tree format.
I'm working on the automated UI tests for my app and I'm having trouble when trying to set up the environment for running the tests. The plan is roughly this:
build the application
shutdown simulator if running
erase the simulator to make a clean install
install my app on the simulator
run UIAutomation tests
Everything is working except when the application is launched by instruments to execute the tests, the alert appears to ask if the user allows notifications. This is all as expected, but I can't find the way to get rid of the alert.
Things I have already tried:
creating onAlert as a first thing in my test script, in case it appears before the my alert callback is defined
delay the target by 5 seconds in case the tests actually run even before the UI of the app is visible in the simulator
I also went through all the permutations of the above that can be found on SO, I never get my onAlert callback invoked, no matter what I do. So another thing I tried was:
try dismissing the alert with applescript
The script I wrote:
tell application "System Events"
tell process "iOS Simulator"
set allUIElements to entire contents of window 1
repeat with anElement in allUIElements
try
log anElement
end try
end repeat
end tell
end tell
and it displays:
static text “MyApp” Would Like to Send You Notifications of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
static text Notifications may include alerts, sounds, and icon badges. These can be configured in Settings. of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
UI element 3 of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
Looks like the buttons are placed inside the "UI element 3" but I can't retrieve any elements from inside it, let alone clicking on it. So I checked with Accessibility Manager:
It sits there as one of the children, the other ones are notification title and message. But when I go to that element, it is highlighted and I see this:
It is identified as generic element, it doesn't have any children...
The interesting thing is when I choose the OK button in the Accessibility Inspector, I can actually see it's a child of the window, yet it is never listed:
Can someone please shed some light on what is going on here? How can I press that button with Applescript?
If you are doing automation using Instrument, the you will need to register callback (onAlert) for performing any action on alerts.
But the problem in your case is that the alert appears before your script actually start executing and at that time no callback is registered for alert.
So if the alert can come with a delay of around 10 sec when you start application, then only it can be handled. But this can only be controlled through source code and not by your Automation code.
So only option which is left is you need to manual dismiss the alert once fresh application is installed
I am also facing same problem and found it to be a limitation of the tool
There are too many limitaion of this tool and thats why i shifted to UFT
I had a similar problem. I just wanted position of the last control on the alert. So I came up with following piece of code:
on get_simulator_last_object_rect(simulator_index)
tell application "System Events"
set ProcessList to (unix id of processes whose name is "iOS Simulator")
set myProcessId to item simulator_index of ProcessList
tell window 1 of (processes whose unix id is myProcessId)
-- Forcefully print the UI elements so that Accessibility hierarchy is built
UI elements
-- Then wait precisely to let the Accessibility view hierarchy is ready
delay 0.5
set lowest_label_lowest_position to 0
set _x to 0
set _y to 0
set _width to 0
set _height to 0
repeat with element in UI elements
set {_x, _y} to position of element
set {_width, _height} to size of element
set current_control_lowest_position to _y + _height
if current_control_lowest_position > lowest_label_lowest_position then set lowest_label_lowest_position to current_control_lowest_position - _height / 2
end repeat
return {{_x, _y}, {_width, lowest_label_lowest_position}}
end tell
end tell
end get_simulator_alert_ok_button_position
I have a desktop app to control my actions. I use this apple script in my Desktop app to get the frame of the last control. Now that I have the frame, I create a mouse event and perform click on the frame, after activating the simulator.
Although I have not yet tried, but I am pretty sure that you can create mouse events from apple script and perform click on the frame / center of the frame.
Hope this helps.
Thanks,
RKS