Is it possible to perform UI tests on Action Extension targets? I am unable to create a UI testing target with the Action Extension as the "Target to be Tested." I am trying to load the Action Extension from within Safari (or Photos, although Safari/both is prefered)
If I record my interactions I can get as far as:
app.icons["Safari"].tap()
I can then manually add:
XCUIDevice.sharedDevice().pressButton(.Home)
before the generated code, but it doesn't work as expected (the simulator is left on the home screen).
I have also tried:
UIApplication.sharedApplication().openURL(NSURL(string: "https://google.com")!)
but that also doesn't open Safari.
I'm not even sure if I'll be able to interact in an automated with with the Action Extension if it does get launched, but hopefully it'll be possible.
So, it's possible to switch apps with XCUITest, but it's undocumented. If you check out Facebook's WebDriverAgent, they did a header dump and made a helper for launching from the springboard. You can call:
XCUIApplication* safari = [[XCUIApplication alloc] initWithPrivatePath:nil bundleID:#"com.apple.safari"];
[safari launch];
And then interact with Safari just like you do your app. However, I've run into a similar problem where XCUITest won't actually launch the extension itself. Once open (i.e. you tap physically on the extension button while the test is running), the test runner works perfectly, and you can interact with your extension in the same context as your app. However, having the test runner tap to launch the extension does nothing. I've also got an Apple Dev Forum question going on this topic.
Update:
It turns out if you use the app to press the screen at the location of the button, the extension will load and you can interact with it! Note that the API for tapping a coordinate is very wonky. The x, y are multipliers of the frame of the thing that you created the coordinate from. Relevant sample code:
// app is your XCUIApplication
// In this case we are tapping in the horizontal middle and at the y coordinate 603 (this is for a 6+ screen size)
XCUICoordinate* coordinateOfRowThatLaunchesYourExtension = [app coordinateWithNormalizedOffset:CGVectorMake(0.5, 603.0 / 736.0)];
[coordinateOfRowThatLaunchesYourExtension tap];
This will press the button for your extension in the action sheet, after Apple's extension picker has been invoked. For whatever reason / bug in XCUITest simply pressing your app in the action sheet doesn't work:
[app.sheets.staticTexts[#"MyApp"] tap];
Related
I have freshly added a bunch of key commands (Cmd-..., etc) in the app by implementing
-(NSArray<UIKeyCommand*>*)keyCommands;
on a UIViewController subclass. Everything works wonderfully when manually tested in the app. The problem is how to UI test this in the iOS simulator.
I don't seem to be able invoke these commands using a method on XCUIElement. According to its documentation, there only seems to be one text input method on iOS
- (void)typeText:(NSString *)text;
with no (apparent) way to bless the input character with a key modifier flag (XCUIKeyModifierFlags) for Cmd, Alt, etc keys.
The method
- (void)typeKey:(NSString *)key modifierFlags:(XCUIKeyModifierFlags)flags;
appears to be macOS only. It would be a shame to provide these commands but not be able to test them in our UI testing suite.
Any ideas that can help me make some progress would be hugely welcome.
The keyboard input of the iphone emulator requires some settings. Make sure you currently select the simulator, and then set it in the upper left corner toolbar:
I/O -> Input -> Send Keyboard Input to Device
I/O -> Keyboard -> Connect Hardware Keyboard
So I'm testing a web app for which I have to open a new tab, switch to it, do some input, than switch back to the first tab and this more than once.
I try to open the new tab this way:
((JavascriptExecutor) AppiumTestBase.getDriver()).executeScript("window.open('http://google.com', '_blank')");
This causes the following alert to appear:
But I'm not able to accept it through automation with Appium. Things that I have tried:
Using the following capabilities: "safariAllowPopups" and "autoAcceptAlerts"
Changing the according settings for safari in the iOS sim
.switchTo().alert().accept(); I also waited for the Popup.
Switchting to native context before accepting the popup
Clicking the pop up by name .findElement(By.name("Allow")).click();
What I have not tried:
Taping on the screen according to the pop-up coordinates. I didn't try this since I'm not sure on how to get the position of the "Allow" button.
And my other question is how would I switch between two tabs? I haven't tried anything yet, but research would suggest that I try it with window contexts.
Some other information:
Currently testing with a iPhoneSE iOS 9.3 Simulator, the solution should work for several configurations
Appium is on the most recent version
The class "AppiumTestBase"s only purpose is to set capabilities and initialize the AppiumDriver
Please try this cap after i change to this no popup from safari anymore
desired_capabilities=automationName:XCUITest,browserName:safari,platformName:iOS,platformVersion:11.1,deviceName:iPhone 6,nativeWebTap:True,safariIngoreFraudWarning:True
I tried the solution suggested by Atthaboon Sanurt, but it didn't help.
There was no alert, but the new window/tab didn't open either.
Here where the problem is reported:
https://github.com/appium/appium/issues/6864
So far it looks like there is no solution and plans to fix it.
I'm getting into UI unit testing, and for a couple days now the UI unit testing refuses to start properly. I setup a simple test to click a button, and when I run it, it hangs starting the app before even starting the test.
Note, it always hangs exactly one minute and then proceeds with the test correctly.
If I delete the app from the Simulator device, or clear the entire Simulator's Content and Settings, then the test runs successfully and instantly on the first run. It hangs each time after that until I delete again. This is not great either, as I end up getting new Location approval prompts each time which might interfere with the app.
What's going on here?
t = 0.00s Start Test
t = 0.00s Set Up
t = 0.00s Launch com.domain.appName
2015-10-06 11:59:24.493 XCTRunner[66707:4085844] Continuing to run tests in the background with task ID 1
t = 0.92s Waiting for accessibility to load
t = 60.92s Wait for app to idle
... rest of test runs immediately
I am also facing this issue but occasionally. Re-attempt or reboot simulator fixes the issue, but temporarily.
The answer at https://forums.developer.apple.com/thread/15780 worked for me:
Pointing the launch screen at a storyboard that is also used for
code, and has connected outlets to a UIViewController subclass. These
outlets can't be resolved by springboard while it generates a launch
image, and it seems to fail over and over, before timing out after 60
seconds.
One solution is to clear out the launch screen setting.
Another solution is to create and add a launch screen to your project by following the instructions at https://developer.apple.com/library/ios/documentation/IDEs/Conceptual/AppDistributionGuide/ConfiguringYourApp/ConfiguringYourApp.html#//apple_ref/doc/uid/TP40012582-CH28-SW4 reproduced below:
Choose File > New > File.
Under iOS, select User Interface.
Click Launch Screen and click Next.
Enter a filename in the Save As text field, and click Create
Configure your launch screen file using basic UIKit views, such as UIImageView and UILabel objects, and uses Auto Layout constraints.
To set the launch screen file
If necessary, open the “App Icons and Launch Images” section of the General pane.
From the Launch Screen File pop-up menu, choose a launch screen file.
I'm working through this tutorial and it works just fine on a simulator, except I don't understand how the methods are being called. The today view widget displays fine but when I add breakpoints to the methods (e.g. ViewDidLoad, widgetPerformUpdateWithCompletionHandler) the breakpoints never seem to be called.
I'm trying to figure this out as I've added extra code - e.g. NSLog to display some values within the methods but do not see any output from the NSLog calls.
Can someone explain why the breakpoints are not working? I'm guessing that it has something to do with the extension methods are executing in the 'background' but am not sure.
Thanks
You are able to have the breakpoints you set in your today extension activate by following the procedure below:
1) Set breakpoint in your extension code (viewDidLoad is a good option to test)
2) Launch your app as you normally would by selecting your app's target and hitting run.
3) Make sure that your extension is installed in the today view (open the today view and hit the edit button to add it if it is not)
4) Close the today view.
5) In Xcode select your today extension target and press the run button. You will be prompted to choose an app to run. Select "Today".
6) You should see the today window appear on the simulator (this also works on the device). Your breakpoint will be hit.
NOTE: You may hit an exception breakpoint in your app prior to the today extension launching because your app is sent to the background. If this happens just skip over the breakpoint and you will hit the breakpoint in your extension as expected. This procedure also allows you to see console statements from your extension.
I'm not sure if this is related to your specific situation, but in an app where I have 3 extensions, I noticed that breakpoints don't work in 2 of them.
I noticed however, that the breakpoints start working if I run (from XCode) the extension on which the breakpoints work, and access one of the other extensions (from inside Photos app in my case). For some reason, running the other extensions from XCode would not trigger the breakpoints.
I'm working on the automated UI tests for my app and I'm having trouble when trying to set up the environment for running the tests. The plan is roughly this:
build the application
shutdown simulator if running
erase the simulator to make a clean install
install my app on the simulator
run UIAutomation tests
Everything is working except when the application is launched by instruments to execute the tests, the alert appears to ask if the user allows notifications. This is all as expected, but I can't find the way to get rid of the alert.
Things I have already tried:
creating onAlert as a first thing in my test script, in case it appears before the my alert callback is defined
delay the target by 5 seconds in case the tests actually run even before the UI of the app is visible in the simulator
I also went through all the permutations of the above that can be found on SO, I never get my onAlert callback invoked, no matter what I do. So another thing I tried was:
try dismissing the alert with applescript
The script I wrote:
tell application "System Events"
tell process "iOS Simulator"
set allUIElements to entire contents of window 1
repeat with anElement in allUIElements
try
log anElement
end try
end repeat
end tell
end tell
and it displays:
static text “MyApp” Would Like to Send You Notifications of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
static text Notifications may include alerts, sounds, and icon badges. These can be configured in Settings. of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
UI element 3 of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
Looks like the buttons are placed inside the "UI element 3" but I can't retrieve any elements from inside it, let alone clicking on it. So I checked with Accessibility Manager:
It sits there as one of the children, the other ones are notification title and message. But when I go to that element, it is highlighted and I see this:
It is identified as generic element, it doesn't have any children...
The interesting thing is when I choose the OK button in the Accessibility Inspector, I can actually see it's a child of the window, yet it is never listed:
Can someone please shed some light on what is going on here? How can I press that button with Applescript?
If you are doing automation using Instrument, the you will need to register callback (onAlert) for performing any action on alerts.
But the problem in your case is that the alert appears before your script actually start executing and at that time no callback is registered for alert.
So if the alert can come with a delay of around 10 sec when you start application, then only it can be handled. But this can only be controlled through source code and not by your Automation code.
So only option which is left is you need to manual dismiss the alert once fresh application is installed
I am also facing same problem and found it to be a limitation of the tool
There are too many limitaion of this tool and thats why i shifted to UFT
I had a similar problem. I just wanted position of the last control on the alert. So I came up with following piece of code:
on get_simulator_last_object_rect(simulator_index)
tell application "System Events"
set ProcessList to (unix id of processes whose name is "iOS Simulator")
set myProcessId to item simulator_index of ProcessList
tell window 1 of (processes whose unix id is myProcessId)
-- Forcefully print the UI elements so that Accessibility hierarchy is built
UI elements
-- Then wait precisely to let the Accessibility view hierarchy is ready
delay 0.5
set lowest_label_lowest_position to 0
set _x to 0
set _y to 0
set _width to 0
set _height to 0
repeat with element in UI elements
set {_x, _y} to position of element
set {_width, _height} to size of element
set current_control_lowest_position to _y + _height
if current_control_lowest_position > lowest_label_lowest_position then set lowest_label_lowest_position to current_control_lowest_position - _height / 2
end repeat
return {{_x, _y}, {_width, lowest_label_lowest_position}}
end tell
end tell
end get_simulator_alert_ok_button_position
I have a desktop app to control my actions. I use this apple script in my Desktop app to get the frame of the last control. Now that I have the frame, I create a mouse event and perform click on the frame, after activating the simulator.
Although I have not yet tried, but I am pretty sure that you can create mouse events from apple script and perform click on the frame / center of the frame.
Hope this helps.
Thanks,
RKS