how to detect flutter red screen in appium? - appium

so, I am using appium inspector to write a script for mobile testing for a school project.
my problem is that I don't know how to detect this red screen resulted for flutter app.
this image is just example for what I mean :
so I want to write in my scripts whenever the result is this red error screen, fail the test.
but the red screen in appium inspector is just a normal android view with no access to the text displayed, so how can I write automated test for it if I don't have a unique identifier for these screens ?
and these are the visible attributes for me in appium inspector for the shown red screen:
Attribute
Value
elementId
00000000-0000-07a4-0000-006700000011
index
1
package
com.example.sw_code
class
android.view.View
text
resource-id
checkable
false
checked
false
clickable
false
enabled
true
focusable
false
focused
false
long-clickable
false
password
false
scrollable
false
selected
false
bounds
[0,294][1440,2579]
displayed
true

Flutter UI attributes are not visible to Appium's UiAutomator2 or XCUITest drivers that we typically use to automate Android and iOS applications respectively.
For this reason, Appium folks have created a separate driver to test flutter based applications. Check it out here: https://github.com/appium-userland/appium-flutter-driver
Note that this driver is in experimental phase. You will need to weigh pros and cons and go for this driver.
The other option is to use flutter's own flutter driver (link: https://docs.flutter.dev/testing), but know that it needs Dart programming language.

Related

In Appium, is it possible to check that an app's dark mode feature is enabled?

In the app, I am able to go to Settings and turn on the Dark Mode feature.
I want to check that this feature is working and write a test validating that the app went into Dark Mode successfully.
If you know the particular attribute (text or background color, etc) of an element that changes, you can use the Get Element Attribute command
MobileElement element = (MobileElement) driver.findElementByAccessibilityId("SomeAccessibilityID");
String tagName = element.getAttribute("content-desc");

How to use Appium to select the elements from a searchview dropdownlist?

As you can see in this picture, I can't locate the Id or XPath of the details I want to click. I'm using AndroidDriver.
I had tried the following codes:
AndroidElement searchView = androidDriver.FindElementByAndroidUIAutomator("new UiScrollable(new UiSelector()).scrollIntoView(text(\"Tribology Testing\"));");
but I still can't get the Tribology Testing being select.
We cannot help you to come up with the correct element locator without seeing your application layout, screenshot doesn't help at all.
You can see the layout by invoking AndroidDriver.getPageSource() function - it will print all the UI hierarchy of the current screen and you will be able to create the corresponding selector.
Another options are:
Android Device Monitor (part of Android SDK)
Layout Inspector (part of Android Studio)
XPath Feature of the Appium Studio

How can you see the XCUIElement tree?

Background:
I'm experimenting with ui level testing in iOS 9.0 with XCode GM.
Question:
Is there a command in XCode GM that will allow you to see a 'tree' of accessible elements and their relationships? Something similar to the 'page' command in Appium?
Ideally I would be able to run a command in the debugger that would give me a list of elements available for selection/manipulation. Currently you can use debugDescription on a single XCUIElement but that only gives you info for that element.
Set a break point where you would like to see the tree... in the debugger type:
po print(XCUIApplication().debugDescription)
That prints out everything XCUITesting has access to. You can also just throw that in to your test:
func testTreeExample() {
XCUIApplication().buttons["login"].tap()
print(XCUIApplication().debugDescription)
XCUIApplication().buttons["next"].tap()
print(XCUIApplication().debugDescription)
}
Thta way if you are having trouble finding something you can have it automatically print out what the app sees right after you do something.
This isn't exactly what you're asking for, but Xcode’s Accessibility Inspector makes it much easier to look at your view hierarchy in terms of what elements are accessible via Identifiers. (N.B. It's not the "Label" in IB's Accessibility panel that matters, it's the "Identifier" field.):
In Xcode 7.2, open Xcode->Open Developer Tool->Accessibility Inspector. (You may need to give the app permission to run in System Preferences.) Then launch your iOS app from Xcode and hover over any UI element in the SIMULATOR. You’ll see comprehensive information about the element type, description, hierarchy, etc.
Anytime you record UI actions and the output doesn't look right, use the tool to figure out what accessibility descriptions need to be added, changed, or removed. (I spent a couple days trying to get a deeply embedded UISegmentedControl to change via the UI Test harness, and the problem became obvious once I figured out how to use the Accessibility Inspector tool.)
Thanks to the folks at shinobicontrols.com for the great tip!
I would suggest choosing from the menu bar: Debug > View Debugging > Capture View Hierarchy when running in debug. Not only do you a way of visually representing the views but also the left-side debug navigator shows the hierarchy. This may not be one-for-one with UI Testing's perspective but it can be very helpful. Hope that helps.
The way Appium does this is using Facebook WebdriverAgent.
As far as I can tell, the way they do it, essentially, is starting from the root application element and collecting information about each child, then recursing.
What about http://fbidb.io?
With idb ui describe-all command you get the accessibility information of all the elements on screen (not the entire app) https://fbidb.io/docs/commands#accessibility-info
Put a breakpoint in any of your tests then just do: po XCUIApplication() and that will print out the whole app's accessibility hierarchy in easy to read tree format.

Dismiss alert on initial launch on iOS simulator

I'm working on the automated UI tests for my app and I'm having trouble when trying to set up the environment for running the tests. The plan is roughly this:
build the application
shutdown simulator if running
erase the simulator to make a clean install
install my app on the simulator
run UIAutomation tests
Everything is working except when the application is launched by instruments to execute the tests, the alert appears to ask if the user allows notifications. This is all as expected, but I can't find the way to get rid of the alert.
Things I have already tried:
creating onAlert as a first thing in my test script, in case it appears before the my alert callback is defined
delay the target by 5 seconds in case the tests actually run even before the UI of the app is visible in the simulator
I also went through all the permutations of the above that can be found on SO, I never get my onAlert callback invoked, no matter what I do. So another thing I tried was:
try dismissing the alert with applescript
The script I wrote:
tell application "System Events"
tell process "iOS Simulator"
set allUIElements to entire contents of window 1
repeat with anElement in allUIElements
try
log anElement
end try
end repeat
end tell
end tell
and it displays:
static text “MyApp” Would Like to Send You Notifications of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
static text Notifications may include alerts, sounds, and icon badges. These can be configured in Settings. of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
UI element 3 of window iOS Simulator - iPhone 6 - iPhone 6 / iOS 8.1 (12B411) of application process iOS Simulator
Looks like the buttons are placed inside the "UI element 3" but I can't retrieve any elements from inside it, let alone clicking on it. So I checked with Accessibility Manager:
It sits there as one of the children, the other ones are notification title and message. But when I go to that element, it is highlighted and I see this:
It is identified as generic element, it doesn't have any children...
The interesting thing is when I choose the OK button in the Accessibility Inspector, I can actually see it's a child of the window, yet it is never listed:
Can someone please shed some light on what is going on here? How can I press that button with Applescript?
If you are doing automation using Instrument, the you will need to register callback (onAlert) for performing any action on alerts.
But the problem in your case is that the alert appears before your script actually start executing and at that time no callback is registered for alert.
So if the alert can come with a delay of around 10 sec when you start application, then only it can be handled. But this can only be controlled through source code and not by your Automation code.
So only option which is left is you need to manual dismiss the alert once fresh application is installed
I am also facing same problem and found it to be a limitation of the tool
There are too many limitaion of this tool and thats why i shifted to UFT
I had a similar problem. I just wanted position of the last control on the alert. So I came up with following piece of code:
on get_simulator_last_object_rect(simulator_index)
tell application "System Events"
set ProcessList to (unix id of processes whose name is "iOS Simulator")
set myProcessId to item simulator_index of ProcessList
tell window 1 of (processes whose unix id is myProcessId)
-- Forcefully print the UI elements so that Accessibility hierarchy is built
UI elements
-- Then wait precisely to let the Accessibility view hierarchy is ready
delay 0.5
set lowest_label_lowest_position to 0
set _x to 0
set _y to 0
set _width to 0
set _height to 0
repeat with element in UI elements
set {_x, _y} to position of element
set {_width, _height} to size of element
set current_control_lowest_position to _y + _height
if current_control_lowest_position > lowest_label_lowest_position then set lowest_label_lowest_position to current_control_lowest_position - _height / 2
end repeat
return {{_x, _y}, {_width, lowest_label_lowest_position}}
end tell
end tell
end get_simulator_alert_ok_button_position
I have a desktop app to control my actions. I use this apple script in my Desktop app to get the frame of the last control. Now that I have the frame, I create a mouse event and perform click on the frame, after activating the simulator.
Although I have not yet tried, but I am pretty sure that you can create mouse events from apple script and perform click on the frame / center of the frame.
Hope this helps.
Thanks,
RKS

How to click/tap on content drawn using core graphics in appium for iOS

I am new to Appium and use it for iOS native app automation.
Appium version : 1.2.0
Webdriver : Selenium
Testing framework : JUnit
Programming language : Java
I have an iOS application where the contents are drawn using Core Graphics. I need to click/tap on certain content(word) using Appium with the co-ordinates specified.
My question is, how can I tap on the content drawn using CG?
For example, how can I click on "Start" using Appium using co-ordinates where the origin is x=0, y=102 and size is height=305, width=320.
The content is present within this and must be scrolled for other contents.
I tried with .tap, .press of TouchAction(). The test case passed but it has not clicked.
Please help me solve this.
Thanks in advance.
This is the general gist of what you need to do to tap something. (My attempt at a Java translation of my python code)
WebElement application = driver.findElement(By.xpath("//UIAApplication[1]"))
Action action = TouchAction(driver)
action.tap(application, x, y, 1).perform()
When testing drawing/tapping code, I like to point it at a sample drawing app like this one and that lets me see exactly what's going on.

Resources