I managed to open the control center of the device, but i cannot identify the buttons, I need the Wi-Fi one more exactly. I tried with the recorder and it's identified as
app.scrollViews.otherElements.scrollViews.otherElements.switches["Wi-Fi"]
but when I try to run the test again, it fails as it does not find the element.
I also tried to find it as other kind of element(buttons or all kinds of bars elements), but nothing works. Also tried to identify it by its label simply using app.buttons["Wi-Fi"] and still no results.
Does anyone know a solution for this?
With Xcode 9 the Control Center is now accessible (the Springboard controls it). Right now it is only possible on a physical device because the Xcode 9 beta simulators don't have a control center. Maybe that will be fixed when Xcode is officially released. For now you have to use a real device.
This test opens the control center and taps on the WiFi Button:
func testSwitchOffWiFi() {
let app = XCUIApplication()
let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard")
app.launch()
// open control center
let coord1 = app.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.99))
let coord2 = app.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.5))
coord1.press(forDuration: 0.1, thenDragTo: coord2)
let wifiButton = springboard.switches["wifi-button"]
wifiButton.tap()
}
That's rather easy done using the setting app...
First you know if the airplane mode icon is present if you query the taskbar.
XCUIElement* airplaneModeIcon = app.windows.otherElements[#"Airplane mode on"];
const bool isAirplaneModeEnabled = airplaneModeIcon.exists;
If after that you realize that you really need to set airplane mode to on or off, you need to launch the settings app.
XCUIApplication* settings = [[XCUIApplication alloc] initWithBundleIdentifier:#"com.apple.Preferences"];
[settings launch];
XCUIElement* airplaneModeCell = settings.tables.cells[#"Airplane Mode"];
// Do what you have to do with the Cell...
Here's an adaptation of joern's answer for iPhones with notch. Also, we don't need to refer to the app under test at all - springboard is enough.
func toggleWiFi() {
let springboard = XCUIApplication(bundleIdentifier:"com.apple.springboard")
// expand control center
let start = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.01))
let end = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.2))
start.press(forDuration: 0.1, thenDragTo: end)
// perform the action
let wifiButton = springboard.switches["wifi-button"]
wifiButton.tap()
// hide control center
let empty = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.1))
empty.tap()
}
}
The control centre is outside the scope of your application under test and therefore cannot be accessed by your UI tests.
To disable wifi, you need to physically disconnect the device from the Internet, as it's not possible to disconnect from wifi programmatically.
Related
I'm trying to change the exposure in my camera app according to certain point of the image.
I'm using the following code that is triggered when the user taps on screen. For now I simply try to expose to the center.
#IBAction func didTap()
{
if captureDevice.isExposurePointOfInterestSupported
{
try! captureDevice.lockForConfiguration()
captureDevice.exposurePointOfInterest = CGPoint(x: 0.5, y: 0.5)
captureDevice.exposureMode = .continuousAutoExposure
captureDevice.unlockForConfiguration()
}
}
But nothing happens.
captureDevice.isExposurePointOfInterestSupported is true. The captureDevice currently is .builtInDualCamera.
This code is in a simple camera test app based on sample code. It shows the live camera image on screen.
Has anyone got exposurePointOfInterest working on iOS 14.4?
What could I be missing?
I actually ran into this issue yesterday. Turns out there's a problem with using exactly (0.5, 0.5). When I use (0.51, 0.51) it works every time đŸ¤·
extension AVCaptureDevice {
func change(_ block: (AVCaptureDevice) -> ()) {
try! self.lockForConfiguration()
block(self)
self.unlockForConfiguration()
}
}
#objc func handleTap() {
device.change {
$0.exposurePointOfInterest = CGPoint(x: 0.51, y: 0.51)
$0.exposureMode = .autoExpose
}
}
Update
It may also be worth noting that, although it's a point specified exposure, the region around that point still has to be large enough to trigger an exposure adjust. Let's call this the trigger region.
From what I understand from my tests, the point (0.5, 0.5) has a special effect on the trigger region's size. Whenever this point is used as the exposurePointOfInterest, the trigger region is rather large, regardless of whether exposureMode is .continuousAutoExpose or .autoExpose.
You can get an idea of the size of this region by using the following code, pointing your phone at a bright area (like a lamp), and seeing how close you have to get until a tap adjusts the exposure. You'll find that the exposure does adjust, but you have to get rather close.
#objc func handleTap() {
device.change {
$0.exposurePointOfInterest = CGPoint(x: 0.5, y: 0.5)
$0.exposureMode = .autoExpose
}
}
Or, you could not use a tap, and just keep the properties exposureMode and exposurePointOfInterest at their default values of .continuousAutoExpose and (0.5, 0.5). Or you could use the native camera app and see when it automatically adjusts the exposure. The results are the same.
Now, if you were to set the exposurePointOfInterest to a value close to but not equal to the midpoint, say (0.51, 0.51), you'll find that the trigger region becomes much, much smaller.
You could also use .continuousAutoExpose and call this only once, and you'll find that the automatic exposure adjustments are a lot more sensitive as the trigger region is a lot smaller:
func viewDidLoad() {
super.viewDidLoad()
device.change {
$0.exposurePointOfInterest = CGPoint(x: 0.51, y: 0.51)
$0.exposureMode = .continuousAutoExpose
}
}
To get an idea of the size of this smaller region, open the native camera app and tap somewhere to focus/expose at that point. You'll see a small bounding box. That's pretty much the size of the trigger region.
Say you have a tap like so:
#objc func handleTap() {
device.change {
$0.exposurePointOfInterest = CGPoint(x: 0.51, y: 0.51)
$0.exposureMode = .autoExpose
}
}
If nothing happens, the region is not large enough, and you should be able to reproduce the same no-effect in the native camera app when you try to tap to expose at that point.
Side Note
Your didTap() method is setting the default values, so it's essentially useless.
If you want to adjust exposure on a tap, use .autoExpose if the point is always the same. Don't use .continuousAutoExpose cuz that's gonna be adjusting exposure all the time, not just on a tap. It only makes sense to do this if the tap will change the point.
Simply put, I wish to have an automated test as part of my UI test suite that can scroll the map. I am not concerned about the location, I just need to move it from its original position.
Why?
Two reasons:
The UI updates once the user interacts with the map. I wish to validate these changes
While I can easily verify this on a device, I also want to include automated screenshots via fastlane. Having a test perform this makes that possible
What have I tested so far?
I found the following from a related issue and tested without success:
let map = app.maps.element
let start = map.coordinate(withNormalizedOffset: CGVector(dx: 200,
dy: 200))
let end = map.coordinate(withNormalizedOffset: CGVector(dx: 250,
dy: 250))
start.press(forDuration: 0.01, thenDragTo: end)
I can confirm that the map element is correctly set and contains the expected information.
I can also confirm that the coordinates I am using fall within the bounds of the map on the screen. I have also tested with a wide range of other values just in case.
I'm not concerned about how it is moved, or where it is moved to. All I need is to replicate a user moving the map by 1 point.
coordinate(withNormalizedOffset:) works a bit differently. The Vektor is multiplied to the size of your object.
From Apple's docs:
The coordinate’s screen point is computed by adding normalizedOffset
multiplied by the size of the element’s frame to the origin of the
element’s frame.
That means that if you want to start dragging an the center of your map and then drag the map a bit you have to use it like this:
let map = app.maps.element
let start = map.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.5))
let end = map.coordinate(withNormalizedOffset: CGVector(dx: 0.6, dy: 0.6))
start.press(forDuration: 0.01, thenDragTo: end)
This puts the start coordinate at 0.5 * map.frame.width and 0.5 * map.frame.height and the end coordinate at 0.6 * map.frame.width and 0.6 * map.frame.height
When you run the UITest with this you'll see that it drags the map.
With your parameters it puts the start coordinate at 200 * map.frame.width and 200 * map.frame.height which is waaaay outside the screen so no dragging occurs.
Since iOS 11 XCUITest is not able anymore to find hitpoints for UIImages anymore, which results in not being able to tap an image or drag a touch to it by using press(forDuration:thenDragTo:).
There is a workaround for tapping an image which works (using tap on coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))). The same approach does not work for the thenDragTo method, because it expects a XCUIElement.
Does anyone have an idea how to get the thenDragTo method to work (preferably without having to edit production code)?
Thanks in advance
It accepts XCUICoordinate in my tests in Xcode 9.2
extension XCUICoordinate {
open func press(forDuration duration: TimeInterval, thenDragTo otherCoordinate: XCUICoordinate)
}
I am able to use it like this:
let fromCoordinate = contentElement.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.5))
let toCoordinate = fromCoordinate.withOffset(CGVector(dx: 0, dy: 260))
fromCoordinate.press(forDuration: 0.01, thenDragTo: toCoordinate)
I want to use Xcode UI tests with the Fastlane Snapshot to make screenshots of the Cordova app. Basically, as my entire app is just a web view, all the Xcode UI test helper methods become irrelevant, and I just want to tap on specific points, e.g. tap(x: 10, y: 10) should produce a tap at the point {10px; 10px}.
That's probably very simple, but I can't figure out how to do it.
Thanks.
You can tap a specific point with the XCUICoordinate API. Unfortunately you can't just say "tap 10,10" referencing a pixel coordinate. You will need to create the coordinate with a relative offset to an actual view.
We can use the mentioned web view to interact with the relative coordinate.
let app = XCUIApplication()
let webView = app.webViews.element
let coordinate = webView.coordinateWithNormalizedOffset(CGVector(dx: 10, dy: 10))
coordinate.tap()
Side note, but have you tried interacting with the web view directly? I've had a lot of success using app.links["Link title"].tap() or app.staticTexts["A different link title"].tap(). Here's a demo app I put together demonstrating interacting with a web view.
Update: As Michal W. pointed out in the comments, you can now tap a coordinate directly, without worrying about normalizing the offset.
let normalized = webView.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))
let coordinate = normalized.withOffset(CGVector(dx: 10, dy: 10))
coordinate.tap()
Notice that I pass 0,0 to the normalized vector and then the actual point, 10,10, to the second call.
#joe To go a little further off of Joe Masilotti's approach I put mine in an extensionand gave prepositional phrases to the global and local params.
func tapCoordinate(at xCoordinate: Double, and yCoordinate: Double) {
let normalized = app.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))
let coordinate = normalized.withOffset(CGVector(dx: xCoordinate, dy: yCoordinate))
coordinate.tap()
}
By giving the global an identifiable name I can easily understand the instance for example:
tapCoordinate(at x: 100, and y: 200)
I found Laser's answer to work fine with Xcode 11, but made a few tweaks to easily integrate it into my testing.
extension XCUIApplication {
func tapCoordinate(at point: CGPoint) {
let normalized = coordinate(withNormalizedOffset: .zero)
let offset = CGVector(dx: point.x, dy: point.y)
let coordinate = normalized.withOffset(offset)
coordinate.tap()
}
}
Now, when I need to tap on a given location, I just provide a CGPoint and call this against my XCUIApplication like so:
let point = CGPoint(x: xCoord, y: yCoord)
app.tapCoordinate(at: point)
<something>.coordinate(withNormalizedOffset: CGVector.zero).withOffset(CGVector(dx:10,dy:60)).tap()
Pass .zero to the normalized vector and then the actual point (10,60)
I have an App with Today Widget. So I would like to perform some UI testing on it.
I found a way to open Today/Notifications panel. It seems easy:
let statusBar = XCUIApplication().statusBars.elementBoundByIndex(0)
statusBar.swipeDown()
But then I can't find a way to do something useful. It is possible to record UI interactions in Today/Notifications panel, but such code can't reproduce my actions.
First you need to open Today View, you can use this way:
let app = XCUIApplication()
// Open Notification Center
let bottomPoint = app.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 2))
app.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0)).press(forDuration: 0.1, thenDragTo: bottomPoint)
// Open Today View
let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard")
springboard.scrollViews.firstMatch.swipeRight()
Then, to access everything you need, just use springboard, for example:
let editButton = springboard.buttons["Edit"]
There's a similar problem testing extensions. I've found that what you must do is tap the element at where it is on the screen rather than the element itself in order to drive the interaction. I haven't tested this with your scenario, but I haven't found anything un-tappable via this method yet.
Here is a Swift example of tapping the "X" button on the Springboard for an app icon, which similarly cannot be tapped via typical interaction:
let iconFrame = icon.frame // App icon on the springboard
let springboardFrame = springboard.frame // The springboard (homescreen)
icon.pressForDuration(1.3) // tap and hold
// Tap the little "X" button at approximately where it is. The X is not exposed directly
springboard.coordinateWithNormalizedOffset(CGVectorMake((iconFrame.minX + 3) / springboardFrame.maxX, (iconFrame.minY + 3) / springboardFrame.maxY)).tap()
By getting the frame of the superview and the subview, you can calculate where on the screen the element should be. Note that coordinateWithNormalizedOffset takes a vector in the range [0,1], not a frame or pixel offset. Tapping the element itself at a coordinate doesn't work, either, so you must tap at the superview / XCUIApplication() layer.
More generalized example:
let myElementFrame = myElement.frame
let appFrame = XCUIApplication().frame
let middleOfElementVector = CGVectorMake(iconFrame.midX / appFrame.maxX, iconFrame.midY / appFrame.maxY)
// Tap element from the app-level at the given coordinate
XCUIApplication().coordinateWithNormalizedOffset(middleOfElementVector).tap()
If you need to access the Springboard layer and go outside your application, you can do so with:
let springboard = XCUIApplication(privateWithPath: nil, bundleID: "com.apple.springboard")
springboard.resolve()
But you'll need to expose some private XCUITest methods with Objective-C:
#interface XCUIApplication (Private) {
- (id)initPrivateWithPath:(id)arg1 bundleID:(id)arg2;
}
#interface XCUIElement (Private) {
- (void) resolve;
}