Since iOS 11 XCUITest is not able anymore to find hitpoints for UIImages anymore, which results in not being able to tap an image or drag a touch to it by using press(forDuration:thenDragTo:).
There is a workaround for tapping an image which works (using tap on coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))). The same approach does not work for the thenDragTo method, because it expects a XCUIElement.
Does anyone have an idea how to get the thenDragTo method to work (preferably without having to edit production code)?
Thanks in advance
It accepts XCUICoordinate in my tests in Xcode 9.2
extension XCUICoordinate {
open func press(forDuration duration: TimeInterval, thenDragTo otherCoordinate: XCUICoordinate)
}
I am able to use it like this:
let fromCoordinate = contentElement.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.5))
let toCoordinate = fromCoordinate.withOffset(CGVector(dx: 0, dy: 260))
fromCoordinate.press(forDuration: 0.01, thenDragTo: toCoordinate)
Related
I want 8 functions to be called at random. I know the functions individually work, but when I try and run them randomly with this code they do not appear in the Simulator...
func randomizeBuildingFunction() {
let wait = SKAction.wait(forDuration: 6)
let randomBuildings = [createBuildingOne, createBuildingTwo, createBuildingThree, createBuildingFour, createBuildingFive, createBuildingSix, createBuildingSeven, createBuildingEight]
let useRandomResult = SKAction.run {
let randomResult = Int(arc4random_uniform(UInt32(randomBuildings.count)))
return randomBuildings[randomResult]()
}
SKAction.repeatForever(SKAction.sequence([useRandomResult, wait]))
}
randomizeBuildingFunction()
This code is in the .didMove function. Here is an example of one of the functions since they are all essentially the same except for slight interval modifications and texture changes.
func createBuildingOne() {
let one = SKSpriteNode(imageNamed: "BuildingOne.png")
one.anchorPoint = CGPoint(x: 0.5, y: 0.5)
one.physicsBody = SKPhysicsBody(rectangleOf: one.size)
one.physicsBody?.affectedByGravity = false
one.physicsBody?.contactTestBitMask = ColliderType.Buildings.rawValue
one.physicsBody?.categoryBitMask = ColliderType.Buildings.rawValue
one.physicsBody?.collisionBitMask = ColliderType.Buildings.rawValue
self.addChild(one)
one.zPosition = 2
one.position = CGPoint(x: self.frame.width * 0.5, y: -self.frame.height * 0.5 + one.size.height / 1.5)
let moveLeft = SKAction.moveBy(x: -self.frame.width - one.size.width, y: 0, duration: 6)
one.run(SKAction.sequence([moveLeft, SKAction.removeFromParent()]))
}
Is there something wrong with my randomizing code at the top? The app builds fine and there are not any errors...
Thanks in advance!
Thanks for the comment #Knight0fDragon! Yes you are right, and after hours of wondering what I was missing, and a little help from a friend of mine who is an Apple Developer, I finally realized that I had to call the run block somehow...
Hence the answer to my question were these 2 little lines of code:
let useFunctionForever = SKAction.repeatForever(SKAction.sequence([useRandomResult, wait]))
run(useFunctionForever)
I had to run the repeatForever action, which would then call and run my block. I had the repeatForever Action already there, but my mistake was expecting it to run automatically when the entire function was called.
So I guess my issue wasn't necessarily with the arc4Random, but with the repeatForever part of my code.
The company that I work for has both an iOS and a tvOS version of an app. The app draws a script at the bottom for the user to keep up during a workout, like so:
This has worked fine for more than a year in any app targeted for iOS. However, when targeting tvOS, with the same drawing code, instead of drawing the way I would expect, the tvOS view has what I'll call "traces" left after drawing each line, as opposed to just having the one line for the current time.
The drawing code is as follows:
func addIndicatorLine(_ context:CGContext?, rect: CGRect, start: CGPoint, color: UIColor = UIColor.white) {
context!.translateBy(x: rect.origin.x, y: rect.height)
context!.scaleBy(x: 1.0, y: -1.0)
context!.setLineWidth(1.0)
context!.setStrokeColor(color.cgColor)
context!.move(to: CGPoint(x: start.x, y: 0))
context!.addLine(to: CGPoint(x: start.x, y: rect.size.height - 5))
context!.strokePath()
context!.scaleBy(x: 1.0, y: -1.0)
context!.translateBy(x: -rect.origin.x, y: -rect.height)
}
Am I missing something obvious, or is there something different I have to do for the tvOS version of the drawing code?
So after a couple of days of trying to figure this out, thanks to one answer (https://stackoverflow.com/a/43898524/2820553) that pointed out a line in the docs to me, I needed to give my view a background color, otherwise drawing errors may occur:
If the view’s opaque property is also set to YES, the backgroundColor property of the view must not be nil or drawing errors may occur.
Simply setting the view's background to a clear color solved this issue. Amazing that it didn't manifest in iOS but did in tvOS. Hope this helps anyone else looking for something similar.
I managed to open the control center of the device, but i cannot identify the buttons, I need the Wi-Fi one more exactly. I tried with the recorder and it's identified as
app.scrollViews.otherElements.scrollViews.otherElements.switches["Wi-Fi"]
but when I try to run the test again, it fails as it does not find the element.
I also tried to find it as other kind of element(buttons or all kinds of bars elements), but nothing works. Also tried to identify it by its label simply using app.buttons["Wi-Fi"] and still no results.
Does anyone know a solution for this?
With Xcode 9 the Control Center is now accessible (the Springboard controls it). Right now it is only possible on a physical device because the Xcode 9 beta simulators don't have a control center. Maybe that will be fixed when Xcode is officially released. For now you have to use a real device.
This test opens the control center and taps on the WiFi Button:
func testSwitchOffWiFi() {
let app = XCUIApplication()
let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard")
app.launch()
// open control center
let coord1 = app.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.99))
let coord2 = app.coordinate(withNormalizedOffset: CGVector(dx: 0.5, dy: 0.5))
coord1.press(forDuration: 0.1, thenDragTo: coord2)
let wifiButton = springboard.switches["wifi-button"]
wifiButton.tap()
}
That's rather easy done using the setting app...
First you know if the airplane mode icon is present if you query the taskbar.
XCUIElement* airplaneModeIcon = app.windows.otherElements[#"Airplane mode on"];
const bool isAirplaneModeEnabled = airplaneModeIcon.exists;
If after that you realize that you really need to set airplane mode to on or off, you need to launch the settings app.
XCUIApplication* settings = [[XCUIApplication alloc] initWithBundleIdentifier:#"com.apple.Preferences"];
[settings launch];
XCUIElement* airplaneModeCell = settings.tables.cells[#"Airplane Mode"];
// Do what you have to do with the Cell...
Here's an adaptation of joern's answer for iPhones with notch. Also, we don't need to refer to the app under test at all - springboard is enough.
func toggleWiFi() {
let springboard = XCUIApplication(bundleIdentifier:"com.apple.springboard")
// expand control center
let start = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.01))
let end = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.2))
start.press(forDuration: 0.1, thenDragTo: end)
// perform the action
let wifiButton = springboard.switches["wifi-button"]
wifiButton.tap()
// hide control center
let empty = springboard.coordinate(withNormalizedOffset: CGVector(dx: 0.9, dy: 0.1))
empty.tap()
}
}
The control centre is outside the scope of your application under test and therefore cannot be accessed by your UI tests.
To disable wifi, you need to physically disconnect the device from the Internet, as it's not possible to disconnect from wifi programmatically.
I'm looking to add an image (shape) back and forth on the screen forever.
I want the image to stop when you tap on the screen.
I found some codes but they are all in objective C and I even tried to make it to swift but couldn't figure it out.
Any help would be appreciated,
If you are using SpriteKit then you can create actions that move the sprite from the right to the left and back forever.
let moveLeft = SKAction.moveTo(CGPoint(x: xPosition, y: yPosition), duration: duration)
let moveRight = SKAction.moveTo(CGPoint(x: xPosition, y: yPosition), duration: duration)
sprite.runAction(SKAction.repeatActionForever(SKAction.sequence([moveLeft, moveRight])))
Then to stop it from moving you can use this code:
sprite.removeAllActions()
I want to use Xcode UI tests with the Fastlane Snapshot to make screenshots of the Cordova app. Basically, as my entire app is just a web view, all the Xcode UI test helper methods become irrelevant, and I just want to tap on specific points, e.g. tap(x: 10, y: 10) should produce a tap at the point {10px; 10px}.
That's probably very simple, but I can't figure out how to do it.
Thanks.
You can tap a specific point with the XCUICoordinate API. Unfortunately you can't just say "tap 10,10" referencing a pixel coordinate. You will need to create the coordinate with a relative offset to an actual view.
We can use the mentioned web view to interact with the relative coordinate.
let app = XCUIApplication()
let webView = app.webViews.element
let coordinate = webView.coordinateWithNormalizedOffset(CGVector(dx: 10, dy: 10))
coordinate.tap()
Side note, but have you tried interacting with the web view directly? I've had a lot of success using app.links["Link title"].tap() or app.staticTexts["A different link title"].tap(). Here's a demo app I put together demonstrating interacting with a web view.
Update: As Michal W. pointed out in the comments, you can now tap a coordinate directly, without worrying about normalizing the offset.
let normalized = webView.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))
let coordinate = normalized.withOffset(CGVector(dx: 10, dy: 10))
coordinate.tap()
Notice that I pass 0,0 to the normalized vector and then the actual point, 10,10, to the second call.
#joe To go a little further off of Joe Masilotti's approach I put mine in an extensionand gave prepositional phrases to the global and local params.
func tapCoordinate(at xCoordinate: Double, and yCoordinate: Double) {
let normalized = app.coordinate(withNormalizedOffset: CGVector(dx: 0, dy: 0))
let coordinate = normalized.withOffset(CGVector(dx: xCoordinate, dy: yCoordinate))
coordinate.tap()
}
By giving the global an identifiable name I can easily understand the instance for example:
tapCoordinate(at x: 100, and y: 200)
I found Laser's answer to work fine with Xcode 11, but made a few tweaks to easily integrate it into my testing.
extension XCUIApplication {
func tapCoordinate(at point: CGPoint) {
let normalized = coordinate(withNormalizedOffset: .zero)
let offset = CGVector(dx: point.x, dy: point.y)
let coordinate = normalized.withOffset(offset)
coordinate.tap()
}
}
Now, when I need to tap on a given location, I just provide a CGPoint and call this against my XCUIApplication like so:
let point = CGPoint(x: xCoord, y: yCoord)
app.tapCoordinate(at: point)
<something>.coordinate(withNormalizedOffset: CGVector.zero).withOffset(CGVector(dx:10,dy:60)).tap()
Pass .zero to the normalized vector and then the actual point (10,60)