Performing a multi-finger swipe using XCUITest - ios

I am working with an app that supports swipe gestures with multi-fingers (2 finger down swipe, etc.), and would like to simulate this in XCUITests. I see that XCUIElement contains a bunch of functions like SwipeUp() and SwipeLeft(), however they all see to be for only single fingers. I don't see any other APIs that look like they would allow simulating a two-finger down swipe, for instance.
Does anyone know of a way to do this?

Unfortunately, there isn’t a solution for this until we’re given a specific call for it; the existing gestures are all synchronous so you can’t simulate individual fingers doing different things at the same time.
There are pinch and rotate functions, but they can’t be made to do what you’re looking for.

Related

Custom gestures from raw touch events on iOS

How do I receive raw touch events on iOS for creating my own gestures using my own state machine? I'm looking for the analogue of Android's MotionEvent for multi-touch gestures. I need to be able to draw on the background box, but I do not need OS components on this box.
iOS provides a gesture recognizer for defining custom gestures, and it defines UITouch objects that deliver to the application. However, the former is incapable of the implementing the complexity of my system, particularly at the efficiency I need, and the latter does preprocessing, such as to determine the tapCount. My system must itself decide whether a touch is a tap or not.
Additionally, I need to ascertain for myself whether multiple simultaneously-touching fingers form a gesture or not. I can't have iOS interpreting four-finger gestures input (exclusively) on top of my viewing area, though I could live with iOS interpreting five-finger gestures. It's okay for iOS to preempt control from my view if any finger of the gesture starts outside of the view.
The application is a gesturing alternative to the keyboard, so it would ideally be able to operate in the keyboard area, in case this affects the answer.
Is this possible on iOS? What's the approach? I'd like to write a simple program outputting touch events to prove it can be done. Thanks for your help!
UPDATE: I was largely able to answer the question by playing with the Multitouch Visualizer app. Each tap is registered as it occurs, and gestures of 4 fingers or fewer appear to be received without iOS interfering. iOS takes over for some 5 finger gestures, but only if they satisfy certain (unknown) initial characteristics. I don't know how this is done, but I imagine via UITouch events.

How can I "mock" the UI of an iOS app?

I am trying to take screenshots of my iOS app. Before taking a screenshot, I need to get the app to an appropriate state. To get to an appropriate state, a lot of swiping is required.
This would have been fine if I have actual devices, but I don't. So I need to perform swipes on a simulator using a trackpad. I find this very hard, and sometimes I can't swipe properly so the gesture is not recognised.
I thought of using the UI Testing library to programmatically perform swipes. However, my app is actually a game, and random events happen. Writing code to handle these random events would not be worth the time. It would be best if I am in control of the swiping.
I have also thought of adding buttons on the UI of the app. When they are pressed a swipe is simulated. Then I can just click those buttons instead of swiping with my trackpad, which is way easier. However, these buttons will then appear on the screenshot, which I obviously don't want users to see.
Also note that I can't use a tap gesture recogniser as a replacement for the swipe gesture recognisers, because I need to detect swipes in all four directions and do different things depending on the direction.
Essentially, how can I perform a "swipe" more easily on the simulator? It would be great if I can do this by pressing keys on my keyboard, or maybe there is a feature in Xcode that allows me to do this which I am not aware of?
I suggest you automate the UI test.
Recording a test from live actions is a standard Xcode UI test feature. Fastlane is the icing on the cake to automate the capture of screenshots too.
Fastlane has the tools to automatically run a UI test and capture screenshots in all device resolutions. You can even record the actions by recording a UI test and play it back.
Check it out here:
Fastlane Screenshot
Even if you do not wish to use Fastlane, you can record the gestures in a unit test and have it pause.

Xcode 7 UI Testing not recording swipes

I have a collection view in my app, which is inside a table view cell (yes I know it's weird), and I want to UI Test the scrolling of the collection view. I use the recording button for this, however Xcode identifies the swiping on my collection view as taps. If I manually change the generated UI Test code from [collectionview tap] to [collectionview scrollLeft], it works, but Xcode won't generate the code for swiping automatically.
What could be the problem?
Xcode only recognises a gesture as a swipe if your trajectory with the gesture is fast, straight and true to the (up/down/left/right) direction you are swiping in.
My guess is that this prevents recording drag or tap-and-hold gestures as swipes, since these are unsupported by the recording tool. If you were going for either of those, a tap gesture would be closer.
As a workaround, take note of where you expected a swipe and switch the gesture as you have been doing when your swipes aren't recorded.
I believe you should file a bug with Apple and include a sample project.
It may be hard for the recording system to differentiate between a tap, a long press, and a swipe. While I've seen the recording of tap events to be reliable, I find I'm manually typing any steps for swipes or typeText. Generally I use the UI test recording feature to help with identification of particular elements, which I then work with in code to specify the user interactions and asserts.
If you want to create a sample project on github or somewhere with your collectionView-inside-tableViewCell configuration, I'd be willing to take a look.
EDIT: After trying your example project, I was sometimes able to get Xcode to record swipeLeft and swipeRight events from the first cell. It's not the most elegant approach, but with the trackpad on my MacBook Air, I start a horizontal swipe with one finger without pressing the mouse button, and then press the button with another finger while the first finger is still swiping.
There were some instances when Xcode recorded this simply as a tap, and at least one instance where it recorded a twoFingerTap.

automating touch events ios

For automated testing reasons, I want to generate low-level touch events programmatically. We are using MonkeyTalk, which abstracts to a higher level in many cases such that only taps seem to work properly when the app does not use gesture recognizers. I am trying to fix the swipe functionality, but am having trouble understanding what exactly I need to send to the touchesMoved method (touchesEnded and touchedBegan are relatively straightforward).

Can an iOS View handle gestures and direct touch events at the same time?

I'm new to iOS development and instinctively turn towards handling touch events because my code is cross-platform and this maps more closely to other input devices like a mouse. But obviously for multi-touch, it is neater to just use built-in gesture functionality.
However can one do both - track a single touch directly as a kind of cursor, while also supporting pinch, rotate, etc?
Yes, they can both happen at the same time but they can (and do by default) interact.
The gesture has properties such as cancelsTouchesInView for example which influence which events are sent on to the view. See also delaysTouchesBegan which influences which are sent and when.

Resources