Custom gestures from raw touch events on iOS - ios

How do I receive raw touch events on iOS for creating my own gestures using my own state machine? I'm looking for the analogue of Android's MotionEvent for multi-touch gestures. I need to be able to draw on the background box, but I do not need OS components on this box.
iOS provides a gesture recognizer for defining custom gestures, and it defines UITouch objects that deliver to the application. However, the former is incapable of the implementing the complexity of my system, particularly at the efficiency I need, and the latter does preprocessing, such as to determine the tapCount. My system must itself decide whether a touch is a tap or not.
Additionally, I need to ascertain for myself whether multiple simultaneously-touching fingers form a gesture or not. I can't have iOS interpreting four-finger gestures input (exclusively) on top of my viewing area, though I could live with iOS interpreting five-finger gestures. It's okay for iOS to preempt control from my view if any finger of the gesture starts outside of the view.
The application is a gesturing alternative to the keyboard, so it would ideally be able to operate in the keyboard area, in case this affects the answer.
Is this possible on iOS? What's the approach? I'd like to write a simple program outputting touch events to prove it can be done. Thanks for your help!
UPDATE: I was largely able to answer the question by playing with the Multitouch Visualizer app. Each tap is registered as it occurs, and gestures of 4 fingers or fewer appear to be received without iOS interfering. iOS takes over for some 5 finger gestures, but only if they satisfy certain (unknown) initial characteristics. I don't know how this is done, but I imagine via UITouch events.

Related

Performing a multi-finger swipe using XCUITest

I am working with an app that supports swipe gestures with multi-fingers (2 finger down swipe, etc.), and would like to simulate this in XCUITests. I see that XCUIElement contains a bunch of functions like SwipeUp() and SwipeLeft(), however they all see to be for only single fingers. I don't see any other APIs that look like they would allow simulating a two-finger down swipe, for instance.
Does anyone know of a way to do this?
Unfortunately, there isn’t a solution for this until we’re given a specific call for it; the existing gestures are all synchronous so you can’t simulate individual fingers doing different things at the same time.
There are pinch and rotate functions, but they can’t be made to do what you’re looking for.

Detect palm touching/releasing the iphone screen

Implementing a sort of 'distress call' button which should work as following:
User starts application and covers a screen with a palm of a hand
Some time passes, user may introduce additional touches during that time or remove some of the existing (but not all of them), location/shape of touches may change
When user releases a hand (i.e. removes last touch) a distress signal is emitted by the app
Basically, the app should register two events: (1) a screen is touched (2) all touched are released
I'm trying to use touchesBegan/touchesEnded methods and those work for small area touches (fingertips) but on touching screen with a full palm or even only palm edge a touchesCancelled gets triggered immediately while hand is still on the screen. Obviously no other events are emitted upon hand release afterwards.
I tried subclassing UIWindow and UIApplication and overriding sendEvent in those but got no additional info - large area touches are triggering touch begin and immediately touch cancel, releasing hand afterwards emits nothing. In some cases large area touches fire no events at all, not even the touchesBegan. Basically, iOS doesn't let me work with a very basic scenario - detecting just the fact of screen touch/release.
Is there any way to query the screen touch state directly and not work with responder chain? Or suppress the cancellation event from firing? Or maybe I'm missing something?
Unfortunately, as of right now, no solution exists

Do iOS apps need to be updated for the iPhone X's 120Hz touch array?

The iPhone X has a 120Hz touch array. Do I need to update my app to support this faster touch array, especially if my app support drawing?
TLDR: No, you don’t need to update your app to support 120Hz touch delivery on iPhone X.
However, if you have an app that benefits from precise touch handling, like a drawing app, you can take advantage of 120Hz touch delivery to improve your user experience. And you may already have for iPad Pro — read on for details.
Apple’s iOS Device Compatibility Reference talks about this a bit, if obliquely. The Touch Input table in that doc shows that iPhone X has a touch sample rate higher than its touch delivery rate, just like the first couple models of iPad Pro. (It’s also like how any iPad Pro gets Apple Pencil touches at 240Hz but delivers events only at 60Hz or 120Hz.)
Further down, it says:
When the capture rate is higher than the delivery rate, multiple events are coalesced into one touch event whose location reflects the most recent touch. However, the additional touch information is available for apps that need more precision.
To get the extra touch information, ask the UIEvent object in your touch handler (touchesBegan, touchesMoved, or touchesEnded) for its coalescedTouches(for:), passing the UITouch you got in your touch handler.
Apple has a couple of articles that go into more detail on coalesced touches:
Getting High-Fidelity Input with Coalesced Touches
Implementing Coalesced Touch Support in an App
Also, if you’re doing anything with coalesced touches, you can probably also benefit from handling predicted touches. They also have a few articles about that, and some sample code that uses both:
Minimizing Latency with Predicted Touches
Incorporating Predicted Touches into an App
In short, if you’ve been optimizing your apps for faster (finger) touch handling and Apple Pencil on iPad Pro, you also benefit from faster touch handling on iPhone X.
If you don’t do anything, you’re just fine — only certain kinds of interaction are really improved by custom touch handling code, like drawing apps. And most likely Apple has optimized a bunch of the system touch handling code, like scroll views, gesture recognizers, the new swipe-to-Home and app switching gestures, etc, so your app would benefit from those for free.

automating touch events ios

For automated testing reasons, I want to generate low-level touch events programmatically. We are using MonkeyTalk, which abstracts to a higher level in many cases such that only taps seem to work properly when the app does not use gesture recognizers. I am trying to fix the swipe functionality, but am having trouble understanding what exactly I need to send to the touchesMoved method (touchesEnded and touchedBegan are relatively straightforward).

Can an iOS View handle gestures and direct touch events at the same time?

I'm new to iOS development and instinctively turn towards handling touch events because my code is cross-platform and this maps more closely to other input devices like a mouse. But obviously for multi-touch, it is neater to just use built-in gesture functionality.
However can one do both - track a single touch directly as a kind of cursor, while also supporting pinch, rotate, etc?
Yes, they can both happen at the same time but they can (and do by default) interact.
The gesture has properties such as cancelsTouchesInView for example which influence which events are sent on to the view. See also delaysTouchesBegan which influences which are sent and when.

Resources