How to capture mouse events on iPadOS - ios

How do we capture mouse events on iPadOS using Swift? EX: Mouse click, scroll, move[x,y position] ,etc..
I 've seen a plenty for macOS but not for iPadOS . Can some one please help throwing some light on how to capture mouse events on iPadOS devices ?The requirement is that I will have to connect mouse to an iPad over bluetooth and I should be able to programatically track the mouse movement, click events and scroll events.

Mouse clicks are passed in via touchesBegan as a UITouch with a type of .indirectPointer. Add UIApplicationSupportsIndirectInputEvents to your Info.plist file to receive these.
Mouse scroll can be detected by adding a UIPanGestureRecognizer with allowedTouchTypes set to an empty array, and maybe allowedScrollTypesMask set to .all. The event information is sent to the target and selector you assign to the gesture recognizer, the gesture state stores the trackpad state, and translation(in view: UIView?) provides the scroll offset.
As far as I'm aware mouse position, and therefore mouse move, cannot be captured directly.

Related

Detect palm touching/releasing the iphone screen

Implementing a sort of 'distress call' button which should work as following:
User starts application and covers a screen with a palm of a hand
Some time passes, user may introduce additional touches during that time or remove some of the existing (but not all of them), location/shape of touches may change
When user releases a hand (i.e. removes last touch) a distress signal is emitted by the app
Basically, the app should register two events: (1) a screen is touched (2) all touched are released
I'm trying to use touchesBegan/touchesEnded methods and those work for small area touches (fingertips) but on touching screen with a full palm or even only palm edge a touchesCancelled gets triggered immediately while hand is still on the screen. Obviously no other events are emitted upon hand release afterwards.
I tried subclassing UIWindow and UIApplication and overriding sendEvent in those but got no additional info - large area touches are triggering touch begin and immediately touch cancel, releasing hand afterwards emits nothing. In some cases large area touches fire no events at all, not even the touchesBegan. Basically, iOS doesn't let me work with a very basic scenario - detecting just the fact of screen touch/release.
Is there any way to query the screen touch state directly and not work with responder chain? Or suppress the cancellation event from firing? Or maybe I'm missing something?
Unfortunately, as of right now, no solution exists

iOS Magic Keyboard APIs

Is it possible to intercept the trackpad events from the magic keyboard (with trackpad) in an iOS app and process them myself while hiding the pointer and not sending these events to the UI, or is that not possible?
Think about a game for instance where the user would move their finger around on the trackpad to move a character around on screen, but you wouldn't want to show the pointer circle and the location on the screen doesn't matter or if/where they click, you are just using the change in position as they move their finger on the trackpad.
Thanks

Custom gestures from raw touch events on iOS

How do I receive raw touch events on iOS for creating my own gestures using my own state machine? I'm looking for the analogue of Android's MotionEvent for multi-touch gestures. I need to be able to draw on the background box, but I do not need OS components on this box.
iOS provides a gesture recognizer for defining custom gestures, and it defines UITouch objects that deliver to the application. However, the former is incapable of the implementing the complexity of my system, particularly at the efficiency I need, and the latter does preprocessing, such as to determine the tapCount. My system must itself decide whether a touch is a tap or not.
Additionally, I need to ascertain for myself whether multiple simultaneously-touching fingers form a gesture or not. I can't have iOS interpreting four-finger gestures input (exclusively) on top of my viewing area, though I could live with iOS interpreting five-finger gestures. It's okay for iOS to preempt control from my view if any finger of the gesture starts outside of the view.
The application is a gesturing alternative to the keyboard, so it would ideally be able to operate in the keyboard area, in case this affects the answer.
Is this possible on iOS? What's the approach? I'd like to write a simple program outputting touch events to prove it can be done. Thanks for your help!
UPDATE: I was largely able to answer the question by playing with the Multitouch Visualizer app. Each tap is registered as it occurs, and gestures of 4 fingers or fewer appear to be received without iOS interfering. iOS takes over for some 5 finger gestures, but only if they satisfy certain (unknown) initial characteristics. I don't know how this is done, but I imagine via UITouch events.

How to implement a draggable scrubber like Podcast app when VoiceOver is running?

Apple's Podcast app has an interesting feature available when VoiceOver is running.
When a user double taps and holds the scrubber, dragging left or right adjusts the scrubber position .
How is this done? I've made my attempts at allowing direct interaction with the scrubber via UIAccessibilityTraitAllowsDirectInteraction , but you lose the ability to drag and scrub to a position if your finger goes outside of the scrubber's bounds.
I found the solution to not set it to UIAccessibilityTraitAllowsDirectInteraction, but UIAccessibilityTraitAdjustable. This also allows an accessibility user to swipe up and down to adjust the scrubber.

2 fingers touch UIWebview, make my app crash

UPDATE
I'm looking for the best way to detect more than one finger on a particular view at time in this case the view is UITextview. I'm not detecting taps or pinching, just the fact that more than one touch is happening. In this case, i set textview.multipleTouchEnabled=NO and then i didn't set gesture anywhere. And i didn't set the zoom in UITextView. And i didn't override some of touches method in that controller.
the problem is, when i touch my app on the device using a touch like pinching, or touch use more than one finger, the app got crashed and no error log in the debugger console. I try to debug this app, but i dont know what i'm looking for.
UPDATE 2
i was wrong about the object that get user touch, i was thought that the object is UITextview, but UIWebview. So, this problem is happen when i touch UIWebview with 2 fingers. I have forwarded the touches event using this suggestion. But i still confuse, how to clear up this crash on my app.
Can somebody help me??
Thank You
Regard,
Risma
Check that text view zoom level is set by default it's min and max zoom is 1.
In pinch gesture you have to set scale factor.

Resources