Is it possible to intercept the trackpad events from the magic keyboard (with trackpad) in an iOS app and process them myself while hiding the pointer and not sending these events to the UI, or is that not possible?
Think about a game for instance where the user would move their finger around on the trackpad to move a character around on screen, but you wouldn't want to show the pointer circle and the location on the screen doesn't matter or if/where they click, you are just using the change in position as they move their finger on the trackpad.
Thanks
Related
Implementing a sort of 'distress call' button which should work as following:
User starts application and covers a screen with a palm of a hand
Some time passes, user may introduce additional touches during that time or remove some of the existing (but not all of them), location/shape of touches may change
When user releases a hand (i.e. removes last touch) a distress signal is emitted by the app
Basically, the app should register two events: (1) a screen is touched (2) all touched are released
I'm trying to use touchesBegan/touchesEnded methods and those work for small area touches (fingertips) but on touching screen with a full palm or even only palm edge a touchesCancelled gets triggered immediately while hand is still on the screen. Obviously no other events are emitted upon hand release afterwards.
I tried subclassing UIWindow and UIApplication and overriding sendEvent in those but got no additional info - large area touches are triggering touch begin and immediately touch cancel, releasing hand afterwards emits nothing. In some cases large area touches fire no events at all, not even the touchesBegan. Basically, iOS doesn't let me work with a very basic scenario - detecting just the fact of screen touch/release.
Is there any way to query the screen touch state directly and not work with responder chain? Or suppress the cancellation event from firing? Or maybe I'm missing something?
Unfortunately, as of right now, no solution exists
How do we capture mouse events on iPadOS using Swift? EX: Mouse click, scroll, move[x,y position] ,etc..
I 've seen a plenty for macOS but not for iPadOS . Can some one please help throwing some light on how to capture mouse events on iPadOS devices ?The requirement is that I will have to connect mouse to an iPad over bluetooth and I should be able to programatically track the mouse movement, click events and scroll events.
Mouse clicks are passed in via touchesBegan as a UITouch with a type of .indirectPointer. Add UIApplicationSupportsIndirectInputEvents to your Info.plist file to receive these.
Mouse scroll can be detected by adding a UIPanGestureRecognizer with allowedTouchTypes set to an empty array, and maybe allowedScrollTypesMask set to .all. The event information is sent to the target and selector you assign to the gesture recognizer, the gesture state stores the trackpad state, and translation(in view: UIView?) provides the scroll offset.
As far as I'm aware mouse position, and therefore mouse move, cannot be captured directly.
I need to track if the user is touching the screen across multiple views and separate screens throughout my app. I have already created a window delegate method to receive every type of touch event. The problem is that if I keep one finger on the first screen and tap a button with a separate finger that takes me to the second screen, when I release my original finger no event is fired.
This problem is the same if you were to hold your finger on the screen before the app loads and then release it once it's started, no event is fired for the touch ending.
I presume there is some in built system in iOS that states you have to start a new touch once the app has started or the screen/view has changed, so if you release a finger that was already touching nothing happens.
Is there a way to detect this? I really need to keep a constant accurate number of touches on the screen throughout the different screens/views within my app, especially if the user takes their finger(s) off the screen.
I develop an iOS Keyboard extension, and I'm using scroll gestures on keyboard. Sometimes when using the keyboard I scroll up the control center and my keyboard stops working fine. Is there any way to detect if control center become visible, or invisible?
You can't do it directly. The most you can know is that your app was deactivated and then activated again. It could be because of the control center, it could be because of the notification center, it could be because a phone call came in, it could be because the user went into the app switcher and came back again...
Here is the possible work around you can try:
It is the UIWindow subclass to enable behavior like adaptive round-corners & detecting when Control Center is opened. This UIWindow subclass does probably the thing you want. You simply subscribe to an NSNotification and can react to the user opening Control Center. Detailed instructions and setup on Github
https://github.com/aaronabentheuer/AAWindow
[AAWindow: The way this is accomplished is by using a combination of NSTimer and overwriting sendEvent in UIWindow to receive all touches without blocking them. So you basically receive all touches check if they are near the lower edge of the screen, if yes set a timer for a half a second and if during this timer is running applicationWillResignActive is called you can be almost certain that ControlCenter is opened. The time has to vary if there's no statusbar, because then the app is in fullscreen and it can take the user up to 3 seconds to launch Control Center.]
Hope it would help you figure out the exact solution to your problem.
i have a direct x c++ game. i use my own mouse icon and capture the mouse and keyboard when the game initialises but the problem is that if i minimise the game and select another window, for example to skip a song media player when i go back to my game screen the mouse no longer works.
as far as i understand it i need to re capture the mouse handler every time the application gets focus but how do i do this.
can i simply re use the same mouse code from the initialisation and if so where do i put it to make it run when the application regains focus.
fyi my game runs in both windowed mode and full screen would this make a difference.
thank you
First of all, I would advise to use the Windows cursor API instead of drawing the mouse yourself. It will be much more responsive (not suffering from low fps etc.) and it is much easier to handle. You can use animated cursors this way too.
For capturing the mouse, you can only really capture it while a mouse button is pressed. If no button is pressed and the mouse is moved outside the application window, you lose the capture.
Why do you even need to capture the mouse? You get WM_MOUSEMOVE messages etc. when the mouse is not captured.