Intercepting all user touches for an iOS app without overriding sendEvent - ios

I'd like to intercept all touch events occurring in an app without interfering with the app functionality (eg after doing something with the touch event, pass it on to the rightful receiver). Is there any way it can be done other than
overriding sendEvent in the UIApplication class by subclassing it
using method swizzling on UIView to get my own touchesBegan etc. functions being called
putting an UIView on top of all other views in the app, catching all touches and sending them through
so that it could be used as a reliable external framework functionality?
EDIT:
Adding a gesture recognizer to the application window would work - is there any other way (just out of curiosity)?

Related

canPerformAction:with sender method called on tap in ios 9

one thing I observed in iOS 9.0 is that when I tap on button or TableView, canPerformMethod:withSender: method is called with sender as UIButton type. I am using this method to prepare my customized option menu.
I did not observed that in previous iOS. Can anyone see me API changes, because I went through overall changes of iOS, but I did not find above mentioned changes in change log or change history.
Per Apple Documentation,
iOS 3.0 introduced system capabilities for generating motion events,
specifically the motion of shaking the device. The event-handling
methods for these kinds of events are motionBegan:withEvent:,
motionEnded:withEvent:, and motionCancelled:withEvent:. Additionally
for iOS 3.0, the canPerformAction:withSender: method allows responders
to validate commands in the user interface while the undoManager
property returns the nearest NSUndoManager object in the responder
chain.
So, all the UIResponder sub classes are entitled to receive a call back for canPerformAction:withSender:. You should use sender parameter to do the handling in this method.

How do we use Watchkit Touch Events?

I wanna use touch events in my app. I know gesture recognisers can not be used in watchKit. Is it possible to use functions like touchesBegan, touchesMove etc ?
Apple Watch app uses WatchKit framework. UIKit events are not applicable here.
Alternate is to use forced touch event which triggers Context Menu (if available)
Instead of just tapping items on the screen, pressing the screen with a small amount of force activates the context menu (if any) associated with the current interface controller.
There is no such an api like "touchesBegan" or "touchesMove".
The only thing you can do to respond to a button event is to use IBAction.
A little late to the party, but it's possible to use SceneKit with WatchKit and SceneKit allows you to add gesture handlers. Please see the Apple example project here:
https://developer.apple.com/library/content/samplecode/WatchPuzzle/Introduction/Intro.html#//apple_ref/doc/uid/TP40017284-Intro-DontLinkElementID_2
Edit: Looks like Tap, Swipe, Long Press and Pan gesture handles can be added to any view added to the InterfaceController.

Detect iOS8 Reachability Gesture

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

iOS jailbroken device: intercept and block the swipe from down gesture to call the Control Center

What I need is to intercept and block the swipe from down gesture to call the Control Center while the user is using an app (he isn't in the springboard) to replace them with a new action. When the user is in the springboard everything it must be as default so he must be able to open the control center.
What's method I must hook to intercept the control center call under my condition (the user is not in the springboard)?
Thanks
From browsing iOS 6 headers, it seems SBPanGestureRecognizer inherits from SBGestureRecognizer. SBGestureRecognizer has these methods:
- (void)touchesEnded:(struct __SBGestureContext *)arg1;
- (void)touchesMoved:(struct __SBGestureContext *)arg1;
- (void)touchesBegan:(struct __SBGestureContext *)arg1;
I'd look into hooking one of those.

Getting touches at launch on iOS

On Mac, one can always get the location of the mouse "outside the event stream" (ie, even if you've not subscribed to any delegate methods for mouseUp: et al) by calling [NSEvent mouseLocation].
Is there any way on iOS to get current touch events without listening to touchesBegan:?
I ask because there is at least one situation in which touchesBegan is not called: at app launch, if a touch is already in progress, touchesBegan is never called. In fact, neither are any of the touch-related UIEvent methods called as near as I can tell (nor UIApplication's / UIWindow's sendEvent:, apparently).
I would like to vary the behavior of my app slightly based on whether a touch is in progress at launch. Is there any way to detect an in-progress touch at app launch?
This cannot be done. The simple reason: The touch events don't belong to your app. Each touch belongs to some UI element (or responder). As you already know, this element gets the began, moved, ended, cancelled messages.
This is even true within a properly programmed app: All events regarding one touch are delivered to the very same object. After all, how would another object know what to do with that event, and how should the first object properly finish its expected behavior?
While you can (or could, but probably shouldn't) find a work around within your app, there's just no way for cross-app-touch passings.
And on the Mac you may query the mouse position, but in normal application flow there'll always be a mouse down before you get a mouse up event.
To be honest, I don't see any reason why this would be needed anyway... oh wait... I could split my app icon into several areas... not sure if it would already break privacy laws, though, if you get to know where the user has his icon on screen.
I think you could simply "extend" the application launch. When I had time consuming tasks during my application launch, I used to show the same splash screen with a UIActivityIndicator while the action was being carried out.
You could simply create a NSTimer, wait for about 2 seconds and during this time, check for touches, while the splash screen will still be showing.
To do this, in applicationDidFinishLaunch, push a ViewController that looks exactaly like the splash screen and check for touches in this ViewController. After those 2 seconds, proceed with normal initialisation. This behaviour also helps if you have time consuming tasks during initialisation.
I know, it`s a workaround, but my guess, is that it is not possible to check for touches because application will be working on the main thread and the touches also processes on the main thread. This could happen also because there are no ViewControllers or UIWindow initialised and ready to listen to touches.
Hope it helps.
You might try handling the hitTest:withEvent: instead.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Since according to apple doc "Returns the farthest descendant of the receiver in the view hierarchy (including itself) that contains a specified point."

Resources