Detect iOS8 Reachability Gesture - ios

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.

There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

Related

iOS 9 event for lightly double tapping home button i.e. Reachability? [duplicate]

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

canPerformAction:with sender method called on tap in ios 9

one thing I observed in iOS 9.0 is that when I tap on button or TableView, canPerformMethod:withSender: method is called with sender as UIButton type. I am using this method to prepare my customized option menu.
I did not observed that in previous iOS. Can anyone see me API changes, because I went through overall changes of iOS, but I did not find above mentioned changes in change log or change history.
Per Apple Documentation,
iOS 3.0 introduced system capabilities for generating motion events,
specifically the motion of shaking the device. The event-handling
methods for these kinds of events are motionBegan:withEvent:,
motionEnded:withEvent:, and motionCancelled:withEvent:. Additionally
for iOS 3.0, the canPerformAction:withSender: method allows responders
to validate commands in the user interface while the undoManager
property returns the nearest NSUndoManager object in the responder
chain.
So, all the UIResponder sub classes are entitled to receive a call back for canPerformAction:withSender:. You should use sender parameter to do the handling in this method.

How can I detect from iOS Keyboard extension if user scrolled up the Control Center?

I develop an iOS Keyboard extension, and I'm using scroll gestures on keyboard. Sometimes when using the keyboard I scroll up the control center and my keyboard stops working fine. Is there any way to detect if control center become visible, or invisible?
You can't do it directly. The most you can know is that your app was deactivated and then activated again. It could be because of the control center, it could be because of the notification center, it could be because a phone call came in, it could be because the user went into the app switcher and came back again...
Here is the possible work around you can try:
It is the UIWindow subclass to enable behavior like adaptive round-corners & detecting when Control Center is opened. This UIWindow subclass does probably the thing you want. You simply subscribe to an NSNotification and can react to the user opening Control Center. Detailed instructions and setup on Github
https://github.com/aaronabentheuer/AAWindow
[AAWindow: The way this is accomplished is by using a combination of NSTimer and overwriting sendEvent in UIWindow to receive all touches without blocking them. So you basically receive all touches check if they are near the lower edge of the screen, if yes set a timer for a half a second and if during this timer is running applicationWillResignActive is called you can be almost certain that ControlCenter is opened. The time has to vary if there's no statusbar, because then the app is in fullscreen and it can take the user up to 3 seconds to launch Control Center.]
Hope it would help you figure out the exact solution to your problem.

Intercepting all user touches for an iOS app without overriding sendEvent

I'd like to intercept all touch events occurring in an app without interfering with the app functionality (eg after doing something with the touch event, pass it on to the rightful receiver). Is there any way it can be done other than
overriding sendEvent in the UIApplication class by subclassing it
using method swizzling on UIView to get my own touchesBegan etc. functions being called
putting an UIView on top of all other views in the app, catching all touches and sending them through
so that it could be used as a reliable external framework functionality?
EDIT:
Adding a gesture recognizer to the application window would work - is there any other way (just out of curiosity)?

iOS jailbroken device: intercept and block the swipe from down gesture to call the Control Center

What I need is to intercept and block the swipe from down gesture to call the Control Center while the user is using an app (he isn't in the springboard) to replace them with a new action. When the user is in the springboard everything it must be as default so he must be able to open the control center.
What's method I must hook to intercept the control center call under my condition (the user is not in the springboard)?
Thanks
From browsing iOS 6 headers, it seems SBPanGestureRecognizer inherits from SBGestureRecognizer. SBGestureRecognizer has these methods:
- (void)touchesEnded:(struct __SBGestureContext *)arg1;
- (void)touchesMoved:(struct __SBGestureContext *)arg1;
- (void)touchesBegan:(struct __SBGestureContext *)arg1;
I'd look into hooking one of those.

Resources