Overriding sendEvent in custom UIApplication to detect hardware keyboard event - ios

I am developing an iPad app that needs to read input from a hardware keyboard. A primary user will be touching the screen normally while another user controls certain aspects of the app from a nearby Bluetooth keyboard that is paired with the iPad.
Overriding the keyCommands property in UIResponder has worked perfectly until now. But when we moved the app to Cocos2d (which uses its own responder chain) all the keyCommands stuff stopped working.
I tried subclassing UIApplication with an overridden sendEvent: method, something as simple as this:
#import "MyUIApplication.h"
#implementation MyUIApplication // subclass of UIApplication
-(void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
NSLog(#"Event detected");
}
As far as I can tell, this successfully detects all events except for hardware keyboard events, which appear to be totally ignored. Is there some way to detect these events without using keyCommands and UIKeyCommands?

Related

iOS 9 event for lightly double tapping home button i.e. Reachability? [duplicate]

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

How to stop 3d touch its interrupt other touch events?

I have a view with touchesMoved event in current UIViewController , when touch move in screen it will draw something (free draw)
then I add a subView in the view and bind the UITapGestureRecognizer(set numberOfTapsRequired 2) with subview,
If double click is detect I will move the UIImageView to the click position.
When I try to draw again, something wrong is happen, the draw line is not smooth now (some line is not display)
Because of the 3D Touch in iPhone6s and 6s Plus, So I can't detect the tapCount in touchesEnded.What Shall I do?
The best way is stop the 3d touch while your app is launched :
As per the apple documentation you can see this here :Apple Docs
A user can turn off 3D Touch while your app is running, so read this property as part of your implementation of the traitCollectionDidChange: delegate method.
To ensure that all your users can access your app’s features, branch your code depending on whether 3D Touch is available. When it is available, take advantage of 3D Touch capabilities. When it is not available, provide alternatives such as by employing touch and hold, implemented with the UILongPressGestureRecognizer class.
Refer to iOS Human Interface Guidelines for ideas on how to enhance your app’s interactions for users with 3D Touch-capable devices while not leaving your other users behind.
BEST CODE:
#interface ViewController () <UIViewControllerPreviewingDelegate>
#property (nonatomic, strong) UILongPressGestureRecognizer *longPress;
For backward compatibility I’ll also add a long press gesture recogniser here. Should our sample app be run on a device without 3D Touch support, at least the preview can be brought up via a long press gesture.
We’ll check if 3D Touch is available using something like the following method. To complete the setup, I’ve also included a custom initialiser for the long press gesture.
- (void)check3DTouch {
// register for 3D Touch (if available)
if (self.traitCollection.forceTouchCapability == UIForceTouchCapabilityAvailable) {
[self registerForPreviewingWithDelegate:(id)self sourceView:self.view];
NSLog(#"3D Touch is available! Hurra!");
// no need for our alternative anymore
self.longPress.enabled = NO;
} else {
NSLog(#"3D Touch is not available on this device. Sniff!");
// handle a 3D Touch alternative (long gesture recognizer)
self.longPress.enabled = YES;
}
}
- (UILongPressGestureRecognizer *)longPress {
if (!_longPress) {
_longPress = [[UILongPressGestureRecognizer alloc]initWithTarget:self action:#selector(showPeek)];
[self.view addGestureRecognizer:_longPress];
}
return _longPress;
}

Detect iOS8 Reachability Gesture

Any ways to detect the new Reachability gesture of iOS8 in Objective-C?
The gesture is activated double tapping the TouchID button on the iPhone6 and iPhone6Plus.
There are no public APIs for it.
There are two related private API methods on UIApplication I can find (using either of these should get your app rejected from the App Store):
_setReachabilitySupported:, which presumably would en/disable reachability (like Spotlight)
_deactivateReachability, which would return the view to the normal place on the screen
I don't see anything that informs your application that the user has performed the gesture, however.
You could also experiment with subclassing UIWindow and overriding setFrame:. Set a breakpoint in this method, and if it fires when you enable Reachability, you can look at the stack trace for more information.

Intercepting all user touches for an iOS app without overriding sendEvent

I'd like to intercept all touch events occurring in an app without interfering with the app functionality (eg after doing something with the touch event, pass it on to the rightful receiver). Is there any way it can be done other than
overriding sendEvent in the UIApplication class by subclassing it
using method swizzling on UIView to get my own touchesBegan etc. functions being called
putting an UIView on top of all other views in the app, catching all touches and sending them through
so that it could be used as a reliable external framework functionality?
EDIT:
Adding a gesture recognizer to the application window would work - is there any other way (just out of curiosity)?

Detecting touch activity in ios app

I need to store actual NSDate value in AppDelegate when I recognize that user used app.
But how to define that user uses app?
I tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event on my base UIViewController but this only detecting touches on non-clickable areas, so I don't capture it when user clicks button (or any clickable component actually).
By using app I mean pretty much any interaction with an app, like touching the screen when the app is active.
What would be the way of doing that?
You can do this from the AppDelegate. Whenever the app is launched, it will call applicationDidBecomeActive: And with this, you will get be able to tell the user launched the app, or brought it back up.
You can then log this however you want and use it however you want
UPDATE
If you specifically are looking to detect every time someone touches inside of your app, you may want to just make sure that every button click, table scroll, etc calls back to a logging class and does whatever you need. And you can still have that touchesBegan method to get all other touches
The answer is right here:
Listen to all touch events in an iOS app
I overwriten UIWindow class and captured every touch in app using - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event

Resources