Detecting touch activity in ios app - ios

I need to store actual NSDate value in AppDelegate when I recognize that user used app.
But how to define that user uses app?
I tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event on my base UIViewController but this only detecting touches on non-clickable areas, so I don't capture it when user clicks button (or any clickable component actually).
By using app I mean pretty much any interaction with an app, like touching the screen when the app is active.
What would be the way of doing that?

You can do this from the AppDelegate. Whenever the app is launched, it will call applicationDidBecomeActive: And with this, you will get be able to tell the user launched the app, or brought it back up.
You can then log this however you want and use it however you want
UPDATE
If you specifically are looking to detect every time someone touches inside of your app, you may want to just make sure that every button click, table scroll, etc calls back to a logging class and does whatever you need. And you can still have that touchesBegan method to get all other touches

The answer is right here:
Listen to all touch events in an iOS app
I overwriten UIWindow class and captured every touch in app using - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event

Related

Retrieve UIPreviewInteraction associated touch

I'm using the new UIPreviewInteraction API and want to know the location where the user lifts up his finger.
Basically the flow that I want to create is:
User 3D touches on something.
Bubbles appear around his finger.
User slides his finger on top of one of them.
Lifts his finger and the app registers his selection.
The UIPreviewInteraction API doesn't have a reference to the UITouch that initiated the interaction.
Is there another way to get it?
Alright, turns out I was going about it the wrong way.
I was trying to receive touch updates in the previewInteraction:didUpdatePreviewTransition:ended: method
But there you'll only receive updates until the 'peek' is happening. After it ends, you no longer receive any updates.
However, if you override the previewInteraction:didUpdateCommitTransition:ended: method you'll continue receiving touch location updates there using this code:
CGPoint touchPoint = [previewInteraction locationInCoordinateSpace:previewInteractionView];
You can find sample code from the Apple WWDC video here

Overriding sendEvent in custom UIApplication to detect hardware keyboard event

I am developing an iPad app that needs to read input from a hardware keyboard. A primary user will be touching the screen normally while another user controls certain aspects of the app from a nearby Bluetooth keyboard that is paired with the iPad.
Overriding the keyCommands property in UIResponder has worked perfectly until now. But when we moved the app to Cocos2d (which uses its own responder chain) all the keyCommands stuff stopped working.
I tried subclassing UIApplication with an overridden sendEvent: method, something as simple as this:
#import "MyUIApplication.h"
#implementation MyUIApplication // subclass of UIApplication
-(void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
NSLog(#"Event detected");
}
As far as I can tell, this successfully detects all events except for hardware keyboard events, which appear to be totally ignored. Is there some way to detect these events without using keyCommands and UIKeyCommands?

Prevent Alert View from cancelling Long Press Gesture

I have a UILongPressGestureRecognizer that detects 3 second presses, in order to get the app into edit mode. Sometimes it can happen that an alertview pops up during that time, due to other things going on in the app. When that happens, the user could abandon the long press, and handle the alert. But it must also be possible for the user to keep pressing, and the app needs to go into edit mode even with the alert still there (the app will then dismiss the alert itself).
However, when the UIAlertView pops up, the OS automatically cancels all ongoing gestures, thereby shortcutting the long press. The GestureRecognizer receives a touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event. By setting breakpoint in XCode I can clearly see that it is caused by the UIAlertView popping up.
Does anybody know whether this can be prevented, or knows a work-around?
This is for iOS-7 if it makes a difference.

Getting touches at launch on iOS

On Mac, one can always get the location of the mouse "outside the event stream" (ie, even if you've not subscribed to any delegate methods for mouseUp: et al) by calling [NSEvent mouseLocation].
Is there any way on iOS to get current touch events without listening to touchesBegan:?
I ask because there is at least one situation in which touchesBegan is not called: at app launch, if a touch is already in progress, touchesBegan is never called. In fact, neither are any of the touch-related UIEvent methods called as near as I can tell (nor UIApplication's / UIWindow's sendEvent:, apparently).
I would like to vary the behavior of my app slightly based on whether a touch is in progress at launch. Is there any way to detect an in-progress touch at app launch?
This cannot be done. The simple reason: The touch events don't belong to your app. Each touch belongs to some UI element (or responder). As you already know, this element gets the began, moved, ended, cancelled messages.
This is even true within a properly programmed app: All events regarding one touch are delivered to the very same object. After all, how would another object know what to do with that event, and how should the first object properly finish its expected behavior?
While you can (or could, but probably shouldn't) find a work around within your app, there's just no way for cross-app-touch passings.
And on the Mac you may query the mouse position, but in normal application flow there'll always be a mouse down before you get a mouse up event.
To be honest, I don't see any reason why this would be needed anyway... oh wait... I could split my app icon into several areas... not sure if it would already break privacy laws, though, if you get to know where the user has his icon on screen.
I think you could simply "extend" the application launch. When I had time consuming tasks during my application launch, I used to show the same splash screen with a UIActivityIndicator while the action was being carried out.
You could simply create a NSTimer, wait for about 2 seconds and during this time, check for touches, while the splash screen will still be showing.
To do this, in applicationDidFinishLaunch, push a ViewController that looks exactaly like the splash screen and check for touches in this ViewController. After those 2 seconds, proceed with normal initialisation. This behaviour also helps if you have time consuming tasks during initialisation.
I know, it`s a workaround, but my guess, is that it is not possible to check for touches because application will be working on the main thread and the touches also processes on the main thread. This could happen also because there are no ViewControllers or UIWindow initialised and ready to listen to touches.
Hope it helps.
You might try handling the hitTest:withEvent: instead.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Since according to apple doc "Returns the farthest descendant of the receiver in the view hierarchy (including itself) that contains a specified point."

How to enable my screen to respond to a UITouch only after the user is prompted with a message in my iOS app?

I need to enable my screen to respond to a user's touch, but only after they are prompted by a message that appears via an MBProgressHUD. I realize that I need to implement the
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
method, but as I said, I need to do this only when the user has been prompted with a message, and not before. Can this be done, and if so, how?
Just use a Boolean flag called userHasRespondedToMessage. Set it to NO until the user has responded to the message. Then, in your touches method, if the boolean is no, just return instead of handling the touch. Or any other way you want to enable or disable tocuhes.
That's depend on your case. But, here is the general way to handle the touch. When, you need to enable/disable the touch in view, you need to enable/disable it like that:
BOOL wantWhatTouch = NO.//or yes depend on you
[self.view setUserInteractionEnabled:wantWhatTouch];
Above code will disable touch on your view.
So, if now you want to enable touch again. Just at the event when you remove/disable your prompt message, just enable it again...Say, you use a button to remove/hide your alert, then just do it:
[self.view setUserInteractionEnabled:YES];
So, you'll find your touch working again.

Resources