Why Does The iPad Become Nonresponsive After UIGestureRecognizer is Added to UIWindow? - ios

Background: I need to look at every UITouch that happens in an application and see if the UITouch is NOT inside a certain UIImageView.
WildcardGestureRecognizer looked very promising and I tried it. (Code is here):
How to intercept touches events on a MKMapView or UIWebView objects?
It worked great for a sandbox/quickly-created application. However that application didn't reflect the complexity that the actual target project has. The target project has a Table View Controller and more.
After adding the WildcardGestureRecognizer to the more involved iPad application, I see that none of the other controls work after the gesture recognizer is added and one click happens.
Here's some code where I was playing with the idea. Again, the sandbox code does not yet have controls on it (such as a Table View Controller or even a UIButton) to see if they work after adding the gesture recognizer to the UIWindow.
Should I pick something other than the UIWindow to add the gesture recognizer to or am I going to run into the same problem regardless? Is there a better way?
sandbox code: https://github.com/finneycanhelp/GestureKata

You might want to try another approach: create a UIWindow subclass, use that in your XIB and override hitTest:withEvent:. It returns the view that has been "selected" to receive the touch events.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v;
v = [super hitTest:point withEvent:event];
if (v == myImageView) {
// Do something. Maybe return nil to prevent the touch from getting
// sent to the image view.
}
return v;
}
Overriding this method can also be helpful when debugging.

Related

Handling a touch event across multiple subviews

I am new to iOS development. I have a custom drawn view which is composed of multiple subviews covering the target area on screen. Specifically this is a board game like chess where I used a view for each square. The squares are created as subviews on a UIView. There is one UIViewController for this. From what I read, I have to have touchesBegan, touchesEnded etc calls in my UIView to handle these. But none of these functions are getting called. I added these calls on the base view and all the subviews. So -
How do I simulate these touch events in the iOS simulator? A mouse click is not calling the touchesBegan ,touchesEnded calls on any view.
Ideally I would like to handle these in the UIViewController because I want to run the touch through some logic. Is that possible? How do I achieve it?
Please refer THIS
It is tutorial in Apple sample codes it describes how to handle touches very nicely.
Just run the sample code and go through description you will get clear idea how touches work un iOS.
Turns out when I add the view programmatically and not through the storyboard the userInteractionEnabled variable is set to NO by default. After setting it up, I get the touchesEnabled call getting called in the view.
Check this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if(![touch.view isKindOfClass:[yourView class]])
{
}
}
Hope this will help.

iOS - View that handle taps, but let swipes go through to the superview

I have an app with quite a complex UI, there's a big UIView called the cover with a UITableView underneath it. The tableView is configured with a tableHeaderView of the same height as the cover. As the tableView scrolls up, the cover moves up the screen (with various fancy animations) using a UIScrollViewDelegate. To allow users to scroll the tableView by swiping the cover, I've overridden the - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event method to always return false.
I've now added some UIButton views to the cover. I've managed to make them respond to taps by changing the way I've overriden the pointInside method like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL isInside = [_directionsButton pointInside:[_directionsButton convertPoint:point fromView:self] withEvent:event];
return isInside;
}
The only problem now is that if you start a swipe gesture on the button, it's caught by the button and the tableView doesn't scroll. I want to be able to ignore swipe gestures on the button (so really let them pass to the view below).
Normally, I would just make the cover view the tableHeaderView, which seems to handle this kind of behaviour really well. However, I can't do this here, due to some unique animations done on the cover as the table scrolls.
Did you tried identifying the Gestures using Gesture Recognisers and doing action method that is to be called when the specified gesture is detected?
Please check this link. This may help you for that.

Need to block all touches except for specific one in an overlay view for iOS

I am trying to set up a tutorial type class that presents an overlay view and requires actions from the user before continuing. I currently have a hierarchy set up as follows.
UIWindow
|---- UIViewController
| |---- UIViewA (View performing tutorial action on)
| |---- UIViewB
|
|---- UIViewT (tutorial overlay)
|---- CGRect (defined by UIViewA)
During the tutorial, views will get dragged around, new views will be created, etc, which is why I added the tutorial overlay view to the UIWindow. This way the I don't have to mess with the view hierarchy within the view controller as suggested in many places on SO. The purpose of the overlay window is to block all actions, except for the required action expected by the tutorial.
Currently the tutorial overlay view is implemented as follows
#interface ZSOverlayView : UIView <UIGestureRecognizerDelegate>
#property (nonatomic, assign) CGRect activeRegion;
#end
#implementation ZSOverlayView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return !CGRectContainsPoint(_activeRegion, point);
}
#end
Where activeRegion is the CGRect defined by UIViewA. This successfully blocks all unwanted events and gestures from making it through the overlay, outside of the activeRegion. In this specific case UIViewB does not get the event or gestures.
The problem is that I only want a single gesture to make it through, not all of them, for UIViewA. For example if UIViewA has a double tap, pan, and custom gesture, I may only want the double tap to be active at once, or perhaps the custom gesture to be active at once, or perhaps both. The tutorial doesn't know what gestures the view has, so it needs a generic way of passing along the needed ones, and blocking the ones that aren't. Currently none of the gestures are blocked. Even if I have flags in place, which I currently do, that determine what gestures should be able to make it through, I am still running into problems in how to block specific ones, and let others through.
I'm unsure how to proceed because the tutorial overlay is not the delegate of any of the gesture recognizers, nor do I want it to be because by taking over as the delegate the tutorial might remove special conditions specified by the existing delegates.
Any ideas how to proceed to get the functionality I'm looking for?
I don't really like the solution, but the best answer was given by Lyndsey Scott in the comments.
If I'm understanding correctly, you could set the UIGestureRecognizerDelegates then just use conditionals in the delegate methods to specify what to do when the gesture view is the tutorial window vs when the gesture view is the main window.
I would have preferred not to rely on this method since I was trying to have my tutorial library do all of the work but since there hasn't been an answer in a while, I just wanted to throw it out there that this worked.
Have you tried to just use -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ?
If you're wanting to block touch events for something, you can just do it here like:
Prevent touch events for a view
Call a helper method to determine which view(s) can be touched
etc.
Edit: Better late then never. I actually ran into needing to do this (again...) and (yet again) found the answer I was referring to in the comments below =]
Anyways, using touchesBegan, you could just do this ( to obtain all gesture recognizers who are receiving the type of touch ( or touches ) you are looking for:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// In this example, I'm only actually looking for (and caring) about a single touch.
if( [touches count] == 1 ) {
UITouch *touch = (UITouch *)[touches anyObject];
// So here they are now
NSArray *gestureRecognizersForTouch = [touch.gestureRecognizers copy];
// ... whatever else.
}
}
At this point, you can either remove the recognizers, set a value to a property they all have access to in your object, submit a notification, etc. etc. etc.

UIScrollView Subclass never receives touchesBegan message for swipes

I am trying to make a scroll view only scrollable on a certain region. To do this, I am subclassing UIScrollView and overriding touchesBegan (similar to this question).
Here's my (pretty simple) code.
.h
#interface SuppressableScrollView : UIScrollView
#end
.m
#import "SuppressableScrollView.h"
#implementation SuppressableScrollView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesBegan touches=%# event=%#", touches, event);
[super touchesBegan:touches withEvent:event];
}
#end
touchesBegan is only being called for touches that UIScrollView doesn't normally consume (like taps). Any idea how to intercept all of the touches?
I think I am missing a concept somewhere.
I was recently looking into something similar for UITableViews. UITableView is and extension of UIScrollView. In digging around inside it I discovered that there are 4 gesture recognisers attached to the UIScrollView to pick up swipes and other things. I would suggest dump out the gesture recognisers properties to see if any are being automatically created (which I think they are). In which case the only option I can think of is to remove them, but then the scroll view will not respond to gestures.
So perhaps you need to look at those gesture recognisers and the gesture recogniser delegates you can use to see if there is a better place to hook into.
P.S. gesture recognisers will automatically start swallowing events once they recogniser a gesture in progress.
If the frame size is greater than the content size, your touches began method may not fire.
Since its working only for taps, my guess is that the content size of the scroll view is not set properly.

UIView touch handling behavior changed with Xcode 4.2?

I upgraded my iPad to 5.0 a couple days ago, and upgraded Xcode to 4.2 at the same time so I could continue to test my apps. Now I am having problems with touch code in several apps that worked with previous versions of Xcode.
I subclassed UIImageView to add some dragging features by overriding -(void)TouchesBegan and -(void)TouchesMoved. I did not override -(void)TouchesEnded in the subclass, but handled that in the view controller for the view that contains the image view.
I pulled the subclassed UIImageView into a new project for testing, and have narrowed down the issue to the fact that the parent UIView (the template created by Xcode) does not seem to be forwarding touch events to the view controller (also created by Xcode).
If I add this to my subclass:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[self.nextResponder touchesEnded:touches withEvent:event];
}
and this to my parent view's view controller:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ViewController");
}
when I let go of the image I am dragging around the screen, I get the "touches ended event in ImageToDrag", but not the log from the view controller.
However, if I intentionally skip over the view by doing this in the subclassed view:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touches ended event in ImageToDrag");
[[self.nextResponder nextResponder] touchesEnded:touches withEvent:event];
}
then I get both log entries.
The only explanation I can come up with is that for some reason, UIView is consuming the touchesEnded event and not passing it to the view controller.
I have verified that exclusiveTouch is set to NO, and userInteractionEnabled is set to YES on the parent view and the subclassed UIImageView.
I have also tested compiling for iOS 5.0 and iOS 4.2, and deploying the test app to both an iOS 5 iPad and iOS 4.3.1 iPad.
The only way I have been able to get the touch event to the viewController is by skipping over the view and using the double nextResponder in the subclass. Although that method functions, it seems like a hack to me and I'm sure it will come back to bite me later.
Has anybody else seen this behavior? Is there any way for me to find out what the UIView is doing with my touch events?
Thanks,
Dan
I've been trying to track down the a similar issue for the last few hours. Finally managed to solve it with the help of this post
Actually it looks like I just managed to solve it, using the hint from
https://devforums.apple.com/message/519690#519690
Earlier, I just
forwarded the touchesEnded event to self.nextResponder. When I added
touchesBegan, Moved, Cancelled handlers with similar implementations
as the touchesEnded, the event seems to bubble up to the root view
controller.
So I guess on iOS5, views discard touchesEnded events
when they have not seen the relevant touchesBegan.
I didn't need to add Moved/etc., I just forwarded on TouchesBegan, and then TouchesEnded start working again!
Some touch handling did chance in iOS 5.0; especially if you re-link your application against the 5.0 SDK.
There's a section UIView's touch handling methods that says this:
If you override this method without calling super (a common use pattern), you must also override the other methods for handling touch events, if only as stub (empy) implementations.
So if you do one, you need to do them all. I know UIKit started taking steps to make sure this was the case in 5.0.
So I'd start there - override all the methods on your view and see what happens.

Resources