UIScrollView Subclass never receives touchesBegan message for swipes - ios

I am trying to make a scroll view only scrollable on a certain region. To do this, I am subclassing UIScrollView and overriding touchesBegan (similar to this question).
Here's my (pretty simple) code.
.h
#interface SuppressableScrollView : UIScrollView
#end
.m
#import "SuppressableScrollView.h"
#implementation SuppressableScrollView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesBegan touches=%# event=%#", touches, event);
[super touchesBegan:touches withEvent:event];
}
#end
touchesBegan is only being called for touches that UIScrollView doesn't normally consume (like taps). Any idea how to intercept all of the touches?
I think I am missing a concept somewhere.

I was recently looking into something similar for UITableViews. UITableView is and extension of UIScrollView. In digging around inside it I discovered that there are 4 gesture recognisers attached to the UIScrollView to pick up swipes and other things. I would suggest dump out the gesture recognisers properties to see if any are being automatically created (which I think they are). In which case the only option I can think of is to remove them, but then the scroll view will not respond to gestures.
So perhaps you need to look at those gesture recognisers and the gesture recogniser delegates you can use to see if there is a better place to hook into.
P.S. gesture recognisers will automatically start swallowing events once they recogniser a gesture in progress.

If the frame size is greater than the content size, your touches began method may not fire.
Since its working only for taps, my guess is that the content size of the scroll view is not set properly.

Related

iOS - View that handle taps, but let swipes go through to the superview

I have an app with quite a complex UI, there's a big UIView called the cover with a UITableView underneath it. The tableView is configured with a tableHeaderView of the same height as the cover. As the tableView scrolls up, the cover moves up the screen (with various fancy animations) using a UIScrollViewDelegate. To allow users to scroll the tableView by swiping the cover, I've overridden the - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event method to always return false.
I've now added some UIButton views to the cover. I've managed to make them respond to taps by changing the way I've overriden the pointInside method like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL isInside = [_directionsButton pointInside:[_directionsButton convertPoint:point fromView:self] withEvent:event];
return isInside;
}
The only problem now is that if you start a swipe gesture on the button, it's caught by the button and the tableView doesn't scroll. I want to be able to ignore swipe gestures on the button (so really let them pass to the view below).
Normally, I would just make the cover view the tableHeaderView, which seems to handle this kind of behaviour really well. However, I can't do this here, due to some unique animations done on the cover as the table scrolls.
Did you tried identifying the Gestures using Gesture Recognisers and doing action method that is to be called when the specified gesture is detected?
Please check this link. This may help you for that.

Need to block all touches except for specific one in an overlay view for iOS

I am trying to set up a tutorial type class that presents an overlay view and requires actions from the user before continuing. I currently have a hierarchy set up as follows.
UIWindow
|---- UIViewController
| |---- UIViewA (View performing tutorial action on)
| |---- UIViewB
|
|---- UIViewT (tutorial overlay)
|---- CGRect (defined by UIViewA)
During the tutorial, views will get dragged around, new views will be created, etc, which is why I added the tutorial overlay view to the UIWindow. This way the I don't have to mess with the view hierarchy within the view controller as suggested in many places on SO. The purpose of the overlay window is to block all actions, except for the required action expected by the tutorial.
Currently the tutorial overlay view is implemented as follows
#interface ZSOverlayView : UIView <UIGestureRecognizerDelegate>
#property (nonatomic, assign) CGRect activeRegion;
#end
#implementation ZSOverlayView
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
return !CGRectContainsPoint(_activeRegion, point);
}
#end
Where activeRegion is the CGRect defined by UIViewA. This successfully blocks all unwanted events and gestures from making it through the overlay, outside of the activeRegion. In this specific case UIViewB does not get the event or gestures.
The problem is that I only want a single gesture to make it through, not all of them, for UIViewA. For example if UIViewA has a double tap, pan, and custom gesture, I may only want the double tap to be active at once, or perhaps the custom gesture to be active at once, or perhaps both. The tutorial doesn't know what gestures the view has, so it needs a generic way of passing along the needed ones, and blocking the ones that aren't. Currently none of the gestures are blocked. Even if I have flags in place, which I currently do, that determine what gestures should be able to make it through, I am still running into problems in how to block specific ones, and let others through.
I'm unsure how to proceed because the tutorial overlay is not the delegate of any of the gesture recognizers, nor do I want it to be because by taking over as the delegate the tutorial might remove special conditions specified by the existing delegates.
Any ideas how to proceed to get the functionality I'm looking for?
I don't really like the solution, but the best answer was given by Lyndsey Scott in the comments.
If I'm understanding correctly, you could set the UIGestureRecognizerDelegates then just use conditionals in the delegate methods to specify what to do when the gesture view is the tutorial window vs when the gesture view is the main window.
I would have preferred not to rely on this method since I was trying to have my tutorial library do all of the work but since there hasn't been an answer in a while, I just wanted to throw it out there that this worked.
Have you tried to just use -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event ?
If you're wanting to block touch events for something, you can just do it here like:
Prevent touch events for a view
Call a helper method to determine which view(s) can be touched
etc.
Edit: Better late then never. I actually ran into needing to do this (again...) and (yet again) found the answer I was referring to in the comments below =]
Anyways, using touchesBegan, you could just do this ( to obtain all gesture recognizers who are receiving the type of touch ( or touches ) you are looking for:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// In this example, I'm only actually looking for (and caring) about a single touch.
if( [touches count] == 1 ) {
UITouch *touch = (UITouch *)[touches anyObject];
// So here they are now
NSArray *gestureRecognizersForTouch = [touch.gestureRecognizers copy];
// ... whatever else.
}
}
At this point, you can either remove the recognizers, set a value to a property they all have access to in your object, submit a notification, etc. etc. etc.

Touch Event in Scrollview ios

Scrolling is not stopping when I touch over the contact labels. How can I add this feature for this open project.
https://www.cocoacontrols.com/controls/scroller
If I touch the background, it is working perfectly. I would like to have same thing for the contacts labels too.
Basically, it uses scrollview and there is an animation while scrolling. I can not make stop it when I touch over the labels.
Any help is welcome.
Though I am unfamiliar with the scroller project, maybe this can at least get you on the right path.
The likely reason why touching the contacts isn't stopping the scrolling is because the labels are receiving their own touch events for their own purpose, which is probably the desired behavior, since you would probably want to touch one of the contacts and have it do something. It's possible that since the touch events are being intercepted in that view for that reason, that you can not interact with the scroll view using the same event.
You may need to set the userInteractionEnabled property of the view surrounding each contact to false until the scrollview has stopped scrolling. There are several ways you could do this, but this might be enough to get you started on a good solution.
My condition may be similar with yours.
I build a scroll view in storyboard and a view is added to the scroll view.All of my UI component was placed in the content view including two textfields.Generally speaking, I would like to rewrite the - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event method, and end editing actions in this view.
However,rewrite the method in scrollview's superview has little help.But when I subclass the view and rewrite that method in this subclass Every thing is OK.
According to my condition ,subclass the view and rewrite - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event.Process the logic in view level.

Finger Tracking and Gesture Recognizing

So I'm developing an app where I have all my gestures are being recognized. My problem comes when I attempt to add UIImageViews wherever the finger touches the screen.
These Views are to follow the finger, which they do, but the problem is I believe they are swallowing the touches not allowing the Gestures to be recognized. I have tried:
[_fingerHolder1 setUserInteractionEnabled:NO];
[_fingerHolder2 setUserInteractionEnabled:NO];
But it doesn't seem to change anything.
I am adding these the View in the ccTouchesBegan/Moved/Ended methods, whereas the gestures are being recognized in their respective handlers.
I have looked at using the UIPanGesture but I'm having some trouble recognizing the swipes as well as setting the coordinates for theUIImageViews of the finger trackers while doing this. Should I experiment with this more or is there a different solution?
Any help is appreciated!
The UIImageView will receive and process touches, hence they will not be forwarded to the cocos2d OpenGL view (also a UIView).
To make this work you need to create a subclass of UIImageView and override each touches… method and manually forward the event to cocos2d's view, here's the example for touchesBegan:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[[CCDirector sharedDirector].view touchesBegan:touches withEvent:event];
}
Use this UIImageView subclass in place of the original ones you use currently.
That will make regular cocos2d touch events work, and it should also make UIGestureRecognizers behave as expected if you've added those to cocos2d's view.
If I understand what you need (please correct me if I'm wrong), you want to move some UIViews when a drag(pan) event is detected, but you also add UIImageViews when the user touches the screen and this disables the touches.
You should set UIIMageView.userInteractionEnable = YES(by default is set to NO), basically every view that should detect touches should have userInteractionEnable = YES.
If you want to ignore some touches on some subviews you should implement:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch method of UIGestureRecognizerDelegate.
For handling different types of gesture you should implement the method:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}

Why Does The iPad Become Nonresponsive After UIGestureRecognizer is Added to UIWindow?

Background: I need to look at every UITouch that happens in an application and see if the UITouch is NOT inside a certain UIImageView.
WildcardGestureRecognizer looked very promising and I tried it. (Code is here):
How to intercept touches events on a MKMapView or UIWebView objects?
It worked great for a sandbox/quickly-created application. However that application didn't reflect the complexity that the actual target project has. The target project has a Table View Controller and more.
After adding the WildcardGestureRecognizer to the more involved iPad application, I see that none of the other controls work after the gesture recognizer is added and one click happens.
Here's some code where I was playing with the idea. Again, the sandbox code does not yet have controls on it (such as a Table View Controller or even a UIButton) to see if they work after adding the gesture recognizer to the UIWindow.
Should I pick something other than the UIWindow to add the gesture recognizer to or am I going to run into the same problem regardless? Is there a better way?
sandbox code: https://github.com/finneycanhelp/GestureKata
You might want to try another approach: create a UIWindow subclass, use that in your XIB and override hitTest:withEvent:. It returns the view that has been "selected" to receive the touch events.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *v;
v = [super hitTest:point withEvent:event];
if (v == myImageView) {
// Do something. Maybe return nil to prevent the touch from getting
// sent to the image view.
}
return v;
}
Overriding this method can also be helpful when debugging.

Resources