I want both my UIScrollView and its subviews to receive all touch events inside the subview. Each can respond in its own way.
Alternatively, if tap gestures were forwarded to subviews, all would be well.
A lot of people are struggling in this general area. Here are a few of the many related questions:
How does UIScrollView steal touches from its subviews
How to steal touches from UIScrollView?
How to Cancel Scrolling in UIScrollView
Incidentally, if I override hitTest:withEvent: in the scroll view, I do see the touches as long as userInteractionEnabled is YES. But that doesn't really solve my problem, because:
1) At that point, I don't know if it's a tap or not.
2) Sometimes I need to set userInteractionEnabled to NO.
EDIT: To clarify, yes, I want to treat taps differently from pans. Taps should be handled by subviews. Pans can be handled by the scroll view in the usual way.
First, a disclaimer. If you set userInteractionEnabled to NO on the UIScrollView, no touch events will be passed to the subviews. So far as I'm aware, there's no way around that with one exception: intercept touch events on the superview of the UIScrollView, and specifically pass those events to the subviews of UIScrollView. To be honest, though, I don't know why you would want to do this. If you're wanting to disable specific UIScrollView functionality (like...well, scrolling) you can do that easily enough without disabling UserInteraction.
If I understand your question, you need tap events to be processed by the UIScrollView and passed to the subviews? In any case (whatever the gesture is), I think what you're looking for is the protocol method gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: in the protocol UIGestureRecognizerDelegate. In your subviews, whatever gesture recognizers you have, set a delegate (probably whatever class is setting the UIGestureReconginzer in the first place) on the gesture recognizer. Override the above method and return YES. Now, this gesture will be recognized along with any other recognizers that might have 'stolen' the gesture (in your case, a tap). Using this method you can even fine tune your code to only send certain kinds of gestures to the subviews or send the gesture only in certain situations. It gives you a lot of control. Just be sure to read about the method, especially this part:
This method is called when recognition of a gesture by
either gestureRecognizer or otherGestureRecognizer would block the
other gesture recognizer from recognizing its gesture. Note that
returning YES is guaranteed to allow simultaneous recognition;
returning NO, on the other hand, is not guaranteed to prevent
simultaneous recognition because the other gesture recognizer's
delegate may return YES.
Of course, there's a caveat: This only applies to gesture recognizers. So you may still have problems if you're trying to use touchesBegan:, touchesEnded, etc to process the touches. You can, of course, use hitTest: to send raw touch events on to the subviews, but why? Why process the events using those methods in UIView, when you can attach a UIGestureRecognizer to a view and get all of that functionality for free? If you need touches processed in a way that no standard UIGestureRecognizer can provide, subclass UIGestureRecognizer and process the touches there. That way you get all the the functionality of a UIGestureRecognizer along with your own custom touch processing. I really think Apple intended for UIGestureRecognizer to replace most (if not all) of the custom touch processing code that developers use on UIView. It allows for code-reuse and it's a lot easier to deal with when mitigating what code processes what touch event.
I don't know if this can help you, but I had a similar problem, where I wanted the scrollview to handle double-tap, but forward single tap to subviews. Here is the code used in a CustomScrollView
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
// Coordinates
CGPoint point = [touch locationInView:[self.subviews objectAtIndex:0]];
// One tap, forward
if(touch.tapCount == 1){
// for each subview
for(UIView* overlayView in self.subviews){
// Forward to my subclasss only
if([overlayView isKindOfClass:[OverlayView class]]){
// translate coordinate
CGPoint newPoint = [touch locationInView:overlayView];
//NSLog(#"%#",NSStringFromCGPoint(newPoint));
BOOL isInside = [overlayView pointInside:newPoint withEvent:event];
//if subview is hit
if(isInside){
Forwarding
[overlayView touchesEnded:touches withEvent:event];
break;
}
}
}
}
// double tap : handle zoom
else if(touch.tapCount == 2){
if(self.zoomScale == self.maximumZoomScale){
[self setZoomScale:[self minimumZoomScale] animated:YES];
} else {
CGRect zoomRect = [self zoomRectForScrollView:self withScale:self.maximumZoomScale withCenter:point];
[self zoomToRect:zoomRect animated:YES];
}
[self setNeedsDisplay];
}
}
Of course, the effective code should be changed, but at this point you should have all the informations you need to decide if you have to forward the event. You might need to implement this in another method as touchesMoved:withEvent:.
Hope this can help.
I was having this same problem, but with a scrollview that was inside UIPageViewController, so it had to be handled slightly differently.
By changing the cancelsTouchesInView property to false for each recognizer on the UIScrollView I was able to receives touches to buttons inside the UIPageViewController.
I did so by adding this code into viewDidLoad:
guard let recognizers = self.pageViewController.view.subviews[0].gestureRecognizers else {
print("No gesture recognizers on scrollview.")
return
}
for recognizer in recognizers {
recognizer.cancelsTouchesInView = false
}
If what you need is to differ between a touch and a scroll then you can test if touches has been moved. If this is a tap then touchHasBeenMoved will not be called then you can assume this is a touch.
At this point you can set a boolean to indicate if a movnent accoured and set this Boolean as a condition in your other methods.
I am on the road but if that's what you need I will be able to explain better later.
A hackish way to achieve your objective - not 100% exact - is to subclass the UIWindow and override the - (void)sendEvent:(UIEvent *)event;
A quick example:
in SecondResponderWindow.h header
//SecondResponderWindow.h
#protocol SecondResponderWindowDelegate
- (void)userTouchBegan:(id)tapPoint onView:(UIView*)aView;
- (void)userTouchMoved:(id)tapPoint onView:(UIView*)aView;
- (void)userTouchEnded:(id)tapPoint onView:(UIView*)aView;
#end
#interface SecondResponderWindow : UIWindow
#property (nonatomic, retain) UIView *viewToObserve;
#property (nonatomic, assign) id <SecondResponderWindowDelegate> controllerThatObserves;
#end
in SecondResponderWindow.m
//SecondResponderWindow.m
- (void)forwardTouchBegan:(id)touch onView:(UIView*)aView {
[controllerThatObserves userTouchBegan:touch onView:aView];
}
- (void)forwardTouchMoved:(id)touch onView:(UIView*)aView {
[controllerThatObserves userTouchMoved:touch onView:aView];
}
- (void)forwardTouchEnded:(id)touch onView:(UIView*)aView {
[controllerThatObserves userTouchEnded:touch onView:aView];
}
- (void)sendEvent:(UIEvent *)event {
[super sendEvent:event];
if (viewToObserve == nil || controllerThatObserves == nil) return;
NSSet *touches = [event allTouches];
UITouch *touch = [touches anyObject];
if ([touch.view isDescendantOfView:viewToObserve] == NO) return;
CGPoint tapPoint = [touch locationInView:viewToObserve];
NSValue *pointValue = [NSValue valueWithCGPoint:tapPoint];
if (touch.phase == UITouchPhaseBegan)
[self forwardTouchBegan:pointValue onView:touch.view];
else if (touch.phase == UITouchPhaseMoved)
[self forwardTouchMoved:pointValue onView:touch.view];
else if (touch.phase == UITouchPhaseEnded)
[self forwardTouchEnded:pointValue onView:touch.view];
else if (touch.phase == UITouchPhaseCancelled)
[self forwardTouchEnded:pointValue onView:touch.view];
}
It's not 100% conforms to what your were expecting - because your second responder view does not handle the touch event natively via -touchDidBegin: or so, and has to implement the SecondResponderWindowDelegate. However this hack does allow you to handle touch events on additional responders.
This method is inspired by and extended from MITHIN KUMAR's TapDetectingWindow
Related
Suppose I have a UIView named Cake. Cake has a gesture recognizer.
Now, suppose I have a UIButton named Bob.
I add Cake as a subview to Bob:
[Bob addSubview: Cake];
Now, Bob, the UIButton, no longer responds the control event touch up inside.
I want Cake to be able to handle the touch while Bob simultaneously handles the touch as well. Currently, Cake can handle the touch, but Bob lazily does nothing.
Things I have tried:
Setting cancelsTouchesInView of Cake's gesture recognizer to NO
Implementing the UIGestureRecognizerdelegate for Cake's gesture recognizer and always returning YES for the shouldRecognizeSimultaneouslyWithGestureRecognizer method
Subclassing UIGestureRecognizer and calling [self.view.nextResponder touchesSomething:touches withEvent:event]; in each of the touchesSomething (touchesBegan, touchesEnded, etc.) methods (I've also confirmed that the next responder IS IN FACT the UIButton that is supposed to handle the control events)
Not using a gesture recognizer and instead just using the touchesSomething methods in the UIView (Cake) + passing through the touchesSomething calls to all of super, self.superview, self.nextResponder and more.
Does anyone know a good way to make this work?
My suggestion is the following:
Make sure, that you set the UserInteraction of the subview to false! Now the action of the button should be called correctly.
Now add the event parameter to the action:
- (IBAction)pressButton:(UIButton *)sender forEvent:(UIEvent *)event
Inside the action check if the press was inside the subview
- (IBAction)pressButton:(UIButton *)sender forEvent:(UIEvent *)event
// get location
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
// check position
if (CGRectContainsPoint(self.subview.frame, location) {
// call selector like the gesture recognizer here
}
}
What I'm trying to make, Cake, is a view that can be placed as a subview into any button without additional setup - its a decorative view of sorts. The gesture recognizer of Cake is there to make a small animation
Then you're going about this all wrong. Take away the gesture recognizer of Cake; you don't need it. You're trying to get Cake to respond to Bob being pressed. But that's easy; Bob's a button! The button already tells you everything that's happening — it's being highlighted etc. So all you need is a UIButton subclass that tells Cake when to do its animation.
In Xcode 5.1 I have created a simple test app for iPhone:
The structure is: scrollView -> contentView -> imageView -> image 1000 x 1000 on the top.
And on the bottom of the single view app I have seven draggable custom UIViews.
The dragging is implemented in Tile.m with touchesXXXX methods.
My problem is: once I add a draggable tile to the contentView in my ViewController.m file - I can not drag it anymore:
- (void) handleTileMoved:(NSNotification*)notification {
Tile* tile = (Tile*)notification.object;
//return;
if (tile.superview != _scrollView && CGRectIntersectsRect(tile.frame, _scrollView.frame)) {
[tile removeFromSuperview];
[_contentView addSubview:tile];
[_contentView bringSubviewToFront:tile];
}
}
The touchesBegan isn't called for the Tile anymore as if the scrollView would mask that event.
I've searched around and there was a suggestion to extend the UIScrollView class with the following method (in my custom GameBoard.m):
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
NSLog(#"%s: %hhd", __PRETTY_FUNCTION__,
[result.superview isKindOfClass:[Tile class]]);
self.scrollEnabled = ![result.superview isKindOfClass:[Tile class]];
return result;
}
Unfortunately this doesn't help and prints 0 in debugger.
The problem is, partly, because user interactions are disabled on the content view. However, enabling user interactions disables scrolling as the view captures all touches. So here is the solution. Enable user interactions in storyboard, but subclass the content view like so:
#interface LNContentView : UIView
#end
#implementation LNContentView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
return result == self ? nil : result;
}
#end
This way, hit test passes only if the accepting view is not self, the content view.
Here is my commit:
https://github.com/LeoNatan/ios-newbie
The reason Tile views don't get touches is that scroll view's pan gesture recogniser consumes the events. What you need is, attach a UIPanGestureRecongnizer to each of your tiles and configure them as follows:
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)]; // handle drag in pan:method
[tile addGestureRecognizer:pan];
UIPanGestureRecognizer *scrollPan = self.scrollView.panGestureRecognizer;
[scrollPan requireGestureRecognizerToFail:pan];
Here you let scroll view's pan gesture recogniser know that you only wish scrolling to happen if none of the tiles are bing dragged.
I've checked the approach — it does work indeed. Regarding your code, you'll need to handle all touches in the gesture recogniser rather than Tile view because touch events may be consumed/delayed by hit-tested view's gesture recogniser before they reach the view itself. Please refer to UIGestureRecognizer documentation to learn more about the topic.
It looks as ir one of the views in the hierarchy is capturing the events.
Have a look at the section
The Responder Chain Follows a Specific Delivery Path
Of the Apple doc's here
Edit:
Sorry I was writing from memory. This is how i resolved a similar issue in an app of myself:
I use UITapGestureRecognizer in the view(s) that I want to detect the touch. Implement the following delegate method of the UITapGestureRecognizer:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
The touches' set contains all the objects (views) that received the event.
I would like to detect the (initial) touch position in my UIScrollView when the user starts dragging. I have googled this issue and many seem to struggle with this very issue. Now, while I still can't wrap my head around why Apple would not let users access touch information in a scroll view, I can't help but find a solution by myself. However all my tries failed, so I would like to ask you.
Here is what I thought would work:
I set up a UIPanGestureRecognizer like this in my UIScrollView subclass and add it to its gesture recognizers:
UIPanGestureRecognizer *tapDrag = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(touchedAndDragged:)];
tapDrag.cancelsTouchesInView = NO;
tapDrag.delegate = self;
[self addGestureRecognizer:tapDrag];
And the corresponding method:
-(void)touchedAndDragged:(UIPanGestureRecognizer*)t{
CGPoint loc = [t locationInView:self];
//do something with location (that is exactly what I need)
//...
//Now DISABLE and forward touches to scroll view, so that it scrolls normally
t.enabled = NO;
/****
?????
*****/
}
As indicated by the comments, I would like to disable the pan gesture after I have the point and then disable the recognizer(while STILL dragging!) and "pass" the touches to my scroll view, so that the user can scroll normally. Is that feasible at all? Is there any other solution to it ?
Well UIScrollView's already have a built in pan gesture that you could tap into. Usage would be as simple as setting your class as your scroll view's delegate (to utilize scrollViewWillBeginDragging) and using UIPanGestureRecognizer's -locationInView: to determine touch location.
- (void)scrollViewWillBeginDragging:(UIScrollView *)scrollView
{
CGPoint location = [scrollView.panGestureRecognizer locationInView:scrollView];
NSLog(#"%#",NSStringFromCGPoint(location));
}
Why don't you grab the start location in -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event that will be more convenient and efficient.
I have a subclass of UIView on top of a UITableView. I am using the UITableView to display some data and, at the same time, I would like to overlay an animation that follows the finger (for instance, leaving a trail).
If I get it right, I need the touch events to be handled both by the UIView subclass and the UITableView. How can I do that?
Is it possible to have, ie, touchesMoved being triggered on the UIView subclass and then on UITableView?
Thank you so much for any help.
The way I have solved this problem is in a way that is not that clean, but it works. Please let me know if there's a better way to do this.
I have overridden hitTest for my custom UIView so that it directs touches to the UITableView underneath. Then in the UITableView I am handling the gestures through touchesBegan, touchesMoved, etc. There I am also calling touchesBegan on the UIView.
In this way touches are handled by two views.
The reason why I am not doing the other way around (having UIView's touchesBegan calling UITableView's touchesBegan) is that gestures recognizers on the UITableView would not work.
UIView subclass' hitTest
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// tview is the UITableView subclass instance
CGPoint tViewHit = [tView convertPoint:point fromView:self];
if ([tView pointInside:tViewHit withEvent:event]) return tView;
return [super hitTest:point withEvent:event];
}
UITableView subclass's touchesBegan
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:touch.view];
// ....
// view is the UIView's subclass instance
[view touchesBegan:touches withEvent:event];
}
No, you cann't do it implicity. Event Delivery chapter says
The window object uses hit-testing and the responder chain to find the
view to receive the touch event. In hit-testing, a window calls
hitTest:withEvent: on the top-most view of the view hierarchy; this
method proceeds by recursively calling pointInside:withEvent: on each
view in the view hierarchy that returns YES, proceeding down the
hierarchy until it finds the subview within whose bounds the touch
took place. That view becomes the hit-test view.
So, when window finds touched view it returns YES. Only one view can handle touches at the current moment.
But if you need to handle event for UITableView then handle it for UIView! You can convert touched point to required coordinates with – convertPoint, – convertRect functions, add subview to UITableView and move it depends on coordinate, and a lot of another things.
UITableView relays unhandled touch events to UIView. (Google "responder chain")
UITableView Documentation
So, you can handle your touch events in UIView only. So. In your UIView
touchesstart - do initialization stuff
touchesmove - draw tail on UIView (Use timers/delayedresponse to desable points so that it would look like a trail)
touchesend - do remaining stuff
Hope this helps.
I'm trying to distinguish between horizontal swiping / panning and
vertical scrolling in a UITableView. The behavior I'm looking to
imitate (in a sense) is that of the Twitter iPad app, that has
multiple UITableView that can be moved horizontally on the screen. If
I slide my finger left or right on one of these UITableView, the view
itself moves horizontally. If I swipe vertically, the view scrolls as
expected.
I'm having trouble figuring out the correct way to implement this
behavior. I've seen some tutorials on this which involve adding touch
event handlers in the UITableViewCells, and overriding hitTest in the
UITableViewto appropriately route events depending on which direction
the gesture is moving. I've implemented some of these techniques, but
none of them work particularly well.
Does anyone know the correct way to implement this sort of behavior?
Conditionally performing actions on a UITableViewdependent on the
direction of the user's finger movement?
Thanks.
I've been struggling with a similar problem for days, and I've went through several potential solutions. I've found the best way and also the simplest solution to be subclassing UIGestureRecognizer to handle horizontal movement and attach it to your UITableViews.
The way it works is that it intercepts any touch events before they go to the UITableView (also UIScrollView). The UITableView, being a subclass of UIScrollView, has a custom UIPanGestureRecognizer built in which detects dragging and scrolls it's view accordingly. By adding your own subclass of UIGestureRecognizer, you can get the touches before the UIScrollView's gesture recognizer does. If your recognizer sees that the user is dragging horizontally, it should change it's state in an overridden touchesMoved: method to UIGestureRecognizerStateBegan. Otherwise, it sets it's state to UIGestureRecognizerCancelled, which lets the underlying UIScrollView handle the touches instead.
Here's what my UIGestureRecognizer subclass looks like:
#import <UIKit/UIGestureRecognizerSubclass.h>
#interface TableViewCellPanGestureRecognizer : UIGestureRecognizer
{
CGPoint startTouchPoint;
CGPoint currentTouchPoint;
BOOL isPanningHorizontally;
}
- (void)reset;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
#end
#implementation TableViewCellPanGestureRecognizer
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
startTouchPoint = [[touches anyObject] locationInView:nil];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
currentTouchPoint = [[touches anyObject] locationInView:nil];
if ( !isPanningHorizontally ) {
float touchSlope = fabsf((currentTouchPoint.y - startTouchPoint.y) / (currentTouchPoint.x - startTouchPoint.x));
if ( touchSlope < 1 ) {
self.state = UIGestureRecognizerStateBegan;
isPanningHorizontally = YES;
[self.view touchesCancelled:touches withEvent:event];
} else {
self.state = UIGestureRecognizerStateCancelled;
[self.view touchesCancelled:touches withEvent:event];
}
} else {
self.state = UIGestureRecognizerStateChanged;
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesCancelled:touches withEvent:event];
self.state = UIGestureRecognizerStateCancelled;
[self.view touchesCancelled:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
self.state = UIGestureRecognizerStateCancelled;
}
-(void)reset
{
[super reset];
startTouchPoint = CGPointZero;
currentTouchPoint = CGPointZero;
isPanningHorizontally = NO;
}
#end
Then I have a subclassed UITableView that attaches the recognizer to itself and implements an action method to trigger horizontal movement of individual rows:
In my UITableView init:
horizontalPanGesture = [[TableViewCellPanGestureRecognizer alloc] initWithTarget:self action:#selector(handleHorizontalDrag:)];
[self addGestureRecognizer:horizontalPanGesture];
[horizontalPanGesture release];
And the action method:
-(void)handleHorizontalDrag:(UIGestureRecognizer *)gesture
{
UIGestureRecognizerState state = gesture.state;
// Set the touched cell
if (!touchedCell){
NSIndexPath *indexPathAtHitPoint = [self indexPathForRowAtPoint:[gesture locationInView:self]];
id cell = [self cellForRowAtIndexPath:indexPathAtHitPoint];
touchedCell = cell;
startTouchPoint = [gesture locationInView:touchedCell];
}
if ( state == UIGestureRecognizerStateBegan || state == UIGestureRecognizerStateChanged ) {
// move your views horizontally
} else if ( state == UIGestureRecognizerStateEnded || state == UIGestureRecognizerStateCancelled ) {
touchedCell = nil;
}
}
The above gets the current cell being touched within the table view, and then applies horizontal movements to it as the user drags left or right. However, if my gesture recognizer determines that the touches are meant to scroll vertically, it just cancels itself and the following touches are sent on to the UITableView to initiate vertical scrolling automatically.
This setup seems to be much simpler than overriding hitTest and doing all sorts of touch event trickery within the UITableView itself. It simply makes an immediate determination about the direction of the touch movement. You'll want to read up on UIGestureRecognizers - specifically about how it should be subclassed. You need to make sure to forward on certain touch events like touchesCancelled to the UITableView, as the UITableView's built in panGestureRecognizer won't be handling these events as it normally does. Obviously, you'll want to move entire table views and not individual cells, but it should be pretty straightforward.
This solution, as simple as it is, took me awhile to get exactly right for my exact needs. I am pretty new to IOS development, so I had to spend a lot of time reading about and tinkering with gesture recognizers and scroll views to figure this out.
I have never done this myself, but as a UITableView is a subclass of UIScrollView, the delegates of UITableView are also UIScrollViewDelegates. So in your UITableViewController subclass, you should be able to use UIScrollView delegates, and intercept the scrolls - making sure to also call the super method.
If you want the UITableViews to be placed "side by side" and when you swipe horizontally you expect them to all move together horizontally at the same time, (like a photo gallery with UITableViews instead of images) you can do the following:
Use a UIScrollView and add the UITableViews as the UIScrollView's subviews. You should set the scrollview's contentSize like this:
CGRect bounds = scrollView.bounds;
scrollView.contentSize=CGSizeMake(bounds.size.width * kNumOfTableViews, bounds.size.height);
so that the UIScrollview scrolls horizontally and not vertically.
You may also want to use
scrollView.pagingEnabled=YES;
depending on the desirable behaviour.
The UITableviews will respond the normal way if you slide your finger vertically and you will be able to change between UITableViews by sliding your finger horizontally.
For more details about how to do this efficiently, you can look at the WWDC 2010 video Session 104 - Designing Apps with Scroll Views and check out the source code from here: http://developer.apple.com/library/ios/#samplecode/PhotoScroller/ . This session describes how to slide between images. Instead of images you will use UITableViews
However, if you want each UITableView to be able to move horizontally independently and maybe overlap with another one as in the twitter app for iPad, this solution will not work for you, at least not out of the box.