Stopping MapView's subviews from propagating touches - uiview

When I add a subview to a MKMapView or to any of it's descendent views, touches will pass right through and affect the map.
This question is ultimately used to explain a bug that was fixed for my custom callout which adds a view to an annotation view. The callout is not to be dismissed when the callout is touched and this question is in reference to that. But for the case brevity I'll use a smaller example that demonstrates the problem.
Setting exclusiveTouch and userInteractionEnabled doesn't do anything. BTW.
Let's take a UIView and attach it to the mapView
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)]
view.backgroundColor = [UIColor blackColor];
[mapView addSubview:view];
And let's add a gesture to the map as well
[mapView addGestureRecognizer:[[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLongPress:)]];
And just have handleLongPress print something NSLog(#"longpress"); Now run the program and observe how long pressing the black uiview will cause the longPress to be triggered. If we add an annotation that shows a callout and select it, touching the black uiview will dismiss it. And if we scroll the map so that the annotation is under the black uiview, we'll be able to select the annotation. Now here's my solution. We tag the view that we're going to add to mapview with SOME_TAG and overwrite the hitTest method for MKMapView
#interface MKMapView(HackySolution)
#end
#implementation MKMapView(HackySolution)
- (UIView *) hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *view = [super hitTest:point withEvent:event];
// block touch propagation
if(view.tag == SOME_TAG){
view = nil;
}
return view;
}
#end
Now here's the deal, I may have solved my problem, but I really don't know why touches from the UIView are being passed right through onto the map. It doesn't make sense for a touch to a subview to be passed onto it's parentView. Maybe I'm not understanding how touches work. I mean if I place a button on a normal UIView and place another uiview on top of it, completely obscuring it, the button wouldn't get the touch right? I'm starting to second guess myself. :/
I can only guess that MKMapView is wired to do this, but I'm unsure. I thought there was a scroll view somewhere within MKMapView, but I ran through all the subviews and all the descendents and didn't find anything.
Is there a better way to solve this problem or is this the best it gets?
Any insight into the situation would be greatly appreciated.

Related

SubView not Considered a part of MainView

I have many SubViews in my UIView, and many of them have UIButtons in them. One of the subviews- _bottomView (Coordinates- (0,519,320,49)) has an error. It does not recognise the click events on the buttons placed inside it.
I tried placing a UIButton covering the entire _bottomView and the click event from that Button (testButton) is not being recognised either.
I tried adding a tapRecogniser to the code and the tap from every point, EXCEPT the points within the _bottomView are recognised. TapRecogniser Code below
UITapGestureRecognizer *gr = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[self.view addGestureRecognizer:gr];
_bottomView.userInteractionEnabled=true;
-(void)handleGesture:(UIGestureRecognizer *)gestureRecognizer {
CGPoint p = [gestureRecognizer locationInView:self.view];
NSLog(#"got a tap in the region i care about");
}
I tried [self.view addSubview:_bottomView]; and that didn't help either. What could be the issue?
If your view is not receiving touches
Make sure that your UIView has userInteractionEnabled set to YES
Make sure that the view is within the bounds of its superview, and the superview is within the bounds of window.You can print the frames of views using NSLog(#"%#",NSStringFromCGRect(view.frame));
Nothing helped. I had to delete the entire View Controller and redo the whole thing. That obviously solved the problem.

Dragging views on a scroll view: touchesBegan is received, but no touchesEnded or touchesCancelled

As an iOS programming newbie I am struggling with a word game for iPhone.
The app structure is: scrollView -> contentView -> imageView -> image 1000 x 1000 (here fullscreen):
I think I have finally understood how to use an UIScrollView with Auto Layout enabled in Xcode 5.1:
I just specify enough constraints (dimensions 1000 x 1000 and also 0 to the parent) for the contentView and this defines the _scrollView.contentSize (I don't have to set it explicitly) - after that my game board scrolls and zooms just fine.
However I have troubles with my draggable letter tiles implemented in Tile.m.
I use touchesBegan, touchesMoved, touchesEnded, touchesCancelled and not gesture recognizers (as often suggested by StackOverflow users), because I display larger letter tile image with shadow (the bigImage) on touchesBegan.
My dragging is implemented in the following way:
In touchesBegan I remove the tile from contentView (and add it to the main app view) and display bigImage with shadow.
In touchesMoved I move the tile
In touchesEnded or touchesCancelled I display smallImage with shadow again and - add the tile to the contentView or leave it in the main view (if the tile is at the bottom of the app).
My problem:
Mostly this works, but sometimes (often) I see that only touchesBegan was called, but the other touchesXXXX methods are never called:
2014-03-22 20:20:20.244 ScrollContent[8075:60b] -[Tile touchesBegan:withEvent:]: Tile J 10 {367.15002, 350.98877} {57.599998, 57.599998}
Instead the scrollView is scrolled by the finger, underneath the big tile.
This results in many big tiles with shadows sitting on the screen of my app, while the scroll view is being dragged underneath them:
How to fix this please?
I know for sure that my structure of the app (with custom UIViews dragged in/out of a UIScrollView) is possible - by looking at popular word games.
I use tile.exclusiveTouch = YES and a custom hitTest method for the contentView - but this doesn't help:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
return result == self ? nil : result;
}
UPDATE 1:
I've tried adding the following code to handleTileTouched:
_contentView.userInteractionEnabled = NO;
_scrollView.userInteractionEnabled = NO;
_scrollView.scrollEnabled = NO;
and then set it back to YES in handleTileReleased of ViewController.m - but this does not help and also looks more like a hack to me.
UPDATE 2:
Having read probably everything related to UIScrollView, hitTest:withEvent: and pointInside:withEvent: - on the web (for ex. Hacking the responder chain and Matt Neuburg's Programming iOS book), StackOverflow and Safari, it seems to me, that a solution would be to implement the hitTest:withEvent: method for the main view of my app:
If a Tile object is hit, it should be returned. Otherwise - the scrollView should be returned.
Unfortunately, this doesn't work - I am probably missing something minor.
And I am sure that a good solution exists - by studying popular word games for iOS. For example dragging and placement of letter tiles works very smooth in Zynga's Words with Friends ® app and in the screenshots below you can see them probably using UIScrollView (the scroll bars are visible in the corner) and displaying a tile shadow (probably in touchesBegan method):
UPDATE 3:
I've created a new project to test gesture recognizer suggested by TomSwift and it shows the problem I have with gesture recognizers: the tile size changes too late - it happens, when the user starts moving the tile and not at the moment he touches it:
The problem here is that removing a view from the view hierarchy confuses the system, the touch is lost. It is the same issue (internally gesture recognizers use the same touchesBegan: API).
https://github.com/LeoNatan/ios-newbie/commit/4cb13ea405d9f959f4d438d08638e1703d6c0c1e
(I created a pull request.)
What I changed was to not remove the tile from the content view when touches begin, but only move on touches end or cancel. But this creates a problem - when dragging to the bottom, the tile is hidden below the view (due to scrollview clipping to its bounds). So I created a cloned tile, add it as a subview of the view controller's view and move that together with the original tile. When touches end, I remove the cloned tile and place the original where it should go.
This is because the bottom bar is not part of the scrollview hierarchy. If it was, the entire tile cloning would not be necessary.
I also streamlined the positioning of tiles quite a bit.
you could set the userInteractionEnabled of the scrollview to NO while you are dragging the tile, and set it back to YES when the tile dragging ended.
You should really try using a gesture recognizer instead of the raw touchesBegan/touchesMoved. I say this because UIScrollView is using gesture recognizers and by default these will cede to any higher-level gesture recognizer that is running.
I put together a sample that has a UIScrollView with an embedded UIImageView. As with your screenshot, below the scrollView I have some UIButton "Tiles", which I subclassed as TSTile objects. The only reason I did this was to expose some NSLayoutConstraints to access/alter their height/width (since you're using auto layout vs. frame manipulation). The user can drag tiles from their starting place into the scroll view.
This seems to work well; I didn't hook up the ability to drag a tile once it is re-parented in the scrollview. But that shouldn't be too hard. For that you might consider placing a long-tap gesture recognizer in each tile, then when it fires you would turn off scrolling in the scrollview, such that the top-level pan gesture recognizer would kick in.
Or, you might be able to subclass the UIScrollView and intercept the UIScrollView's pan-gesture-recognizer delegate callbacks to hinder panning when the user starts from a tile.
#interface TSTile : UIButton
//$hook these up to width/height constraints in your storyboard!
#property (nonatomic, readonly) IBOutlet NSLayoutConstraint* widthConstraint;
#property (nonatomic, readonly) IBOutlet NSLayoutConstraint* heightConstraint;
#end
#implementation TSTile
#synthesize widthConstraint,heightConstraint;
#end
#interface TSViewController () <UIScrollViewDelegate, UIGestureRecognizerDelegate>
#end
#implementation TSViewController
{
IBOutlet UIImageView* _imageView;
TSTile* _dragTile;
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIPanGestureRecognizer* pgr = [[UIPanGestureRecognizer alloc] initWithTarget: self action: #selector( pan: )];
pgr.delegate = self;
[self.view addGestureRecognizer: pgr];
}
- (UIView*) viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return _imageView;
}
- (BOOL) gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint pt = [gestureRecognizer locationInView: self.view];
UIView* v = [self.view hitTest: pt withEvent: nil];
return [v isKindOfClass: [TSTile class]];
}
- (void) pan: (UIGestureRecognizer*) gestureRecognizer
{
CGPoint pt = [gestureRecognizer locationInView: self.view];
switch ( gestureRecognizer.state )
{
case UIGestureRecognizerStateBegan:
{
NSLog( #"pan start!" );
_dragTile = (TSTile*)[self.view hitTest: pt withEvent: nil];
[UIView transitionWithView: self.view
duration: 0.4
options: UIViewAnimationOptionAllowAnimatedContent
animations:^{
_dragTile.widthConstraint.constant = 70;
_dragTile.heightConstraint.constant = 70;
[self.view layoutIfNeeded];
}
completion: nil];
}
break;
case UIGestureRecognizerStateChanged:
{
NSLog( #"pan!" );
_dragTile.center = pt;
}
break;
case UIGestureRecognizerStateEnded:
{
NSLog( #"pan ended!" );
pt = [gestureRecognizer locationInView: _imageView];
// reparent:
[_dragTile removeFromSuperview];
[_imageView addSubview: _dragTile];
// animate:
[UIView transitionWithView: self.view
duration: 0.25
options: UIViewAnimationOptionAllowAnimatedContent
animations:^{
_dragTile.widthConstraint.constant = 40;
_dragTile.heightConstraint.constant = 40;
_dragTile.center = pt;
[self.view layoutIfNeeded];
}
completion:^(BOOL finished) {
_dragTile = nil;
}];
}
break;
default:
NSLog( #"pan other!" );
break;
}
}
#end
I also think you should use a UIGestureRecognizer, and more precisely a UILongPressGestureRecognizer on each tile that once recognized will handle pan.
For fine grained control you can still use the recognizers' delegate.

Draggable UIView stops posting touchesBegan after being added to UIScrollView

In Xcode 5.1 I have created a simple test app for iPhone:
The structure is: scrollView -> contentView -> imageView -> image 1000 x 1000 on the top.
And on the bottom of the single view app I have seven draggable custom UIViews.
The dragging is implemented in Tile.m with touchesXXXX methods.
My problem is: once I add a draggable tile to the contentView in my ViewController.m file - I can not drag it anymore:
- (void) handleTileMoved:(NSNotification*)notification {
Tile* tile = (Tile*)notification.object;
//return;
if (tile.superview != _scrollView && CGRectIntersectsRect(tile.frame, _scrollView.frame)) {
[tile removeFromSuperview];
[_contentView addSubview:tile];
[_contentView bringSubviewToFront:tile];
}
}
The touchesBegan isn't called for the Tile anymore as if the scrollView would mask that event.
I've searched around and there was a suggestion to extend the UIScrollView class with the following method (in my custom GameBoard.m):
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
NSLog(#"%s: %hhd", __PRETTY_FUNCTION__,
[result.superview isKindOfClass:[Tile class]]);
self.scrollEnabled = ![result.superview isKindOfClass:[Tile class]];
return result;
}
Unfortunately this doesn't help and prints 0 in debugger.
The problem is, partly, because user interactions are disabled on the content view. However, enabling user interactions disables scrolling as the view captures all touches. So here is the solution. Enable user interactions in storyboard, but subclass the content view like so:
#interface LNContentView : UIView
#end
#implementation LNContentView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
return result == self ? nil : result;
}
#end
This way, hit test passes only if the accepting view is not self, the content view.
Here is my commit:
https://github.com/LeoNatan/ios-newbie
The reason Tile views don't get touches is that scroll view's pan gesture recogniser consumes the events. What you need is, attach a UIPanGestureRecongnizer to each of your tiles and configure them as follows:
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(pan:)]; // handle drag in pan:method
[tile addGestureRecognizer:pan];
UIPanGestureRecognizer *scrollPan = self.scrollView.panGestureRecognizer;
[scrollPan requireGestureRecognizerToFail:pan];
Here you let scroll view's pan gesture recogniser know that you only wish scrolling to happen if none of the tiles are bing dragged.
I've checked the approach — it does work indeed. Regarding your code, you'll need to handle all touches in the gesture recogniser rather than Tile view because touch events may be consumed/delayed by hit-tested view's gesture recogniser before they reach the view itself. Please refer to UIGestureRecognizer documentation to learn more about the topic.
It looks as ir one of the views in the hierarchy is capturing the events.
Have a look at the section
The Responder Chain Follows a Specific Delivery Path
Of the Apple doc's here
Edit:
Sorry I was writing from memory. This is how i resolved a similar issue in an app of myself:
I use UITapGestureRecognizer in the view(s) that I want to detect the touch. Implement the following delegate method of the UITapGestureRecognizer:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
The touches' set contains all the objects (views) that received the event.

How to allow sibling UIViews to handle different gestures?

I have a UIView which has two subviews, one is a UIScrollView and the other is a container view for a few other subviews. The container view is covering the scroll view completely.
Views that need to handle gestures:
UIScrollView - should handle the default pinch and pan gestures
Container view - none
Container view subviews - should handle tap gesture
Now in order for the tap gestures to be handled by the container view subviews I implemented pointInside:withEvent: for the container view. If it recognises the point is inside one of its subviews it returns YES. This works fine. The problem is that when I pinch or pan and my finger initially touches one of the container view subviews it doesn't work. When I pinch or pan on an empty area of the container view it works as it should.
Any suggestions how to make it work?
EDIT:
I've implemented hitTest:withEvent: for the main view and got the same behavior.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitTestView;
for (UIView *subview in [self.subviews reverseObjectEnumerator])
{
hitTestView = [subview hitTest:[self convertPoint:point
toView:subview]
withEvent:event];
if (hitTestView && ![hitTestView isKindOfClass:[ContainerView class]])
{
break;
}
}
return hitTestView;
}
On the bottom line the question here is how does one view only handles some gestures and passes on other gestures so an underlying view could handle them.
I've read quite a lot about the subject and tried different approaches but couldn't find a straightforward solution to what seems like a pretty common issue.
You don't actually need to handle pinch and pan gesturese on UIScrollView manually, it's going to happen automatically.
For handling container view subviews you can use UITapGestureRecognizer. For each view you need to handle tap use:
UITapGestureRecognizer* tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTapFirstSubview:)];
[firstSubview addGestureRecognizer:tapRecognizer];
Handler method:
- (void)handleTapFirstSubview:(UITapGestureRecognizer *)tapRecogmizer
{
// handle tap here
}

Enlarge UIButton's hitest area for button with background image, imageView and label in it

It may seem like duplicate, but I can't actually find any good answer to my concrete situation.
I have some button with background image (1pt-wide, streched), icon-like image inside along with the text label. It's height is 33pt, I need to make it's hittest area 44pt-high.
I saw two solutions, but neither of them works for me.
First solution is to enlarge frame and adjust image so it would have some padding. But that is not acceptable for me, because I have both background image and image inside the button.
Second solution is to subclass UIButton (which is absolutely acceptable) and override - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event. I did that, when I put some breakpoints in this method and tried to tap in desired area nothing happened (breakpoints worked when tapped inside the button frame).
Are there any other solutions? What could be possibly wrong with the second solution?
If you create the view programmatically, you can instantiate a UIView that is 44px high, add a tap gesture recognizer to it, and add your UIButton as a subview inside this view:
CGRect bigFrame = CGRectMake(0,0,100,44);
UIView *big = [[UIView alloc] initWithFrame:bigFrame];
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(doPress)];
[big addGestureRecognizer:tapRecognizer];
UIButton *yourButton = ...
[big addSubview:yourButton];
[self.view addSubview:big];
Then implement the selector as:
- (void)doPress {
[yourButton sendActionsForControlEvents:UIControlEventTouchUpInside];
}
In that way, the outer view taps will be interpreted as button presses for your button.

Resources