I have a bit of a complex situation involving multiple gestures. Basically, I want to have one container UIScrollView that only scrolls left-to-right if the touches are within a specific area. If they are not within the area, the UIScrollView passes those touches on to child UIViews that exist side-by-side inside the UIScrollView (think of it like a panel navigation).
I have the UIScrollView containing UIViews working fine. I subclassed UIScrollView and added the panning restriction via TouchesBegan/TouchesMoved/TouchesEnded/TouchesCancelled. Everything works except when the UIView is a UITableView. It seems at that point my parent UIScrollView never gets these events and so can never restrict the panning properly.
Anyone have any ideas for how to accomplish this?
Thanks!
The way to do this is to subclass the subviews that are eating the touch events and not allowing the UIScrollView to get them. Then, override the pointInside: method (with the appropriate exception for UI that you want to still work). For example:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// Confine the offending control to a certain area
CGRect frame = CGRectMake(0, 0,
self.frame.size.width,
self.frame.size.height - 100.00);
// Except for subview buttons (or some other UI element)
if([self depthFirstButtonTest:self pointInside:point withEvent:event])
{
return YES;
}
return (CGRectContainsPoint(frame, point));
}
- (BOOL)depthFirstButtonTest:(UIView*)view pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
for (UIView * subview in view.subviews)
{
if([self depthFirstButtonTest:subview pointInside:point withEvent:event])
{
return YES;
}
}
// Is it a button? If so, perform normal testing on it
if ([view isKindOfClass:[UIButton class]]) {
CGPoint pointInButton = [view convertPoint:point fromView:self];
if ([view pointInside:pointInButton withEvent:event]) {
return YES;
}
}
return NO;
}
Related
I am trying to animate this view ControlsView up by touchUpInside in the UIButton which is the carrot character inside the white square in the image attached. When the button is hit, a delegate method is fired an the controlsView is animated up. The problem is that because the UIButton is outside of the bounds of controlsView it does not receive touch info.
I have thought about this a lot and read up on some potential candidate solutions. Such as detecting the touch event on a super view by overriding hitTest:withEvent:. However, the UIButton is actually a subview of CockPitView which is a subview of ControlsView which is a subview of MainView. MainView, it seems, is the rectangle whose coordinates the UIButton would truly lie in. So would I override hitTest there and pass the touch info to CockPitView, where I could then have my Button trigger its action callback?
Has anyone encountered this or a similar problem and have any advice on how to proceed?
You should override hitTest in your custom view CockPitView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (!self.clipsToBounds && !self.hidden && self.alpha > 0) {
for (UIView *subview in self.subviews.reverseObjectEnumerator) {
CGPoint subPoint = [subview convertPoint:point fromView:self];
UIView *result = [subview hitTest:subPoint withEvent:event];
if (result != nil) {
return result;
}
}
}
return nil;
}
CockPitView has to be a subclass of UIView
You may want to use -pointInside:withEvent:. In your button's superview, i.e., ControlsView, override the method like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// self.button is the button outside the bounds of your ControlsView
if (CGRectContainsPoint(self.button.bounds, [self convertPoint:point toView:self.button])) {
return YES;
}
return [super pointInside:point withEvent:event];
}
By doing this, your ControlsView claims that the points inside the bounds of your button should be treated like the points inside your ControlsView.
I have next structure
UView -
- UIView2 with Button
- CollectionView - which overlap UIView2 and has contentInset
I can't make to Button receive touch events.
I try pointInside with check clear color (collectionView has backgroundColor is set to clear color)
But i need to scroll work event on blue part (UIView2)
So, i need then i touch collectionView touch also receive a UIButton
Is it possible?
Why does your collection view need to overlap the button? Anyway, you can try overriding HitTest in UIView like this:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
CGPoint pointInUIView2 = [UIView2 convertPoint:point fromView:self];
if ([UIView2 pointInside:pointInUIView2 withEvent:event])
{
return UIView2;
}
else
{
return [super hitTest:point withEvent:event];
}
}
I have a view with a subview. When a button in the subview is tapped, the subview expands outside the bounds of a view, presenting couple of other buttons. However, I cannot find a way to interact with them.
I found a code at Apple's site:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// Convert the point to the target view's coordinate system.
// The target view isn't necessarily the immediate subview
CGPoint pointForTargetView = [self.targetView convertPoint:point fromView:self];
if (CGRectContainsPoint(self.targetView.bounds, pointForTargetView)) {
// The target view may have its view hierarchy,
// so call its hitTest method to return the right hit-test view
return [self.targetView hitTest:pointForTargetView withEvent:event];
}
return [super hitTest:point withEvent:event];
}
However, I cannot understand how should I use it, so that my subview will recognize the touches.
Any help would be greately appreciated.
You need to subclass the UIView or which ever class you need and override that method. Then create an object of that subclass and use it. It will then recognize the touches.
I've implemented a paged scroll according to this technique
( iOS develop. How to extend UIScrollView's scroll event responding area? ) and it works just as intended.
The view that I'm scrolling is containing a couple of buttons and I want to be able to click not only those that are centered/paged into the scrollview but also those to the left and to the right of it. I cannot find any way to solve this but I'm not really an iOS-Jedi yet, hoping one of you are though :)
So as you can see from the screenie the UIScrollView is about a third of the width of the window, the contentsize of the UIScrollView is much larger: about 1500px and contains a lot of buttons added programmatically. The cool thing with this solution, and the part that actually works, is that the buttons:
1) are paged into the scrollview
2) are visible outside the scrollview (since "clip subviews" is unchecked for the scrollview) 3) the buttons are clickable when visible inside the uiscrollview.
BUT what doesn't work is simply this:
- the buttons currently being outside of the window does not receive "their" clicks when clicking on them, the events are instead forwarded to the underlaying (the white part of the window) view.
So,
I finally managed to solve this puzzle and the solution is divided into two
parts. The problem was, as you way recall, that the click events did not travel to the
buttons that were(visible) outside the UIScrollView. It turned out that the clicks were captured by the underlying view and that it is possible to manipulate their way to finding their target by bending the rules a bit regarding who got hit and thereby tricking the events to get passed where you want them. Not really sure if this is how it should be done but it solved my problem.. . :)
1) First one must override the following method in the bottom view
so that it returns the scrollview instead of itself when appropriate.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *view = [super hitTest:point withEvent:event];
if (view == self)
return [self scrollView];
return view;
}
2) The scrollView must override TWO methods to hand over the clicks to its contained objects.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *view = [super hitTest:point withEvent:event];
// Always return us.
return view ;
}
and
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// We want EVERYTHING!
return YES;
}
Thanks a lot for you comments and willingness to help.
I ho
Inspired by the answer #tommys mentioned, it turns out that by overriding the hinTest method of a UIView and return the scrollView instead, you actually can detach the swiping of this UIView to the scrollView.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *view = [super hitTest:point withEvent:event];
// Doing this make you detached the swiping to the scrollView
if (view == self)
return [self scrollView];
return view;
}
So this UIView is acting like an extension scroll area of the scrollView, the idea is here. If you make the UIView mask over the scrollView and same size of the window, then swiping anywhere inside the window makes the scrollView scroll.
Here is the example, ExtensionScrollArea
Here's my version:
hit test in container
- (UIView *) hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if ( CGRectContainsPoint( self.frame, point ) && ! self.hidden ) // <-- *
{
if ( ! CGRectContainsPoint( scrollView.frame, point ) )
return scrollView;
}
return [super hitTest:point withEvent:event];
}
(*) This marked line is important if you are moving about or otherwise hiding your view, for instance if you have multiple views, each with their own scrollviews. If you don't have this line, you may be directing all your touches to an off-screen scrollview!
override in scrollview
- (BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
return YES;
}
(in the hitTest of the container, you can exclude additional frames within the if statement for default behaviour) :)
The title is hard .
The the main case is like this
UIView *superView = [[UIView alloc] initWithFrame:CGRectMake(0,0,400,400)];
UIView *subView = [[UIView alloc] initWithFrame:CGRectMake(-200,-200,400,400)];
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapAction:)];
[subView addGestureRecognizer:tapGesture];
[superView addSubView:subView];
OK , you will find that the tap gesture will take effect when you click the area in (0,0,200,200) , if you click the point (-150,-150) the tap gesture will not take effect.
I don't know whether the click outside the superView bounds to cause this problem or not.
Anyone have any idea how to fix this?
To allow subviews lying outside of the superview to respond to touch, override hitTest:withEvent: of the superview.
Documentation on Event Delivery
Touch events. The window object uses hit-testing and the responder chain to find the view to receive the touch event. In hit-testing, a window calls hitTest:withEvent: on the top-most view of the view hierarchy; this method proceeds by recursively calling pointInside:withEvent: on each view in the view hierarchy that returns YES, proceeding down the hierarchy until it finds the subview within whose bounds the touch took place. That view becomes the hit-test view.
Create a subclass of UIView.
Override hitTest:withEvent.
Use this UIView subclass for the superview.
Add method below in subclass:
(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
NSEnumerator *reverseE = [self.subviews reverseObjectEnumerator];
UIView *iSubView;
while ((iSubView = [reverseE nextObject])) {
UIView *viewWasHit = [iSubView hitTest:[self convertPoint:point toView:iSubView] withEvent:event];
if(viewWasHit) {
return viewWasHit;
}
}
return [super hitTest:point withEvent:event];
}
Note: Reverse enumerator used since subviews are ordered from back to front and we want to test the front most view first.
The only workaround I've found for case like that is to create an instance of a view that is transparent for touches as main view. In such case inner view will respond to touches as it fits bounds of main. In the class I've made from different examples found in the net I can control the level of "touch visibility" like so:
fully visible - all of the touches end up in the view.
only subviews - the view itself invisible, but subviews get their touches.
fully invisible - pretty self explanatory I think :)
I didn't try to use it with gesture recognizers, but I don't think there will be any problem, as it works perfectly with regular touches.
The code is simple...
TransparentTouchView.h
#import <UIKit/UIKit.h>
typedef enum{
TransparencyTypeNone = 0, //act like usual uiview
TransparencyTypeContent, //only content get touches
TransparencyTypeFull //fully transparent for touches
}TransparencyType;
#interface TransparentTouchView : UIView {
TransparencyType _transparencyType;
}
#property(nonatomic,assign)TransparencyType transparencyType;
#end
TransparentTouchView.m
#import "TransparentTouchView.h"
#implementation TransparentTouchView
#synthesize
transparencyType = _transparencyType;
- (id)initWithFrame:(CGRect)frame{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
self.backgroundColor = [UIColor clearColor];
}
return self;
}
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
// UIView will be "transparent" for touch events if we return NO
switch (_transparencyType) {
case TransparencyTypeContent:
for(UIView* subview in self.subviews){
CGPoint p = [subview convertPoint:point fromView:self];
if([subview pointInside:p withEvent:event]){
return YES;
}
}
return NO;
break;
case TransparencyTypeFull:
return NO;
default:
break;
}
return YES;
}
#end
I believe that you can accomodate it to your needs.