A problem in the tableview section headview with hitTest method - ios

I custom a lightGray view. The lightGray view has a property button. The button's backgroundColor is red.
I overwrite the hitTest method in lightGray view.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
if (!self.isUserInteractionEnabled || self.isHidden || self.alpha <= 0.01) {
return nil;
}
if([self pointInside:point withEvent:event]){
return self.testButton;
}
return nil;
}
As a result, the cursor at the point in the image above. I click the button could response.
If I click in the left area of lightGray view. The red button couldn't response.
What's the reason?
My real problem is the same problem in the tableview section headview. I write this demo to test and ask this question.

I was able to reproduce this behaviour even not using a table view, just a custom view, and it really looks strange - if we return button from hitTest, button's action gets fired not at whole area of the gray view, but rather only at some distance around the button.
I added a test view with a tap gesture recognizer looking for difference with button behaviour and it works as expected. So as a workaround you may replace the button with a view plus gesture recognizer.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (!self.isUserInteractionEnabled || self.isHidden || self.alpha <= 0.01) {
return nil;
}
// if (CGRectContainsPoint(self.bounds, point)) {
// return self.testButton; // fires only at some distance
// }
if (CGRectContainsPoint(self.bounds, point)) {
return self.testView; // fires for every point inside gray view
}
return [super hitTest:point withEvent:event];
}
- (IBAction)testButtonTouchUpInside:(UIButton *)sender
{
NSLog(#"BTN HIT"); // Not OK
}
- (IBAction)testViewTap:(UITapGestureRecognizer *)sender
{
NSLog(#"VIEW TAP"); // OK
}

Related

UIView alpha 0 but still receive touch events?

I am trying to hide a uiview but still have it receive touch events. I am setting the alpha to 0 like so:
mainView.alpha = 0.0
also I've tried setting this to true but it doesn't do anything
mainView.userInteractionEnabled = true
Trouble is it no longer receives touch events when I do this. How can I enable it to receive touch events but be effectively hidden?
Set your UIView's backgroundColor to UIColor.clear.
mainView.backgroundColor = .clear
There are two ways to handle this:
The hard one - instead of setting alpha on the view, set it on all its subviews and make all the content in the view invisible (e.g. change background to clear color).
The easy one - let another view handle the events. Just add another transparent view with the same position and size (which is easy using constraints) which will handle the events.
By the way, if you check the documentation for method hitTest:withEvent: which is used to handle touch events, it says that views with alpha lower than 0.01 won't receive touches.
Hide all subviews of the view and set the backgroundColor to UIColor clearColor
Are you looking for a transparent view?
Just create a TransparantView sub class from UIView and copy below code. The view will be transparant, but still receive touches. You may need to change some code, because my code handles its subviews. The key is rewrite pointInside to meet your goals.
If you just want a transparent view, so other views under this transparent view can still get the touch events passed through the transparent view, you cannot add any subviews in transparentView.
The extended reading is to learn the responder chain.
#implementation TransparantView
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
return self;
}
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *view in self.subviews) {
if (!view.hidden && view.alpha > 0 && view.userInteractionEnabled && [view pointInside:[self convertPoint:point toView:view] withEvent:event])
return YES;
}
return NO;
}
#end
UPDATE:
Seems iOS will not dispatch events to a view while its alpha is set to 0. However, the transparent view will have nothing on screen as if its alpha=0. I have updated the code to fix a bug
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
for (UIView *view in self.subviews) {
if (!view.hidden && view.alpha > 0 && view.userInteractionEnabled && [view pointInside:[self convertPoint:point toView:view] withEvent:event])
return YES;
}
if (!self.hidden && self.alpha >= 0.0 && self.userInteractionEnabled) {
return YES;
} else
return NO;
}
#end

iOS - UIScrollView hitTest doesn't contain touches

I have a UIScrollView with UIViews inside it. The UIScrollView has two-finger scroll enabled. The UIViews each have a panGestureRecognizer. I want the following functionality:
If two-touch pan --> scroll.
If single touch pan && touch on a UIView --> fire UIView's panGestureRecognizer.
I want to do this by overwriting UIScrollView's hitTest. If the number of touches is greater than 1, return the UIScrollView to possibly scroll. If the number of touches is 1, return the normal hitTest result to possibly fire UIView's panGestureRecognizer. But my UIScrollView's hitTest code never has any touches! (Although I successfully two-finger scroll, the hitTest doesn't have any touches.)
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
NSSet *touches = [event touchesForView:self];
NSLog(#"%#", touches);
if ([touches count] > 1)
{
return self;
}
UIView *usualView = [super hitTest:point withEvent:event];
return usualView;
}
HitTest is a low level overridable method for handling touches, or better said, for detection of the destination of the touches. You cannot know number of touches here - event parameter is useless. Instead, you get called twice for each touch, meaning that for double touch you get called 4 times.
It is not suitable for detection of a number of touches or gestures inside, just for the destination of the touches.
Default implementation looks something like:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.hidden || !self.userInteractionEnabled || self.alpha < 0.01 || ![self pointInside:point withEvent:event] || ![self _isAnimatedUserInteractionEnabled]) {
return nil;
} else {
for (UIView *subview in [self.subviews reverseObjectEnumerator]) {
UIView *hitView = [subview hitTest:[subview convertPoint:point fromView:self] withEvent:event];
if (hitView) {
return hitView;
}
}
return self;
}
}

hitTest:withEvent: is calling sendActionsForControlEvents twice

I have a simple UIButton subclass and I just want to extend the touch area of the button without actually increasing the frame (this is because I want the highlight and selected backgrounds to be the size of the original frame).
Here is what I have added to my UIButton subclass:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
int errorMargin = 20;
CGRect largerFrame = CGRectMake(0 - (errorMargin / 2), 0 - (errorMargin / 2), self.frame.size.width + errorMargin, self.frame.size.height + errorMargin);
if ((CGRectContainsPoint(largerFrame, point)) == 1){
NSLog(#"Sending Action");
[self sendActionsForControlEvents:UIControlEventTouchUpInside];
return self;
}
else{
return nil;
}
}
This works great; however, it is calling the required action twice. So for instance a "tab" button will be tapped once, but tab over twice.
I have found a few questions related to a UIControl and the beganTracking method, but I can't get it to work with a UIButton.
Any ideas on how to properly implement this with a UIButton? Thanks!
By returning self for the larger area, you can fool the hitTest logic into thinking that the button is larger than it is. That should cause the button to trigger the action without the need to call sendActionsForControlEvents in the hitTest method.

Some buttons fail hitTest

My interface sometimes has buttons around its periphery. Areas without buttons accept gestures.
GestureRecognizers are added to the container view, in viewDidLoad. Here’s how the tapGR is set up:
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(playerReceived_Tap:)];
[tapGR setDelegate:self];
[self.view addGestureRecognizer:tapGR];
In order to prevent the gesture recognizers from intercepting button taps, I implemented shouldReceiveTouch to return YES only if the view touched is not a button:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Get the topmost view that contains the point where the gesture started.
// (Buttons are topmost, so if they were touched, they will be returned as viewTouched.)
CGPoint pointPressed = [touch locationInView:self.view];
UIView *viewTouched = [self.view hitTest:pointPressed withEvent:nil];
// If that topmost view is a button, the GR should not take this touch.
if ([viewTouched isKindOfClass:[UIButton class]])
return NO;
return YES;
}
This works fine most of the time, but there are a few buttons that are unresponsive. When these buttons are tapped, hitTest returns the container view, not the button, so shouldReceiveTouch returns YES and the gestureRecognizer commandeers the event.
To debug, I ran some tests...
The following tests confirmed that the button was a sub-subview of the container view, that it was enabled, and that both button and the subview were userInteractionEnabled:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Test that hierarchy is as expected: containerView > vTop_land > btnSkipFwd_land.
for (UIView *subview in self.view.subviews) {
if ([subview isEqual:self.playComposer.vTop_land])
printf("\nViewTopLand is a subview."); // this prints
}
for (UIView *subview in self.playComposer.vTop_land.subviews) {
if ([subview isEqual:self.playComposer.btnSkipFwd_land])
printf("\nBtnSkipFwd is a subview."); // this prints
}
// Test that problem button is enabled.
printf(“\nbtnSkipFwd enabled? %d", self.playComposer.btnSkipFwd_land.enabled); // prints 1
// Test that all views in hierarchy are interaction-enabled.
printf("\nvTopLand interactionenabled? %d", self.playComposer.vTop_land.userInteractionEnabled); // prints 1
printf(“\nbtnSkipFwd interactionenabled? %d", self.playComposer.btnSkipFwd_land.userInteractionEnabled); // prints 1
// etc
}
The following test confirms that the point pressed is actually within the button’s frame.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
CGPoint pointPressed = [touch locationInView:self.view];
CGRect rectSkpFwd = self.playComposer.btnSkipFwd_land.frame;
// Get the pointPressed relative to the button's frame.
CGPoint pointRelSkpFwd = CGPointMake(pointPressed.x - rectSkpFwd.origin.x, pointPressed.y - rectSkpFwd.origin.y);
printf("\nIs relative point inside skipfwd? %d.", [self.playComposer.btnSkipFwd_land pointInside:pointRelSkpFwd withEvent:nil]); // prints 1
// etc
}
So why is hitTest returning the container view rather than this button?
SOLUTION: The one thing I wasn't testing was that the intermediate view, vTop_land, was framed properly. It looked OK because it had an image that extended across the screen -- past the bounds of its frame (I didn't know this was possible). The frame was set to portrait width, rather than landscape width, so buttons on the far right were out of zone.
Hit test is not reliable in most cases, and it is generally not advisable to use it along with gestureRecognizers.
Why dont you setExclusiveTouch:YES for each button, and this should make sure that the buttons are always chosen.

UIView. Why Does A Subviews Outside its Parent's Extent Not Receive Touches?

I have a simple - trivial - UIView parent/child hierarchy. One parent (UIView). One child (UIButton). The parents bounds are smaller then it's child's bounds so that a portion of the child extends beyond the bounding box of its parent.
Here's the problem: Those portions of the child outside the bbox of the parent do not receive touches. Only tapping within the bbox of the parent allows the child button to receive touches.
Can someone please suggest a fix/workaround?
UPDATE
For those following this question, here is the solution I implemented as a result of #Bastians most excellent answer:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL isInside = [super pointInside:point withEvent:event];
// identify the button view subclass
UIButton *b = (UIButton *)[self viewWithTag:3232];
CGPoint inButtonSpace = [self convertPoint:point toView:b];
BOOL isInsideButton = [b pointInside:inButtonSpace withEvent:nil];
if (isInsideButton) {
return isInsideButton;
} // if (YES == isInsideButton)
return isInside;
}
The problem is the responder chain. When you touch the display it will go down from the parents to the childen.
So .. when you touch the screen the parent will see that the touch is outside of it's own bounds and so the children will not even asked.
The function that does that is the hitTest. If you have your own UIView class you can overwrite it and return the button by yourself.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
Per Apple’s own documentation, the simplest and most reliable way I have found to do this is to override hitTest:withEvent: in the superclass of your clipped view to look like the following:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// Convert the point to the target view's coordinate system.
// The target view isn't necessarily the immediate subview
CGPoint pointForTargetView = [self.targetView convertPoint:point fromView:self];
if (CGRectContainsPoint(self.targetView.bounds, pointForTargetView)) {
// The target view may have its view hierarchy,
// so call its hitTest method to return the right hit-test view
return [self.targetView hitTest:pointForTargetView withEvent:event];
}
return [super hitTest:point withEvent:event];
}
Precondition:
You have a UIButton(named as button1) inside a UIView(named as container), and button1 is partially outside the container's bounds.
Problem:
the part outside the container of button1 will not response click.
Solution:
subclass your container from UIView:
class Container: UIView {
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
let closeButton = viewWithTag(10086) as! UIButton //<-- note the tag value
if closeButton.pointInside(convertPoint(point, toView: closeButton), withEvent: event) {
return true
}
return super.pointInside(point, withEvent: event)
}
}
Don't forget to give your button1 a tag of 10086
#dugla, thank you for the question!
#Bastian and #laoyur, thanks to you for answers!
Swift 4
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if yourChildView.point(inside: convert(point, to: yourChildView), with: event) {
return true
}
return super.point(inside: point, with: event)
}
I had the same exact problem at hand. You only need to override:
-(BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event
Here is working code for a custom UIView subclass created solely to be tappable for its out-of-bounds children.
#implementation ARQViewToRightmost
// We should not need to override this one !!!
/*-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return self;
return nil;
}*/
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return YES; // That answer will make hitTest continue recursively probing the view's subviews for hit
return NO;
}
// Check to see if the point is within the bounds of the view. We've extended the bounds to the max right
// Make sure you do not have any other sibling UIViews to the right of this one as they will never receive events
- (BOOL) isInside:(CGPoint)point
{
CGFloat maxWidth = CGFLOAT_MAX;
CGRect rect = CGRectMake(0, 0, maxWidth, self.frame.size.height);
if (CGRectContainsPoint(rect, point))
return YES;
return NO;
}
#end

Resources