iOS - UIScrollView hitTest doesn't contain touches - ios

I have a UIScrollView with UIViews inside it. The UIScrollView has two-finger scroll enabled. The UIViews each have a panGestureRecognizer. I want the following functionality:
If two-touch pan --> scroll.
If single touch pan && touch on a UIView --> fire UIView's panGestureRecognizer.
I want to do this by overwriting UIScrollView's hitTest. If the number of touches is greater than 1, return the UIScrollView to possibly scroll. If the number of touches is 1, return the normal hitTest result to possibly fire UIView's panGestureRecognizer. But my UIScrollView's hitTest code never has any touches! (Although I successfully two-finger scroll, the hitTest doesn't have any touches.)
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
NSSet *touches = [event touchesForView:self];
NSLog(#"%#", touches);
if ([touches count] > 1)
{
return self;
}
UIView *usualView = [super hitTest:point withEvent:event];
return usualView;
}

HitTest is a low level overridable method for handling touches, or better said, for detection of the destination of the touches. You cannot know number of touches here - event parameter is useless. Instead, you get called twice for each touch, meaning that for double touch you get called 4 times.
It is not suitable for detection of a number of touches or gestures inside, just for the destination of the touches.
Default implementation looks something like:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (self.hidden || !self.userInteractionEnabled || self.alpha < 0.01 || ![self pointInside:point withEvent:event] || ![self _isAnimatedUserInteractionEnabled]) {
return nil;
} else {
for (UIView *subview in [self.subviews reverseObjectEnumerator]) {
UIView *hitView = [subview hitTest:[subview convertPoint:point fromView:self] withEvent:event];
if (hitView) {
return hitView;
}
}
return self;
}
}

Related

A problem in the tableview section headview with hitTest method

I custom a lightGray view. The lightGray view has a property button. The button's backgroundColor is red.
I overwrite the hitTest method in lightGray view.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event{
if (!self.isUserInteractionEnabled || self.isHidden || self.alpha <= 0.01) {
return nil;
}
if([self pointInside:point withEvent:event]){
return self.testButton;
}
return nil;
}
As a result, the cursor at the point in the image above. I click the button could response.
If I click in the left area of lightGray view. The red button couldn't response.
What's the reason?
My real problem is the same problem in the tableview section headview. I write this demo to test and ask this question.
I was able to reproduce this behaviour even not using a table view, just a custom view, and it really looks strange - if we return button from hitTest, button's action gets fired not at whole area of the gray view, but rather only at some distance around the button.
I added a test view with a tap gesture recognizer looking for difference with button behaviour and it works as expected. So as a workaround you may replace the button with a view plus gesture recognizer.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if (!self.isUserInteractionEnabled || self.isHidden || self.alpha <= 0.01) {
return nil;
}
// if (CGRectContainsPoint(self.bounds, point)) {
// return self.testButton; // fires only at some distance
// }
if (CGRectContainsPoint(self.bounds, point)) {
return self.testView; // fires for every point inside gray view
}
return [super hitTest:point withEvent:event];
}
- (IBAction)testButtonTouchUpInside:(UIButton *)sender
{
NSLog(#"BTN HIT"); // Not OK
}
- (IBAction)testViewTap:(UITapGestureRecognizer *)sender
{
NSLog(#"VIEW TAP"); // OK
}

Swipe doesn't work properly in a uiscrollview

I have an UISrollView called templateView. I must add to it a swipe gesture to allow the user to swipe left/right to see another templates. The problem is that most of times the user can't swipe easily because the view scrolls down/up instead of swiping to another view. his finger needs to be aligned strictly horizontal to swipe to another page and this isn't acceptable from a user experience perspective.
Any idea how to handle such cases? Is there a way to implement an angle for detecting the swipe gesture? or, is there a way to do it as a custom uigesture for detecting oblique lines with a specific angle?
Thanks in advance.
Try to implement UIGestureRecognizer Delegate method. This method is called when recognition of a gesture by either gestureRecognizer or otherGestureRecognizer would block the other gesture recognizer from recognizing its gesture. Note that returning YES is guaranteed to allow simultaneous recognition.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Reference: UIGestureRecognizer Protocol
Do not forget to assign delegate, when you are initializing your swipe gesture.
UPDATE 1 CREATING YOUR OWN GESTURE
You always can subclass UIGestureRecognizer class and implement touchesBegan, touchesMoved, touchesEnded methods - manually managing the states of the gesture depending on your own needs.
I am posting some sample code of implementing custom EdgeGestureRecognizer for your better understanding.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touch = touches.anyObject;
CGPoint location = [touches.anyObject locationInView:self.view];
// if not single finger, then fail
if ([touches count] != 1)
{
self.state = UIGestureRecognizerStateFailed;
return;
}
//put here some logics for your case. For instance, you can register
//here your first touch location, it will help
//you to calculate the angle after.
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
if (self.state == UIGestureRecognizerStateFailed) return;
UITouch *touch = touches.anyObject;
self.previousPoint = self.currentPoint;
self.previousPointTime = self.currentPointTime;
self.currentPoint = [touch locationInView:self.view];
self.currentPointTime = touch.timestamp;
if (self.state == UIGestureRecognizerStatePossible)
{
CGPoint translate = CGPointMake(self.currentPoint.x - self.startPoint.x, self.currentPoint.y - self.startPoint.y);
// see if we've moved the necessary minimum distance
if (sqrt(translate.x * translate.x + translate.y * translate.y) >= self.minimumRecognitionDistance)
{
// recognize if the angle is roughly horizontal, otherwise fail
double angle = atan2(translate.y, translate.x);
if ([self isAngleCloseEnough:angle])
self.state = UIGestureRecognizerStateBegan;
else
self.state = UIGestureRecognizerStateFailed;
}
}
else if (self.state == UIGestureRecognizerStateBegan)
{
self.state = UIGestureRecognizerStateChanged;
}
}

Get current view from finger swipe in iOS SDK

In my application I have multiple small views joined together to form a big canvas. Im properly getting the touch begin/moved/ended events for each of those views separately. What I want now is this that if I touch at view1 and drag my finger out of view1 and into the territory of view2 without lifting my finger up, I want the view2 to somehow get a notification that I'm now in this view i.e. view2. Thanks.
I was able to do it using touchesMoved method. Here's the code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
CGPoint nowPoint = [touches.anyObject locationInView:self.view];
NSLog(#"%f, %f", nowPoint.x, nowPoint.y);
NSArray *viewsToCheck = [self.view subviews];
for (UIView *v in viewsToCheck)
{
if ([v isKindOfClass:[CharacterTile class]])
{
if (CGRectContainsPoint(v.frame, nowPoint))
{
CharacterTile *ctTemp = (CharacterTile *)v;
//perform your work with the subview.
}
}
}
}
where CharacterTile are the subviews added on self.view.
CGRectContainsPoint tells if the point touched by user is inside a view or not.

iOS: Objective How To Get One Image At A Time To Glow, As Finger Is Dragged Over Each Of A Series Of Images

Objective: To get one image at a time to glow, as finger is dragged over each of a series of images.
In an attempt to figure this out on my own, I tried to update the Alpha of the touched image to 1.00 while setting all others in the series of images to alpha 0.25 as finger drags over each individual image via touchesMoved? However my methods below did not produce the desired result.
Artwork for Glow overlay for each of the eight images is created in viewDidLoad using this pattern:
-(void)viewDidLoad
{
Glow *imageOne = [[Glow alloc]
initWithNibName:#"ImageOne" bundle:[NSBundle mainBundle]];
self.glowOneView = imageOne;
[imageOne release];
[self.glowOneView setTag:101];
[self.glowOneView setAlpha:0.25];
[self.glowOneView setCenter:CGPointMake(160,135)];
[self.view insertSubview:self.glowOneView atIndex:11];
}
(repeating the above pattern to uniquely create each of the remaining eight images).
touchesMoved pattern looks like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Began");
UITouch *touch = [touches anyObject];
if ([touch view] == glowOneView) {
[glowOneView setAlpha:1.00];
[glowTwoView setAlpha:0.25];
[glowThreeView setAlpha:0.25];
[glowFourView setAlpha:0.25];
[glowFiveView setAlpha:0.25];
[glowSixView setAlpha:0.25];
[glowSevenView setAlpha:0.25];
[glowEightView setAlpha:0.25];
NSLog(#"Began Button One");
}
else if ([touch view] == glowTwoView) {
[glowOneView setAlpha:0.25];
[glowTwoView setAlpha:1.00];
[glowThreeView setAlpha:0.25];
[glowFourView setAlpha:0.25];
[glowFiveView setAlpha:0.25];
[glowSixView setAlpha:0.25];
[glowSevenView setAlpha:0.25];
[glowEightView setAlpha:0.25];
NSLog(#"Began Button Two");
}
...
(repeating the above pattern to uniquely handle each of the remaining eight images).
The code snippets above will glow the first image touched, but as you drag around the view touchesMoved won't update the alpha for any subsequent image the finger is dragged over, until the touch is released and a new touch is initiated.
The Console shows the touchesMoved NSLog from the first image touched only, and continually repeats the NSLog for that first image touched for as long as the finger is dragged, no matter which image is subsequently under the dragging finger.
I would really appreciate your advice and any example that replaces or updates the above methods to produce the desired result of getting one image at a time to glow, as a finger is dragged over each of a series of images.
Thank you,
I've implemented something similar to what you describe. Assuming this code is in the view that contains your Glow views, you need to do a couple of things: 1) have your container view intercept touch events by returning itself from hitTest:withEvent:, and 2) in touchesMoved:withEvent:, call UIView's implementation of hitTest:withEvent: to figure out which view the touch is in.
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// intercept all touches inside container
if ([self pointInside:point withEvent:event]) {
return self;
}
return nil;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// figure out which view the touch is in
UIView *view = [super hitTest:[[touches anyObject] locationInView:self] withEvent:nil];
// highlight Glow subview under touch
if ([view isKindOfClass:[Glow class]]) {
[view setAlpha:1.0];
}
// un-highlight all other Glow subviews
for (UIView *subview in [self subviews]) {
if (subview != view && [subview isKindOfClass:[Glow class]]) {
[subview setAlpha:0.25];
}
}
}
You'll probably want to also implement touchesBegan:withEvent: to highlight the view before the touch starts moving and touchesCancelled:withEvent: and touchesEnded:withEvent: to un-highlight all.

UIView. Why Does A Subviews Outside its Parent's Extent Not Receive Touches?

I have a simple - trivial - UIView parent/child hierarchy. One parent (UIView). One child (UIButton). The parents bounds are smaller then it's child's bounds so that a portion of the child extends beyond the bounding box of its parent.
Here's the problem: Those portions of the child outside the bbox of the parent do not receive touches. Only tapping within the bbox of the parent allows the child button to receive touches.
Can someone please suggest a fix/workaround?
UPDATE
For those following this question, here is the solution I implemented as a result of #Bastians most excellent answer:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL isInside = [super pointInside:point withEvent:event];
// identify the button view subclass
UIButton *b = (UIButton *)[self viewWithTag:3232];
CGPoint inButtonSpace = [self convertPoint:point toView:b];
BOOL isInsideButton = [b pointInside:inButtonSpace withEvent:nil];
if (isInsideButton) {
return isInsideButton;
} // if (YES == isInsideButton)
return isInside;
}
The problem is the responder chain. When you touch the display it will go down from the parents to the childen.
So .. when you touch the screen the parent will see that the touch is outside of it's own bounds and so the children will not even asked.
The function that does that is the hitTest. If you have your own UIView class you can overwrite it and return the button by yourself.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
Per Appleā€™s own documentation, the simplest and most reliable way I have found to do this is to override hitTest:withEvent: in the superclass of your clipped view to look like the following:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// Convert the point to the target view's coordinate system.
// The target view isn't necessarily the immediate subview
CGPoint pointForTargetView = [self.targetView convertPoint:point fromView:self];
if (CGRectContainsPoint(self.targetView.bounds, pointForTargetView)) {
// The target view may have its view hierarchy,
// so call its hitTest method to return the right hit-test view
return [self.targetView hitTest:pointForTargetView withEvent:event];
}
return [super hitTest:point withEvent:event];
}
Precondition:
You have a UIButton(named as button1) inside a UIView(named as container), and button1 is partially outside the container's bounds.
Problem:
the part outside the container of button1 will not response click.
Solution:
subclass your container from UIView:
class Container: UIView {
override func pointInside(point: CGPoint, withEvent event: UIEvent?) -> Bool {
let closeButton = viewWithTag(10086) as! UIButton //<-- note the tag value
if closeButton.pointInside(convertPoint(point, toView: closeButton), withEvent: event) {
return true
}
return super.pointInside(point, withEvent: event)
}
}
Don't forget to give your button1 a tag of 10086
#dugla, thank you for the question!
#Bastian and #laoyur, thanks to you for answers!
Swift 4
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if yourChildView.point(inside: convert(point, to: yourChildView), with: event) {
return true
}
return super.point(inside: point, with: event)
}
I had the same exact problem at hand. You only need to override:
-(BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event
Here is working code for a custom UIView subclass created solely to be tappable for its out-of-bounds children.
#implementation ARQViewToRightmost
// We should not need to override this one !!!
/*-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return self;
return nil;
}*/
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
if ([self isInside:point])
return YES; // That answer will make hitTest continue recursively probing the view's subviews for hit
return NO;
}
// Check to see if the point is within the bounds of the view. We've extended the bounds to the max right
// Make sure you do not have any other sibling UIViews to the right of this one as they will never receive events
- (BOOL) isInside:(CGPoint)point
{
CGFloat maxWidth = CGFLOAT_MAX;
CGRect rect = CGRectMake(0, 0, maxWidth, self.frame.size.height);
if (CGRectContainsPoint(rect, point))
return YES;
return NO;
}
#end

Resources