Finger Tracking and Gesture Recognizing - ios

So I'm developing an app where I have all my gestures are being recognized. My problem comes when I attempt to add UIImageViews wherever the finger touches the screen.
These Views are to follow the finger, which they do, but the problem is I believe they are swallowing the touches not allowing the Gestures to be recognized. I have tried:
[_fingerHolder1 setUserInteractionEnabled:NO];
[_fingerHolder2 setUserInteractionEnabled:NO];
But it doesn't seem to change anything.
I am adding these the View in the ccTouchesBegan/Moved/Ended methods, whereas the gestures are being recognized in their respective handlers.
I have looked at using the UIPanGesture but I'm having some trouble recognizing the swipes as well as setting the coordinates for theUIImageViews of the finger trackers while doing this. Should I experiment with this more or is there a different solution?
Any help is appreciated!

The UIImageView will receive and process touches, hence they will not be forwarded to the cocos2d OpenGL view (also a UIView).
To make this work you need to create a subclass of UIImageView and override each touches… method and manually forward the event to cocos2d's view, here's the example for touchesBegan:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[[CCDirector sharedDirector].view touchesBegan:touches withEvent:event];
}
Use this UIImageView subclass in place of the original ones you use currently.
That will make regular cocos2d touch events work, and it should also make UIGestureRecognizers behave as expected if you've added those to cocos2d's view.

If I understand what you need (please correct me if I'm wrong), you want to move some UIViews when a drag(pan) event is detected, but you also add UIImageViews when the user touches the screen and this disables the touches.
You should set UIIMageView.userInteractionEnable = YES(by default is set to NO), basically every view that should detect touches should have userInteractionEnable = YES.
If you want to ignore some touches on some subviews you should implement:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch method of UIGestureRecognizerDelegate.
For handling different types of gesture you should implement the method:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}

Related

Make a UITapGestureRecognizer not consume taps that are not relevant to a view

I've set a UITapGestureRecognizer on a view. Only specific areas within the view need to trigger the view's behavior. I do that by checking whether the exact location of the touch is on actual actionable areas in the view. In cases where the touch is not in these areas, I'd like the tap to propagate upwards in the view hierarchy.
I've tried to use the gesture delegate: shouldReceive touch, and test there if the touch is relevant to the view, but I cannot perform the action there, as it fires on touch, not on tap. I could just perform the relevancy test in shouldReceive, and only if the tap action handler gets called perform the action, but I find it awkward. Is there a more elegant way to tell iOS that my gesture recognizer decided not to consume the tap?
Based on our discussion, you can evaluate using a combination of the if-else check in attributedString touch handler & gestureRecognizer:shouldReceiveTouch:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
if ([touch.view isDescendantOfView:interestingView]) {
return NO;
}
return YES;
}
Hope this helps. If not, then again, I'll reiterate, it would be better you could share some related code.

Detecting a second touch while holding a first touch in Cocos2d v3 iOS

I'm wanting to be able to detect the following in Cocos2d v3:
A touch is initiated and held, then a second touch occurs somewhere else on the screen. Think of holding with one finger, and tapping with a second.
I've tried to use - (void)touchMoved:(UITouch *)touch withEvent:(UIEvent *)event but this is only called the first time the second touch occurs, not subsequently.
To be clear, if I hold a touch on the screen and then tap somewhere else, the above method is called. But if I continue holding the first touch and then tap a second time, the above method is not called.
Also, touchBegan: is only called when the first touch occurs (i.e. the initial holding touch) and touchEnded: is only called when all touches are removed, including the initial holding touch.
I'd like to know:
1) How to recognise the above gesture in Cocos2d v3?
2) If 1) isn't possible, would there be a way to do it with my own gesture recogniser, and how would I implement my own gesture recogniser into Cocos2d v3?
Turns out by default Cocos2d V3 only responds to a single touch by default.
The solution:
self.multipleTouchEnabled = TRUE;
This means now every new touch will call:
-(void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
And every time a finger is lifted from the screen it will call:
-(void) touchEnded:(UITouch *)touch withEvent:(UIEvent *)event
Even if there are other touches continuing.
if you use void HelloWorld::ccTouchesBegan(CCSet *pTouches, CCEvent *pEvent) then you get each touch count using pTouches->count();

Handling a touch event across multiple subviews

I am new to iOS development. I have a custom drawn view which is composed of multiple subviews covering the target area on screen. Specifically this is a board game like chess where I used a view for each square. The squares are created as subviews on a UIView. There is one UIViewController for this. From what I read, I have to have touchesBegan, touchesEnded etc calls in my UIView to handle these. But none of these functions are getting called. I added these calls on the base view and all the subviews. So -
How do I simulate these touch events in the iOS simulator? A mouse click is not calling the touchesBegan ,touchesEnded calls on any view.
Ideally I would like to handle these in the UIViewController because I want to run the touch through some logic. Is that possible? How do I achieve it?
Please refer THIS
It is tutorial in Apple sample codes it describes how to handle touches very nicely.
Just run the sample code and go through description you will get clear idea how touches work un iOS.
Turns out when I add the view programmatically and not through the storyboard the userInteractionEnabled variable is set to NO by default. After setting it up, I get the touchesEnabled call getting called in the view.
Check this :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if(![touch.view isKindOfClass:[yourView class]])
{
}
}
Hope this will help.

iOS - View that handle taps, but let swipes go through to the superview

I have an app with quite a complex UI, there's a big UIView called the cover with a UITableView underneath it. The tableView is configured with a tableHeaderView of the same height as the cover. As the tableView scrolls up, the cover moves up the screen (with various fancy animations) using a UIScrollViewDelegate. To allow users to scroll the tableView by swiping the cover, I've overridden the - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event method to always return false.
I've now added some UIButton views to the cover. I've managed to make them respond to taps by changing the way I've overriden the pointInside method like this:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
BOOL isInside = [_directionsButton pointInside:[_directionsButton convertPoint:point fromView:self] withEvent:event];
return isInside;
}
The only problem now is that if you start a swipe gesture on the button, it's caught by the button and the tableView doesn't scroll. I want to be able to ignore swipe gestures on the button (so really let them pass to the view below).
Normally, I would just make the cover view the tableHeaderView, which seems to handle this kind of behaviour really well. However, I can't do this here, due to some unique animations done on the cover as the table scrolls.
Did you tried identifying the Gestures using Gesture Recognisers and doing action method that is to be called when the specified gesture is detected?
Please check this link. This may help you for that.

UIScrollView Subclass never receives touchesBegan message for swipes

I am trying to make a scroll view only scrollable on a certain region. To do this, I am subclassing UIScrollView and overriding touchesBegan (similar to this question).
Here's my (pretty simple) code.
.h
#interface SuppressableScrollView : UIScrollView
#end
.m
#import "SuppressableScrollView.h"
#implementation SuppressableScrollView
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"touchesBegan touches=%# event=%#", touches, event);
[super touchesBegan:touches withEvent:event];
}
#end
touchesBegan is only being called for touches that UIScrollView doesn't normally consume (like taps). Any idea how to intercept all of the touches?
I think I am missing a concept somewhere.
I was recently looking into something similar for UITableViews. UITableView is and extension of UIScrollView. In digging around inside it I discovered that there are 4 gesture recognisers attached to the UIScrollView to pick up swipes and other things. I would suggest dump out the gesture recognisers properties to see if any are being automatically created (which I think they are). In which case the only option I can think of is to remove them, but then the scroll view will not respond to gestures.
So perhaps you need to look at those gesture recognisers and the gesture recogniser delegates you can use to see if there is a better place to hook into.
P.S. gesture recognisers will automatically start swallowing events once they recogniser a gesture in progress.
If the frame size is greater than the content size, your touches began method may not fire.
Since its working only for taps, my guess is that the content size of the scroll view is not set properly.

Resources