I'm trying to enable touch for layers as many other people have suggested online:
hudLayer = [[[CCLayer alloc] init] autorelease];
[self addChild:hudLayer z:3];
gameLayer = [[[CCLayer alloc] init] autorelease];
[self addChild:gameLayer z:1];
gameLayer.isTouchEnabled = YES;
rileyLayer = [[[CCLayer alloc] init]autorelease];
[self addChild:rileyLayer z:2];
pauseMenu = [[[CCLayer alloc] init] autorelease];
[self addChild:pauseMenu z:4];
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:hudLayer priority:0 swallowsTouches:YES];
and my touchmethods are here:
- (BOOL)ccTouchBegan:(NSSet *)touch withEvent:(UIEvent *)event {
return TRUE;
}
- (void)ccTouchEnded:(NSSet *)touch withEvent:(UIEvent *)event {
if (!paused) {
ratMove = 0;
}
}
however this continually throws the error: Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Layer#ccTouchBegan override me'
The only reason I could find for this error online is if you weren't including the ccTouchBegan function, however I am, does anyone else know any other reasons for this error to appear?
Subclass CCLayer to have hud layer, then inside it implement these methods.
You add your hud layer as targeted delegate, then it must implement at least ccTouchBegan:withEvent: method. If you want your hud to be targeted delegate, make CCLayer subclass and implement there methods from targeted touch delegate protocol.
your function does not implement the appropriate signature. Try:
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
// your stuff here
}
if you want multiple touch handling (your signature), you should addStandardDelegate instead of targetedTouchDelegate.
EDIT : and now in objective-c:
[[CCDirector sharedDirector].touchDispatcher addStandardDelegate:self priority:0];
There are two protocols implemented by the touch dispatcher. You are currently registering as a targetTouchDelegate but implementing the delegate methods of the standardDelegate. Use the line above to register if you want to keep your methods.
Edit 2 : and now the exact syntax of the protocols, straight from cocos's code. As you can see, no ccTouchBegan with NSSet (your signature) BUT INSTEAD ccTouchesBegan. Whichever handling method you prefer (targeted of standard), you must conform to the protocols below.
#protocol CCTargetedTouchDelegate
/** Return YES to claim the touch.
#since v0.8
*/
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event;
#optional
// touch updates:
- (void)ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event;
- (void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event;
- (void)ccTouchCancelled:(UITouch *)touch withEvent:(UIEvent *)event;
#end
/**
CCStandardTouchDelegate.
This type of delegate is the same one used by CocoaTouch. You will receive all the events (Began,Moved,Ended,Cancelled).
#since v0.8
*/
#protocol CCStandardTouchDelegate <NSObject>
#optional
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
#end
Related
I use the following code to add a gesture recognizer on the top right side of uinavigationbar but i get result if i tap anywhere on the navbar. How am i supposed to make a gesture for the top right corner?
- (void)handleGestureForTopRightBarButtonItem:(UIGestureRecognizer *)gestureRecognizer {
CGPoint p = [gestureRecognizer locationInView:self.navigationController.navigationBar];
if (CGRectContainsPoint(CGRectMake(self.navigationController.navigationController.view.width-20,0,100,self.navigationController.navigationController.view.height), p)) {
NSLog(#"got a tap on the right side in the region i care about");
} else {
NSLog(#"got a tap on the right side, but not where i need it");
}
}
As UIGestureRecognizer is reporting to a class object there are a couple of ways to solve this. UIGestureRecognizer was not meant to be stacked multiple times on the same view, if you do so you would very likely drain more Energy than you need apart from the loss of CPU power and lots of comparison code that has to distinguish all the running recognisers. But it would work..
a) write code that compares its coordinates and expected values and if they match in the range you want do your actions.
b) create another object that is living only in the coordinates you want and has it own UIGestureRecognizer. Not ideal, as written above.
c) use the power of UIControl which are also inherited UIView's that are also UIResponders.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (void)endTrackingWithTouch:(nullable UITouch *)touch withEvent:(nullable UIEvent *)event; // touch is sometimes nil if cancelTracking calls through to this.
- (void)cancelTrackingWithEvent:(nullable UIEvent *)event;
d) use the power of UIView without a UIGestureRecognizer. Which by the way basically works also on CALayers, they do not have the sendAction methods, they are not UIControls.
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL inside = [super pointInside:point withEvent:event];
if (inside && event.type == UIEventTypeTouches) {
//[super sendActionsForControlEvents:UIControlEventTouchDown];
}
return inside;
}
e) code some Class that inherits from UIResponder, which basically is what UIControls do and use their API instead so you make use of touch coordinates as well.
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
[self someCoRoutineWithTouch:touch];
}
//[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved :(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesEnded:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
Just don't forget that UIGestureRecognizer is basically for gestures that are made of touches, even multiple touches but not mainly to catch touches in general, despite in a lot of examples they are used instead of coding a proper UIControl. Also dont forget in a Devices edges the recognition of gestures is limited by the nature of finger size.
I searched but not quite understand why we cant detect a UITouch on UITableView. What I am having right now is :a view controller with a table view located in its view. Please look at the picture below for your reference
In implementation class, I am enabling breakpoint for each UITouch methods which are
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
I notice that, these breakpoints are invoked if and only if you touch outside of the table view ( orange area )
I do not get it. I thought UITableView is subclass of UIScrollView which is subclass of UIView which is subclass of UIResponder. It means UITouch should be invoked. (correct me if I am wrong )
All comments are welcomed and appreciated here.
Rather than tampering with the table view, use a gesture recognizer. You can act as the delegate to ensure that all interactions work concurrently and enable and disable the gestures if / as required.
You can detect touches method in the UITableView by subclassing it as this:
I Test it and it print "Test" successfully
//TestTable.h
#import <UIKit/UIKit.h>
#interface TestTable : UITableView
#end
//TestTable.m
#import "TestTable.h"
#implementation TestTable
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog("Test");
}
Tables utilize scroll views to handle panning, which use a pan gesture recognizer. Why not just tap into that?
CGPoint location = [self.tableView.panGestureRecognizer locationInView:self.tableView];
If you wish to detect the touches on UITableView, create a subclass of tableview and add implement UIResponder method, canBecomeFirstResponder.
#interface MyTableView: UITableView
#end
#implementation: MyTableView
- (BOOL) canBecomeFirstResponder{
return YES;
}
// then implement all other touch related methods, remember to call super in these methods
// such that it correctly forwards the events to other responders
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
}
#end
In an iOS app, I have a UIWebView which I use to display some contents and to dismiss this UIWebView I use the following mechanism:
I put a DoubleTapView (subclass of UIView) on top of the UIWebView.
This DoubleTapView catches the double-tap events and executes adequate code to dimiss itself as well as the UIWebView.
Up to this point it works fine.
Here is the problem:
All the events that the UIWebView is interested in (scrolling the contents, tapping links ..etc..) don't go through; they are blocked by the DoubleTapView. How can I solve this?
In fact for what I just described I could have used a UIView instead of a DoubleTapView.
But I intended to subclass UIView with DoubleTapView. And then implement:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
or
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event;
to solve my problem, but I must be doing something wrong, because it does not work (at list the way I did).
Here is some code I wrote to put the parts together:
webView=[[UIWebView alloc] initWithFrame:varFrame];
webView.delegate=self;
quitGesture=[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(quitWeb:)];
quitGesture.numberOfTapsRequired=2;
varFrame.origin=CGPointZero;
quitSensor=[[DoubleTapView alloc] initWithFrame:varFrame];
quitSensor.userInteractionEnabled=YES;
[quitSensor addGestureRecognizer:quitGesture];
[webView addSubview:quitSensor];
and here is what I tried when implementing the methods for DoubleTapView:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[[self superview] touchesBegan:touches withEvent:event];
}
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event {
[[self superview] motionBegan:motion withEvent:event];
}
I'm trying to implement ccTouchesBegan in my GameScene.
I've set isTouchEnabled = YES. I'm also calling addStandardDelegate on the touchDispatcher. Finally, in my AppDelegate i have [glView setMultipleTouchEnabled:YES].
However, ccTouchesBegan is never called.
What am I doing wrong?
Solved it!
I was registering touches on a previous layer, but the layer wasn't being dealloced because you have to "un-register" with the onExit method.
Long story short: touchesBegan was not being called on my GameLayer because it was being swallowed by another layer.
Create one dummy application and try this:
#protocol CCStandardTouchDelegate <NSObject>
#optional
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
#end
May be it will help you..
I subclassed the UIWindow, but for some reason the - (void)sendEvent:(UIEvent *)event gets called 2 times for any interaction with the screen. Any ideas why would that happen?
For debugging purposes, subclass window (of app delegate) and override sendEvent: method
-(void) sendEvent:(UIEvent *)event
{
NSLog(#"%#",event);
[super sendEvent:event];
}
Most probably, you will notice the events responsible for TouchesBegan and TouchesEnded (for tapping). This can be tested by subclassing the View and overriding touch related methods.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"tocuhesBegan");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesMoved");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesCancelled");
}
Also, try drag/swipe on the view to notice the change in count of events sent to view :)
sendEvent gets called for fingerDown and allFingersUp