Since CCTargetedTouchDelegate has been removed since cocos2d 3.0, I would like to know how can i handle touch swallowing stuff?
I have a small sprite on top of a canvas node(layer in 2.x) and need to set the priority of the sprite higher than the canvas. When a user touches within the sprites bounding box, the touch is swallowed, otherwise the canvas will respond to it.
In cocos2d 3.0 touches are handled in reverse z-order order. This means that if your sprite is rendered on top of the canvas node it should already receive touch notification first, and has a chance to swallow it.
However, to receive and swallow the touch by the sprite you should follow these steps:
Create a separate class for your sprite and inherit it from
CCSprite.
Set self.userInteractionEnabled to YES in the init method
if this class.
Add empty touchBegan: method.
This will swallow the touch, because without calling [super touchBegan:...] in the touchBegan: method, you won't pass it to the underlying nodes.
The default implementation of CCSprite (and all the way up to CCNode) calls the [super touchBegan:...], this is why you need to create a subclass and override this behaviour.
In case you do need to pass touches to underlying nodes in some cases, you can write something like this:
-(void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
if (_passToUnderlyingNode == YES)
{
//passed to canvas node
[super touchBegan:touch withEvent:event];
}
else
{
//swallowed
}
}
Related
I'm trying to determine if a point is inside a node that's been rotated. I've tried the following code in my scene:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
UITouch *touch = [touches anyObject];
if ([self.cardNode containsPoint:[touch locationInNode:self.cardNode.parent]]) {
NSLog(#"HIT");
}
}
But it seems that the nodes frame is still the same as to when it's not rotated. In other words, points outside of the rotated node (which are in the node when it's not rotated) are considered to be inside of it.
I've also tried using CGRectContainsPoint which produced the same results.
How can I determine if a point is in a rotated node?
In case if you have two textures you can use .contactTestBitMask property for two physical bodies so if they overlapping you will get notification and you can do whatever you want, that is it! (notice that you don't have to make any collision just notification) Here you should read about physicsBody property and didBeginContact method.
Because physics bodies created by texture so if they rotating it means that it's texture rotating so you will get what do you want.
But in case if you want to check if you touch is in location of some node, so you can use property
yourTouch.locationInNode(SOME_NODE)
Hope it helps!
I am unable to figure out the why. could someone please say where the bug is?
Xcode 6.1.1
Cocos2d 3.1.0
I used a break point to see if the touch method is getting called or not.
Its never being called when i test and touch in device.
I used this line blow in main method
self.userInteractionEnabled = YES;
also super on enter is called below main
Make certain your object instance has a content size set in onEnter. Touch may be enabled, but with (0,0) content size, they are not being dispatched. Also, you must have a touchBegan method in your code. The following lines are pretty much 'boilerplate' for my UI bearing classes:
- (void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
}
- (void)onEnter {
[super onEnter];
self.userInteractionEnabled = YES;
// replace following two lines by whatever works for your
// specifics, but make certain that you have them and that
// the object's geometry falls in the display screen.
// setPositionInPointsW is in my own CCNode category, not in cocos2d
[self setPositionInPointsW:self.viewPort.origin];
self.contentSizeInPoints = self.viewPort.size;
}
How I can get the touch location on a sprite body so I know what force to apply to the body.
So for instance if the sprite and body is in the shape of a rectangle and the player touches the top half of the rectangle, I'd want to apply a downward force or if the player touches the rectangle in the middle of the shape then I don't want to apply any force.
I can work out how to check if the touch location of the player's finger is on the object but calculating where the player touched on the object's body itself, I'm not sure how to go about it.
It would be better if you shown your code, but generally you can get touch location inside the sprite either using locationInNode: or convertToNodeSpace: methods, depending on what you already have.
Something like this:
-(void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
//Check if you touched the sprite in case
//you’re handling touches in your scene
//If you subclassed CCSprite and handling touches there
//you don’t have to do anything here.
CCSprite *yourSprite = ...;
CGPoint touchLocation = [touch locationInNode:yourSprite];
float halfHeight = yourSprite.boundingBox.size.height * 0.5f;
if (touchLocation.y >= halfHeight)
{
//Upper part
}
else
{
//lower part
}
}
Note, that you can handle touches in the scene, and then you first need to check whether you touched your sprite at all, or you can subclass CCSprite and implement touchBegan: method to get notified when the player touches your sprite (then you don’t need to check anything and yourSprite becomes self).
I have a view where I'm drawing lines. When I draw a line with two fingers or more, there is a weird behaviour. That's why I want to disable multi touch on this view.
I tried :
self.drawingView.multipleTouchEnabled = NO;
self.drawingView.exclusiveTouch = YES;
But there is no impact. And my the touches method are still called.
Ideally, I want to when I try to draw with two fingers, it does nothing. Is there a solution ?
Thanks :)
In your touches methods (Began/Moved) check how many touches are on screen and there is only one touch, handle it, otherwise pass it along. Example touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
if ((touches.count == 1) && ([event allTouches].count == 1)) {
// handle single finger touch moves here
....
} else {
// If more than one touch, pass it along
[super touchesBegan:touches withEvent:event];
}
}
I've been busy for a few days trying to figure out how to handle touch in my Cocos2d project. The situation is a bit different as normal. I have a few different game layers that have items on it that I need to control with touch:
ControlLayer: Holds the game controls
(movement, action button). This layer is on top.
GameplayLayer: Holds the game objects
(CCSprites). This layer is directly beneath the ControlLayer.
Now my touches work fine in the ControlLayer, I can move my playable character around and make him jump and do other silly stuff. Yet I cannot grasp how to implement the touches to some of my CCSprites.
The information I've gathered so far makes me think I need get all my touch input from the control layer. Then I somehow need to 'cascade' the touch information to the GameplayLayer so I can handle the input there. Another option would be for me to get the CGRect information from my sprites by somehow creating an array with pointers to the objects that should be touchable. I should be able to use that information in the ControlLayer to check for each item in that list if the item was touched.
What is the best option to do this, and how do I implement this? I'm kind of new to programming with cocoa and Objective C so I'm not really sure what the best option is for this language and how to access the sprites CGRect information ([mySpriteName boundingBox]) in another class then the layer it is rendered in.
At the moment the only way I'm sure to get it to work is create duplicate CGRects for each CCSprite position and so I can check them, but I know this is not the right way to do it.
What I have so far (to test) is this:
ControlLayer.m
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGRect rect = CGRectMake(0.0f, 0.0f, 100.0f, 100.0f);
//Tried some stuff here to get see if I could get a sprite by tagname so I could use it's bounding box but that didn't work
// Check for touch with specific location
if (CGRectContainsPoint([tree boundingBox], location)) {
CCLOG(#"CGRect contains the location, touched!");
}
CCLOG(#"Layer touched at %#", NSStringFromCGPoint(location));
}
Thanks in advance for helping me!
The easiest and simplest way to solve your problem, IMO, is by using ccTouchBegan/Moved/Ended instead of ccTouchesBegan/Moved/Ended. Meaning, you are handling a single touch at a particular moment so you avoid getting confuses over multiple touches, plus the most important feature of ccTouchBegan is a CCLayer can 'consume' the touch and stop it from propagating to the next layers. More explanation after code samples below.
Here are steps to do it. Implement these sets of methods in all CCLayer subclasses that should handle touch events:
First, register with CCTouchDispatcher:
- (void)registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
Next, implement ccTouchBegan, example below is from a game I've created (some part omitted of course):
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if (scene.state != lvlPlaying) {
// don't accept touch if not playing
return NO;
}
CGPoint location = [self convertTouchToNodeSpace:touch];
if (scene.mode == modePlaying && !firstTouch) {
if (CGRectContainsPoint(snb_putt.sprite.boundingBox, location)) {
touchOnPutt = touch.timestamp;
// do stuff
// return YES to consume the touch
return YES;
}
}
// default to not consume touch
return NO;
}
And finally implement ccTouchMoved and ccTouchEnded like the ccTouches* counterparts, except that they handle single touch instead of touches. The touch that is passed to these methods is restricted to the one that is consumed in ccTouchBegan so no need to do validation in these two methods.
Basically this is how it works. A touch event is passed by CCScene to each of its CCLayers one by one based on the z-ordering (i.e starts from the top layer to the bottom layer), until any of the layers consume the touch. So if a layer at the top (e.g. control layer) consume the touch, the touch won't be propagated to the next layer (e.g. object layer). This way each layer only has to worry about itself to decide whether to consume the touch or not. If it decides that the touch cannot be used, then it just has to not consume the touch (return NO from ccTouchBegan) and the touch will automatically propagate down the layers.
Hope this helps.