I've been busy for a few days trying to figure out how to handle touch in my Cocos2d project. The situation is a bit different as normal. I have a few different game layers that have items on it that I need to control with touch:
ControlLayer: Holds the game controls
(movement, action button). This layer is on top.
GameplayLayer: Holds the game objects
(CCSprites). This layer is directly beneath the ControlLayer.
Now my touches work fine in the ControlLayer, I can move my playable character around and make him jump and do other silly stuff. Yet I cannot grasp how to implement the touches to some of my CCSprites.
The information I've gathered so far makes me think I need get all my touch input from the control layer. Then I somehow need to 'cascade' the touch information to the GameplayLayer so I can handle the input there. Another option would be for me to get the CGRect information from my sprites by somehow creating an array with pointers to the objects that should be touchable. I should be able to use that information in the ControlLayer to check for each item in that list if the item was touched.
What is the best option to do this, and how do I implement this? I'm kind of new to programming with cocoa and Objective C so I'm not really sure what the best option is for this language and how to access the sprites CGRect information ([mySpriteName boundingBox]) in another class then the layer it is rendered in.
At the moment the only way I'm sure to get it to work is create duplicate CGRects for each CCSprite position and so I can check them, but I know this is not the right way to do it.
What I have so far (to test) is this:
ControlLayer.m
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGRect rect = CGRectMake(0.0f, 0.0f, 100.0f, 100.0f);
//Tried some stuff here to get see if I could get a sprite by tagname so I could use it's bounding box but that didn't work
// Check for touch with specific location
if (CGRectContainsPoint([tree boundingBox], location)) {
CCLOG(#"CGRect contains the location, touched!");
}
CCLOG(#"Layer touched at %#", NSStringFromCGPoint(location));
}
Thanks in advance for helping me!
The easiest and simplest way to solve your problem, IMO, is by using ccTouchBegan/Moved/Ended instead of ccTouchesBegan/Moved/Ended. Meaning, you are handling a single touch at a particular moment so you avoid getting confuses over multiple touches, plus the most important feature of ccTouchBegan is a CCLayer can 'consume' the touch and stop it from propagating to the next layers. More explanation after code samples below.
Here are steps to do it. Implement these sets of methods in all CCLayer subclasses that should handle touch events:
First, register with CCTouchDispatcher:
- (void)registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
Next, implement ccTouchBegan, example below is from a game I've created (some part omitted of course):
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if (scene.state != lvlPlaying) {
// don't accept touch if not playing
return NO;
}
CGPoint location = [self convertTouchToNodeSpace:touch];
if (scene.mode == modePlaying && !firstTouch) {
if (CGRectContainsPoint(snb_putt.sprite.boundingBox, location)) {
touchOnPutt = touch.timestamp;
// do stuff
// return YES to consume the touch
return YES;
}
}
// default to not consume touch
return NO;
}
And finally implement ccTouchMoved and ccTouchEnded like the ccTouches* counterparts, except that they handle single touch instead of touches. The touch that is passed to these methods is restricted to the one that is consumed in ccTouchBegan so no need to do validation in these two methods.
Basically this is how it works. A touch event is passed by CCScene to each of its CCLayers one by one based on the z-ordering (i.e starts from the top layer to the bottom layer), until any of the layers consume the touch. So if a layer at the top (e.g. control layer) consume the touch, the touch won't be propagated to the next layer (e.g. object layer). This way each layer only has to worry about itself to decide whether to consume the touch or not. If it decides that the touch cannot be used, then it just has to not consume the touch (return NO from ccTouchBegan) and the touch will automatically propagate down the layers.
Hope this helps.
Related
Since CCTargetedTouchDelegate has been removed since cocos2d 3.0, I would like to know how can i handle touch swallowing stuff?
I have a small sprite on top of a canvas node(layer in 2.x) and need to set the priority of the sprite higher than the canvas. When a user touches within the sprites bounding box, the touch is swallowed, otherwise the canvas will respond to it.
In cocos2d 3.0 touches are handled in reverse z-order order. This means that if your sprite is rendered on top of the canvas node it should already receive touch notification first, and has a chance to swallow it.
However, to receive and swallow the touch by the sprite you should follow these steps:
Create a separate class for your sprite and inherit it from
CCSprite.
Set self.userInteractionEnabled to YES in the init method
if this class.
Add empty touchBegan: method.
This will swallow the touch, because without calling [super touchBegan:...] in the touchBegan: method, you won't pass it to the underlying nodes.
The default implementation of CCSprite (and all the way up to CCNode) calls the [super touchBegan:...], this is why you need to create a subclass and override this behaviour.
In case you do need to pass touches to underlying nodes in some cases, you can write something like this:
-(void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
if (_passToUnderlyingNode == YES)
{
//passed to canvas node
[super touchBegan:touch withEvent:event];
}
else
{
//swallowed
}
}
In iOS 7's SpriteKit framework, I am attempting to build a simple game for the purposes of learning the framework. One area I am tripping over a little bit is how to detect a specific node when there are multiple nodes overlapping under a touch. Let me give an example:
In a basic chess training game, I can drag a piece forward one tile, but what happens after that is dependent on what other nodes are in that space. I want to know which tile the touch is on, regardless of any other nodes which happen to also be on that tile node. The problem I am running into is that the touch seems to detect the uppermost node. So my question would be:
What is the recommended solution for detecting the tile node? I was thinking about using zPosition in some way but I have yet to determine how to do that. Any suggestions?
Another approach would be to detect ALL nodes under a touch. Is there a way to grab all nodes and put them in an array?
Iterate through the nodes at the touched point:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
NSArray *nodes = [self nodesAtPoint:[touch locationInNode:self]];
for (SKNode *node in nodes) {
//go through nodes, get the zPosition if you want
int nodePos = node.zPosition;
//or check the node against your nodes
if ([node.name isEqualToString:#"myNode1"]) {
//...
}
if ([node.name isEqualToString:#"myNode2"]) {
//...
}
}
}
You could calculate the tile at a given point, since you would know the frame of the board and therefore the size of each tile.
E.g. Assume you have a board whose frame is {10, 50, 160, 160}. You know therefore that the size of each tile would be 20x20. If you have a touch at point {x,y} You know that the index of the row touched is (x-10)/20 and the index of the column is (y-50)/20. Oh, and you would probably need to use floorf in that calculation too.
Alternatively, and to actually answer your question, you could use the nodesAtPoint: method to get all the nodes at a given point :)
I have a custom view that has multiple subviews. They are all circles on the screen, sort of like three wheels of different radius on top of each other. I'm trying to make them receive a UITouch * event correctly to make them spin with the finger. Since the shapes are actually squares on the screen, when a bigger one flips and it's touchable area enters the frame of a circle above, it becomes untouchable.
So, I created another subview on top of others that will calculate the distance of the touch point to the center and distribute the touch event accordingly. I can think of several ways of doing it, but I was wondering what would be the most elegant, and most correct way of handling a situation like this.
This is what I've done so far: My custom view has a delegate, and that delegate is assigned to my main viewController. I have three protocol methods in my custom view, for the three wheels respectively. I'm passing out the touch and event according to the point of UITouch, but I'm not sure how should I actually send this data to the views that are supposed to receive it. They are all custom UIControl objects, and they all handle touches via the -beginTrackingWithTouch:withEvent:. Since this is a private method, I cannot access this from my viewController. Should I make this method public and access this from the viewController, or is there a more correct way of handling this?
Edit: added the code:
This is how I distribute the touch in the custom UIView object. The calculations work fine.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Distribute the touches according to the touch location.
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
//calculations for the circles.
CGFloat xDistance = (point.x - BIGGEST_CIRCLE_RADIUS);
CGFloat yDistance = (point.y - BIGGEST_CIRCLE_RADIUS);
CGFloat distance = sqrtf((xDistance*xDistance) + (yDistance*yDistance));
//Check to see if the point is in one of the circles, starting from the innermost circle.
if (distance <= SMALLEST_CIRCLE_RADIUS) {
[self.delegate smallestCircleReceivedTouch:touch withEvent:event];
} else if (distance < MIDDLE_CIRCLE_RADIUS) {
[self.delegate middleCircleReceivedTouch:touch withEvent:event];
} else if (distance <= BIGGEST_CIRCLE_RADIUS) {
[self.delegate biggestCircleReceivedTouch:touch withEvent:event];
} else {
return;
}
}
The delegate is the viewController and the circles are custom UIControls. They handle the touch like this:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchPoint = [touch locationInView:self];
{....}
return YES;
}
These work fine in themselves, but I'm not sure how should I connect the delegate method to the touch handling of each custom UIControl. Should I call their -beginTrackingWithTouch:withEvent: from the viewController, or should I make them implement the protocol of the customView? Or is there some other way to handle this properly?
Even I did not try it, it is not necessary, to do your own calculation. -hitTest:withEvent: should work fine for you.
I am using touchesMoved with a coordinate system to detect and respond to user touches within certain areas of the screen. For example, if I have a virtual keyboard and the user swipes across the keys, it reads the coordinates and responds:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 90 && point.x > 20)
{
//do something
}
}
...However, the problem is, if the user slowly drags across the keys, or the border between keys, the method is triggered several times in a row, playing the piano key sound in a stutter.
How can I prevent this stutter? I think setting a minimum delay of 0.25 seconds between each successive if statement triggering would help. Also, this delay would only be for a specific if statement -- I want the user to be able to drag across the keys quickly and trigger different key's if-statement as quick as they want.
Does anyone know how to code something like this?
Try this:
BOOL _justPressed; // Declare this in your #interface
...
- (void)unsetJustPressed {
_justPressed = NO;
}
Then, in your touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_justPressed) {
// A key was just pressed, so do nothing.
return;
}
else {
_justPressed = YES;
// Do stuff here
[self performSelector:#selector(unsetJustPressed)
withObject:nil
afterDelay:0.25];
}
}
This way, you set a variable _justPressed to YES every touchesMoved:withEvent: is called (or within a specific conditional in there, depending on what you want to do), and you use performSelector:withObject:afterDelay: to set _justPressed to NO after a certain time period, and so you can just check whether _justPressed is YES when touchesMoved: is called to ascertain whether it was called recently.
Remember, you don't have to return from the method like in the example above, you can simply use _justPressed to check whether you should play the sound, but still perform your other actions. The example is just to give you a basic idea of what to do.
In my application I have many UIButtons dynamically added to the View and I use below method
to Drag them around the View.
//forDragAction
[btnTarget addTarget:self action:#selector(wasDragged:withEvent:)
forControlEvents:UIControlEventTouchDragInside];
- (void)wasDragged:(UIButton *)button withEvent:(UIEvent *)event
{
// get the touch
UITouch *touch = [[event touchesForView:button] anyObject];
// get delta
CGPoint previousLocation = [touch previousLocationInView:button];
// frameof buttonChanged here
}
I want to stop dragging action if the dragged one intersect with anyother, I know that I can use for loop like below to check if any UIButton is Interacting
for(UIButton *btn in [[button superview] subViews])
{
//check if the btn frame interact with any others if so comeout of loop
}
I want to know if there is someother way, As mentioned way will get slower if the subViews count increse to such great amount
Edit:- the UIButtons are dynamically added to the UIView (But total amount of subViews won't exceed 120)
Try brute force first. You might be surprised at how well it does. (But remember that a loop through [[button superview] subviews] will contain the button itself, so it will always stop because the button intersects itself. Be sure to exclude the button).
Optimize after you have something working that is demonstrably slow with real data.
If that's really the case, there's a whole lot of algorithmic work done on this problem, which can be summarized as preprocessing the data into structures that allow cheaper initial tests to reject distant objects. This is a good SO answer on the topic, referring to this article.
I don't think that count of subviews, that may fill the screen (without intersections), is too large. So use function:
bool CGRectIntersectsRect (
CGRect rect1,
CGRect rect2
);
for detect whether frame of dragged button intersects with another subview.