I'd really like to get to the bottom of why this code causes intermittent response to touch input... Even with NSLog as the first instruction...
I've just made a new project in cocos2d with Box2d for a game that at this stage needs to just a few simple things...
A basket that appears in the centre of the screen. It must be a b2Fixture and fall onto a surface. Then if the user touched the screen, I want the basket to zoom to the touch point, and from there the user can drag it around the screen.
When the user lets go, the basket drops... I have this working right now...
However the BUG is that touching the screen doesn't always work... It intermittently responds to touches, and therefore intermittently calls the methods.
As you will see below, I have used NSLog to check when each methods are being called. The result is that sometimes you have to lift your finger off the screen and then back on several times, and then "seemingly at random", it will decide to run the code....
Heres what I got...
My touch methods....
-(void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Multi Touch Moved...");
if (_mouseJoint == NULL) return;
UITouch *myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView:[myTouch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
b2Vec2 locationWorld = b2Vec2(location.x/PTM_RATIO, location.y/PTM_RATIO);
_mouseJoint->SetTarget(locationWorld);
}
-(void)ccTouchCancelled:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"\nThe touch was CANCELED...");
}
-(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"Single Touch Moved...");
}
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView:[myTouch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
b2Vec2 locationWorld = b2Vec2(location.x/PTM_RATIO, location.y/PTM_RATIO);
NSLog(#"\n\nTouch did begin...");
if (_mouseJoint != NULL)
{
_mouseJoint->SetTarget(locationWorld);
NSLog(#"The IF statment was met...");
return;
}
NSLog(#"The IF statment was NOT met...Running _mouseJoint setup...");
b2MouseJointDef md;
md.bodyA = _groundBody;
md.bodyB = _body;
md.target = _body->GetPosition();
md.collideConnected = true;
md.maxForce = 100000000.0f * _body->GetMass();
_mouseJoint = (b2MouseJoint *)_world->CreateJoint(&md);
_body->SetAwake(true);
_mouseJoint->SetTarget(locationWorld);
}
-(void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_mouseJoint != nil) {
_world->DestroyJoint(_mouseJoint);
_mouseJoint = nil;
}
}
And this is the interface that some of the code refers to...
#interface HelloWorldLayer : CCLayerColor
{
b2World *_world;
b2Body *_body;
b2Fixture *_bodyFix;
b2MouseJoint *_mouseJoint;
b2Body *_groundBody;
}
The only idea another member helped me determine is, that because I'm working within a single scene, and I have/had a CCMenu and a CCLabelTTF on the screen, is it possible that the CCMenu is still intercepting touches, and if so, how can I destroy the CCMenu after my animation has finished?
Pressing the only button item simply CCmoves the title and label(CCMenuItem) off the screen vertically... But the object still exists...
The problem here was that my CCMenu was still receiving touches (I assume in parts of the screen where the CCMenuItems where formed).
The solution was to declare my CCMenu in my #implementation rather than in my init scene setup.
#implementation HelloWorldLayer
{
CCLabelTTF *label;
CCLabelTTF *startTheGameLabel;
CCSprite *theCattery;
CCMenu *_menu;
}
Then in my init, rather than declaring it there, i simply assign to the one in the #implementation.
-(id) init { if( (self=[super initWithColor:ccc4(50, 180, 220, 255)]) )
{
//.. Other "irrelevant to question" scene setup stuff here...//
// Start the game button
startTheGameLabel = [CCLabelTTF labelWithString:#"Save Some Kitties!" fontName:#"Zapfino" fontSize:20];
CCMenuItemLabel *startTheGameLabelItem = [CCMenuItemLabel itemWithLabel:startTheGameLabel target:self selector:#selector(startGameStub:)];
// Push the menu
_menu = [CCMenu menuWithItems: startTheGameLabelItem, nil];
[self addChild:_menu];
//.. Other "irrelevant to question" scene setup stuff here...//
}
This gives me access to the CCMenu throughout my class, so I can later disable touch input once the user has made a selection.
-(void) startGameStub:(id)sender
{
CGSize size = [[CCDirector sharedDirector] winSize];
// Clear the labels off the screen
CMoveTo *moveTheTitleLabelAction = [CCMoveTo actionWithDuration:1.0 position:ccp(label.position.x, size.height + label.boundingBox.size.height)];
CCMoveTo *moveTheStartLabelAction = [CCMoveTo actionWithDuration:1.0 position:ccp(startTheGameLabel.position.x, size.height + startTheGameLabel.boundingBox.size.height)];
// Commit actions
[label runAction:moveTheTitleLabelAction];
[startTheGameLabel runAction:moveTheStartLabelAction];
// LIFE SAVING MESSAGE!!!
[_menu setTouchEnabled:NO]; // This is what fixes the problem
}
Related
Okay so I have to Scenes: TitleScene and GameScene. Obviously there is a lot of stuff in GameScene and that resulted in something very annoying. I have a play-button in the TitleScene and whenever that gets pressed I present the GameScene.
I do that like this:
In TitleScene.m :
-(void)didMoveToView: (SKView *)view {
if(!self.contentCreated) {
self.createSceneContents;
self.contentCreated = YES;
}
}
-(void)createSceneContents {
//among other TitleScene stuff
self.gameScene = [GameScene sceneWithSize:self.frame.size];
self.gameScene.scaleMode = SKScaleModeResizeFill;
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
if ([self.playButton containsPoint:location]) {
[self.view presentScene:self.gameScene];
}
}
The Problem is that whenever I press the play-button it causes like a second to two seconds of lag before transitioning to the next scene.
I create the GameScene just like the TitleScene with a lot of stuff in the createSceneContents method. I know it is not the simulators fault, because I tested it on my iPhone 6.
Can I pre-load all the GameScene's stuff when the app is first opened, or is there another way to stop this lag?
I have an NSMutableArray filled with different sprites. These sprites are all on the screen. How can I detect if a touch is landing on one of these sprites, and then do something if a touch on the sprite has occurred?
This is what I have now,
CGPoint touchLocation = [touch locationInNode:_physicsNode];
if(CGRectContainsPoint((starInArray.boundingBox), touchLocation)) {
Instead of (starInArray.boundingBox), I want to be able to say something like (anyObjectInMyArray.boundingBox).
Any way to go about this?
Thanks!
Something along the lines of this should work.
- (BOOL) ccTouchBegan:(UITouch *)touches withEvent:(UIEvent *)event
{
CGPoint touchLocation = [self convertTouchToNodeSpace:touches];
for (CCSprite *star in starInArray)
{
if (CGRectContainsPoint(CGRectMake(star.position.x - star.contentSize.width/2,
star.position.y - star.contentSize.height/2, star.contentSize.width, star.contentSize.height), touchLocation))
{
//Do Something
break;
}
}
}
I would like some of my Sprite Kit nodes to behave like UIButtons. I tried 2 approaches:
1) Use touchesBegan: - this works if a user is careful, but seems to fire multiple times, faster than I can disable interaction, resulting in a button being able to be activated multiple times:
spriteNode.userInteractionEnabled = YES;
//causes the following to fire when node receives touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
self.userInteractionEnabled = NO;
DLog(#"Do node action");
self.userInteractionEnabled = YES;
}
2)I switched to Kobold-Kit as a layer on top of iOS Sprite Kit. One of the things it allows me to do is add button - like behavior to any node. However, I'm running into an issue where a if I have 2 buttons stacked on top of each other, tapping the top button activates both. Boolean flags can prevent repeated interactions in this case. I tracked the issue of stacked buttons firing together to this call within KKScene:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
for (id observer in _inputObservers)
{
if ([observer respondsToSelector:#selector(touchesBegan:withEvent:)])
{
[observer touchesBegan:touches withEvent:event];
}
}
}
The scene simply sends notifications to all nodes within the scene. This causes buttons behaviors that are stacked on top of each other to fire together.
Is there a way or example that shows how to properly arrange sprite nodes to allow behavior similar to UIButton, where I can have only the top button activate, and each activation disables the button for a short time afterwards?
#property (nonatomic, strong) UITapGestureRecognizer *tapGesture;
self.tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapGestureStateChanged:)];
self.tapGesture.delegate = self;
[self.view addGestureRecognizer:self.tapGesture];
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
if (gestureRecognizer == self.tapGesture) {
return CGRectContainsPoint(self.neededSprite.frame, [self convertPoint:[sender locationInView:self.view] toNode:self]);
}
return YES;
}
- (void)tapGestureStateChanged:(UITapGestureRecognizer *)sender
{
if (sender.state == UIGestureRecognizerStateRecognized) {
/* do stuff here */
}
}
The SpriteKit Programming Guide suggests using a SKNode name to gate the behavior that you want to achieve when touched. The example in the section called "Using Actions to Animate Scenes" overrides the touchesBegan:withEvent: method and only runs when the name is not nil. When you're done, just reset the name so that you can catch the next touch.
- (void)touchesBegan:(NSSet *) touches withEvent:(UIEvent *)event {
SKNode *helloNode = [self childNodeWithName:#"helloNode"];
if (helloNode != nil) {
helloNode.name = nil;
SKAction *moveUp = [SKAction moveByX: 0 y: 100.0 duration: 0.5];
SKAction *zoom = [SKAction scaleTo: 2.0 duration: 0.25];
SKAction *pause = [SKAction waitForDuration: 0.5];
SKAction *fadeAway = [SKAction fadeOutWithDuration: 0.25];
SKAction *remove = [SKAction removeFromParent];
SKAction *moveSequence = [SKAction sequence:#[moveUp, zoom, pause, fadeAway, remove]];
[helloNode runAction: moveSequence];
}
}
Another idea would be to perform your re-enabling of the user interaction to touchesEnded.
I have a problem as following, I've developed a ios app, to be concice , It has a UIViewController as parent, also it has a button , and the UIViewController popup a transparent UIView as a mask. When I click on the UIView(exactly within the underlying button boundary ) , the button could not receive any event(such as "touch up inside"), how could the button get the "touch up inside" event from transparent the UIView which is above the UIViewController?
This is not possible directly as the event triggered will not be for the button as button is not visible
(ie. another View is completly covering the button and is blocking the interaction with the user).
But i can give u a work around.
1.Declare a Bool Variable in your UIViewController
2.Implement the touches methods as shown below
- (void)touchesBegin:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch=[touches anyObject];
CGPoint p=[touch locationInView:self.view];
if(CGRectContainsPoint(button.frame, p) && !boolVariable) {
boolVariable = YES;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch=[touches anyObject];
CGPoint p=[touch locationInView:self.view];
// If the below condition is true then it mean there the user tapped on the same location as that of button..(touchesEnded and touchesCanceled was not called) so the event is just like touchUpInside
if(CGRectContainsPoint(button.frame, p) && boolVariable) {
boolVariable = NO;
[Here you can call the method which you wanted to call on touchUpInside of the button];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
boolVariable = NO;
}
- (void)touchesCanceled:(NSSet *)touches withEvent:(UIEvent *)event {
boolVariable = NO;
}
I was not able to test the able code on Xcode but i think it will work..
Note: The frame of the button should be with respect to the UIViewController.
Hope this helps u out :)
UIView instances, do not listen for touches. In order to get callbacks for events such as "touch up inside" are only sent to subclasses of UIControl
The most basic concrete subclass is UIButton.
Without code or more details about your app's setup, it's difficult to give better advice.
I have subclassed a UIView that already handles single touches and drags. I want to enhance the interaction of this view so that, while dragging, if the user touches with a second finger (anywhere else in the view), then the system prints a message. I've made a stab at the code:
In my header file I've declared:
NSString *key; // This unique key identifies the first touch
My .m file I have:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if (key == NULL)
{
key = [[[NSValue valueWithPointer:t] description] copy];
}
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
NSLog(#"calling parent to handle single touch");
[super touchesBegan:[NSSet setWithObject:t] withEvent:event];
}
else
{
[self twoTouchDetected];
}
}
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
[super touchesMoved:[NSSet setWithObject:t] withEvent:event];
}
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
[super touchesEnded:[NSSet setWithObject:t] withEvent:event];
key = NULL;
}
}
}
Unfortunately there are issues with this implementation. The first time (while dragging with one finger) if I touch with the second figer, the system will register it immediately. However, the second time I touch with a second finger (while still continuing to drag with the first finger), the second finger touch does not register until the first finger is lifted up. The events from the second finger are backed up...
What is also strange is that sometimes, the parent gets called with touch data from the 2nd finger and not the 1st.
It turns out my code worked, but the problem was that I subclassed an object belonging to the Core Plot framework. This framework does weird things to their objects and therefore the touches were coming back in the wrong order.
I created an empty project to receive touches and everything came out great.