I am using ccTouchesBegan and ccTouchesEnded methods to register touches. Everything was ok until I placed some buttons(ccmenuitems) on my node. now when i place my finger down on a button and then move it to any other place of screen ccTouchesEnded method does not calls. What am I doing wrong? How can I detect touches on a button?
some code:
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint location = [self convertTouchToNodeSpace: [touches anyObject]];
// here i check if touch is in the right place
if ([self ptInRect:location :CGRectMake(0, center.y - 160, winSize.width, 40)]) {
dragBeginLocation = location;
}
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint location = [self convertTouchToNodeSpace: [touches anyObject]];
if (ABS(location.x - dragBeginLocation.x) < 8) {
NSLog(#"TAP");
} else {
NSLog(#"SWIPE");
}
}
So, when I begin touch on a sprite and release on background I get nothing in console.
Buttons are in CCMenu, which is higher than background.
Related
I'm new to Xcode and I'm developing a small tool. Basically it has a circle image that rotates in a view. That image has two dots to be placed in the circle radials. When the image is in the initial position we can drag the dots with touch and them moves following the finger tip... But when we rotate the image and after it we drag the dots them moves erratically on screen!... If the rotation is 180deg the dots move on opposite direction of the drag movement!...
Any help I appreciate.
Many thanks in advance.
Rotation.m
#implementation Rotation
#synthesize rotation = rotation_;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([[event touchesForGestureRecognizer:self] count] > 1) {
[self setState:UIGestureRecognizerStateFailed];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([self state] == UIGestureRecognizerStatePossible) {
[self setState:UIGestureRecognizerStateBegan];
} else {
[self setState:UIGestureRecognizerStateChanged];
}
UITouch *touch = [touches anyObject];
UIView *view = [self view];
CGPoint center = CGPointMake(CGRectGetMidX([view bounds]), CGRectGetMidY([view bounds]));
CGPoint currentTouchPoint = [touch locationInView:view];
CGPoint previousTouchPoint = [touch previousLocationInView:view];
CGFloat angleInRadians = atan2f(currentTouchPoint.y - center.y, currentTouchPoint.x - center.x) - atan2f(previousTouchPoint.y - center.y, previousTouchPoint.x - center.x);
[self setRotation:angleInRadians];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([self state] == UIGestureRecognizerStateChanged) {
[self setState:UIGestureRecognizerStateEnded];
} else {
[self setState:UIGestureRecognizerStateFailed];
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[self setState:UIGestureRecognizerStateFailed];
}
#end
Make sure that you set the anchor point of the layer to
circleView.layer.anchorPoint = CGPointMake(.5, .5);
How do I constantly get the location of the user's touch? I'm working on controllers for a game, and as it stands I must lift and individually place my finger on the new control in order to make it work. It seems that the touchLocation variable only updates upon each tap, and I want it to update constantly, so that even if I slide my finger around it will know the location and update it accordingly. Here is the relevant code. Thank you!
-(void)update:(CCTime)delta{
if([playerDirection isEqual:#"left"]){
_player.position = ccp(_player.position.x - 2, _player.position.y);
}
if([playerDirection isEqual:#"right"]){
_player.position = ccp(_player.position.x + 2, _player.position.y);
}
if([direction isEqual:#"up"]){
_enemy.position = ccp(_enemy.position.x, _enemy.position.y + 2);
}
if([direction isEqual:#"down"]){
_enemy.position = ccp(_enemy.position.x, _enemy.position.y - 2);
}
if(_enemy.position.y >= self.contentSize.height){
direction = #"down";
}
if(_enemy.position.y <= 2){
direction = #"up";
}
if(isTouching){
_player.position = ccp(_player.position.x + 2, _player.position.y);
}
}
// -----------------------------------------------------------------------
#pragma mark - Touch Handler
// -----------------------------------------------------------------------
-(void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchLocation = [touch locationInView:[touch view]];
touchLocation = [[CCDirector sharedDirector] convertToGL:touchLocation];
if(touchLocation.x > 400){
playerDirection = #"right";
}
if(touchLocation.x < 200){
playerDirection = #"left";
}
CGPoint touchLoc = [touch locationInNode:self];
//isTouching = true;
// Log touch location
CCLOG(#"Move sprite to # %#",NSStringFromCGPoint(touchLoc));
// Move our sprite to touch location
//CCActionMoveTo *actionMove = [CCActionMoveTo actionWithDuration:1.0f position:touchLoc];
//[_sprite runAction:actionMove];
}
- (void)touchEnded:(UITouch *)touch withEvent:(UIEvent *)event
{
playerDirection = #"none";
}
i see you've got a touchesBegan and touchesEnded... but you're missing touchesMoved.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for(UITouch *touch in touches) {
//do whatever you want
}
}
Hope that helps, good luck!
To get Touch you have to enable touch in -(id) init() method.
you can enable touch by
self.IsTouchEnable = YES
and after this you can get touch in following touch handling methods
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)ccTouchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
hope This will help you
I have been looking at how to invoke different touch methods depending on where the user touches on the screen. I presume this should be fairly simple in Sprite Kit but I can't find anything on it. Currently I am using the following to make the character jump.
How would I only use this method if it is touched on the left or the right of the screen? So a right touch calls one method and a left touch calls another. Thanks.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
if (isJumping == NO){
isJumping = YES;
SKSpriteNode* charNode = (SKSpriteNode*)[self childNodeWithName:#"character"];
[charNode runAction:jumpAnimation];
[charNode runAction:jumpMovement];
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInNode:self.scene];
if(touchLocation.x < self.size.width/2)
NSLog(#"left side");
if(touchLocation.x > self.size.width/2)
NSLog(#"right side");
}
Okay basically what this code currently does is drag an image up and down along the Y axis depending on where the user drags it, and it returns to its original position. My problem is that if someone were to not touch directly on the center of the UIImageView and start dragging it would jolt (very not smooth). Wherever someone is touching the UIImageView and starts dragging the UIImageView jolts a little to go directly on the center of the touch event.
I was thinking about using animation just to move it where the image needs to go, or is there another way?
I apologize if this is an inefficient way to do this. I'm fairly new to the IOS world.
Here's what I have:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//Gets location of UIImageView.
self.originalFrame = self.foregroundImage.frame;
}
//This method is used for moving the UIImageView along the y axis depending on touch events.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
if([touch view]==self.foregroundImage) {
CGPoint location = [touch locationInView:self.view];
location.x=self.foregroundImage.center.x;
self.foregroundImage.center=location;
}
}
//This method sets the UIImageView back to its original position.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
CGRect newFrame = self.foregroundImage.frame;
newFrame.origin.y = self.originalFrame.origin.y;
[UIView animateWithDuration:1.1 animations:^{
self.foregroundImage.frame = newFrame;
}];
}
You also need to save the first location in touchesBegan, relative to the parent view. Then, you can use that to change the frame by the difference between the previous location and the new location. Please see the following code.
- (void) touchesBegan: (NSSet*) touches
withEvent: (UIEvent*) event
{
if (touches.count == 1)
{
UITouch* touch = [touches anyObject];
self.touchLocation = [touch locationInView: self.view];
}
}
- (void) touchesMoved: (NSSet*) touches
withEvent: (UIEvent*) event
{
if (touches.count == 1)
{
UITouch* touch = [touches anyObject];
CGPoint newTouchLocation = [touch locationInView: self.view];
if (touch.view == self.foregroundImage)
{
/* Determine the difference between the last touch locations */
CGFloat deltaX = newTouchLocation.x - self.touchLocation.x;
CGFloat deltaY = newTouchLocation.y - self.touchLocation.y;
/* Offset the foreground image */
self.foregroundImage.center
= CGPointMake(self.foregroundImage.center.x + deltaX,
self.foregroundImage.center.y + deltaY);
}
/* Keep track of the new touch location */
self.touchLocation = newTouchLocation;
}
}
I have two custom uibuttons.
#interface myButton : UIButton
I overwrite several methods like
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// do something
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
[self.nextResponder touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
if (!CGRectContainsPoint(self.bounds, touchPoint)) {
[self touchesCancelled:touches withEvent:event];
}
return;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// do something
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesCancelled:touches withEvent:event];
return;
}
The thing I want is , when I touch a button, I can keep moving my touch to another button. I tired to call touchesCancelled when the touch is out of the bound of the first button I touched. So I "think" then I move to another button it will be a new touch event. But it doesn't work like this.
Did I do something wrong? Or the touchesCancelled method is not used for like this?
Thanks in advance!
Your suspesions are correct, this is not how touchesCancelled:withEvent: was intended to be used. From the documentation:
This method is invoked when the Cocoa Touch framework receives a system interruption requiring cancellation of the touch event; for this, it generates a UITouch object with a phase of UITouchPhaseCancel. The interruption is something that might cause the application to be no longer active or the view to be removed from the window.
Your responder will receive the touches cancelled event if your user receives an incoming phone call, SMS or their alarm goes off etc. It should be used to clean up any state information that was established in the other touch events.
It seems as though you want to change the responder associated with the touch events mid-touch i.e. when you drag the touch from the bounds of the first button and enter that of the second button it should become the responder recieving the touch events. Unfortunately that is not the way responders are designed to work. Once a UIView is identified as being the responder, as returned in hitTest:withEvent:, this will be the UIView to receive touch events.
A possible workout to achieve what you want would be to handle the touch events (touchesBegan:withEvent:, touchesMoved:withEvent: etc.) in a superview that contains both of your buttons. Then your superview will receive the touch events and you can take action depending on which button's frame you are within:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches)
{
CGPoint touchLocation = [touch locationInView:self.view];
UIButton *button;
if (CGRectContainsPoint(self.button1.frame, touchLocation))
{
button = self.button1;
NSLog(#"touchesMoved: First Button");
}
else if (CGRectContainsPoint(self.button2.frame, touchLocation))
{
button = self.button2;
NSLog(#"touchesMoved: Second Button");
}
// Do something with button...
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches)
{
CGPoint touchLocation = [touch locationInView:self.view];
UIButton *button;
if (CGRectContainsPoint(self.button1.frame, touchLocation))
{
button = self.button1;
NSLog(#"touchesMoved: First Button");
}
else if (CGRectContainsPoint(self.button2.frame, touchLocation))
{
button = self.button2;
NSLog(#"touchesMoved: Second Button");
}
// Do something with button...
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches)
{
CGPoint touchLocation = [touch locationInView:self.view];
UIButton *button;
if (CGRectContainsPoint(self.button1.frame, touchLocation))
{
button = self.button1;
NSLog(#"touchesEnded: First Button");
}
else if (CGRectContainsPoint(self.button2.frame, touchLocation))
{
button = self.button2;
NSLog(#"touchesEnded: Second Button");
}
// Do something with button...
}
}