touchesCancelled triggered when using three fingers - ios

I have a simple app that prints the coordinates of all the touches event.
In the simulator it works great.
On my device (iPhone) it works great with up to two fingers. When I tap with three fingers in a fast sequence the event touchesCancelled is triggered.
Could someone please explain this to me?
This is the code for printing (in case the problem lays there) that sits in my UIView.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Began %f %f", location.x, location.y);
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Moved %f %f", location.x, location.y);
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Ended %f %f", location.x, location.y);
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Phase: Touches cancelled");
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Cancelled %f %f", location.x, location.y);
}
}
An example of sequence is this one:
Began 38.000000 263.000000
Began 173.500000 238.500000
Moved 41.500000 263.000000
Phase: Touches cancelled <<<< third touch
Cancelled 41.500000 263.000000
Cancelled 173.500000 238.500000
Thank you.

Related

Touch event in vuforia iOS app

I am creating an Augmented Reality based iOS app using vuforia.I have integrated Vuforia SDK in my project.I need to shows some objects over the target image while scanning the target image. It works fine. I also need to show some messages over the screen when the user touch any of the object. How can i identify which object the user have touched? How the touch events works when the device get zoom in and zoom out?Please help me.
Try to combine ImageTarget and Dominoes sample. Touch event handle in Dominoes starts from EAGLView.mm in the dominoes sample:
// Pass touch events through to the Dominoes module
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_DOWN, 0, location.x, location.y);
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_CANCEL, 0, location.x, location.y);
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_UP, 0, location.x, location.y);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_MOVE, 0, location.x, location.y);
}
Understand how they handle these touches and try to do the same. For more reference see Vuforia developer forum https://developer.vuforia.com/forum/ios/getting-touch-event-3d-model

moving 2 objects at same time

In my current ios project, I have dedicated one side of the screen to one object and the other side of the screen to the other object and i have made it so that if you swipe your finger across one side of the screen on the designated object would move. However, I want to make it so that you can move both objects at the same time in different movement but i cannot figure out how to do so. Below is my current code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 270 ) {
[Person setCenter:CGPointMake(location.x, Person.center.y)];
}
else {
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 270 ) {
[Person setCenter:CGPointMake(location.x, Person.center.y)];
}
else {
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
}
you should start handling multiple touches that are delivered in the touches set - loop through all the UITouch objects and do your handling.
Edit:
here's your code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for(UITouch *touch in [event allTouches]) {
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 270 ) {
[Person setCenter:CGPointMake(location.x, Person.center.y)];
}
else {
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for(UITouch *touch in [event allTouches]) {
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 270 ) {
[Person setCenter:CGPointMake(location.x, Person.center.y)];
}
else {
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
}
}
If you move the -touchesBegan & touchesMoved code into the Person view class rather than the view/or viewController class it is in currently then those views can handle touches independent of each other and simultaneously.
*Edit: More info:
Currently you are handling touches events (in I'm guessing a UIViewController) using the code you pasted above, if you moved that code into your Person class you have created you could make the code simpler and achieve the result you desire.
This will have the effect that the Person will decide if it is being touched and where and will move itself accordingly.
In your Person.m file add this code,
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.superview];
[self moveToLocation:location];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.superview];
[self moveToLocation:location];
}
-(void)moveToLocation:(CGPoint)location{
CGFloat halfOfPersonWidth = self.bounds.size.width /2.0f;
CGFloat halfOfSuperviewWidth = self.superview.bounds.size.width/2.0f;
// Stop Person from going off left screen edge
if ((location.x - halfOfPersonWidth) <= 0.0f){
return;
} // Stop Person from going off right screen edge
else if ((location.x + halfOfPersonWidth) >= self.superview.bounds.size.width){
return;
}
if (self.center.x < halfOfSuperviewWidth) {
// Person is on the left of the screen and should not move to right side
if ((location.x + halfOfPersonWidth) > halfOfSuperviewWidth) {
return;
}
} else{
// Person is on the right of the screen and should not move to left side
if ((location.x - halfOfPersonWidth) < halfOfSuperviewWidth) {
return;
}
}
// If we have made it this far then we can move the Person
// move to touch location on x axis only
[self setCenter:CGPointMake(location.x, self.center.y)];
}
Now in your View Controller class you can delete the original -touchesBegan & -touchesMoved code that you pasted here, Or just comment it out if you want to be cautious
/* put slash asterisks above the code and asterisks slash below the code to comment out */
If you build and run you should be able to move each Person view around as before but if you put a finger on each Person view at the same time you should be able to move them simultaneously.

Sprite Kit physics causes too much shaking

In my app, once the gamepieces are stacked to a certain height, the screen starts to move down to allow more room for building. The problem is that everytime the screen moves down, the gamepieces shake so much that they topple over.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
SKSpriteNode *gamePiece = [self pickObject];
gamePiece.position = location;
gamePiece.physicsBody.dynamic = NO;
[self addChild:gamePiece];
_currentTouch = touch;
currentGamePiece = gamePiece;
CGPoint touchLocation = [touch locationInNode:self.scene];
if(touchLocation.y > 350)
{
_bg.position = CGPointMake(_bg.position.x, _bg.position.y-2);
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
if ([touch isEqual:_currentTouch])
{
currentGamePiece.position = location;
}
CGPoint touchLocation = [touch locationInNode:self.scene];
if(touchLocation.y > 350)
{
_bg.position = CGPointMake(_bg.position.x, _bg.position.y-2);
}
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches)
{
CGPoint location = [touch locationInNode:self];
if ([touch isEqual:_currentTouch])
{
currentGamePiece.position = location;
currentGamePiece.physicsBody.dynamic = YES;
}
}
}

move Sprite with touches moved

i am moving a sprite using the touches moved method. currently the sprite jumps to the point on which the screen is touched but I want the sprite only to move when it is touched directly.
my code:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
CGPoint newPosition = CGPointMake(location.x, self.size.height/2);
self.sprite.position = newPosition;
}
}
check if the touch location is inside the sprite, like this:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
if(CGRectContainsPoint(self.sprite.boundingBox,positionInScene)) {
CGPoint newPosition = CGPointMake(positionInScene.x, self.size.height/2);
self.sprite.position = newPosition;
}
}

How to called TouchesMoved for 2 different UIImageView

Hello I have 2 UIImageViews in UIView.
Once I touch on UIImageView touchesBegan method gets called. But once I drag on UIImageView then touchesMoved is called. But at the same time touchesMoved for the second UIImageView is also called.
Can you please help me how i can get touchesMoved event for both the UIImageViews?
This is my code
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if (CGRectContainsPoint(iv1.frame,currentPoint)==YES)
NSLog(#"iv1 Begin");
if (CGRectContainsPoint(iv2.frame,currentPoint)==YES)
NSLog(#"iv2 Begin");
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if(CGRectContainsPoint(iv1.frame,currentPoint)==YES)
NSLog(#"iv1 Moved NO");
if(CGRectContainsPoint(iv2.frame,currentPoint)==NO)
NSLog(#"iv1 Moved YES");
if(CGRectContainsPoint(iv2.frame,currentPoint)==YES)
NSLog(#"iv2 Moved NO");
if(CGRectContainsPoint(iv2.frame,currentPoint)==NO)
NSLog(#"iv2 Moved NO");
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if (CGRectContainsPoint(iv1.frame,currentPoint)==YES)
NSLog(#"iv1 End");
if (CGRectContainsPoint(iv1.frame,currentPoint)==YES)
NSLog(#"iv2 End");
}
You can use two outlets linked to two views and this is the code for reacognize the two view inside touch methods:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *t = [touches anyObject];
touchedView = t.view;
if (t.view == view1) {
//todo something with view1;
} else if (t.view == view2) {
//todo something with view2
}
}

Resources