SpriteKit strange touch coordinates behavior - ios

Touch coordinates in SpriteKit seem to have strange negative offset on Y axis.
This is how I get touch locations in my scene:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
NSLog(#"Touch location: %#", NSStringFromCGPoint(location));
}
If you touch the very bottom of the screen, the lowest Y coordinate value you can get is something about 8.5 points and on the top of the screen there is a noticeable area that produces identical max Y coordinate values.
Has anybody encountered the same problem? Is it a bug in SpriteKit or am I doing something wrong?
I have uploaded a demo project on GitHub to illustrate the issue: https://github.com/mirmanov/SKTouchDemo
NOTE: You can reproduce this behavior only on a real device. On iOS simulator everything seems to work properly.

Related

SKAction moveTo goes up when should go down

I was following the SpriteKit tutorial here to create a simple sprite kit shooter, where you make a space ship that shoots lasers at asteroids.
I want to make the lasers (each laser is an SKSpriteNode) move to the point where I click. I am getting the touch correctly within the method touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event. However when I setup an SKAction on the SKSpriteNode, it moves in the y direction OPPOSITE of where I click. Ie image the window has width (x) 500 and height (y) 400. When I touch the screen at the coordinate (300, 100), the lazer appears to move to the coordinate (300, 300).
I've verified that the coordinates in touchLocation are correct.
FYI I have only used the iPhone simulator for this - but that shouldn't matter, should it?
Relevant code snippet:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
SKSpriteNode *shipLaser = [_shipLasers objectAtIndex:_nextShipLaser];
shipLaser.position = CGPointMake(_ship.position.x+shipLaser.size.width,_ship.position.y+0);
shipLaser.hidden = NO;
[shipLaser removeAllActions];
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
SKAction *laserMoveAction = [SKAction moveTo:touchLocation duration:0.5];
SKAction *laserDoneAction = [SKAction runBlock:(dispatch_block_t)^() {
shipLaser.hidden = YES;
}];
SKAction *moveLaserActionWithDone = [SKAction sequence:#[laserMoveAction,laserDoneAction]];
[shipLaser runAction:moveLaserActionWithDone withKey:#"laserFired"];
}
EDIT: I wonder if it might have to do with the fact that SprikeKit's coordinate system originates from the bottom left, while UIKit originates from the top left??
You want the touchLocation in the SKScene, not the UIView.
change :
CGPoint touchLocation = [touch locationInView:self.view];
to :
CGPoint touchLocation = [touch locationInNode:self];
The problem was that SprikeKit's coordinate system originates from the bottom left, unlike UIKit's coordinate system which originates in the top right. So the coordinates for the touch event were in the UIkit's coord system, and the SKNode was moving in SpriteKit's system. Once I understood this difference it was pretty easy to fix.

How to rotate something using spritekit relative to user touch coordinates

I want to rotate a gun based on the user dragging their finger on the screen. I figure i will need my cartesian points in polar coordinates and that i will need a long press gesture which is both things that i have. I am just wondering how i would go about programming this? sorry i'm really new to sprite kit i have read all of apple's documentation and i'm having a hard time finding this. My anchor point is 0,0.
-(void)shootBullets:(UILongPressGestureRecognizer *) gestureRecognizer
{
double radius;
double angle;
SKSpriteNode *gun = [self newGun];
}
i figured out how to get the angle but now my zrotation wont work. Like i nslog and the angles are right but when i click gun.zrotation and i tap nothing happens? please help i'm getting uber mad.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
SKSpriteNode *gun = [self newGun];
self.gunRotation = atanf(location.y/location.x)/0.0174532925;
gun.zRotation = self.gunRotation;
NSLog(#"%f",gun.zRotation);
}
This is a very old question, but here is what I did - hopefully it will help someone out there:
#import "MyScene.h"
#define SK_DEGREES_TO_RADIANS(__ANGLE__) ((__ANGLE__) * 0.01745329252f) // PI / 180
Then add this:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint positionInScene = [touch locationInNode:self];
float deltaX = positionInScene.x - gun.position.x;
float deltaY = positionInScene.y - gun.position.y;
float angle = atan2f(deltaY, deltaX);
gun.zRotation = angle - SK_DEGREES_TO_RADIANS(90.0f);
}
}
Check out the answers I got to this question about how to implement a rotary knob, people have already figured out all the geometry, all you need to do is implement a transparent rotary knob, get its angle and pass it to your sprite kit object, using action rotate

Looking for an alternative to touchesMoved to detect any touch event within a defined area?

I have a virtual keyboard in my app with 6 keys, and the whole thing is just an image implemented with UIImageView. I determined the exact x and y coordinates that correspond to the image of each 'key' and used the following code to respond to a user interacting with the keyboard:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 72 && point.x > 65)
{
NSLog(#"Key pressed");
}
//Repeat per key...
}
However, I have realized that this method is not very smart, because changing phone perspectives (portrait to landscape) or changing devices will ruin my x and y coordinates and therefore cause problems.
So, I am looking for an alternative to specifying the absolute x and y values, and using touchesMoved in general. Ideally, it would be a button with specific settings that would call its method if it was tapped, or if the user dragged their finger into the area of the button (even if very slowly - I used swipe detection before and it required too much of an exaggerated movement).
Is it possible to set up a button to call its method if tapped or if a touch event started outside of the button and then proceded into the button? If not, what are my alternatives?
Thanks SE!
You need to get the winSize property, which will fix the problem you are having with screen sizes.
CGSize size = [[CCDirector sharedDirector]winSize];
I do believe you are using Cocos2D? If so you can use this size property instead of hard coding numbers. :)
to convert your point, use
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
To see if its within the bounds/box of the button you could try:
if (CGRectContainsPoint ([self.myButton boundingBox], location)
{
//Execute method
}
This is all assuming you are using Cocos2D and a CCSprite for your button.
This should work on any screen size and portrait or landscape :)

read x,y pixel touched ipad/iphone?

Is there anyway to detect which pixels you are touching while keeping your hand/finger on the screen (iphone/ipad)? Essentially drawing a shape of my hand (not as detailed like a fingerprint).
Thanks.
What you want to achieve is sadly not possible. The current devices can only detect up to 11 touches as points (more info in this post). There is no way to get the real touch area or the true touched pixels.
If you are looking for the coordinate point of touch use following code.
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self.view];
NSLog(#"%f,%f",currentTouchPosition.x,currentTouchPosition.y);
}

touchesMoved continued point?

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
NSLog("%f %f",currentPoint.x,currentPoint.y);
}
i want to developed a paint app for my ipad.
when i use these code and use finger to paint a line on my pad,
it's print
(x,1),(x,3),(x,6),(x,7),(x,12),(x,15),(x,18)....
in my thought,it should print
(x,1),(x,2),(x,3),(x,4),(x,5),(x,6),(x,7),(x,8),(x,9),(x,10),(x,11),(x,12),(x,13),(x,14),(x,15),(x,16),(x,17),(x,18)....
touchesMoved can not get continued coordinate ?
It depends on the speed that you swipe.
If you swipe really slow you'll probably get (x,1),(x,2),(x,3),(x,4),(x,5),(x,6),(x,7),(x,8),(x,9),(x,10), but if you swipe fast you can get as little as (x,1),(x,5),(x,10).
If you are developing a paint app you will have to take into account if the user hasn't lift his finger and paint the line between the points if he hasn't.
Good luck!

Resources