Screen touch locations on IPad3 - ipad

I'm writing an OpenGL game for iPhone & iPad, controlled by screen touches. It's a universal app, so I have code in touchesBegan to convert any touch location into the range 0.0-1.0 (one side of the screen to the other, which is how the rest of the app needs to get the touch info). The code that does this is:
CGPoint touchLocationPx;
GLFloat xx;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
touchLocationPx = [touch locationInView:self]; // touchLocationPx.x & .y are in points, not pixels...
xx = (GLfloat)touchLocationPx.x / (GLfloat)(self.frame.size.width); // self.frame.size.width is also in points,
// so xx is in the range 0.0 to 1.0...
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// Set xx to a dummy value that will not be processed by the rest of the app...
xx = -999.999;
}
It relies on [touch locationInView:self] and self.frame.size.width always being in the same units (points rather than pixels as I understand it) regardless of whether it's a non-retina screen, a retina screen with [UIIScreen mainScreen].scale=2.0, or an iPad in x1 or x2 mode. (My own test device is an iPhone 4S and if I monitor screen size and touch locations it all gets reported in points (480,320) as expected.
However I've since heard that on iPad3 the controls were unresponsive. Are screen size and touch location handled differently on this device? Or could something else be causing my code to not convert the touch location correctly?
(On the simulator all device types work fine including retina iPad. I know the simulator will never exactly match a real device, but something as fundamental as how touch locations are reported should be in there properly. Any idea why the real device fails?)
Thanks,
Sam.

Related

SpriteKit strange touch coordinates behavior

Touch coordinates in SpriteKit seem to have strange negative offset on Y axis.
This is how I get touch locations in my scene:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
NSLog(#"Touch location: %#", NSStringFromCGPoint(location));
}
If you touch the very bottom of the screen, the lowest Y coordinate value you can get is something about 8.5 points and on the top of the screen there is a noticeable area that produces identical max Y coordinate values.
Has anybody encountered the same problem? Is it a bug in SpriteKit or am I doing something wrong?
I have uploaded a demo project on GitHub to illustrate the issue: https://github.com/mirmanov/SKTouchDemo
NOTE: You can reproduce this behavior only on a real device. On iOS simulator everything seems to work properly.

Location wrong in touchesBegan

So I am using GLKViewController and GLKView. I am debugging my app on iPhone 5s and if I view the drawableHeight and drawableWidth for the view it shows as 1136 and 640 which is correct for iphone 5s. However when I check the following code.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
startTime = CFAbsoluteTimeGetCurrent();
self.lastGestureTime = startTime;
active = true;
UITouch *touch = [ touches anyObject ];
CGPoint _location = [ touch locationInView:self.view ];
and I click the bottom-right of the screen I get the location as 316 and 533 which is approximately half of what I expect the co-ordinates to be. So the location anywhere on the screen is getting sized to exactly half of what I expect it to be. My UI displays correctly so the buttons are all in the right place, but the location from locationInView comes out wrong ( I think it is exactly half of what it should be ). Any help would be great.

Movement of object only when one half of the screen is touched

In my current objective C project I am coding a mechanic that when you drag your hand on one half of the screen an object moves in direct correlation and when you drag on the other half of the screen, the other object moves in direct correlation but the first does not. When I test my project the first object moves perfectly on the half screen, however the second object does not when the other half of the screen is touched
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 259 )
[Person setCenter:CGPointMake(location.x, Person.center.y)];
if (location.y >289)
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
if (location.x <= 259 )
[Person setCenter:CGPointMake(location.x, Person.center.y)];
if (location.y >289)
[Person1 setCenter:CGPointMake(location.x, Person1.center.y)];
}
Your objects should be moving just fine. However, if you're trying to split your screen in half (as you said in your question) then your if statements in your touch delegate methods are rather odd. Here is an image of which "Person" object will be moved depending on where you touch the screen (assuming 4" screen in portrait)
As you can see, there is a section of screen that will not move either object, and another (rather large) section that will move both of your objects. There is only a very small area that will move Person1 by itself.
If you're wanting to assign half of your screen to each object, I would suggest doing something like this to split the top and bottom half of the screen:
if(location.y <= self.view.frame.size.height / 2)
{
// move Person
}
else
{
// move Person1
}
You could obviously modify that to split the left/right halves of the screen.
If you're still having issues moving both objects, make sure that they're hooked up to the view if they're IBOutlets

Looking for an alternative to touchesMoved to detect any touch event within a defined area?

I have a virtual keyboard in my app with 6 keys, and the whole thing is just an image implemented with UIImageView. I determined the exact x and y coordinates that correspond to the image of each 'key' and used the following code to respond to a user interacting with the keyboard:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 72 && point.x > 65)
{
NSLog(#"Key pressed");
}
//Repeat per key...
}
However, I have realized that this method is not very smart, because changing phone perspectives (portrait to landscape) or changing devices will ruin my x and y coordinates and therefore cause problems.
So, I am looking for an alternative to specifying the absolute x and y values, and using touchesMoved in general. Ideally, it would be a button with specific settings that would call its method if it was tapped, or if the user dragged their finger into the area of the button (even if very slowly - I used swipe detection before and it required too much of an exaggerated movement).
Is it possible to set up a button to call its method if tapped or if a touch event started outside of the button and then proceded into the button? If not, what are my alternatives?
Thanks SE!
You need to get the winSize property, which will fix the problem you are having with screen sizes.
CGSize size = [[CCDirector sharedDirector]winSize];
I do believe you are using Cocos2D? If so you can use this size property instead of hard coding numbers. :)
to convert your point, use
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
To see if its within the bounds/box of the button you could try:
if (CGRectContainsPoint ([self.myButton boundingBox], location)
{
//Execute method
}
This is all assuming you are using Cocos2D and a CCSprite for your button.
This should work on any screen size and portrait or landscape :)

rotate an image with touch

I making an app which sets the sleep timer with a clock .Basically it is clock which has a single hand which user can move to set his sleep time .I tried to rotate the image with uitouch but it rotates from middle but i want it to rotate from the tip.Secondly i want that the image only rotates when user is touching the tip of the image but in my project the image is also rotating when use touches any part of the screen.Also i want to rotate the image in both directions but in my project it moves only clockwise due to this method
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
Can anybody give me hints or solutions about how can it be done?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//
touch = [[event allTouches] anyObject];
touchLocation = [touch locationInView:touch.view];
NSLog(#"began");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// get touch event
[image.layer setAnchorPoint:CGPointMake(0.0,0.0)];
if ([touch view] == image) {
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
//image.center = touchLocation;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"end");
}
In detail, you can custom a rotateView,then:
1: In the delegate method of "touchesBegan", get initialPoint of finger and initialAngle.
2: During "touchesMoved",get the newPoint of finger:
CGPoint newPoint = [[touches anyObject] locationInView:self];
[self pushTouchPoint:thePoint date:[NSDate date]];
double angleDif = [self angleForPoint:newPoint] - [self angleForPoint:initialPoint];
self.angle = initialAngle + angleDif;
[[imageView layer] setTransform:CATransform3DMakeRotation(angle, 0, 0, 1)];
3: At last, in "touchesEnded" you can calculate final AngularVelocity.
If anything being confused, for more detail, you can write back.
To rotate it the other way you just use:
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
And to rotate form the tip you should use setAnchorPoint (like you used in your code, but i think you should do: [image setAnchorPoint]).
more on the subject: setAnchorPoint for UIImage?
I tried to rotate the image with uitouch but it rotates from middle but i want it to rotate from the tip
I dont know any way in SDK to rotate an element on its extreme end. If you want to have that effect take the clock handle image you have twice in length with half portion transparent. It is not a direct solution, but a workaround.
Also i want to rotate the image in both directions but in my project it moves only clockwise due to this method
As per iOS developer documentation, a positive angle value specifies counterclockwise rotation and a negative value specifies clockwise rotation.

Resources