It seems that there is no guarantee about the order in which UITouches appear when looping through a UIGestureRecognizer's method [locationOfTouch: inView:]. Specifically:
for (int i = 0; i < [recognizer numberOfTouches]; i++) {
CGPoint point = [recognizer locationOfTouch:i inView:self];
NSLog(#"[%d]: (%.1f, %.1f)", i, point.x, point.y);
}
One would think that point with index 0 would be the first UITouch, or the first that was released, but quite often the order of 2 touches is mixed up. Does anyone know how to test for the order of those events? Unfortunately there is no access to the UITouch objects themselves (with the timestamp).
Also, no guarantee is made in the documentation that the touches from -locationOfTouch:inView: will always be in a reliable order. Can anyone confirm or deny this?
You could try setting the recognizer's delegate property and implementing -gestureRecognizer:shouldReceiveTouch:. It should be called sequentially, and you can then hold on to the touches.
It seems like to me like want to track i.e. two fingers of two hands independently, instead of recognizing a concrete gesture which is the goal of a UIGestureRecognizer, as the name says. If that's what you want i'd rather implement the UIResponder methods:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
With this approach you really get a set of UITouch objects und can do advanced tracking.
Why not iterating through UITouches sorted by timestamp?
Related
I use the following code to add a gesture recognizer on the top right side of uinavigationbar but i get result if i tap anywhere on the navbar. How am i supposed to make a gesture for the top right corner?
- (void)handleGestureForTopRightBarButtonItem:(UIGestureRecognizer *)gestureRecognizer {
CGPoint p = [gestureRecognizer locationInView:self.navigationController.navigationBar];
if (CGRectContainsPoint(CGRectMake(self.navigationController.navigationController.view.width-20,0,100,self.navigationController.navigationController.view.height), p)) {
NSLog(#"got a tap on the right side in the region i care about");
} else {
NSLog(#"got a tap on the right side, but not where i need it");
}
}
As UIGestureRecognizer is reporting to a class object there are a couple of ways to solve this. UIGestureRecognizer was not meant to be stacked multiple times on the same view, if you do so you would very likely drain more Energy than you need apart from the loss of CPU power and lots of comparison code that has to distinguish all the running recognisers. But it would work..
a) write code that compares its coordinates and expected values and if they match in the range you want do your actions.
b) create another object that is living only in the coordinates you want and has it own UIGestureRecognizer. Not ideal, as written above.
c) use the power of UIControl which are also inherited UIView's that are also UIResponders.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (void)endTrackingWithTouch:(nullable UITouch *)touch withEvent:(nullable UIEvent *)event; // touch is sometimes nil if cancelTracking calls through to this.
- (void)cancelTrackingWithEvent:(nullable UIEvent *)event;
d) use the power of UIView without a UIGestureRecognizer. Which by the way basically works also on CALayers, they do not have the sendAction methods, they are not UIControls.
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL inside = [super pointInside:point withEvent:event];
if (inside && event.type == UIEventTypeTouches) {
//[super sendActionsForControlEvents:UIControlEventTouchDown];
}
return inside;
}
e) code some Class that inherits from UIResponder, which basically is what UIControls do and use their API instead so you make use of touch coordinates as well.
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
[self someCoRoutineWithTouch:touch];
}
//[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved :(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesEnded:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
Just don't forget that UIGestureRecognizer is basically for gestures that are made of touches, even multiple touches but not mainly to catch touches in general, despite in a lot of examples they are used instead of coding a proper UIControl. Also dont forget in a Devices edges the recognition of gestures is limited by the nature of finger size.
I'm building a game that basically the goal is to hold a button pressed for a long time.
The game starts with the "touchesBegan" call, and ends on "touchesEnded".
I have a issue that some very specific times this call, is not called. After a some search i figured out i'm not the only one with this problem :
https://discussions.apple.com/thread/1507669?tstart=0
There is a known problem that some times "touchesEnded" is not called.
So the work around i was thinking of doing, is setting a timer, and checking every once in a while, if there is a finger pressing, and where exactly on the screen.
Problem is, i know only about these methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
And nun of them serve me for this use.
Is there a way of getting the current touch on screen?
Thanks
touchesCancelled:withEvent:
When a touch begins, touchesBegan:withEvent: is called. After that, at some point in the future, either touchesEnded:withEvent: or touchesCancelled:withEvent: will be called! You can, e.g. create a method touchesEndedOrCancelled:withEvent: and call that method from both touchesEnded and touchesCancelled.
To get the current touches you need to keep track of them on your own. Just create a mutable set _allTouches
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[_allTouches unionSet:touches];
// ...
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[_allTouches minusSet:touches];
// ...
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[_allTouches minusSet:touches];
// ...
}
I can't seem to be able to figure out why this code for forwarding a UIButton touch stopped working in iOS5:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
[self.nextResponder touchesBegan:touches withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
[self.nextResponder touchesMoved:touches withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
[self.nextResponder touchesEnded:touches withEvent:event];
}
When I log the next responder's touches methods I can see that touches moved is only forwarded once and touches ended isn't forward at all. This is very unexpected behavior, as all touches are forwarded in iOS4. Any help would be appreciated.
While comment posters were able to get this code to work fine in their tests, I was unsuccessful in getting it to reliably work in iOS5. My solution was to use delegation to forward the touches to the interested party. Not elegant or preferred, but so far has worked fine without any gotchas.
I subclassed the UIWindow, but for some reason the - (void)sendEvent:(UIEvent *)event gets called 2 times for any interaction with the screen. Any ideas why would that happen?
For debugging purposes, subclass window (of app delegate) and override sendEvent: method
-(void) sendEvent:(UIEvent *)event
{
NSLog(#"%#",event);
[super sendEvent:event];
}
Most probably, you will notice the events responsible for TouchesBegan and TouchesEnded (for tapping). This can be tested by subclassing the View and overriding touch related methods.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"tocuhesBegan");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesMoved");
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesEnded");
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touchesCancelled");
}
Also, try drag/swipe on the view to notice the change in count of events sent to view :)
sendEvent gets called for fingerDown and allFingersUp
I am dtecting touches on my UIView.
Is some situations I want to be abel to cancel touches so that touchesEnded won't get called. But doesn't matter what touchesEnded will always get called?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (SOMETHING_SPECIAL)
{
[super touchesCancelled:touches withEvent:event];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//I don't want it to get here if touches were canceled how can i do this?
}
- In my touchesEnded how can I determine whether touches were canceled or not?
TouchesEnded will always get called wherever your touches where canceled or not, so I would suggest instead having that exact:
if (SOMETHING_SPECIAL)
{
}
In your:
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
To get the touchesCancelled event, implement this method from UIResponder.
touchesCancelled:withEvent:.