I have subclassed a UIView that already handles single touches and drags. I want to enhance the interaction of this view so that, while dragging, if the user touches with a second finger (anywhere else in the view), then the system prints a message. I've made a stab at the code:
In my header file I've declared:
NSString *key; // This unique key identifies the first touch
My .m file I have:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if (key == NULL)
{
key = [[[NSValue valueWithPointer:t] description] copy];
}
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
NSLog(#"calling parent to handle single touch");
[super touchesBegan:[NSSet setWithObject:t] withEvent:event];
}
else
{
[self twoTouchDetected];
}
}
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
[super touchesMoved:[NSSet setWithObject:t] withEvent:event];
}
}
}
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *t in touches) {
if ([key isEqualToString:[[NSValue valueWithPointer:t] description]])
{
[super touchesEnded:[NSSet setWithObject:t] withEvent:event];
key = NULL;
}
}
}
Unfortunately there are issues with this implementation. The first time (while dragging with one finger) if I touch with the second figer, the system will register it immediately. However, the second time I touch with a second finger (while still continuing to drag with the first finger), the second finger touch does not register until the first finger is lifted up. The events from the second finger are backed up...
What is also strange is that sometimes, the parent gets called with touch data from the 2nd finger and not the 1st.
It turns out my code worked, but the problem was that I subclassed an object belonging to the Core Plot framework. This framework does weird things to their objects and therefore the touches were coming back in the wrong order.
I created an empty project to receive touches and everything came out great.
Related
I use the following code to add a gesture recognizer on the top right side of uinavigationbar but i get result if i tap anywhere on the navbar. How am i supposed to make a gesture for the top right corner?
- (void)handleGestureForTopRightBarButtonItem:(UIGestureRecognizer *)gestureRecognizer {
CGPoint p = [gestureRecognizer locationInView:self.navigationController.navigationBar];
if (CGRectContainsPoint(CGRectMake(self.navigationController.navigationController.view.width-20,0,100,self.navigationController.navigationController.view.height), p)) {
NSLog(#"got a tap on the right side in the region i care about");
} else {
NSLog(#"got a tap on the right side, but not where i need it");
}
}
As UIGestureRecognizer is reporting to a class object there are a couple of ways to solve this. UIGestureRecognizer was not meant to be stacked multiple times on the same view, if you do so you would very likely drain more Energy than you need apart from the loss of CPU power and lots of comparison code that has to distinguish all the running recognisers. But it would work..
a) write code that compares its coordinates and expected values and if they match in the range you want do your actions.
b) create another object that is living only in the coordinates you want and has it own UIGestureRecognizer. Not ideal, as written above.
c) use the power of UIControl which are also inherited UIView's that are also UIResponders.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(nullable UIEvent *)event;
- (void)endTrackingWithTouch:(nullable UITouch *)touch withEvent:(nullable UIEvent *)event; // touch is sometimes nil if cancelTracking calls through to this.
- (void)cancelTrackingWithEvent:(nullable UIEvent *)event;
d) use the power of UIView without a UIGestureRecognizer. Which by the way basically works also on CALayers, they do not have the sendAction methods, they are not UIControls.
-(BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event {
BOOL inside = [super pointInside:point withEvent:event];
if (inside && event.type == UIEventTypeTouches) {
//[super sendActionsForControlEvents:UIControlEventTouchDown];
}
return inside;
}
e) code some Class that inherits from UIResponder, which basically is what UIControls do and use their API instead so you make use of touch coordinates as well.
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
[self someCoRoutineWithTouch:touch];
}
//[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved :(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
//do some stuff per touch
}
//[super touchesEnded:touches withEvent:event];
}
-(void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
Just don't forget that UIGestureRecognizer is basically for gestures that are made of touches, even multiple touches but not mainly to catch touches in general, despite in a lot of examples they are used instead of coding a proper UIControl. Also dont forget in a Devices edges the recognition of gestures is limited by the nature of finger size.
I have a custom view which is not being deallocated. I dismiss controller on close button pressed. Now if I only press the button the view is deallocated alright. But If press button with one finger with other finger touching the view its not deallocated on dismiss but on the next touch event.
Its UITouch which is keeping the reference of my view and not releasing it. How can I fix this?
Here is my code for my close action:
- (IBAction)closePressed:(UIButton *)sender {
NSLog(#"Close pressed");
if (self.loader)
[self.loader cancelJsonLoading];
[self.plView quit];
[self dismissViewControllerAnimated:YES completion:nil];
}
Did you try to call:
[self.view resignFirstResponder];
That should cancel all pending UITouches.
If this doesn't work, you can keep trace of your touches:
define a NSMutableSet where you store current touches:
NSMutableSet *_currentTouches;
in your init():
_currentTouches = [[NSMutableSet alloc] init];
And implement:
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[super.touchesBegan:touches withEvent:event];
[_currentTouches unionSet:touches]; // record new touches
}
- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[super.touchesEnded:touches withEvent:event];
[_currentTouches minusSet:touches]; // remove ended touches
}
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
[super.touchesEnded:touches withEvent:event];
[_currentTouches minusSet:touches]; // remove cancelled touches
}
Then, when you need to clean your touches (when you release your view for instance):
- (void)cleanCurrentTouches {
self touchesCancelled:_currentTouches withEvent:nil];
_currentTouchesremoveAllObjects];
}
It is, I think, a bit hacky, but the doc says:
When an object receives a touchesCancelled:withEvent: message it
should clean up any state information that was established in its
touchesBegan:withEvent: implementation.
I am making a custom iOS keyboard and have a UIControl subclass to represent my button. I am trying to get the same behaviour as the normal iOS keyboard:
User begins touch on one button
User drags over other buttons (need to detect this so they can highlight/dehighlight accordingly)
Register the actual keyboard "press" when the user lifts their finger; that is, the touch ends
I am testing using the touch tracking methods like this:
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event
{
[super beginTrackingWithTouch:touch withEvent:event];
NSLog(#"Begin for %#", [self label]);
return YES;
}
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event
{
[super continueTrackingWithTouch:touch withEvent:event];
NSLog(#"Continue for %#", [self label]);
return YES;
}
- (void)endTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event
{
[super endTrackingWithTouch:touch withEvent:event];
NSLog(#"End for %#", [self label]);
}
These methods are all called, except they are only ever called on the UIControl where the touch began.
What is the best way to recognise touches coming and going across all my buttons? Do I have to do it all via the parent view?
I'll select a better answer if offered... but in case anybody finds this by search, I've managed to get what I need this way:
Set userInteractionEnabled to NO for the button class (UIControl subclass)
Don't override any touch methods in the button class
Implement touchesBegan:withEvent:, touchesMoved:withEvent: and touchesEnded:withEvent: in the view controller
On each event, extract the location from the UITouch object
Iterate over all of the button subviews and find the one containing the touch location:
- (MyButton *)buttonForTouch:(UITouch *)touch
{
CGPoint windowLocation = [touch locationInView:keyboardView];
for (MyButton *button in buttons) {
if (CGRectContainsPoint([button frame], windowLocation)) {
return button;
}
}
return nil;
}
Having determined which button the user is interacting with, make the view controller send messages to the relevant buttons to adjust their appearance
If appropriate, keep a reference to the UITouch instance in touchesBegan:withEvent: so you can be sure that you're tracking the same one in the other methods
I think that you should have a single big UIControl which has different subviews (like UIButton) but tracks touches by itself like you did already but finds out which subview to highlight depending on the touch position.
I'm having trouble getting a UIView to respond how I want with multiple touches. Basically certain UITouches are in UITouchPhaseBegan but never make it to UITouchPhaseEnded or UITouchPhaseCancelled. Here's the code I'm using to handle touches, which is called from touchesBegan:withEvent, touchesMoved:withEvent, touchesEnded:withEvent and touchesCancelled:withEvent. If I put one finger down, then another, move them, and release them simultaneously, the NSLog output is sometimes Began! Began! Ended! rather than Began! Began! Ended! Ended!. Are these touches getting lost somewhere? How can I keep track of them?
- (void) handleTouchEvent:(UIEvent *)event {
for( UITouch* touch in [event allTouches] ) {
if( touch.phase == UITouchPhaseBegan ) {
NSLog(#"Began!");
if( ![m_pCurrentTouches containsObject:touch] )
[m_pCurrentTouches addObject:touch];
uint iVoice= [m_pCurrentTouches indexOfObject:touch];
CGPoint location = [touch locationInView:self];
m_pTouchPad->SetTouchPoint( location.x, location.y, iVoice );
m_pTouchPad->SetIsTouching( true, iVoice );
}
else if( touch.phase == UITouchPhaseMoved ) {
uint index= [m_pCurrentTouches indexOfObject:touch];
CGPoint location = [touch locationInView:self];
m_pTouchPad->SetTouchPoint( location.x, location.y, index );
}
else if( touch.phase == UITouchPhaseEnded || touch.phase == UITouchPhaseCancelled ) {
uint index= [m_pCurrentTouches indexOfObject:touch];
[m_pCurrentTouches removeObject:touch];
NSLog(#"Ended!");
m_pTouchPad->SetIsTouching( false, index );
}
}
}
EDIT:
I'm offering a bounty because I really want a good solution to this. To summarize: I need a system where every touch that begins also ends, so if a user puts down one finger and then another elsewhere, I can see both touches begin, and by the time there are no fingers in contact with the device anymore, I have seen both touches end.
Am I pursuing the wrong strategy to achieve this?
One event can report many touches. So you are sometimes getting "Ended!" once, because only one event arrived and only one touch event handler call was made - but it reported both touches ending. If you are manually handling multiple simultaneous touches (fingers), it is up to you to track each touch individually and check every touch in every event to see how many of your touches are being reported and decide what to do.
Apple has sample code showing how to do this by maintaining a CFDictionaryRef:
http://developer.apple.com/library/IOs/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/MultitouchEvents/MultitouchEvents.html#//apple_ref/doc/uid/TP40009541-CH3-SW7
(Scroll down to the section called "Handling Multitouch Events".)
Just tried your code, which have some problems.
I get "Began Began Began End End" for two fingers sometimes because touchesBegan get called two times and first time have one began touch second time have two began touches.
I don't know why you didn't split the method and put the code into the touchesBegan, touchesMoved, touchesEnded methods. But you should use touches that passed from the argument instead of [event allTouches].
- (void) handleTouches:(NSSet *)touches {
for( UITouch* touch in touches ) {
if( touch.phase == UITouchPhaseBegan ) {
NSLog(#"Began!");
}
else if( touch.phase == UITouchPhaseMoved ) {
}
else if( touch.phase == UITouchPhaseEnded || touch.phase == UITouchPhaseCancelled ) {
NSLog(#"Ended!");
}
}
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleTouches:touches];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleTouches:touches];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleTouches:touches];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleTouches:touches];
}
It seems that there is no guarantee about the order in which UITouches appear when looping through a UIGestureRecognizer's method [locationOfTouch: inView:]. Specifically:
for (int i = 0; i < [recognizer numberOfTouches]; i++) {
CGPoint point = [recognizer locationOfTouch:i inView:self];
NSLog(#"[%d]: (%.1f, %.1f)", i, point.x, point.y);
}
One would think that point with index 0 would be the first UITouch, or the first that was released, but quite often the order of 2 touches is mixed up. Does anyone know how to test for the order of those events? Unfortunately there is no access to the UITouch objects themselves (with the timestamp).
Also, no guarantee is made in the documentation that the touches from -locationOfTouch:inView: will always be in a reliable order. Can anyone confirm or deny this?
You could try setting the recognizer's delegate property and implementing -gestureRecognizer:shouldReceiveTouch:. It should be called sequentially, and you can then hold on to the touches.
It seems like to me like want to track i.e. two fingers of two hands independently, instead of recognizing a concrete gesture which is the goal of a UIGestureRecognizer, as the name says. If that's what you want i'd rather implement the UIResponder methods:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesEnded:touches withEvent:event];
}
With this approach you really get a set of UITouch objects und can do advanced tracking.
Why not iterating through UITouches sorted by timestamp?