I need to track multiple touch events in an iOS MonoTouch application. I have overridden the TouchesBegan, TouchesCancelled, TouchesMoved and TouchesEnded methods.
What I need to figure out now is how to iterate through the touches (there will be more than one) during each of those overrides and match them up. I want to know when a touch begins and do something with it versus a different touch event. The user may put a finger down at which time I will start a timer to do something if that finger stays down.
If during that time the user puts down another finger I will want to start a timer for that one that is different than the first one.
I am pretty sure I can figure out a way to store my timers and such. What I can't figure out it how to iterate through the touch events that the NSSet contains in each of the overrides and then how to uniquely identify them BETWEEN the overrides.
I am assuming that a TouchesBegan touch in the NSSet will match up with a TouchesMoved, TouchesCancelled or TouchesEnded touch in the NSSets that those overrides get as well.
Is that true? If so how to I get at each one and uniquely ID them to match them up?
Here is a good example of MonoGame's use of TouchesBegan etc: https://github.com/mono/MonoGame/blob/develop/MonoGame.Framework/iOS/iOSGameView_Touch.cs
UITouch also has a timestamp field you could use to differentiate touches. I think you should store them in a dictionary to get the functionality you mention.
Here is the class reference for UITouch: http://developer.apple.com/library/ios/#DOCUMENTATION/UIKit/Reference/UITouch_Class/Reference/Reference.html
Related
Is there a way to begin a UIPanGestureEvent if the finger is already pressed at the time the object is instantiated?
I have a situation where when a user holds their find on a screen I create a UIView under their finger.
I want them to be able to drag that around and as such I have put a UIPanGestureRecognizer inside the UIView.
Problem is I need to take my finger off and put it back to trigger the UIPanGestureRecognizer to start up. I need it to start from an already pressed state.
Do you know how I can activate a UIPanGesture from an already pressed state i.e. can I get the touch event thats already active at the time of instantiation and pass it along?
You can do it, but the UIPanGestureRecognizer will need to exist already on the view behind the view you create (and you will then have to adjust your calculations based on this; not difficult).
The reason is that, under the circumstances you describe, the touch does not belong to the UIView you create - it belongs to the UIView behind it, the one that the user was originally touching. And given the nature of iOS touch delivery, you can't readily change that. So it will be simpler to let that view, the actual original touch view, do the processing of this touch.
I think Matt's solution is best so I am going to mark it as correct.
However my code structure wasn't going to allow me to cleanly implement it. Compounding the issue was the object listening was listening for a UILongGestureRecognizer.
So my solution was as follows:
Create a callback in my ViewController that would handle the longGestureOverride call
Add a callback to the object listening for the longGesture that would call the longGestureOverride callback and pass along the point
Manually move the object based on the point passed back
If the user lifts their finger, I disable the longGestureOverride callback, and begin using the UIPanGesture inside the new object
I'm a bit new to programming, but am creating an app for iOS in which there are going to be up to four objects that need to be moved separate from each other. So I need to know how to create 4 different areas on the screen that the user can touch and swipe to drag each object.
Sorry if this question was asked before! I didn't find the answer, but I may have just been searching the wrong way.
Gesture recognizer:
You can verify the touch area in the selector.
Create 4 separate gesture recognizer on 4 objects.
And in each's handler, check for the touch location, verify it and respond accordingly.
This issue I think deserves its own question. Using the code attached to my solution to another problem I discovered the issue described here.
I have the main view controller set as a UIGestureRecognizerDelegate, and I implement touchesBegan, touchesMoved, touchesEnded, and touchesCancelled programming my solution with the assumption that for every touch object with a touchesBegan event there would be a touchesEnded or touchesCancelled event for that same object. I'm finding that not to be the case, though.
Scenario:
The following events happen in this order.
User starts gesture 1, touching the screen and sliding the finger.
User starts gesture 2, touching the screen at a different location.
User continues to slide both fingers at their respective parts of the screen.
User lifts finger off the screen for gesture 2.
User continues gesture 1.
User lifts finger off the screen for gesture 1.
Using NSLog to capture the details of the touch event, I find that a separate touch object is used for gesture 1 and gesture 2. But while touchesBegan, touchesMoved, and touchesEnded are all called for gesture 1, only touchesBegan and touchesMoved are called for gesture 2. The event touchesCancelled is not called for it either.
So how can I tell when gesture 2 finishes if touchesEnded and touchesCancelled are not called?
Edit: I found another post with similar symptoms. Most of my subviews are created programmatically, though. I'll try what was suggested there for the others. I'm skeptical it is the same issue, though, since in my testing, the touch locations are not anywhere near the other views.
Another edit: Following the recommendation in the link posted in my previous edit, I looked at the subviews, and one had user interaction checked. After I unchecked it, the behavior is slightly different. Now the second touch isn't noted at all in any of the touch events. I must be missing something basic. The main view, and the view with user interaction checked, by the way, both occupy the same space (one encapsulates the other).
My original assumption that each touch would have its own object that starts at touchesBegan and ends with either touchesEnded or touchesCancelled I think is correct. It is with my current implementation anyway. I originally wasn't seeing a second touch because Multiple Touch was not enabled for the view I was working with. I enabled that, per suggestion in the comments. After that, I was able to see some, but not all touch events for the second touch. The reason I was sometimes not seeing the second touch was because I had a subview that had user interaction enabled. Apparently, it was commandeering the touches. I unchecked that and then was able to see the touch objects.
I then switched tracking the touches by coordinates to touch IDs and was able to track the complete lifespan of all touches. Tracking by coordinates didn't work because I found that for the second touch, the touchesEnded coordinates were identical to the last one in touchesMoved rather than the previous location in touchesEnded matching the touchLocation in touchesMoved as with the first touch. If this sounds confusing, just track the touches by touch ID instead of by coordinates.
How about you put something like this in your touchesMoved method
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSArray* touchData = #[touches,event];
[self.timer invalidate];
self.timer = [NSTimer scheduledTimerWithTimeInterval:0.1 target:self
selector:#selector(touchesFinishedWithoutCallback:) userInfo:touchData
repeats:NO];
[self.timer fire];
}
The touchesFinishedWithoutCallback: method will only get called when touchesMoved stops getting called.
Needs elaborating for multiple touch, but could be a solution?
I have a situation where I apply an effect to a UIView when a touch begins and reverse that effect when that touch ends. So basically I am tracking touchesbegan, touchesEnded and touchesCancelled methods of UIView.
But the problem is that when the view goes out of the screen, i.e. when it or one of its parents gets removed from superview, it does not get any more touch events. Is there any way to give this "last" touchesended event to the view? Maybe if the UIView gets notified about being invisible, I can also use this event for that purpose.
Ok I am going to move the answers in comments to original question to make a good summary of important points.
The reason I am tracking touch events is that I want to apply some
nice effects such as glowing on touch start and remove those effects
on touch ending.
The reason why I can not simulate touchesEnded on removing those
views is that I do not directly remove them. Instead I remove one of
the ancestor views of them. I can not keep track of ancestor views
all the way to UIWindow, it is technically impossible I think.
Instead, framework should provide this to as an event I think.
I solved my problem by overriding -(void)willMoveToWindow:(UIWindow *)newWindow method and checking if newWindow is nil.
I'd like to implement multitouch, and I was hoping to get some sanity checks from the brilliant folks here. :)
From what I can tell, my strategy to detect and track multitouch is going to be to use the touchesBegan _Moved and _Ended methods and use the allTouches method of the event parameter to get visibility on all relevant touches at any particular time.
I was thinking I'd essentially use the previousLocationInView as a way of linking touches that come in with my new events with the currently active touches, i.e. if there is a touchBegan for one that is at x,y = 10,14, then I can use the previous location of a touch in the next message to know which one this new touch is tied to as a way of keeping track of one finger's continuous motion etc. Does this make sense? If it does make sense, is there a better way to do it? I cannot hold onto UITouch or UIEvent pointers as a way of identifying touches with previous touches, so I cannot go that route. All I can think to do is tie them together via their previouslocationInView value (and to know which are 'new' touches).
You might want to take a look at gesture recognizers. From Apple's docs,
You could implement the touch-event handling code to recognize and handle these gestures, but that code would be complex, possibly buggy, and take some time to write. Alternatively, you could simplify the interpretation and handling of common gestures by using one of the gesture recognizer classes introduced in iOS 3.2. To use a gesture recognizer, you instantiate it, attach it to the view receiving touches, configure it, and assign it an action selector and a target object. When the gesture recognizer recognizes its gesture, it sends an action message to the target, allowing the target to respond to the gesture.
See the article on Gesture Recognizers and specifically the section titled "Creating Custom Gesture Recognizers." You will need an Apple Developer Center account to access this.