I am trying to achieve a certain behavior in which an object executes a method when a tap passes into the bounds of the object, and then another when the tap leaves its bounds. Since this is a difficult scenario to put into words, I drew a picture:
Which type of ControlEvent is this, and if it doesn't exist, are there other accepted methods of getting this to work?
Thanks,
James
There isnt any control event for this that i know of. You will have to implement this yourself. On the view controller (superview to your button) place the following methods, something like this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Touches began, check the position of my touch. Is it outside a button?
// note that in a BOOL
...
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// was my first touch on a button? if not, see if my current touch is inside
// the button, note that
...
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// Touch is ending, is my touch outside the button and did i cross through earlier?
// If so, "drag through" event is successful
...
}
hope this helps
Related
I'm developing an app for iOS, and I want the user to be able to select some stuff on the screen, and then at the end of the selection be able to hold the finger for a second, to trigger some other event.
I have already created a lot of actions, for when the user marks stuff by swiping around. For this i have been using:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//Stuff here
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//Stuff here
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//Stuff here
}
I don't care if these functions are used, or if it done by UIGestureRecognizer, but i simply can't find a efficient and simple way to implement this.
Any suggestions?
You could initiate a NSTimer when a touch began, if the finger is within a target area, and if the touch ended function is called before the timer has fired r some condition is met within the touch moved function then cancel the timer, otherwise let it fire.
You'll probably need to use a timer as well as a pan gesture recognizer. When the finger reaches a target area, start the timer. If the finger moves out of the area, invalidate the timer. If the timer fires, it's a long press.
I have two custom views as subviews on a UIView. I'd like to take every interaction that happens on either of the two subviews and does the same thing to the other subview. This includes all possible interactions like zooming, panning, rotating, etc. What is the easiest/cleanest way to implement that?
If you are only interested in recognizing specific gestures such as zooming, panning, long presses, I would recommend using the UIGestureRecognizer subclasses, which allow you to listen for specific types of gestures. Add them all to one view, and whenever one of them is triggered, perform the actions on both of them (such as increasing the scale of your view or moving the content).
Another similar option is to implement the touch handling methods of UIView and respond to each tough event directly, mimicking the actions on both views. Those methods are:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
//called when a touch or touches begin
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//called when a touch or touches move
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
//called when a touch or touches end
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
//called when a touch or touches are interrupted in some way, such as a phone call
}
You can read more about those methods in the UIResponder reference here.
OTOH, if you want to actually synthesize the touch events, you will have to do some hacking. Here is a tutorial on how to extend the functionality of the UITouch class so that you can simulate a touch programmatically. This is not a solution you should use if you plan to submit this app to the App Store, since it uses private APIs.
I'm having an issue with handling more than one touch through the touchesBegan/Moved/Ended methods of UIViewController. I'm also seeing the same behaviour in a cocos2d app (using ccTouchesBegan/Moved/Ended) so I think this question can be applied to all touch handling in iOS. I've put the code that I'm using below, followed by the results that I'm seeing.
All methods are implemented on a UIViewController subclass.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Began");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Moved");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Ended");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
-(void)logTouchesFor:(UIEvent *)event
{
int count = 1;
for (UITouch *touch in event.allTouches)
{
CGPoint location = [touch locationInView: self.view];
NSLog(#"%d: (%.0f, %.0f)", count, location.x, location.y);
count++;
}
}
Now for the interesting results...
Single Touches Work as Expected
Let's say I touch the screen with my thumb. I see in the output window that touchesBegan has been called as expected. I move my thumb around and touchesMoved gets called. Then I lift my thumb off the screen and touchesEnded gets called. All this is as expected, I'm including it in the question as a control case - just to be clear that my view controller is receiving touch events and I haven't missed a vc.view.userInteractionEnabled = YES anywhere.
The Second Touch Doesn't Cause touchesBegan, touchesMoved or touchesEnded to be called
This is the most interesting one. Let's say I touch the screen with my thumb (touchesBegan is called) and hold it still on the screen. Then I touch somewhere else on the screen with my index finger, whilst keeping my thumb in the same place. TouchesBegan is not called. Then let's say I move my index finger whilst keeping my thumb absolutely still (this can be tricky but it is possible). TouchesMoved is not called. Then, I lift my index finger off the screen. TouchesEnded is not called. Finally, I move my thumb and touchesMoved is called. Then I lift my thumb from the screen and touchesEnded is called.
Just to be clear: I've set self.view.multipleTouchEnabled = YES in my viewDidLoad method.
Information About the Second Touch is Available, Providing the First Touch Moves
This time I do something very similar to the example immediately above. I touch the screen with my thumb, then index finger whilst keeping my thumb still. TouchesBegan is called when my thumb hits the screen, but not my index finger. Now I move my thumb, and touchesMoved is called. Not only that, but there are two touches in the event.allTouches array (and yes, the second touch is where I would expect it to be). This means that the system is aware that I have touched the screen a second time, but I am not being notified through the touch handling methods on my view controller.
How Can I be Notified About Changes to the Second Touch?
I'd really like to be able to respond to changes in the location or state of the second touch as they happen, rather than when the first touch also changes. A lot of the time this won't be an issue because it's very hard to change one touch without impacting on the other, but I have at least one situation where it can be an issue. Am I missing something obvious? Has anyone else noticed this behaviour or had issues with it?
In case it's relevant, I'm using an iPhone 3GS running iOS 5.1.
Ok, I started painstakingly rebuilding my project, adding in the files one at a time, and I think I've got it...
There was a subview of the view in question which had userInteractionEnabled == YES. Once I set this to NO, my view controller started getting touchesBegan/Moved/Ended calls for each touch. Presumably, the second touch was being claimed by the subview and not making it to my view controller, but I have no idea why this would only happen with the second touch and not the first.
I haven't figured out what's going on with the cocos2d project yet, but presumably it's a different issue as there are no subviews or nodes that could be doing the same thing.
In UITapGestureRecognizer, when user tap in & then out do we get two different events for this?
I have put UITapUITapGestureRecognizer on one of my view & then when user tap in I need to change the color of the view & when user tap out (i.e. remove his finger from the point) the color should be changed back to original color. I am able to change the color on tap in but not on tap out.
Any advise?
You shouldn't (can't?) use gesture recognizers for that, since they are built to handle raw touch events and parse them into gestures. The "tap in" event isn't a gesture by itself though.
Use these :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
I have a slider element that I want to detect a touchesEnded action on.
I'm trying to add the touchesEnded action programmatically:
[self.slider touchesEnded:(NSSet *] touches withEvent:event];
The problem is I have no idea where to get the touches variable from (I'm assuming the event param is the function I wish to call after the touch has ended).
Create a subclass for UISlider. Implement
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
method in the sub-class. Delegate the touch event to your view.