I have layer, on which I placed sprite(yellow) and child menu("Main Menu: button), to look like modal dialog. There are also number of buttons, menus below that layer. What I want to achieve is whenever this layer is presented, do not propagate touch event to other items. This is what I've done in layer class:
- (id)init {
...
// I was hoping that this will take touch and eat, so if I tap on button it will not be propagated further, because it has lowest priority
[_menu setHandlerPriority:INT_MIN+1];
...
}
-(void)registerWithTouchDispatcher {
// With this I was hoping that if I tap anywhere else my layer class will capture it first
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:INT_MIN+2 swallowsTouches:YES];
}
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
// so I simply drop any tap on my layer
return NO;
}
Probably I'm missing something, but with this buttons below my layer are still getting the touch. So the modal concept doesn't work.
How can I make all child items of my yellow sprite touchable and the rest not touchable.
Your return in ccTouchBegan is the issue.
Your layer will only swallow touches when ccTouchBegan tells it to (returns YES).
Normally you would check if the touch is within the bounds of the layer and then return YES, but in this case, returning YES always will swallow all touches (unless there is another layer of priority INT_MIN, or INT_MIN+1).
Edit: And point 2 - make sure you're enabling touches to begin with:
[self setTouchEnabled:YES];
Related
I have a UIViewController subclass whose view will generally contain some number of UIButtons and other interactive elements which may have one or more gesture recognizes attached to them.
What I'm trying to do is provide some visual feedback in the event that the user taps on a part of the screen that is not interactive. In other words: when the user taps the screen anywhere, if and only if no other control in the view responds to the touch event (including if it's, say, the start of a drag), then I want to fire off a method based on the location of the tap.
Is there a straightforward way to do this that would not require my attaching any additional logic to the interactive elements in the view, or that would at least allow me to attach such logic automatically by traversing the view hierarchy?
You can override the pointInside:withEvent: method of the container view and return NO if there is an interactive element under the tapped location, and return YES otherwise. The case where you return YES will correspond to tapping on an empty location.
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// Perform some checks using the CGRectContainsPoint method
// e.g. CGRectContainsPoint(oneOftheSubviews.frame, point)
}
A good alternative is using the hitTest:withEvent: method as in the example below:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView *hitView = [super hitTest:point withEvent:event];
if (hitView != someView) return nil;
return hitView;
}
[super hitTest:point withEvent:event] will return the deepest view in that view's hierarchy that was tapped.
So you can check the type of hitView, and if it corresponds to one of your interactive subviews, return it. If it is equal to self, then it means that there isn't a subview under the tapped location, which in your case means that the user tapped on an empty area. In that case you'll do whatever you want to do and then return nil.
i have a custom UITableViewCell with a DrawingView as a subview. if the user try to draw something on that view everytime the touch events are forwarded to the underlying ScrollView (UITableView) and than the view scrolls. How can i disable the forwarding from the touch/scroll-events to the scrollView, that the user can draw on the DrawingView?
Any idear's? I tests the exclusiveTouch property, methods like hitTest or touchBegan to captcher the events and stop the scrolling, but nothing helped. Thanks for helping!
The caveat here is that a 'drawing' motion could very easily be interpreted as a scrolling motion.
What you need to do is override pointInside on your cell.
Effectively:
- (BOOL)pointInside:(CGPoint)point
withEvent:(UIEvent *)event
{
if (CGRectContainsPoint(drawingView.frame, point)) {
// Use the point to do the drawing
[drawingView drawAtPoint:point];
// Disable scrolling for good measure
self.tableView.scrollView.scrollEnabled = NO;
return NO;
}
// Enable scrolling
self.tableView.scrollView.scrollEnabled = YES;
return [super pointInside:point withEvent:event];
}
What this means is that as long as the user touches your cell inside the drawingView, scrolling won't happen.
If you're looking to scroll and draw at the same time from within the drawingView, that's going to be a lot kludgier to pull off.
See if this works. You may have to do some extra work like forwarding the point to your drawingView to draw something at the point.
Be careful that even if your finger is touching the same point, the pointInside method could be called multiple times so take care of duplicate events being called.
I'm trying to call action with UIControlEventTouchDown event for simple UIButton which placed in left-bottom corner of UIViewController (which pushed with UINavigationController).
I created storyboard to push view controller with button.
And added actions for button to trace touch events.
- (IBAction)touchUpInside:(id)sender {
NSLog(#"touchUpInside");
}
- (IBAction)touchDown:(id)sender {
NSLog(#"touchDown");
}
And also added touchesBegan to trace if it is called.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
NSLog(#"touchesBegan");
}
Now, with such setup, I have strange behaviour. There are some touch areas in left (width =13) and left-bottom (width = 50, height = 50) which respond differently on touches. If you will will make touch over those areas -touchesBegan is not called on touch down, as would be with normal behaviour. But it will be called only after touch up.
I believe left area is used in UINavigationControoler for interactive pop of pushed UIViewController. So two questions here.
For which functionality is responsible area in bottom-left?
How and where can I change behaviour to pass touch event to UIButton (for example if I want UIButton to respond on long touch event, when I pressing in "red" area)?
I had this same problem, and I fixed it by disabling the "swipe to go back" (technically called "interactive pop gesture" in UIKit) feature introduced in iOS 7.
Sample code to disable interactive pop gesture:
if ([self.navigationController respondsToSelector:#selector(interactivePopGestureRecognizer)]) {
self.navigationController.interactivePopGestureRecognizer.enabled = NO;
}
I believe this is due to the interactive pop gesture recognizer consuming/delaying touch events near the left edge of the screen (because a swipe to go back starts from the left edge) and thus causing the touch events to not be delivered to controls that are situated near the left edge of the view.
I'm having an issue with handling more than one touch through the touchesBegan/Moved/Ended methods of UIViewController. I'm also seeing the same behaviour in a cocos2d app (using ccTouchesBegan/Moved/Ended) so I think this question can be applied to all touch handling in iOS. I've put the code that I'm using below, followed by the results that I'm seeing.
All methods are implemented on a UIViewController subclass.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Began");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Moved");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Touches Ended");
[self logTouchesFor: event];
[super touchesEnded: touches withEvent: event];
}
-(void)logTouchesFor:(UIEvent *)event
{
int count = 1;
for (UITouch *touch in event.allTouches)
{
CGPoint location = [touch locationInView: self.view];
NSLog(#"%d: (%.0f, %.0f)", count, location.x, location.y);
count++;
}
}
Now for the interesting results...
Single Touches Work as Expected
Let's say I touch the screen with my thumb. I see in the output window that touchesBegan has been called as expected. I move my thumb around and touchesMoved gets called. Then I lift my thumb off the screen and touchesEnded gets called. All this is as expected, I'm including it in the question as a control case - just to be clear that my view controller is receiving touch events and I haven't missed a vc.view.userInteractionEnabled = YES anywhere.
The Second Touch Doesn't Cause touchesBegan, touchesMoved or touchesEnded to be called
This is the most interesting one. Let's say I touch the screen with my thumb (touchesBegan is called) and hold it still on the screen. Then I touch somewhere else on the screen with my index finger, whilst keeping my thumb in the same place. TouchesBegan is not called. Then let's say I move my index finger whilst keeping my thumb absolutely still (this can be tricky but it is possible). TouchesMoved is not called. Then, I lift my index finger off the screen. TouchesEnded is not called. Finally, I move my thumb and touchesMoved is called. Then I lift my thumb from the screen and touchesEnded is called.
Just to be clear: I've set self.view.multipleTouchEnabled = YES in my viewDidLoad method.
Information About the Second Touch is Available, Providing the First Touch Moves
This time I do something very similar to the example immediately above. I touch the screen with my thumb, then index finger whilst keeping my thumb still. TouchesBegan is called when my thumb hits the screen, but not my index finger. Now I move my thumb, and touchesMoved is called. Not only that, but there are two touches in the event.allTouches array (and yes, the second touch is where I would expect it to be). This means that the system is aware that I have touched the screen a second time, but I am not being notified through the touch handling methods on my view controller.
How Can I be Notified About Changes to the Second Touch?
I'd really like to be able to respond to changes in the location or state of the second touch as they happen, rather than when the first touch also changes. A lot of the time this won't be an issue because it's very hard to change one touch without impacting on the other, but I have at least one situation where it can be an issue. Am I missing something obvious? Has anyone else noticed this behaviour or had issues with it?
In case it's relevant, I'm using an iPhone 3GS running iOS 5.1.
Ok, I started painstakingly rebuilding my project, adding in the files one at a time, and I think I've got it...
There was a subview of the view in question which had userInteractionEnabled == YES. Once I set this to NO, my view controller started getting touchesBegan/Moved/Ended calls for each touch. Presumably, the second touch was being claimed by the subview and not making it to my view controller, but I have no idea why this would only happen with the second touch and not the first.
I haven't figured out what's going on with the cocos2d project yet, but presumably it's a different issue as there are no subviews or nodes that could be doing the same thing.
I have some labels in my app like this...
What i need to do is, when clicking on a label, just I'm showing label name in the bottom of the screen. It works fine while clicking on each cell separately. But i want to show the changes even the user click on a particular label and move his finger on another label. That is, once he pressed on screen, where ever his finger moves, i want to trace those places and want to show the changes. How can i do this? Please explain briefly.
Thanks in Advance
By default, touch events are only sent to the view they started in. So the easiest way to do what you're trying to do would be to put all of your labels in a container view that intercepts the touch events, and let the container view decide how to handle the events.
First create a UIView subclass for the container and intercept the touch events by overriding hitTest:withEvent::
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// intercept touches
if ([self pointInside:point withEvent:event]) {
return self;
}
return nil;
}
Set that custom class as the class of the container view. Then, implement the various touches*:withEvent: methods on your container view. In your case, something like this should work:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// determine which view is under the touch
UIView* view = [super hitTest:[[touches anyObject] locationInView:self] withEvent:nil];
// get that label's text and set it on the indicator label
if (view != nil && view != self) {
if ([view respondsToSelector:#selector(text)]) {
// update the text of the indicator label
[[self indicatorLabel] setText:[view text]];
}
}
}