My interface sometimes has buttons around its periphery. Areas without buttons accept gestures.
GestureRecognizers are added to the container view, in viewDidLoad. Here’s how the tapGR is set up:
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(playerReceived_Tap:)];
[tapGR setDelegate:self];
[self.view addGestureRecognizer:tapGR];
In order to prevent the gesture recognizers from intercepting button taps, I implemented shouldReceiveTouch to return YES only if the view touched is not a button:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Get the topmost view that contains the point where the gesture started.
// (Buttons are topmost, so if they were touched, they will be returned as viewTouched.)
CGPoint pointPressed = [touch locationInView:self.view];
UIView *viewTouched = [self.view hitTest:pointPressed withEvent:nil];
// If that topmost view is a button, the GR should not take this touch.
if ([viewTouched isKindOfClass:[UIButton class]])
return NO;
return YES;
}
This works fine most of the time, but there are a few buttons that are unresponsive. When these buttons are tapped, hitTest returns the container view, not the button, so shouldReceiveTouch returns YES and the gestureRecognizer commandeers the event.
To debug, I ran some tests...
The following tests confirmed that the button was a sub-subview of the container view, that it was enabled, and that both button and the subview were userInteractionEnabled:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
// Test that hierarchy is as expected: containerView > vTop_land > btnSkipFwd_land.
for (UIView *subview in self.view.subviews) {
if ([subview isEqual:self.playComposer.vTop_land])
printf("\nViewTopLand is a subview."); // this prints
}
for (UIView *subview in self.playComposer.vTop_land.subviews) {
if ([subview isEqual:self.playComposer.btnSkipFwd_land])
printf("\nBtnSkipFwd is a subview."); // this prints
}
// Test that problem button is enabled.
printf(“\nbtnSkipFwd enabled? %d", self.playComposer.btnSkipFwd_land.enabled); // prints 1
// Test that all views in hierarchy are interaction-enabled.
printf("\nvTopLand interactionenabled? %d", self.playComposer.vTop_land.userInteractionEnabled); // prints 1
printf(“\nbtnSkipFwd interactionenabled? %d", self.playComposer.btnSkipFwd_land.userInteractionEnabled); // prints 1
// etc
}
The following test confirms that the point pressed is actually within the button’s frame.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gr
shouldReceiveTouch:(UITouch *)touch {
CGPoint pointPressed = [touch locationInView:self.view];
CGRect rectSkpFwd = self.playComposer.btnSkipFwd_land.frame;
// Get the pointPressed relative to the button's frame.
CGPoint pointRelSkpFwd = CGPointMake(pointPressed.x - rectSkpFwd.origin.x, pointPressed.y - rectSkpFwd.origin.y);
printf("\nIs relative point inside skipfwd? %d.", [self.playComposer.btnSkipFwd_land pointInside:pointRelSkpFwd withEvent:nil]); // prints 1
// etc
}
So why is hitTest returning the container view rather than this button?
SOLUTION: The one thing I wasn't testing was that the intermediate view, vTop_land, was framed properly. It looked OK because it had an image that extended across the screen -- past the bounds of its frame (I didn't know this was possible). The frame was set to portrait width, rather than landscape width, so buttons on the far right were out of zone.
Hit test is not reliable in most cases, and it is generally not advisable to use it along with gestureRecognizers.
Why dont you setExclusiveTouch:YES for each button, and this should make sure that the buttons are always chosen.
Related
I am working on an iOS map app and it includes interactive map. The interactive map is a subclass of UIImageView and placed on a scrollView. My view hierarchy is shown below:
When user taps some part of the map, ViewController performs animated segue (like zoom-in to that area of the map). I can start segue from any point of the screen, but to do this properly, I need exact coordinates of user's tap relative to the screen itself. As ImageView is put at the top of ScrollView, it uses different coordinate system, larger than screen size. No matter, which area of map has ben tapped, what matters is the tapped CGPoint on the screen (physical).
ImageView uses its own code to get coordinates of a tap:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesEnded:touches withEvent:event];
// cancel previous touch ended event
[NSObject cancelPreviousPerformRequestsWithTarget:self];
CGPoint touchPoint = \
[[touches anyObject] locationInView:self];
NSValue* touchValue =\
[NSValue
valueWithCGPoint:touchPoint];
// perform new one
[self
performSelector:#selector(_performHitTestOnArea:)
withObject:touchValue
afterDelay:0.1];
}
And the case if I place gesture recognizer, it works, but ImageView can't get any touches and, therefore, call segue.
The code for gesture recognizer, I attempted to use:
UITapGestureRecognizer *rec = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapRecognized:)];
[someView addGestureRecognizer:rec];
[rec release];
// elsewhere
- (void)tapRecognized:(UITapGestureRecognizer *)recognizer
{
if(recognizer.state == UIGestureRecognizerStateRecognized)
{
CGPoint point = [recognizer locationInView:recognizer.view];
// again, point.x and point.y have the coordinates
}
}
So, is there any way to get two coordinates in different reference systems?, or to make these recognizers work simultaneously without interfering each other?
Solved
I use this code to convert touched point from one view's reference system to
CGPoint pointInViewCoords = [self.parentView convertPoint:self.imageView.touchPoint fromView:self.imageView];
Where self.parentView is "View" on hierarchy image - with the size of the screen.
In iPad when you put your finger outside top or bottom edge of screen and then drag it on screen a menu is revealed. How can I implement that?
There is specifically a Gesture Recogniser class for this, introduced in iOS 7. It's the UIScreenEdgePanGestureRecognizer. The documentation for it is here. Check it out.
To test this in the simulator, just start the drag from near the edge (~15 points).
Also, you will have to create a gestureRecognizer for each edge. You can't OR edges together, so UIRectEdgeAll won't work.
There is a simple example here. Hope this helps!
Well you can do something like this, this example is the case where you want you pan gesture to work only when the user swipes 20px inside from the right hand side of the screen
First of all add the gesture to your window
- (void)addGestures {
if (!_panGesture) {
_panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[_panGesture setDelegate:self];
[self.view addGestureRecognizer:_panGesture];
}
}
After adding the check whether the touch you recieved is a pan gesture and then perform your action accordingly
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint point = [touch locationInView:self.view];
if (gestureRecognizer == _panGesture) {
return [self slideMenuForGestureRecognizer:gestureRecognizer withTouchPoint:point];
}
return YES;
}
Here is how you can check whether your touch is contained in the region where you want it to be
-(BOOL)isPointContainedWithinBezelRect:(CGPoint)point {
CGRect leftBezelRect;
CGRect tempRect;
//this will be the width between CGRectMaxXEdge and the screen offset, thus identifying teh region
CGFloat bezelWidth =20.0;
CGRectDivide(self.view.bounds, &leftBezelRect, &tempRect, bezelWidth, CGRectMaxXEdge);
return CGRectContainsPoint(leftBezelRect, point);
}
I have a UIView with two gesture recognizers. Both recognize tap with two fingers: one for the upper half of the screen, the other for the bottom of the screen.
In that UIView, I have 4 buttons that cover the entire screen (each button is a quarter of the screen).
I'm using the gesture recognizer to detect when the user presses 2 buttons at the same time, and I still want to recognize the normal touches on the buttons.
I've setup everything, and it works fine. However, when pressing with just one finger the shadow on the button appears on Touch Up, and not on Touch Down. And it feels weird. I've tried to change delaysTouchesBegan with no success.
Is there a way to have both behaviours? Detect the touches with two fingers, but have a "normal behaviour" when there's only one finger? Otherwise, can I force the pressed state of a UIbutton?
Here's how I setup my gestures :
-(void)initGestureRecognition{
handClapTapGestureRecognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(handClapDetected:)];
handClapTapGestureRecognizer.numberOfTouchesRequired = 2;
handClapTapGestureRecognizer.numberOfTapsRequired = 1;
handClapTapGestureRecognizer.cancelsTouchesInView = YES;
[self.gestureRecognitionView addGestureRecognizer:handClapTapGestureRecognizer];
handClapTapGestureRecognizer.delegate = self;
jumpTapGestureRecognizer = [[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(jumpDetected:)];
jumpTapGestureRecognizer.numberOfTouchesRequired = 2;
jumpTapGestureRecognizer.numberOfTapsRequired = 1;
jumpTapGestureRecognizer.cancelsTouchesInView = YES;
[self.gestureRecognitionView addGestureRecognizer:jumpTapGestureRecognizer];
jumpTapGestureRecognizer.delegate = self;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return NO;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
if ([gestureRecognizer isEqual:handClapTapGestureRecognizer] && [touch locationInView:self.view].y > self.view.frame.size.height/2)
return NO;
if ([gestureRecognizer isEqual:jumpTapGestureRecognizer] && [touch locationInView:self.view].y < self.view.frame.size.height/2)
return NO;
return YES;
}
I know my problem is similar to that one : UIButton inside a view that has a UITapGestureRecognizer but the difference is that in my case, the behaviour is ok, and I'm just trying to get the shadow on the button on Touch DOwn, rather than on Touch Up.
Thanks
Could you manually set [button setHighlighted:YES] when the tap gesture is first recognized and its location matches that of the button's, and then to NO when the gesture ends?
I have the following problem.
I am using a UILongPressGestureRecognizer to put a UIView into a "toggle mode". If the UIView is in "toggle mode" the user is able to drag the UIView around the screen. For dragging the UIView around the screen I am using the methods touchesBegan, touchesMoved and touchesEnded.
It works, but: I have to lift my finger in order to drag it, because the touchesBegan method got already called and therefore is not called again and therefore I can't drag the UIView around the screen.
Is there any way to manually call touchesBegan after UILongPressGestureRecognizer got triggered (UILongPressGestureRecognizer changes a BOOL value and the touchesBegan only works if this BOOL is set to YES).
UILongPressGestureRecognizer is a continuous gesture recognizer, so rather than resorting to touchesMoved or UIPanGestureRecognizer, just check for UIGestureRecognizerStateChanged, e.g.:
- (void)viewDidLoad
{
[super viewDidLoad];
UILongPressGestureRecognizer *gesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[self.view addGestureRecognizer:gesture];
}
- (void)handleGesture:(UILongPressGestureRecognizer *)gesture
{
CGPoint location = [gesture locationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
// user held down their finger on the screen
// gesture started, entering the "toggle mode"
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
// user did not lift finger, but now proceeded to move finger
// do here whatever you wanted to do in the touchesMoved
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// user lifted their finger
// all done, leaving the "toggle mode"
}
}
I would suggest you to use UIPanGestureRecognizer as it a recommended gesture for dragging.
You can configure the min. and max. number of touches required for a panning, using the following the properties:
maximumNumberOfTouches
minimumNumberOfTouches
You can handle the states like Began, Changed and Ended, like having animation for the required states.
Using the below method translate the point to the UIView in which you want it.
- (void)setTranslation:(CGPoint)translation inView:(UIView *)view
example:
You have to use a global variable to retain the old frame. Get this in UIGestureRecognizerStateBegan.
When the state is UIGestureRecognizerStateChanged. You can use the
-(void) pannningMyView:(UIPanGestureRecognizer*) panGesture{
if(panGesture.state==UIGestureRecognizerStateBegan){
//retain the original state
}else if(panGesture.state==UIGestureRecognizerStateChanged){
CGPoint translatedPoint=[panGesture translationInView:self.view];
//here you manage to get your new drag points.
}
}
Velocity of the drag. Based on the velocity you can provide a animation to show bouncing of a UIView
- (CGPoint)velocityInView:(UIView *)view
The moment I receive touchesBegan, I want to removeFromSuperview the view that was touched and addSuperview to a new parent view, and then continue to receive touches. However I am finding that sometimes this does not work. Specifically, touchesMoved and touchesEnded are never called.
Is there a trick for making this work correctly? This is for implementing a drag and drop behavior, where the view is initially inside a scroll view.
Thanks.
Instead of:
[transferView removeFromSuperView];
[newParentView addSubview:transferView];
Use only:
[newParentView addSubview:transferView];
The documentation states: "Views can have only one superview. If view already has a superview and that view is not the receiver, this method removes the previous superview before making the receiver its new superview."
Therefore there is no need to use removeFromSuperView because it is handled by addSubview. I have noticed that removeFromSuperView ends any current touches without calling touchesEnded. If you use only addSubview, touches are not interrupted.
You need to process your touches in the superview instead of in the view that you want switched out. This will allow you to switch out the view without loosing your touch events. When you do this though, you'll have to test yourself whether the touch is occurring in the specific subview you want switched out. This can be done many ways, but here are some methods to get you started:
Converting Rects/Point to another view:
[view convertRect:rect toView:subview];
[view convertPoint:point toView:subview];
Here are some methods to test if the point is located in the view:
[subView hitTest:point withEvent:nil];
CGRectContainsPoint(subview.frame, point); //No point conversion needed
[subView pointInside:point withEvent:nil];
In general, it's better to use UIGestureRecognizers. For example, if you were using a UIPanGestureRecognizer, you would create a method that the gesture recognizer can call and in that method you do your work. For example:
- (void) viewPanned:(UIPanGestureRecognizer *)pan{
if (pan.state == UIGestureRecognizerStateBegan){
CGRect rect = subView.frame;
newView = [[UIView alloc] initWithFrame:rect];
[subView removeFromSuperview];
[self addSubview:newView];
} else if (pan.state == UIGestureRecognizerStateChanged){
CGPoint point = [pan locationInView:self];
newView.center = point;
} else {
//Do cleanup or final view placement
}
}
Then you init the recognizer, assign it to the target (usually self) and add it:
[self addGestureRecognizer:[[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(viewPanned:)]];
Now self (which would be the superview managing it's subviews) will respond to pan motions.