I have a map drawn with OpenGLES, and I have a pan gesture recognizer that has maximumNumberofTouches set to 1 to pan around the map and a pinch gesture recognizer for zooming. I want to start panning once im done zooming, (one finger is lifted off the screen) but the pan gesture recognizer doesnt kick in until pinchgesturerecognizer is done which is when it detects there are no fingers on the screen. Any ideas?
It may be possible to allow both gestures to be active via the delegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
// The two recognizers using the delegate (scale and rotate) should both be active.
//
return YES;
}
And keep a BOOL that tracks if the user is zooming, not allowing the code in the pan gesture to execute while that BOOL is YES. Altering the BOOL value by checking the number of touches in the touchesMoved:withEvent method (or perhaps some other UIGestureRegonizer method).
I think this should work, I do something similar in an app that allows scaling, rotating, and dragging, where dragging is only allowed when the user isn't scaling/rotating.
~Good Luck
Related
EDIT : Will make it clearer what I am trying to achieve.
I have some annotations on a MKMapView that I want to make easy to drag. The standard Apple way to move the annotations is to tap once then quickly tap and hold and drag. The users of the app have complained that it is too difficult to do this.
So technically what I'd like to do is to use a pan gesture. The problem is that MKMapView is also using the pan gesture to move the map about.
What I would like to do is when the use does the pan gesture, check if the pan gesture starts really close to an annotation, if so then let the pan gesture handler move the annotation. I have this part working.
But if the pan gesture was not close to an annotation then pas the gesture on to MKMapView for it to be handled by it.
// EDIT END
I've a method to handle a pan gesture. This gets called as I would expect when there is a pan gesture on the MKMapView. Sometimes I would not want to handle the gesture in my method but to pass the gesture through to the MKMapView to pan/drag the map around like normal.
Here is an outline of what I've got so far.
The pan gesture is being handled by the method:
-(void)panGesture:(UIPanGestureRecognizer*)sender
Depending on some logic I would like to pass this gesture through to the MKMapView (self.mapView).
Can anybody share the code to do this please?
I tried [self.mapView gestureRecognizerShouldBegin:sender]; but nothing happened from this call.
- (void) addPanGesture
{
self.panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panGesture:)];
self.panGesture.delegate = self;
[self.panGesture setMinimumNumberOfTouches:1];
[self.panGesture setMaximumNumberOfTouches:1];
[self.mapView addGestureRecognizer:self.panGesture];
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
BOOL result = NO;
if ((gestureRecognizer == self.panGesture) && [[otherGestureRecognizer view] isDescendantOfView:[gestureRecognizer view]])
{
result = YES;
}
return result;
}
-(void)panGesture:(UIPanGestureRecognizer*)sender
{
//some logic to see if we will handle the gesture in this method or pass gesture on to the MKMapView
return;
}
The standard Apple way to move the annotations is to tap once then quickly tap and hold and drag. The users of the app have complained that it is too difficult to do this.
I'm not think so.. :(
I'm going to write my logic, Actually It is bad thing to use pan gesture even apple provide default drag and drop facility. BTW if you are going with your way that mentioned in your question then you must need to add separate pan gesture for each and every annotation those are displaying on the map. Add it's own tag and keep same name of gesture method for all so you will easily mange it by tag. so when end-user will tap/touch on any annotation then it's method will be called and you can get touched/tapped annotation by it's own tag. write your code for drag-drop annotation only. Not sure but might be solve your problem.
I agree completely with iPatel: If possible one should use the standard Apple way, since users are used to it.
But if you really want to implement you own solution, maybe the following might work (not tested):
The MapView has a built-in PanGestureRecognizer (called here builInPGR). I assume you added your own PanGestureRecognizer (called here ownPGR) to this MapView.
Now the following function in the UIGestureRecognizerDelegate protocol controls if two gesture recognisers should recognise simultaneously (see the docs):
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer,
shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool
The default implementation returns false.
You mentioned that you can drag your annotations using ownPGR. This means that builInPGR does not recognise the pan gesture simultaneously with ownPGR.
If you implement now this delegate function and return true, ownPGR and builInPGR should work simultaneously: builInPGR would move the map, and ownPGR would check, if a region close to an annotation has been tapped, and if so, would move the annotation.
Now, this delegate function gives you also a pointer to builInPGR, the otherGestureRecognizer.
So if ownPGR decides to drag an annotation since it started close to it, you can set the property isEnabled of builInPGR to false. This lets builInPGR transition to a cancelled state, and the map would no longer be dragged.
Of course, you had to set builInPGR.isEnabled to true again, when ownPGR finishes dragging the annotation.
Once more: I don’t know if this works, but it might. And I strongly recommend to use the standard approach!
EDIT: I am sorry, I just realised that you wanted the answer in Obj-C. But, using the docs (just switch to Obj-C) it should be easy to test it in Obj-C.
I want to implement back navigation, using longpress and swipe to the left, without lifting the finger, but the swipe gesture isn't recognised, if I don't lift the finger after the longpress.
I also implemented the following delegate method, but the desired result isn't appearing. Any thoughts?
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
if (gestureRecognizer == _longPress && otherGestureRecognizer == _swipe) {
return YES;
}
if (gestureRecognizer == _swipe && otherGestureRecognizer == _longPress) {
return YES;
}
return NO;
edit:
- the longpress gesture fires method, which changes the background color of the current UIViewController (made it, just to see, if it fires).
-the swipe gesture fires method, -popViewController:animated
Don't use 2 different gesture recognisers because this is 1 gesture. You should create a custom gesture subclass to encode your logic so it's a single logical gesture for you to add and for the user to execute.
Inside your gesture i'd have a small state machine so you know when you start, when the long press time is up, if they have actually swiped enough. From each state you're only looking for one thing to happen, if anything else happens then you know it's a failure and the gesture can fail out.
I'm having a rectangle (derived from UIView). This rectangle has an attached UITapGestureRecognizer, UIPanGestureRecognizer, UIPinchGestureRecognizer and an UIRotationGestureRecognizer.
These gesture recognizers are used to move the view, zoom it and rotate it.
However there is a conflict between the recognizers since rotating doesn't work while panning and so on...
I tried the following
[self.pincher requireGestureRecognizerToFail:self.panner];
With this rotating working while panning. How to extend that such the pinching is working either?
Edit:
Whats working:
Every gesture recognizer is working on it's own. But in combination there are troubles or let's say I want a different behavior ;) If I pan an object and tap with a second finger on it and pinch, I want to zoom the object - this is not working.
However, starting rotating with the second finger works with the code line above.
The UIGestureRecognizerDelegate protocol declares the following method gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer:
You can override this method and return YES to have your recognizers working at the same time:
self.pincher.delegate = self;
self.panner.delegate = self;
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer{
return YES;
}
EDIT:
I've written a simple view controller that implements simultaneous scale, rotate, and pan on a UIView, you can check out this gist
How can I know when the finger is down and when is it up with UITapGestureRecognizer?
The documentation says I should only handle UIGestureRecognizerStateEnded as tap so it means there is UIGestureRecognizerStateBegin when finger is down, but all I get is UIGestureRecognizerStateEnded.
The code I use to register the recognizer is:
[[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tap:)]
UITapGestureRecognizer is a discrete gesture recognizer, and therefore never transitions to the began or changed states. From the UIGestureRecognizer Class Reference:
Discrete gestures transition from Possible to either Recognized
(UIGestureRecognizerStateRecognized) or Failed
(UIGestureRecognizerStateFailed), depending on whether they
successfully interpret the gesture or not. If the gesture recognizer
transitions to Recognized, it sends its action message to its target.
(Remembering of course that UIGestureRecognizerStateRecognized == UIGestureRecognizerStateEnded).
The docs are saying that you should check the state of a tap gesture recognizer to see that it is in its ended state, before you fire your code to say that it has been recognized. They are not saying that the tap gesture actually transitions to the began or changed states (although I admit that the docs are a little misleading in the language used!).
If you want to check for the finger down event for a tap gesture recognizer, I would recommend just using touchesBegan:withEvent:, since this is what you are really after anyway.
You could override the delegate method -(BOOL)gestureRecognizer:shouldReceiveTouch:.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
NSLog(#"Hello from press down");
return YES;
}
I need have a view where users can draw multiple rectangles. Now I need to be able to move those rectangles using a Pan gesture. The problem I am having is that I can move any single rectangle with the Pan gesture recognizer with no problem. Now when I use 2 fingers to Pan 2 rectangles to move simultaneously it doesn't work.
Seem to me that the problem is the target/action that I specified for the pan gesture gets fired only once.
My view adopts the UIGestureRecognizerDelegate and has defined this delegate method:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
return YES;
}
Thanks in advance!
Add gesture recognizer to each created rectangle view individually by making a method.
like this
- (void)addGestureRecognizersToPiece:(UIView *)piece
{
// Add the gesture recognizer to the piece here
}
You can loop in to add the gesture recognizer to all the pieces simultaneously, or while u create each rectangle, just fire this method after creating the rectangle, and passing that as the piece parameter to this function.
That should help.