UISwipeGestureRecognizer to override all others? - ios

I have an pretty standard application that uses gesture recognizers in various places. I’m trying to add an above-all UISwipeGestureRecognizer with three fingers which can be performed anywhere in the app, similar to the Apple four-fingered ones. This is working fine in some views, but if there’s another swipe recognizer beneath it, it’ll trigger that one instead of the new one.
I’d like this new three-finger swipe to be given priority at all times – I’ve added it to my root view controller’s view, but it still seems to bleed through at times.
Is there an easier way to do this than going through and requiring all other recognizers to fail?

You can use requireGestureRecognizerToFail: method to filter through unneeded gestures.
Apple doc.

Related

Swift - Performance of `UIGestureRecognizer` when having many of them

I've been thinking about this for quite some time now and I haven't found a suiting answer to this.
How performant are UIGestureRecognizer in swift/iOS development?
Let me explain by giving you a theoretical example:
You have an app on the iPad Pro (big screen, much space) and there you have maybe dozens of different views and buttons and so on. For whatever reason you need every one of these views and buttons to be moveable/clickable/resizable/...
What's better?
Adding one (or multiple) UIGestureRecognizer(s) to each view (which results in many active gesture recognizers and many small, specific handling methods [maybe grouped for each type of view])
Adding one single recognizer to the superview (which results in one active gesture recognizer and a big handling method that needs to cycle through the subviews and determine which one has been tapped)
I guess the first one is the most simple one but is it slower than the second one? I'm not sure about that. My stomach tells me that having that many UIGestureRecognizers can't be a good solution.
But either way, the system has to cycle through everything (in the worst case), be it many recognizers or many subviews. I'm curious about this.
Thank you
Let look at your question in terms of gesture recognition flow -> to pass event to the right gesture recognize the system goes by the views tree to find last one specific to this gesture, that will return true in one specific method of UIView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
Adding one (or multiple) UIGestureRecognizer(s) to each view
This way I recommended to use. In this case the system will do most of the work for you and prevent you from mistakes that very difficult to debug later - trust me. This is for UI. Especially if you have multiple different gestures on different parts of a screen. In my particular case I have a huge video player UI that has around 20 gesture recognizers on one screen and feels pretty good - no lags or frame drop in UI. this way is simple and self describing. I recommend implement it using storyboard or xib. You can refer to Interface Builderand later any time to recall in a moment what recognizer should you update to change the behaviour of UI. The speed of this approach guaranteed by a system.
Adding one single recognizer to the superview
This way could be used with only one simple gesture for a multiple views (more > 20). this could happened if you implementing some game where user pick up and paste a bricks of different shapes for example. It is not suitable for common UI tasks. The speed depends on your implementation, and based on question itself I am not recommend to do it. This approach is design specific not speed relevant.

Slide Effect for iOS

I'm new to developing iOS apps,
I've successfully implemented a Swipe Gesture Recognizer,
What I was wondering is if there is an easy to use recognizer like the swipe gesture. That would let you implement the homescreen page turning effect but just on a small view in the view controller?
If your unclear on what effect I mean, when you look at the iPhone's homescreen you can drag your finger and it responds instantly (unlike swipe) and also has some spring feeling to it, is this some effect I can use, or do I manually have to program this into the code if so is there a tutorial that explains this?
Thanks,
I hope my question makes sense.
Have a look at UIPanGestureRecognizer:
https://developer.apple.com/library/ios/documentation/uikit/reference/UIPanGestureRecognizer_Class/Reference/Reference.html
UIPanGestureRecognizer is a concrete subclass of UIGestureRecognizer
that looks for panning (dragging) gestures. The user must be pressing
one or more fingers on a view while they pan it. Clients implementing
the action method for this gesture recognizer can ask it for the
current translation and velocity of the gesture.
A panning gesture is continuous. It begins
(UIGestureRecognizerStateBegan) when the minimum number of fingers
allowed (minimumNumberOfTouches) has moved enough to be considered a
pan. It changes (UIGestureRecognizerStateChanged) when a finger moves
while at least the minimum number of fingers are pressed down. It ends
(UIGestureRecognizerStateEnded) when all fingers are lifted.
Clients of this class can, in their action methods, query the
UIPanGestureRecognizer object for the current translation of the
gesture (translationInView:) and the velocity of the translation
(velocityInView:). They can specify the view whose coordinate system
should be used for the translation and velocity values. Clients may
also reset the translation to a desired value.
Edit: The spring feeling part you would need to implement yourself. Since iOS 7 there is UIDynamics which contains different animators, for what you describe you may need UIGravityBehavior and maybe UICollisionBehaviour. Look at the WWDC 2013 videos for this topic, I think you will find some examples there.

Prevent reordering of elements in gestureRecognizers array

I'm experiencing a bug in my app that is causing gestures to stop working that I previously added to a UITextField via addGestureRecognizer:. Essentially, I add a tap and long press gesture recognizer to the UITextField (which already has 7 gesture recognizers applied from iOS). When logging self.textField.gestureRecognizers, it shows the existing 7 gestures and then the two I added at the end of the array. The gestures work just like I expected.
However, when I present a modal view controller and then dismiss it, my two gestures stop working on the text field. I'm not sure exactly why, but the view does disappear and it resignsFirstResponder (the keyboard is always up when the modal VC is presented) which may be related. But I discovered the gestures aren't removed from the text field, but the order of the gestures in the array has changed. My custom gestures are now located at index 0 and 1 instead of 7 and 8. I believe the 7 default gestures are conflicting/overriding my custom ones (I assume later placement in the array overrides those before it) which explains why they stop working even though they're still applied.
My questions are:
- Do you know why it is reordering the elements in self.textField.gestureRecognizers?
- How do I prevent that from occurring to ensure my custom gestures always work, without breaking the default gestures for UITextField?
My current solution is to add the two gestures for the first time then store the array of total (9) gestures, then in viewDidAppear I change the gestureRecognizers array (yes it is settable) to my stored array. This guarantees the array will be the 7 built-in gestures followed by my two custom gestures in that order. But I discovered my gestures are overriding the default gestures (that bring up the popup to Cut, Copy, etc), so I have to reset the gestures back to the default 7 after my custom gesture occurs (which is just fine - I only need to trigger the action a single time after recognizing my custom gesture). Simple enough to do - I store the original gestures in a property as well. But this doesn't feel like the best solution. I'd prefer to figure out the cause and address that or go about the situation differently instead of duct-taping the code together.
My first solution was to always add my two gestures in viewDidAppear
viewDidAppear: is called when your view controller's view first appears, but it is also called again later when the presented view controller is dismissed.
Thus you are adding the gesture recognizers twice.
The simplest solution is to use a BOOL instance variable (we call this a "flag") which you set to YES the first time and test afterwards:
if (!self.addedGestures) {
self.addedGestures = YES;
// ... add them! ...
}
Now you will only add them once.
(On the other hand it might be argued that if you care about the order of the gesture recognizers in the array you are already doing something wrong. Use delegate methods to resolve conflicts between gesture recognizers - that's what they are for.)

iOS: Swipe up gesture recognizer not working

My table view has a text field above it. And whenever the text field is focussed, a swipe gesture is registered. When the swipe gesture is recognized, the keyboard is dismissed. The code is working for all gestures except for swipe up gesture is not working. This is my code
swipe = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:#selector(dismissKeyboard)];
[swipe setDirection:UISwipeGestureRecognizerDirectionUp];
Can someone please let me know if there is any problem?
if all the other gestures works, that means there is no logic problem.
check out spelling errors.
and reapply the swipe gesture, and check out everything (outlets etc.).
I don't know about this case, but I know that when I've had gestures on a custom container view and then added a child view with its own gestures, I've had to iterate through the child's gestures and tell them to require my gestures to fail (i.e. mine take precedence). I've done this with scroll views successfully:
for (UIGestureRecognizer *gesture in self.scrollView.gestureRecognizers)
{
[gesture requireGestureRecognizerToFail:myGesture];
}
The only times I've had problems with that are views like UITextView which remove and add gestures as you go in and out of edit mode, so that's a hassle.
Also, while I tried this with standard gestures, I've subsequently shifted to custom gestures that I've programmed to failed as quickly as possible (check the start location and fail immediately if it won't support the direction my gesture requires, rather than waiting for a bunch of touchesMoved to come to the same conclusion). If you don't want to interfere with the child view's gestures, be as aggressive as possible in letting yours fail. Maybe this isn't an issue with a swipe gesture, but it's a possible consideration if you find that your gestures end up changing the behavior of the child view noticeably.
But I suspect you'll probably just have to figure out which views have the gestures that are interfering with yours and make them require yours to fail first.
Any chance you're colliding with one of the scrollview's gestures? It doesn't seem likely if your other gestures are working, but it might be worth at least trying the gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: method in the UIGestureRecognizerDelegate protocol.

Gestures that steal touches like iOS multitasking swipe

I know what I want to do, but I'm stumped as to how to do it: I want to implement something like the iOS multitasking gestures. That is, I want to "steal" touches from any view inside my view hierarchy if the number of touches is greater than, say, two. Of course, the gestures are not meant to control multitasking, it's just the transparent touch-stealing I'm after.
Since this is a fairly complex app (which makes extensive use of viewController containment), I want this to be transparent to the views that it happens to (i. e. I want to be able to display arbitrary views and hierarchies, including UIScrollViews, MKMapViews, UIWebViews etc. without having to change their implementation to play nice with my gestures).
Just adding a gestureRecognizer to the common superview doesn't work, as subviews that are interaction enabled eat all the touches that fall on them.
Adding a visually transparent UI-enabled view as a sibling (but in front) of the main view hierarchy also doesn't work, since now this view eats all the touches. I've experimented with reimplementing touchesBegan: etc. in the touchView, but forwarding the touches to nextResponder doesn't work, because that'll be the common superview, in effect funnelling the touches right around the views that are supposed to be receiving them when the touchView gives them up.
I am sure I'm not the only one looking for a solution for this, and I'm sure there are smarter people than me that have this already figured out. I even suspect it might not actually be very hard, and just maybe my brain won't see the forest for the trees today. I'm thankful for any helpful answers anyway :)
I would suggest you to try using method swizzling, reimplementing the touchesbegan on UIView. I think that the best way is to store in a static shared variable the number of touches (so that each view can increment/decrement this value). It's just a very simple idea, take it with a grain of salt.
Hope this helps.
Ciao! :)
A possible, but potentially dangerous (if you aren't careful) approach is to subclass your application UIWindow and redefine the sendEvent: method.
As this method is called for each touch event received by the app, you can inspect it and then decide to call [super sendEvent:] (if the touch is not filtered), or don't call it (if the touch is filtered) or just defer its call if you are still recognizing the touch.
Another possibility is to play with the hitTest:withEvent: method but this would require your stealing view to be placed properly in the subview, and I think it doesn't fit well when you have many view controllers. I believe the previous solution is more general purpose.
Actually, adding a gesture recognizer on the common superview is the right way to do this. But it sound like you may need to set either delaysTouchesBegan or cancelsTouchesInView (or both) to ensure that the gesture recognizer handles everything before letting it through to the child views.

Resources