I found mixed answer to this question whether we should manually remove the gesture recogniser or not. Can anyone provide the better understanding on this?
This says Yes: https://forums.xamarin.com/discussion/16970/gesturerecognizer-should-manually-remove
This says No:Do I need to release a gesture recognizer?
Suggestions please.
If you are NOT talking about using Xamarin then:
No you don't, the answer on the second link you posted is right. The first link is talking about Xamarin, same rules don't apply.
This is how you attach a gesture recognizer.
https://developer.apple.com/documentation/uikit/uiview/1622496-addgesturerecognizer
Under the "Discussion" part you can see this statement:
The view establishes a strong reference to the gesture recognizer.
Whenever you see this kind of statements it can be implied that "This object will keep my added object alive since it will strongly reference it". Thus, once the object disappears my added object will go with it.
Related
I currently have a UIView, call it (A), that is outsourced out into a 3rd party library.
Pressing onTapGesture physically would be simple, but the problem here is that this (A) is on another view hierarchy, versus the one I have. The reason for this is that I apply transforms to this separate from (A).
Im deciding on delegating a UIButton that will programmatically call the UIView in question's onTapGesture, is there a way to do this?
I do not have access to this 3rd party library's selector for onTap.
This is a very late answer to your question. If you still working on this may be my answer will be helpful or may be someone else will find it helpful.
The gesture doesn't expose its target and action. In this case may be creating a fake touch programatically will be helpful. As you have UIView you can easily get the window of that view. With window find out the x,y co-ordinate and create a fake UITouch event. I also had similar kind of problem and creating fake touch worked for me. This is the link I referred for creating fake touches.
Brent Simmons wrote in a blog post that tap gesture recognizers, presumably on a UIView, are less accessible than UIButtons. I'm trying to learn my way around making my app accessible, and I was curious if anyone could clarify what makes that less accessible than a UIButton, and what makes an element "accessible" to begin with?
For more customizability I was planning to build a button comprised of a UIView and tap gesture recognizers with some subviews, but now I'm not so sure. Is it possible to make a UIView as accessible as a UIButton?
Accessible in this context most likely refers to UI elements that can be used using Apple's accessibility features, such as VoiceOver (see example below).
For example, a visually impaired person will not be able to see your view or subviews, or buttons for that matter; but the accessibility software "VoiceOver" built into every iOS device will read to her/him the kind of object and its title, something like "Button: Continue" (if the button title is "Continue").
You can see that most likely the tap gesture recognizer will not be read by VoiceOver and thus be less "accessible".
I have an pretty standard application that uses gesture recognizers in various places. I’m trying to add an above-all UISwipeGestureRecognizer with three fingers which can be performed anywhere in the app, similar to the Apple four-fingered ones. This is working fine in some views, but if there’s another swipe recognizer beneath it, it’ll trigger that one instead of the new one.
I’d like this new three-finger swipe to be given priority at all times – I’ve added it to my root view controller’s view, but it still seems to bleed through at times.
Is there an easier way to do this than going through and requiring all other recognizers to fail?
You can use requireGestureRecognizerToFail: method to filter through unneeded gestures.
Apple doc.
My question is simple. Is it possible to change the gesture for selecting a cell to be a double-tap instead of the default? If so, what would be the general approach for doing so?
Thanks, in advance.
Regex.
You can use gesture recognizers to capture gesture events.
I've never seen it done in a main interface; I'd be concerned that Apple will reject it as being noncompliant to the HIG.
I'd like to implement multitouch, and I was hoping to get some sanity checks from the brilliant folks here. :)
From what I can tell, my strategy to detect and track multitouch is going to be to use the touchesBegan _Moved and _Ended methods and use the allTouches method of the event parameter to get visibility on all relevant touches at any particular time.
I was thinking I'd essentially use the previousLocationInView as a way of linking touches that come in with my new events with the currently active touches, i.e. if there is a touchBegan for one that is at x,y = 10,14, then I can use the previous location of a touch in the next message to know which one this new touch is tied to as a way of keeping track of one finger's continuous motion etc. Does this make sense? If it does make sense, is there a better way to do it? I cannot hold onto UITouch or UIEvent pointers as a way of identifying touches with previous touches, so I cannot go that route. All I can think to do is tie them together via their previouslocationInView value (and to know which are 'new' touches).
You might want to take a look at gesture recognizers. From Apple's docs,
You could implement the touch-event handling code to recognize and handle these gestures, but that code would be complex, possibly buggy, and take some time to write. Alternatively, you could simplify the interpretation and handling of common gestures by using one of the gesture recognizer classes introduced in iOS 3.2. To use a gesture recognizer, you instantiate it, attach it to the view receiving touches, configure it, and assign it an action selector and a target object. When the gesture recognizer recognizes its gesture, it sends an action message to the target, allowing the target to respond to the gesture.
See the article on Gesture Recognizers and specifically the section titled "Creating Custom Gesture Recognizers." You will need an Apple Developer Center account to access this.