As far as i observed, Tap gestures cancels event when finger moves. Here a screenShot from Event Handling Guide:
I have already done some tests with the following code. Nope, it cancels...
#IBAction func dosomething(sender: UITapGestureRecognizer) {
//called by gesture recognizer
}
#IBOutlet var red: UIView!
#IBOutlet var green: UIView!
#IBOutlet var yellow: UIView!
#IBOutlet var white: UIView!
//setting delegate when outlet get set.
#IBOutlet var tapGesture: UITapGestureRecognizer! { didSet { outlet.delegate = self
}}
//delegate func shouldReceiveTouch
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
if touch.view == red {
println("ez")
return true
}else if touch.view == green{
println("bp")
return false
}else { view.backgroundColor = UIColor.clearColor()
return true}
}
So if I'm right, when touchesMoved: called, tapGestureRecognizer must have failed, but according to figure it does not. What I'm missing here?
When a user taps or drags their finger across a screen in an iPhone application, two things happen:
The view receives a UITouch object(s)
A UIGestureRecognizer may grab the UITouch object(s) and try to infer what the user is doing (pan in/out, scroll up/down, swipe between pages in a browser, etc.)
A UIGestureRecognizer will not know what the user is doing unless the user has finished tapping the screen. The iPhone will know when the user is done when the UITouch object(s) are no longer apart of the current view.
The thing is, UIGestureRecognizer doesn't "fail", but rather it has no value or meaning until the user's touches are "cancelled" or not apart of the view. This can be seen by the fourth column to the right of Figure 1-6. Another problem is that a tap is not what you think it is. It's just a single touch. The touch ends as soon as it begins. If you're trying to swipe or drag something across a screen, you need a different UIGestureRecognizer object such as UIPanGestureRecognizer, UILongPressGestureRecognizer, UISwipeGestureRecognizer, etc.
If you want to see this in action, try doing the following in an iOS Single View Application:
Make a ViewController.swift file.
Class your ViewController as a subclass of UIViewController.
Add a UIGestureRecognizerDelegate protocol.
Add some UITapGestureRecognizer objects and UITouch objects and link them to a single view in your storyboard.
Set the UIGestureRecognizer object's delegate to the UIViewController or self.
Check when a touch begins and ends by adding some println statements in the touchesBegan:withEvent:, touchesMoved:withEvent:, touchesEnded:withEvent, and touchesCancelled:withEvent methods.
You can learn more about how these methods work in the iOS Developer Library directly.
Related
I have a problem where I have a UIScrollView and a Header (UIView) inside my main View and my Header is over my UIScrollView as such:
UIView.
|
|- UIScrollView.
|
|- Header. (UIView)
I want my header to be able to detect taps on it, but I also want my scroll view to be able to scroll when I drag over my Header which right now it is not possible because my Header is over it and is blocking the scroll.
To sum up, I want my Header to detect taps but forward scrolls to my UIScrollView.
To tackle this problem I tried multiple things, and here are some of them:
Adding a UIPanGestureRecognizer to my Header so it is able to detect dragging
Adding a UITapGestureRecognizer to my Header so it is able to detect tapping
Setting isUserInteractionEnabled = false when dragging begins so the gesture can be passed to the next UIResponder which in this case is my UIScrollView
Setting isUserInteractionEnabled = true once my dragging has finished so it can again detect tapping
This is the code snippet:
override func viewLoad() {
myScreenEdgePanGestureRecognizer = UIPanGestureRecognizer(target: self, action:#selector(handlePan))
let tapGestureRecognizer = UITapGestureRecognizer(target: self, action:#selector(handleTap(_:)))
headerView.addGestureRecognizer(myScreenEdgePanGestureRecognizer)
headerView.addGestureRecognizer(tapGestureRecognizer)
}
#objc func handlePan(_ sender: UITapGestureRecognizer){
print("dragging")
if headerView.isUserInteractionEnabled{
headerView.isUserInteractionEnabled = false
}
if sender.state == .began {
} else if sender.state == .ended {
headerView.isUserInteractionEnabled = true
}
}
#objc func handleTap(_ sender: UITapGestureRecognizer){
print("tapped")
}
At this point I see how dragging and tapping are being detected just fine, but for some reason isUserInteractionEnabled = false seems to not be changing how the view is behaving. This code is acting as isUserInteractionEnabled is always true no mater what.
Things that I have also tried besides this:
overriding the hitTest function inside UIButton
overriding touchesBegan, touchesMoved, touchesEnded methods overriding next
setting the variable to return ScrollView as the next UIResponder
setting the isExclusiveTouch method in UIButton
changing the isUserInteractionEnabled in every way possible
I was struggling with this problem too, you should try to use methods of UIGestureRecognizerDelegate which allows you to handle simultaneous gestures.
Connect your gesture recognizers delegates e.g. tapGestureRecognizer.delegate = self
Make your ViewController conform this protocol e.g.
extension YourViewController: UIGestureRecognizerDelegate {}
Implement this function:
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool { return true }
I have the following case. parentView has it's own gestureRecognizerAand has a subview subView which has it's own UITapGestureRecognizer.
Is there any way to tell parentView that it should pass the touch events recognized in gestureRecognizerA to subView if these touch events are in subView's bounds?
gestureRecognizerA is very specific. It is a custom gesture recognizer for recognizing a circlular motion. This recognition should happen on all areas of parentView. However, when that same gesture recognizer recognizes a tap, it should pass that tap to subView.
You can easily identify the points of tap.
As for example you have a tap gesture in parent class as:
let tapGR = UITapGestureRecognizer(target: self, action: #selector(tapped))
view.addGestureRecognizer(tapGR)
#objc func tapped(gr:UITapGestureRecognizer) {
let loc:CGPoint = gr.location(in: gr.view)
//insert your touch based code here
}
Inside the tapped method you can identify the location where tap happened, so after checking bounds of the subview with location of tap you can verify is the tap happened inside the bounds of subview or not.
It seems like you just want both of those gesture recognizers to work simultaneously. Just implement UIGestureRecognizerDelegate for your parentView and make it tapGestureRecognizer's and gestureRecognizerA's delegate. Then implement an optional method there:
// MARK: - UIGestureRecognizerDelegate
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) {
return true
}
That might be able to detect a tap in subView even while doing a circular motion within parentView.
UPDATE: When using gesture recognizers, "forwarding touches" would be to simply calling a method of another recognizer. Just put a recognizer which is doing the forwarding as its parameter.
For instance, tapGestureRecognizer fires viewWasTapped(_ sender: UITapGestureRecognizer) when a tap is detected. Now, when your gestureRecognizerA wants to forward its events to tapGestureRecognizer, it simply does so by calling:
subView.viewWasTapped(self.gestureRecognizerA)
With an obvious change to the method itself:
func viewWasTapped(_ sender: UIGestureRecognizer) {
// ...
}
This works for UITapGestureRecognizer. The sender can be any other UIGestureRecognizer and you'd still have almost all the information to resolve a tap gesture there.
I have a UITextView to which I have attached a gesture recognizer as follows:
let testTapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(textTextViewTapped(gestureRecognizer:)))
testTapGestureRecognizer.cancelsTouchesInView = false
testTapGestureRecognizer.delaysTouchesBegan = false
testTapGestureRecognizer.delaysTouchesEnded = false
if textTextView != nil {
textTextView!.addGestureRecognizer(testTapGestureRecognizer)
}
The selector mentioned above is as follows:
#objc func textTextViewTapped(gestureRecognizer: UIGestureRecognizer) {
print("testTextViewTapped called.")
}
Every time I tap the UITextView, I can see the message above printed on the console. However, the keyboard doesn't appear any more.
I found Apple's doc confusing here:
Here, it says that
A gesture recognizer doesn’t participate in the view’s responder
chain.
which I am interpreting as that any gestures are also sent to the view and up the chain, as is normal.
Later on the same page, it says,
If a gesture recognizer recognizes its gesture, the remaining touches
for the view are cancelled.
which means that if an attached gesture recognizer is able to understand a gesture as the one it is supposed to recognize, then it will prevent it from being delivered to the view to which it is attached. Further, it specifies 3 properties that should be able to stop the gesture recognizer from doing that. I have set all three of them to false in my code, as shown above.
What is actually happening here and how do I allow the UITextView to interpret the touches normally and also be able to use a gesture recognizer?
You could use the UIGestureRecognizerDelegate to make the UITapGestureRecognizer work along the regular UITextView behaviour:
class TestViewController: UIViewController {
#IBOutlet weak var textView: UITextView!
override func viewDidLoad() {
super.viewDidLoad()
let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(tap))
tapGestureRecognizer.delegate = self
textView.addGestureRecognizer(tapGestureRecognizer)
}
#objc private func tap() {
print("tap")
}
}
extension TestViewController: UIGestureRecognizerDelegate {
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool {
return true
}
}
The UITextView probably has its own private gesture recognizer to handle when a user taps on it. When this happens it makes the text view the first responder, which causes the keyboard to appear. Gesture recognizers can force other gesture recognizers to fail when they recognize their gesture. (See the docs) Perhaps this is what is happening when you add your tap gesture. It recognizes the tap and thus forces other gestures to fail, which prevents the text view from becoming the first responder.
The best solution is to follow the answer from this question (as was mentioned by #FrancescoDeliro in the comments) and use the delegate calls to know when editing is beginning/ending.
I have 3 buttons in my UI that ultimately will perform similar to keys on a piano. Either of the 3 buttons may get tapped to perform that particular buttons actions, but the user can 'slide' their finger off onto the next button to perform button 2's action.
I did some searching and it looks like a touchesBegan method detecting the location of the tap is best. My attempt is below:
#IBOutlet weak var button1: UIButton!
#IBOutlet weak var button2: UIButton!
#IBOutlet weak var button3: UIButton!
(I don't have any action events tied to the 3 buttons, because I thought touchesBegan will cover that.)
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
var touch: UITouch = event!.allTouches()!.first!
var location: CGPoint = touch.locationInView(touch.view!)
if CGRectContainsPoint(button1.frame, location){
print("button 1")
}
else if CGRectContainsPoint(button2.frame, location) {
print("button 2")
}
else if CGRectContainsPoint(button3.frame, location) {
print("button 3")
}
}
Nothing prints to the console when I test this. I tested both tapping and swiping across buttons. Should I be using UIViews instead of UIButtons? Should I be using actions on the buttons? This is for Swift 2.0. Thank you.
EDIT
I have a working prototype much closer to how I envisioned this functioning. This thread here: Using iOS pan gesture to highlight buttons in grid pointed me in the direction of using a pangesture on the buttons superview.
var touchPoint = CGPoint()
var myGesture = UIPanGestureRecognizer()
viewDidLoad:
myGesture = UIPanGestureRecognizer(target: self, action: Selector("panDetected:"))
myGesture.maximumNumberOfTouches = 1
view.addGestureRecognizer(myGesture)
.
func panDetected(sender : UIPanGestureRecognizer) {
touchPoint = sender.locationInView(self.view)
if CGRectContainsPoint(button1.frame, touchPoint) {
button1Func()
}
else if CGRectContainsPoint(button2.frame, touchPoint) {
button2Func()
}
else if CGRectContainsPoint(button3.frame, touchPoint) {
button3Func()
}
The above DOES work, however, button1Func / button2Func / button3Func all contain an animation block. When the finger is dragged within the buttons, it fires the method every time a movement is detected. I just need for this to happen once. I tried adding this to a state == .Began statement, but that prevents any functionality once the finger leaves the first tapped button.
Is there any way I can fire those methods (button1Func / button2Func / button3Func) just ONCE inside the pan gesture once the finger is inside the bounds of each of the 3 buttons? I will be happy to clarify this if it's confusing. THANKS.
There are a whole bunch of control events you can tie to IBActions. Take a look at the UIControlEvents enum. You probably want to add actions for UIControlEventTouchDragInside, UIControlEventTouchDragEnter and a few others.
I'm creating a resume app (mostly for fun) and can't seem to get the tap gesture recognizer to work for me. What I'd like is to have a single label display the information and have it change depending on which title label they tap below. Here is the code I've written:
import UIKit
class WorkHistoryViewController: UIViewController {
// MARK: Properties
#IBOutlet weak var jobOne: UILabel!
#IBOutlet weak var jobTwo: UILabel!
#IBOutlet weak var jobThree: UILabel!
#IBOutlet weak var jobFour: UILabel!
#IBOutlet weak var jobFive: UILabel!
#IBOutlet weak var workHistoryDescriptionLabel: UILabel!
let tapRec = UITapGestureRecognizer()
override func viewDidLoad() {
super.viewDidLoad()
tapRec.addTarget(self, action: "tappedLabel")
jobOne.addGestureRecognizer(tapRec)
jobTwo.addGestureRecognizer(tapRec)
jobThree.addGestureRecognizer(tapRec)
jobFour.addGestureRecognizer(tapRec)
jobFive.addGestureRecognizer(tapRec)
}
// MARK: Methods
func tappedLabel() {
workHistoryDescriptionLabel.text = "It worked!"
}
}
What's happening is that the last label to have addGestureRecognizer() called in viewDidLoad() is the only one that works. If I comment out the last line then only the label above it works. I also tried to enable user interaction on each of the labels programmatically and on the attributes inspector and neither changed anything.
As per Apple's Event Handling Guidelines
Gesture Recognizers Are Attached to a View
Every gesture recognizer is
associated with one view. By contrast, a view can have multiple
gesture recognizers, because a single view might respond to many
different gestures. For a gesture recognizer to recognize touches that
occur in a particular view, you must attach the gesture recognizer to
that view. When a user touches that view, the gesture recognizer
receives a message that a touch occurred before the view object does.
As a result, the gesture recognizer can respond to touches on behalf
of the view.
So you need to create multiple instances of UITapGestureRecognizer and attach them to each view, even if they perform same actions.
For instance,
let tapRecOne = UITapGestureRecognizer()
tapRecOne.addTarget(self, action: "tappedLabel")
jobOne.addGestureRecognizer(tapRecOne)
let tapRecTwo = UITapGestureRecognizer()
tapRecTwo.addTarget(self, action: "tappedLabel")
jobTwo.addGestureRecognizer(tapRecTwo)
and so on.
You need to create new object of tap gestures to get your work done.You can refer showing images in UIScrollview horizontally on ipad and getting tag of image on tap
Here is Apple's documentation as defined here.
A gesture recognizer operates on touches hit-tested to a specific view and all of that view’s subviews. It thus must be associated with that view. To make that association you must call the UIView method addGestureRecognizer:. A gesture recognizer doesn’t participate in the view’s responder chain.
That means that you need to have separate "UIGestureRecognizer" instances for each view. That should sort out the issue.
Xcode has a small icon in Xcode you see in this screen shot that you can click on when your app is running. As I put in this screenshot its invaluable when trying to understand what is happening with UI issues?