I recently updated to iOS10 and Swift 3. And know, whenever I run my app on an iOS10 device (iPhone6s in my case) there is a randomly occurring delay of the touchesBegan function. Sometimes this is accompanied with the following error message:
"Gesture: Failed to receive system gesture state notification before
next touch"
Interestingly my older iPhone5s running iOS9 does not reproduce this problem.
Usually strange UI delays on iOS have to do with threads. UI related methods on iOS and Mac are all meant to be called from the main thread only. Because the APIs are becoming highly asynchronous, with many methods taking a block callback, it's easy to accidentally do something UI-related from a background thread.
Make sure that any UI-related activity, whether it's updating an image, a text field, reloading a table view, etc, is always done from the main thread. If you're doing UI stuff within a block, you can wrap the UI parts inside dispatch_async(dispatch_get_main_queue(), ...).
I'm my experience, an inadvertent background-thread call to one of these methods can have an effect of "poisoning the well," causing subsequent weirdness such as the delayed touch events you're seeing.
I had the same issue where I simulate a touch and hold to steer a car right and left. Here was what fixed it for me.
Changing the follow from:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let location = touch.location(in: self)
let node: SKNode = atPoint(location)
if node == self.player1CarRight {
self.player1CarRotateRightTouching = true
}
}
}
To:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch:UITouch = touches.first! as UITouch
let touchLocation = touch.location(in: self)
if player1CarRight.contains(touchLocation) {
self.player1CarRotateRightTouching = true;
}
}
Same change in touchesEnded:
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch:UITouch = touches.first! as UITouch
let touchLocation = touch.location(in: self)
if player1CarRight.contains(touchLocation) {
self.player1CarRotateRightTouching = false
}
}
I ran into this behavior today, but perhaps for a different reason. My scenario involved having a UITapGestureRecognizer on a superview of the view that implements touchesBegan. Since the UITapGestureRecognizer was recognizing a tap, and because the default for cancelsTouchesInView on the tap recognizer is YES, once the recognizer recognized the touch, the views don't get any of the touch callbacks. However, if I touch and hold, the UITapGestureRecognizer fails to recognize the touch, and then the views get the callbacks. This results in what looks like a delayed call to touchesBegan.
I also noticed as part of this investigation that UIButtons get special treatment here. If a UITapGestureRecognizer is in play, it will not consider touches on UIButtons.
Related
I'm trying to use a gesture recognizer to allow the user to resize a view on the screen by dragging it with one finger... as such, I want to track all changes to the touch. I initially tried to use a PanGestureRecognizer for this, but I've run into an issue when the user touches the view with multiple fingers. I only want to track the first touch (ie the first finger the user places on the screen during a gesture), but I can't find a way to prevent additional touches from interfering with my ability to track the first. My initial solution was to simply set
recognizer.maximumNumberOfTouches = 1
However, this causes the entire gesture to be cancelled in the event of a second touch (I want to continue tracking the first touch in this case). But, of course, not setting this value to 1 causes the gesture to actually track multiple touches. How can I make a gesture recognizer that tracks all changes to only the first touch in the gesture, and isn't cancelled by additional touches on the screen?
From your comments and answer, it sounds like your use of a pan gesture recognizer was always wrong and you were doing something wrong in your action handler. No correct action handler for a pan gesture recognizer would ever use location(in:) for anything. location(in:) gives you a centroid of all touches, which is exactly what you say you don't want — and in any case there is no need to consult it.
If your goal is to drag a view, you use translation(in:).
If your goal is to track the actual location of a particular touch, you use location(ofTouch:in:).
If you discover that a touch is being delivered and you don't want it tracked, call ignore(_:for:).
Here is a UIGestureRecognizer subclass that will track all changes the first touch the user places on the screen at the start of a gesture, and will ignore all other fingers without cancelling the gesture. Note that if you're using functions/properties of the gesture recognizer (such as location(in:)), you'll probably have to override those as well.
class FirstTouchGestureRecognizer: UIGestureRecognizer {
private var firstTouch: UITouch? = nil
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
if firstTouch == nil {
firstTouch = touches.first!
self.state = .began
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
if firstTouch != nil && touches.contains(firstTouch!) {
self.state = .changed
}
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent) {
if firstTouch != nil && touches.contains(firstTouch!) {
firstTouch = nil
self.state = .ended
}
}
}
I am developing a keybaord extension for iOS. On iOS 9 the keys react imediatelly except for keys along left edge of the keyboard. Those react with around 0.2 second delay. The reason is that the touches are simply delivered with this delay to the UIView that is root view of my keyboard. On iOS 8 there is no such delay.
My guess is that this delay is cause by some logic that is supposed to recognize gesture for opening "running apps screen". That is fine but the delay on a keyboard is unacceptable. Is there any way how to get those events without delay? Perhaps just setting delaysTouchesBegan to false on some UIGestureRecognizer?
This is for anyone using later versions of iOS (this is working on iOS 9 and 10 for me). My issue was caused by the swipe to go back gesture interfering with my touchesBegan method by preventing it from firing on the very left edge of the screen until either the touch was ended, or the system recognised the movement to not be that of the swipe to go back gesture.
In your viewDidLoad function in your controller, simply put:
self.navigationController?.interactivePopGestureRecognizer?.delaysTouchesBegan = false
The official solution since iOS11 is overriding preferredScreenEdgesDeferringSystemGestures of your UIInputViewController.
https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
However, it doesn't seem to work on iOS 13 at least. As far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That' not the case for UIInputViewController, though.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).
If you have access to the view's window property, you can access these system gesture recognizers and set delaysTouchesBegan to false.
Here's a sample code in swift that does that
if let window = view.window,
let recognizers = window.gestureRecognizers {
recognizers.forEach { r in
// add condition here to only affect recognizers that you need to
r.delaysTouchesBegan = false
}
}
Also relevant: UISystemGateGestureRecognizer and delayed taps near bottom of screen
I'm trying to implement 2 distinct touch based behaviours. The first behaviour's logic is like this:
override func touchesMoved(touches: Set<NSObject>, withEvent event: UIEvent) {
let touch = touches.first as! UITouch
let touchLocation = touch.locationInNode(self)
moveViewTo(touchLocation) //moves my view to touchLocation
}
Now, for my second behaviour, I want to rotate the view based on where I have my finger on the screen. For this, I tried this:
override func touchesMoved(touches: Set<NSObject>, withEvent event: UIEvent) {
let touch = touches.first as! UITouch
let touchLocation = touch.locationInNode(self)
rotateViewTo(touchLocation) //rotates my view to face touchLocation
}
Now, I want these two distinct behaviours to work concurrently. In particular, I want the first touch to change the position of the view, and the second touch to rotate the view.
Rotating is possible only if there are two touches in the view, and changes direction based on the second touch.
Is there any way to differentiate which of the touches are first and which are second? I can't find a way to differentiate from the touches: Set<NSObject> because by nature Set is an unordered collection.
I believe your only option is to store your first touch as a var. My swift isn't strong enough to write out the code for you but here are the steps.
Loop through your touches instead of just grabbing the first one.
Check if there is a firstTouch (your var for tracking first touch)
If there isn't a firstTouch assign the first touch and do your position logic
If there is a first touch check to see if each touch is the firstTouch. If it isn't then do your rotation logic.
On touches ended and canceled loop through each touch. If it is the firstTouch set your var to nil.
Sorry I can't write all the code out for you, but hopefully that gets you going in the right direction.
I am making a SpriteKit game where in order to begin the game, you need to hold two separate spots (SKShapeNodes) for 3 seconds. If you let go either finger or move either finger off a node, the game will not start. I have it working fine with 1 spot, but when I try to do 2 spots, I'm stuck. What is the simplest way to detect the 2 correct touches on the correct nodes?
This doesn't seem like a very uncommon situation, so if anyone knows the best way to handle this, I would appreciate the help.
Swift preferred, also.
Set multipleTouchEnabled to YES and use the touchesForView: method.
You can get more specific information on multi touch from the Apple Docs Multitouch Events.
The main idea is to have all touches when users provides any actions and operate them at the same time.
So, add 3 handlers to your Scene class:
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
checkTouches((event?.allTouches())!)
}
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
checkTouches((event?.allTouches())!)
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
checkTouches((event?.allTouches())!)
}
In the checkTouches function you will see all touches with updated properties (like positions etc).
Simple example:
func checkTouches(touches: Set<UITouch>) {
// iterate over all touches
for t in touches {
let touch = t as UITouch
let touchLocation = touch.locationInNode(self)
if... <-- YOUR CODE HERE TO CHECK NODE NAME AND TOUCHED TIME
}
}
Using this approach you will be able to handle any changes simultaniously.
E.g. user may touch on your node, then move finger outside it and then move bach to this node.
Enjoy!
Hi I am currently moving a sprite by checking if the button was pressed in touchesBegan and then updating the position in update. I am doing this so that the user does not need to keep pressing the move up/down/left/right etc button over and over. The problem is that sometimes the sprite does not stop moving which im sure is due to this. Does anyone know a better solution to this? For brevity I will show you the way I am taking care of the up button.
override func touchesBegan(touches: NSSet, withEvent event: UIEvent) {
for touch: AnyObject in touches {
// Get the location of the touch in this scene
let location = touch.locationInNode(self)
// Check if the location of the touch is within the button's bounds
if upButton.containsPoint(location) {
upButtonPressed = true
}
override func update(currentTime: CFTimeInterval) {
if upButtonPressed == true {
ball.position.y += 3
}
I kept it simple here but I do have all the conditions to stop movement in my code. I am simply wondering if there is an easier way to do this with maybe a long press gesture recognizer?