everyone knows that when you drag outside a button it don't cancel the highlight state right away by UIButton's default. UIControlEventTouchDragExit triggers when 70 pixels away. I want that distance to be 0. So after searching the solution of it, I tried to create a subclass like this:
import UIKit
class UINewButton: UIButton {
override func continueTrackingWithTouch(touch: UITouch, withEvent event: UIEvent?) -> Bool {
print("here")
let touchOutside = !CGRectContainsPoint(self.bounds, touch.locationInView(self))
if touchOutside {
let previousTochInside = CGRectContainsPoint(self.bounds, touch.previousLocationInView(self))
if previousTochInside {
print("Sending UIControlEventTouchDragExit")
self.sendActionsForControlEvents(.TouchDragExit)
self.highlighted = false
self.selected = false
}else{
print("Sending UIControlEventTouchDragOutside")
self.sendActionsForControlEvents(.TouchDragOutside)
}
}else{
let previousTouchOutside = !CGRectContainsPoint(self.bounds, touch.previousLocationInView(self))
if previousTouchOutside{
print("Sending UIControlEventTouchDragEnter")
self.sendActionsForControlEvents(.TouchDragEnter)
}else{
print("Sending UIControlEventTouchDragInside")
self.sendActionsForControlEvents(.TouchUpInside)
}
}
return super.continueTrackingWithTouch(touch, withEvent: event)
}
}
and create a button like this in a UIViewController
#IBOutlet var confirmButton: UINewButton!
I assumed when a UIButton being touched and dragged. It would call the function in this sequence:
beginTrackingWithTouch(when touched) -> continueTrackingWithTouch(when dragged) -> endTrackingWithTouch(when left)
But here is the weird part. Even though I override the function continueTrackingWithTouch, it still not been called. Cause the console window didn't show "here" where I put there in it. And the result remain the default distance 70. how come is that?
I tried to call the three functions mentioned above and return true if it needs one.
What did I missed?
After reading this article: UIControlEventTouchDragExit triggers when 100 pixels away from UIButton
Still not helping :( (plus it written in objective-C...)
Isn't the distance of 70px a property of the function so I can just changed?(How can I see the original function by the way? There is no detail in Apple Developer Documentation...)
Should I use button.addtarget in the UIViewController? But it seems like another way to do it.
Here is another question:
If I want to cancel the highlight state when dragged outside the button, is this right?
self.highlighted = false
self.selected = false
I don't know which one is the right one so I used it all.
please help! Just a newbie in swift but I have been stuck in this problem for 3 days. QQ
In Swift 3 the function signature has changed. It's now:
func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool
API Reference
Related
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
let touch = touches.first
let positionInScene = touch!.location(in: self)
let touchedNode = self.atPoint(positionInScene)
if let name = touchedNode.name {
if name == "leftbutton" {
print("left button stopped")
touchedNode.run(buttonStoppedPressAction)
player?.removeAllActions()
}
if name == "rightbutton" {
print("right button stopped")
touchedNode.run(buttonStoppedPressAction)
player?.removeAllActions()
}
}
}
Here I have code that when the user lifts off their finger from the buttons it stops the action but only if they lift of their finger inside the button. So if they press it and begin to move their finger somewhere else on the screen while continuously pressing down the button will not stop executing its code. Thank you for any help.
Essentially you should check for touch location at touch down and compare to the location at touch up. If the touch is no longer in the area of your button, you cancel all effects.
First, though, a point. It seems like you are handling button logic in the SKScene level, which is what tutorials often tell you to do. However, this may not be the best approach. The risks here, in addition to just a cluttered mess of a SKScene, emerge from handling multiple objects and how they react to touch events, and also additional complexity from multitouch (if allowed).
Years ago when I started with SpriteKit, I felt like this was a huge pain. So I made a button class that handles all the touch logic independently (and sends signals back to the parent when something needs to happen). Benefits: No needless clutter, no trouble distinguishing between objects, the ability to determine multitouch allowances per-node.
What I do in my class to see if the touch hasn't left the button before touch up is that I store the size of the button area (as a parameter of the object) and touch position within it. Simple simple.
In fact, it has baffled me forever that Apple didn't just provide a rudimentary SKButton class by default. Anyhow, I think you might want to think about it. At least for me it saves sooo much time every day. And I've shipped multiple successful apps with the same custom button class.
EDIT: Underneath is my barebones Button class.
import SpriteKit
class Button: SKNode {
private var background: SKSpriteNode?
private var icon: SKNode?
private var tapAction: () -> Void = {}
override init() {
super.init()
isUserInteractionEnabled = true
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
isUserInteractionEnabled = true
}
// MARK: Switches
public func switchButtonBackground(buttonBackgroundSize: CGSize, buttonBackgroundColor: SKColor) {
background = SKSpriteNode(color: buttonBackgroundColor, size: buttonBackgroundSize)
addChild(background!)
}
public func switchButtonIcon(_ buttonIcon: SKNode) {
if icon != nil {
icon = nil
}
icon = buttonIcon
addChild(icon!)
}
public func switchButtonTapAction(_ buttonTapAction: #escaping () -> Void) {
tapAction = buttonTapAction
}
// MARK: Touch events
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
tapAction()
}
}
And then you create the Button object by first initiating it, assigning it a background using a size and color, then assign it an icon, assign it a function to run when tapped and finally add it as a child to the scene.
let icon = SKNode()
let size = CGSize(width: 20.0, height: 20.0)
let button = Button()
button.switchButtonBackground(buttonBackgroundSize: size, buttonBackgroundColor: .clear)
button.switchButtonIcon(icon)
button.switchButtonTapAction(buttonPressed)
addChild(button)
The background defines the touch area for the button, and you can either have a color for it or determine it as .clear. The icon is sort of supposed to hold any text or images you want on top of the button. Just package them into an SKNode and you're good to go. If you want to run a function with a parameter as the tap action, you can just make a code block.
Hope that helps! Let me if you need any further help :).
I have a view in my storyboard that by default the alpha is set to 0. In certain cases the Swift file sets the alpha to 1. So either hidden or not. Before this view just contained 2 labels. I'm trying to add 2 buttons to the view.
For some reason the buttons aren't clickable at all. So when you tap it normally buttons change color slightly before you releasing, or while holding down on the button. But that behavior doesn't happen for some reason and the function connected to the button isn't being called at all.
It seems like an issue where something is overlapping or on top of the button. The button is totally visible and enabled and everything but not clickable. I tried Debug View Hierarchy but everything looks correct in that view.
Any ideas why this might be happening?
EDIT I tried making a class with the following code and in interface builder setting the container view to be that class.
class AnotherView: UIView {
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
for view in self.subviews {
if view.isUserInteractionEnabled, view.point(inside: self.convert(point, to: view), with: event) {
return true
}
}
return false
}
}
Go with hitTest(_:with:) method. When we call super.hitTest(point, with: event), the super call returns nil, because user interaction is disabled. So, instead, we check if the touchPoint is on the UIButton, if it is, then we can return UIButton object. This sends message to the selector of the UIButton object.
class AnotherView: UIView {
#IBOutlet weak var button:UIButton!
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let view = super.hitTest(point, with: event)
if self.button.frame.contains(point) {
return button
}
return view
}
#IBAction func buttnTapped(sender:UIButton) {
}
}
I have a number of UIButton instances in a UIViewController and I want to perform a couple of actions when any of these buttons are pressed with pressure (all the way down), I don't know the exact term here (force touch maybe?).
So when UIButton is pressured, I want to give haptic feedback via vibration, change the button image source and do some other stuff. Then when the pressure is released I want to restore the button image source to the normal state and do some more stuff.
What is the easiest way to do this?
Should I make my own custom UIButton like below or are there methods that can be overridden for 3D touch "pressed" and "released".
This is my custom UIButton code so far. Should I determine, by trial and error, what the maximum force should be? Also how do I change the source of the image for each button in the easiest way possible?
import AudioToolbox
import UIKit
class customButton : UIButton {
override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
for touch in touches {
print("% Touch pressure: \(touch.force/touch.maximumPossibleForce)");
if touch.force > valueThatIMustFindOut {
AudioServicesPlayAlertSound(SystemSoundID(kSystemSoundID_Vibrate))
// change image source
// call external function
}
}
}
override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
print("Touches End")
// restore image source
// call external function
}
}
Please note that I am new to Swift so I would like to use the graphical interface in Xcode to create the user interface as much as possible. So I would like to avoid creating the UI from the code.
As for the force touch - you need to detect if it's available first:
func is3dTouchAvailable(traitCollection: UITraitCollection) -> Bool {
return traitCollection.forceTouchCapability == UIForceTouchCapability.available
}
if(is3dTouchAvailable(traitCollection: self.view!.traitCollection)) {
//...
}
and then in e.g. touchesMoved it will be available as touch.force and touch.maximumPossibleForce
func touchMoved(touch: UITouch, toPoint pos: CGPoint) {
let location = touch.location(in: self)
let node = self.atPoint(location)
//...
if is3dTouchEnabled {
bubble.setPressure(pressurePercent: touch.force / touch.maximumPossibleForce)
} else {
// ...
}
}
Here's the more detailed example with code samples:
http://www.mikitamanko.com/blog/2017/02/01/swift-how-to-use-3d-touch-introduction/
It's also a good practice to react on such "force touches" with haptic/taptic feedback, so user will experience the touch:
let generator = UIImpactFeedbackGenerator(style: .heavy)
generator.prepare()
generator.impactOccurred()
you might want to take a look at this post for details:
http://www.mikitamanko.com/blog/2017/01/29/haptic-feedback-with-uifeedbackgenerator/
I'm creating a Custom Keyboard for iOS. I have 4 rows of keys, each key have two actions: Touch Down to highlight button, and Touch Up Inside to unhighlight the button in 0.4 seconds.
But at the left edge of the screen there is a zone where Touch Down event of any button makes a delay for about quarter of second to show highlight.
See the image
So to see highlighted version, I had to hold the button, or swipe right from it. The buttons are the same, no difference at all. When I switch from letters to symbols, this left edge zone also makes the same delay. I've tried to move all the keys to the right, about 20px, and they worked fine, without delay. Ok, stick to the edge back, and delay came back also. Then I noticed, that pressing the button on its right edge, about 1-2 pixels made no delay at all. So, it seems like the problem is in this left side edge zone of the screen particularly.
By the way, I am running this app on my 5S, I've tried to run it on my friend's 5C, the same problem. But when I run it in the simulator, there is no such delay.
Use new iOS 11 feature to solve this problem definitely.
var preferredScreenEdgesDeferringSystemGestures: UIRectEdge { get }
Documentation
I'm too creating a custom keyboard, and as far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That's however not the case for UIInputViewController.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).
I am developing a keybaord extension for iOS. On iOS 9 the keys react imediatelly except for keys along left edge of the keyboard. Those react with around 0.2 second delay. The reason is that the touches are simply delivered with this delay to the UIView that is root view of my keyboard. On iOS 8 there is no such delay.
My guess is that this delay is cause by some logic that is supposed to recognize gesture for opening "running apps screen". That is fine but the delay on a keyboard is unacceptable. Is there any way how to get those events without delay? Perhaps just setting delaysTouchesBegan to false on some UIGestureRecognizer?
This is for anyone using later versions of iOS (this is working on iOS 9 and 10 for me). My issue was caused by the swipe to go back gesture interfering with my touchesBegan method by preventing it from firing on the very left edge of the screen until either the touch was ended, or the system recognised the movement to not be that of the swipe to go back gesture.
In your viewDidLoad function in your controller, simply put:
self.navigationController?.interactivePopGestureRecognizer?.delaysTouchesBegan = false
The official solution since iOS11 is overriding preferredScreenEdgesDeferringSystemGestures of your UIInputViewController.
https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
However, it doesn't seem to work on iOS 13 at least. As far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That' not the case for UIInputViewController, though.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).
If you have access to the view's window property, you can access these system gesture recognizers and set delaysTouchesBegan to false.
Here's a sample code in swift that does that
if let window = view.window,
let recognizers = window.gestureRecognizers {
recognizers.forEach { r in
// add condition here to only affect recognizers that you need to
r.delaysTouchesBegan = false
}
}
Also relevant: UISystemGateGestureRecognizer and delayed taps near bottom of screen