How to remap a touch to another window? - ios

I have an iPad with an external touchscreen. I am trying to remap the coordinate system of the external touchscreen to the UIWindow that is shown there.
I am getting the touches on the UIViewController of the UIWindow that is displayed on the iPad, as if I was touching the iPad. I do get them with the touchtype .stylus, which is how I can distinguish them from actual iPad touches (I will show different views on the iPad and the external screen). I am using the following code:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touchesOnExternalWindow = touches.filter({ $0.type == .stylus })
let touchesOnIpad = touches.subtracting(touchesOnExternalWindow)
if !touchesOnIpad.isEmpty {
super.touchesBegan(touchesOnIpad, with: event)
}
if !touchesOnExternalWindow.isEmpty {
guard let externalWindow = UIApplication.shared.windows.first(where: { $0.screen.bounds == screenBounds }) else {
fatalError("Touching the external display without an external display is not supported!")
}
externalWindow.rootViewController?.view.touchesBegan(touchesOnExternalWindow, with: event)
}
}
I am trying to pass the touches along to the second UIWindow, as if I were touching there. But that does not seem to work.
How can I touch a view programmatically? I am testing now with a button on the second screen, but it will need to work with SceneKit views as well.

You could have only one first responder at time and it can pass events only to one next responder.
So if you want to handle touches with stilus in external window, you need place this code inside into your iPad UIViewController next responder (it's not clear who next responder of your iPad UIViewController).
In the next responder of iPad UIViewController you place this code:
override var next: UIResponder? {
// The next responder of The UIResponder chain.
// It will receive responder events (like touches)
// that current responder (your iPad window UIViewController next reponder) did't handle.
guard let externalWindow = UIApplication.shared.windows.first(where: { $0.screen.bounds == screenBounds }) else {
fatalError("Touching the external display without an external display is not supported!")
}
return externalWindow
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let touchesOnExternalWindow = touches.filter({ $0.type == .stylus })
let touchesOnIpad = touches.subtracting(touchesOnExternalWindow)
if !touchesOnIpad.isEmpty {
// Here you handle your touches on iPad, instead of passing it to next responder
}
// Pass touches to next responder(external window).
if !touchesOnExternalWindow.isEmpty {
super.touchesBegan(touchesOnExternalWindow, with: event)
}
}

Related

iOS Swift 4 Screen game swipe boxes

I am creating an app so that user has to Swipe all the boxes from the screen. The goal is to swipe all the boxes until all boxes are swiped like example below.
So my question is:
Is it better to create the boxes using Stack View or rather draw manually by coordinates on the screen?
How to detect if user has swiped through the boxes (using UIGestureRecognizer)?
Note: When user swiped through the boxes, swiped boxes will turn into other color.
Both stack view or manually should work nicely. I would go with manually in this case but this is just a preference because you might have more power over it. But there is a downside that you need to reposition them when screen size changes. A third option is also a collection view.
The gesture recognizer should be pretty straight forward. You just add it on the superview of these cells and check the location when it moves or and when it starts. A pan gesture seems the most appropriate but it will not detect if user just taps the screen. This may be a feature but if you want to handle all touches you should either use a long press gesture with zero press duration (It makes little sense, I know but it works), or you may simply just override touch methods:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
handleDrag(at: touch.location(in: viewWhereAllMiniViewsAre))
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
handleDrag(at: touch.location(in: viewWhereAllMiniViewsAre))
}
}
func handleDrag(at location: CGPoint) {
// TODO: handle the nodes
}
The gesture recognizer procedure would do something like:
func onDrag(_ sender: UIGestureRecognizer) {
switch sender.state {
case .began, .changed, .ended, .cancelled: handleDrag(at: sender.location(in: viewWhereAllMiniViewsAre))
case .possible, .failed: break
}
}
Now all you need is your data source. An array of all of your items should be enough. Like:
static let rows: Int = 10
static let columns: Int = 10
var nodes: [Node] = {
return Array<Node>(repeating: Node(), count: LoginViewController.rows * LoginViewController.columns)
}()
And a list of all of your mini views:
var nodeViews: [UIView] = { ... position them or get them from stack view or from collection view }
Now the implementation on touch handle:
func handleDrag(at location: CGPoint) {
nodeViews.enumerated().forEach { index, view in
if view.frame.contains(location) {
view.backgroundColor = UIColor.green
nodes[index].selected = true
}
}
}
This is just an example. An easy one and rather a bad one from maintenance perspective at least. In general I would rather have a node view of custom UIView subclass with a reference to a node. Also it should hook using delegate to a Node instance so that the node reports when the selection state changes.
This way you have much cleaner solution when handling touches:
func handleDrag(at location: CGPoint) {
nodeViews.first(where: { $0.frame.contains(location) }).node.selected = true
}
Checking if all are green is then just
var allGreen: Bool {
return !nodes.contains(where: { $0.selected == false })
}

UIButton touchDragEnter and touchDragExit called too often

How can I avoid a UIButtons .touchDragEnter and .touchDragExit functions from rapid firing? This question demonstrates the issue perfectly, but the only answer does not describe how to work around it. I'm trying to animate a button when the users finger on the button, and animate it again when their finger slides off. Are there any better ways to do this? If not, how should I stop my animation code from firing multiple times when the users finger is right between an .enter and an .exit state?
You could instead track the location of the touch point itself and determine when the touch point moves in and out of the button
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let point = t.location(in: self)
// moving in to the button
if button.frame.contains(point) && !wasInButton {
// trigger animation
wasInButton = true
}
// moving out of the button
if !button.frame.contains(point) && wasInButton {
// trigger animation
wasInButton = false
}
}
}
wasInButton could be a boolean variable set to true when there is a touch down in the button's frame:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
let point = t.location(in: self)
if button.frame.contains(point) {
wasInButton = true
// trigger animation
} else {
wasInButton = false
}
}
This would require you to subclass the button's superview. And since you might not want to animate as soon as the point leaves the button's frame (because the user's finger or thumb would still be covering most of the button), you could instead do the hit test in a larger frame that encapsulates your button.

Call hitTest inside of touchesMoved

I have an UIView which is on top of all other views and has overridden hitTest() method which always return itself:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
return self
}
Then, when I make some operations using points from touchesBegan(), I need to pass hitTest() to the views below of the our UIView:
override public func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
// Do some operations
// ...
// ...
// ...
// pass touch event handling to views below or change hitTest()
}
So basically, on the top UIView I'm overriding touchesBegan(), touchesMoved() and touchesEnded() methods. Then I need to handle touches, perform some operations and then, if needed, to pass to views below. Is it possible?
It is probably simpler and better to solve your problem differently.
UIKit delivers a touch event by sending it to the window (the root of the view hierarchy) in a sendEvent(_:) message. The window's sendEvent(_:) method is responsible for finding the gesture recognizers interested in the touches, and sending the appropriate touchesBegan, touchesMoved, etc. messages to the recognizers and/or the hit view.
This means that you can subclass UIWindow and override sendEvent(_:) to get a look at every touch event in the window, before the event reaches any gesture recognizers or views, without overriding any view's hitTest(_:with:) method. Then you pass the event along to super.sendEvent(event) for normal routing.
Example:
class MyWindow: UIWindow {
override func sendEvent(_ event: UIEvent) {
if event.type == .touches {
if let count = event.allTouches?.filter({ $0.phase == .began }).count, count > 0 {
print("window found \(count) touches began")
}
if let count = event.allTouches?.filter({ $0.phase == .moved }).count, count > 0 {
print("window found \(count) touches moved")
}
if let count = event.allTouches?.filter({ $0.phase == .ended }).count, count > 0 {
print("window found \(count) touches ended")
}
if let count = event.allTouches?.filter({ $0.phase == .cancelled }).count, count > 0 {
print("window found \(count) touches cancelled")
}
}
super.sendEvent(event)
}
}
You can use this window subclass in your app by initializing your app delegate's window outlet to an instance of it, like this:
#UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow? = MyWindow()
// other app delegate members...
}
Note that UIKit uses hitTest(_:with:) to set the view property of a touch when the touch begins, before it delivers the touch-began event to the window. UIKit also sets each touch's gestureRecognizers property to the set of recognizers that might want the touch (recognizer state .possible) or are actively using the touch (states began, changed, ended, cancelled) before passing the event to the window's sendEvent(_:). So your sendEvent(_:) override can look at each touch's view property if it needs to know where the touch is going.

UIButtons in specific zone of the screen make delay on Touch Down event

I'm creating a Custom Keyboard for iOS. I have 4 rows of keys, each key have two actions: Touch Down to highlight button, and Touch Up Inside to unhighlight the button in 0.4 seconds.
But at the left edge of the screen there is a zone where Touch Down event of any button makes a delay for about quarter of second to show highlight.
See the image
So to see highlighted version, I had to hold the button, or swipe right from it. The buttons are the same, no difference at all. When I switch from letters to symbols, this left edge zone also makes the same delay. I've tried to move all the keys to the right, about 20px, and they worked fine, without delay. Ok, stick to the edge back, and delay came back also. Then I noticed, that pressing the button on its right edge, about 1-2 pixels made no delay at all. So, it seems like the problem is in this left side edge zone of the screen particularly.
By the way, I am running this app on my 5S, I've tried to run it on my friend's 5C, the same problem. But when I run it in the simulator, there is no such delay.
Use new iOS 11 feature to solve this problem definitely.
var preferredScreenEdgesDeferringSystemGestures: UIRectEdge { get }
Documentation
I'm too creating a custom keyboard, and as far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That's however not the case for UIInputViewController.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).

Touch events are delayed near left screen edge on iOS 9 only. How to fix it?

I am developing a keybaord extension for iOS. On iOS 9 the keys react imediatelly except for keys along left edge of the keyboard. Those react with around 0.2 second delay. The reason is that the touches are simply delivered with this delay to the UIView that is root view of my keyboard. On iOS 8 there is no such delay.
My guess is that this delay is cause by some logic that is supposed to recognize gesture for opening "running apps screen". That is fine but the delay on a keyboard is unacceptable. Is there any way how to get those events without delay? Perhaps just setting delaysTouchesBegan to false on some UIGestureRecognizer?
This is for anyone using later versions of iOS (this is working on iOS 9 and 10 for me). My issue was caused by the swipe to go back gesture interfering with my touchesBegan method by preventing it from firing on the very left edge of the screen until either the touch was ended, or the system recognised the movement to not be that of the swipe to go back gesture.
In your viewDidLoad function in your controller, simply put:
self.navigationController?.interactivePopGestureRecognizer?.delaysTouchesBegan = false
The official solution since iOS11 is overriding preferredScreenEdgesDeferringSystemGestures of your UIInputViewController.
https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
However, it doesn't seem to work on iOS 13 at least. As far as I understand, that happens due to preferredScreenEdgesDeferringSystemGestures not working properly when overridden inside UIInputViewController, at least on iOS 13.
When you override this property in a regular view controller, it works as expected:
override var preferredScreenEdgesDeferringSystemGestures: UIRectEdge {
return [.left, .bottom, .right]
}
That' not the case for UIInputViewController, though.
UPD: It appears, gesture recognizers will still get .began state update, without the delay. So, instead of following the rather messy solution below, you can add a custom gesture recognizer to handle touch events.
You can quickly test this adding UILongPressGestureRecognizer with minimumPressDuration = 0 to your control view.
Another solution:
My original workaround was calling touch down effects inside hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView?, which is called even when the touches are delayed for the view.
You have to ignore the "real" touch down event, when it fires about 0.4s later or simultaneously with touch up inside event. Also, it's probably better to apply this hack only in case the tested point is inside ~20pt lateral margins.
So for example, for a view with equal to screen width, the implementation may look like:
let edgeProtectedZoneWidth: CGFloat = 20
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
let result = super.hitTest(point, with: event)
guard result == self else {
return result
}
if point.x < edgeProtectedZoneWidth || point.x > bounds.width-edgeProtectedZoneWidth
{
if !alreadyTriggeredFocus {
isHighlighted = true
}
triggerFocus()
}
return result
}
private var alreadyTriggeredFocus: Bool = false
#objc override func triggerFocus() {
guard !alreadyTriggeredFocus else { return }
super.triggerFocus()
alreadyTriggeredFocus = true
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesCancelled(touches, with: event)
alreadyTriggeredFocus = false
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
alreadyTriggeredFocus = false
}
...where triggerFocus() is the method you call on touch down event. Alternatively, you may override touchesBegan(_:with:).
If you have access to the view's window property, you can access these system gesture recognizers and set delaysTouchesBegan to false.
Here's a sample code in swift that does that
if let window = view.window,
let recognizers = window.gestureRecognizers {
recognizers.forEach { r in
// add condition here to only affect recognizers that you need to
r.delaysTouchesBegan = false
}
}
Also relevant: UISystemGateGestureRecognizer and delayed taps near bottom of screen

Resources