Disable iOS Reachability Swipe Gesture in iOS game - ios

I have an iOS game with a few controls near the bottom of the screen that may be swiped. When a player is swiping down, if their finger slides off the bottom of the screen, the Reachability accessibility gesture is also triggered. This then slides down the screen, moving those controls off the page and hiding half of the game. Obviously, this is not the players intention and requires them to be very specific with their swipes which isn't very intuitive or fun.
On the rounded suite of iPhones, the controls are roughly 100pt from the bottom of the screen to give space for the home indicator which helps to prevent this issue in many situations, but on squared devices, they are much closer at 10pt:
In my rudimentary testing, I've discovered that even if a swipe started as high as 300pt on the screen continues all the way to the base of the screen, Reachability will be triggered. So raising my controls higher isn’t a solution since that puts them dead center on the screen (also blocking the focus of the game) and out of reach of fingers comfortably on some phones.
Since Reachability doesn't have any use in my game (there are no controls in the upper third of the screen for the purpose of keeping your hand(s) in the lower part of the screen) I'd really like a way to prevent this. Ideally, some way to inform the system it is unnecessary during gameplay, so I can allow it during non-gameplay menus - but I may be dreaming with that part.
I also don't think it's A great solution to ask a user to disable this system wide, as it's my app's conflict and that requires them changing their behavior everywhere else.
Is there any guidance, examples, or advice on how to handle conflicts with this specific accessibility gesture?

You do not want to disable it, you want to defer it.
see https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
To use this, you want to override preferredScreenEdgesDeferringSystemGestures to defer the part of the screen you need to delay.
In your case:
override func preferredScreenEdgesDeferringSystemGestures() -> UIRectEdge {
return [.bottom]
}
Now if you are doing this in a dynamic fashion, you are also going to need to call setNeedsUpdateOfScreenEdgesDeferringSystemGestures() to notify iOS that your rules are changing.

Related

tvOS focus engine on a 90 degree rotated TV?

Our company is using a TV in portrait orientation hooked up to an Apple TV running our own custom app to serve as a status board. This is purely an internal, hacked-together app - no worries about sending to the App Store.
To avoid things being rendered sideways, we have a base class view controller doing a 90 degree CGAffineTransform on the view (and all other view controllers in the project inherit from this base class):
class PortraitViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.transform = CGAffineTransform(rotationAngle: -1*CGFloat.pi/2)
}
}
This works great for showing images, text, videos, custom UI controls, etc. However, the focus engine does not rotate with the view, and because it expects the TV is still being shown in landscape orientation, the Apple TV remote gestures end up 90 degrees off from what we want. Here's an example:
Here, we would want swiping right/left on the remote to move the focus of the segmented control between the two segments. But because the Apple TV thinks it's being shown in landscape mode, it thinks the segmented control is oriented vertically, and swiping up/down moves the focus of the segments.
Does anyone know if there's a way to convince the focus engine not to rotate alongside the view, or alternatively, a different way to display the view in portrait mode without rotating the view?
This is terrible, but the only way I can think of to achieve what you want is to take over the focus system entirely. So making your root window/view focusable to hijack all input events, listening to raw input events, and managing your own focus system from raw touch/press events as needed. You can still use preferredFocusEnvironments + setNeedsFocusUpdate to leverage all system UI + animation for focusing/unfocusing elements, but would need to take full ownership of how to shift focus based on user input and how to forward focus update hints for parallax effects.
Overriding sendEvent(_ event:UIEvent) at the UIApplication or UIWindow level to transform input events into portrait coordinates before passing them on to the system seems like a great idea on paper and almost works. Except that it is impossible to generate or modify touch events programmatically. Since you are not concerned about App Store viability though maybe you can hack a way to generate or modify touch events programmatically? Subclassing UITouch/UIPress? Using performSelector to call private or undocumented methods?

iPhone X home indicator behavior

I am trying to emulate a behavior of the home indicator on iPhone X but can't figure out how. In some apps, the home indicator goes dim, and you have to swipe it to activate normal behavior. I have found an option in the Controller to hide the indicator, but that isn't what I am looking for. In Clash Royale and Clash of Clans, for example, the home indicator dims, then when you swipe up on it the indicator gets brighter, and if you do it again it activates 'home'. Hiding the indicator using an API I found really just makes it behave weirdly.
This is the API I am using, but it doesn't work like I have seen in other apps. With auto hide on, the indicator will disappear until you swipe and immediately invokes the home action. That is no good because the purpose is to prevent inadvertent swipes going to the home screen:
override func prefersHomeIndicatorAutoHidden() -> Bool {
return true
}
The behavior I prefer is for the indicator to dim, and then activate (get brighter) when you swipe up (but not go to home), then if you swipe up again to trigger home. This behavior is constant in Supercell apps, but perhaps it isn't a built-in behavior.
In order to see the difference, you can look at one of those Supercell apps (on an iPhone X), and look at an app with just the property set.
I researched the question more and finally found the answer in this article: iPhone X: Dealing with Home Indicator
Emphasis here (I changed .top to .bottom since that is where the home indicator lives):
override func preferredScreenEdgesDeferringSystemGestures() -> UIRectEdge {
return .top
}
What that does is defer the home action until the user performs the gesture once to activate the home control, and then a second time to invoke home. Now that I have found this I (ironically) probably won't use it. I will probably just leave enough extra room at the bottom. My problem isn't with the gesture, but with the indicator covering my content (probably needs a UI update, but I don't have time for that now).
Hopefully, someone else will find this useful since this behavior is pretty cool but difficult to discover.

Get Rid of pull-down arrow from top and bottom of Iphone Screen

I am developing an app with lots of gestural interaction. There are interactive touch areas situated in all areas of the screen. The app is an interactive synthesizer and not some picture sharing network that tightly follows the human interface guidelines.
Whenever I interact with any of the gesture inputs near the top or bottom of the screen, these arrows that signal the OS info screens appear. Is there any possible way to turn this off?
Short answer is that this is not possible.
The best you can do is warn users and ask them to go to settings to turn the Control Center off.

How to increase tappable area of top navbar buttons in iOS PhoneGap/Corova Apps

I have been developing hybrid apps on iOS and the most glaring problem I am having is the back button that emulates the native back button on the top navbar has a much smaller area.
This may be due to the button being on the edge of the top edge of the screen and the webview doesn't interpret taps on the edge to be intended for the webview, maybe the status bar.
I have even enlarged the padding on the button element to the point where it takes up the whole top left corner of the screen and wont register a tap unless you aiming for 3.5mm beneath the top of the webview. On a native app you can aim 0mm away from the edge and it registers.
This may not seem that bad, however when you allow a long term iOS user that 3.5mm is very apparent, and their mental model of where a touch should register makes them immediately think the app is broken, instead of them tapping the wrong area.
I am interested in any other information regarding ways to minimize this discrepancy between native and hybrid, or proposed solutions/information leading to a better understanding on why this occurs.
Using Cordova / PhoneGap and Kendo Mobile to implement the app

Qt screen orientation change

I'm using Qt 5.1 beta on iOS. I am deploying my app on an ipad.
The problems I am having regard how touch events are sensed and processed by qt. As long as I keep the ipad oriented straight (i.e. frontal camera is up), everything works fine. In that configuration, if I touch the screen, the coordinates of the point of touch sensed through mousePressedEvent(QMouseEvent *e) indicates that, as expected, the origin of the coordinate system is in the upper left corner of the screen.
When I turn my ipad, let's say left, so that the camera is to the left, my ui correctly rotates so that the buttons that I have are aligned to the screen. However, if I sense the touch events as described above, the origin of the coordinate system has not changed, so now it is in the lower left corner of the screen. Because of this, the buttons act as if they were not aligned to the screen but as if they turned around the same way I turned the screen (even if they are rendered aligned) So, if I tap on the visualized button it won't sense the touch, but it will if I tap were they would be if they would have not changed orientation as the screen did.
Does anyone have any idea what might cause this or how it could be fixed?
Please ask if you would like to see code. I did not put any as I would not know what might be of interest and my app is quite big already.
Thanks,
Corneliu
You file a bug report to the trolls about it. And also check to see if there is a bug report about it already.
http://bugreports.qt-project.org/
For the mean time you could push your coordinates through a mapping function of some sort based on the orientation of the device.
Also calling 'adjustSize()` on widgets tends to fix sizing and positioning with layouts.
Hope that helps.

Resources