tvOS focus engine on a 90 degree rotated TV? - ios

Our company is using a TV in portrait orientation hooked up to an Apple TV running our own custom app to serve as a status board. This is purely an internal, hacked-together app - no worries about sending to the App Store.
To avoid things being rendered sideways, we have a base class view controller doing a 90 degree CGAffineTransform on the view (and all other view controllers in the project inherit from this base class):
class PortraitViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.transform = CGAffineTransform(rotationAngle: -1*CGFloat.pi/2)
}
}
This works great for showing images, text, videos, custom UI controls, etc. However, the focus engine does not rotate with the view, and because it expects the TV is still being shown in landscape orientation, the Apple TV remote gestures end up 90 degrees off from what we want. Here's an example:
Here, we would want swiping right/left on the remote to move the focus of the segmented control between the two segments. But because the Apple TV thinks it's being shown in landscape mode, it thinks the segmented control is oriented vertically, and swiping up/down moves the focus of the segments.
Does anyone know if there's a way to convince the focus engine not to rotate alongside the view, or alternatively, a different way to display the view in portrait mode without rotating the view?

This is terrible, but the only way I can think of to achieve what you want is to take over the focus system entirely. So making your root window/view focusable to hijack all input events, listening to raw input events, and managing your own focus system from raw touch/press events as needed. You can still use preferredFocusEnvironments + setNeedsFocusUpdate to leverage all system UI + animation for focusing/unfocusing elements, but would need to take full ownership of how to shift focus based on user input and how to forward focus update hints for parallax effects.
Overriding sendEvent(_ event:UIEvent) at the UIApplication or UIWindow level to transform input events into portrait coordinates before passing them on to the system seems like a great idea on paper and almost works. Except that it is impossible to generate or modify touch events programmatically. Since you are not concerned about App Store viability though maybe you can hack a way to generate or modify touch events programmatically? Subclassing UITouch/UIPress? Using performSelector to call private or undocumented methods?

Related

Disable iOS Reachability Swipe Gesture in iOS game

I have an iOS game with a few controls near the bottom of the screen that may be swiped. When a player is swiping down, if their finger slides off the bottom of the screen, the Reachability accessibility gesture is also triggered. This then slides down the screen, moving those controls off the page and hiding half of the game. Obviously, this is not the players intention and requires them to be very specific with their swipes which isn't very intuitive or fun.
On the rounded suite of iPhones, the controls are roughly 100pt from the bottom of the screen to give space for the home indicator which helps to prevent this issue in many situations, but on squared devices, they are much closer at 10pt:
In my rudimentary testing, I've discovered that even if a swipe started as high as 300pt on the screen continues all the way to the base of the screen, Reachability will be triggered. So raising my controls higher isn’t a solution since that puts them dead center on the screen (also blocking the focus of the game) and out of reach of fingers comfortably on some phones.
Since Reachability doesn't have any use in my game (there are no controls in the upper third of the screen for the purpose of keeping your hand(s) in the lower part of the screen) I'd really like a way to prevent this. Ideally, some way to inform the system it is unnecessary during gameplay, so I can allow it during non-gameplay menus - but I may be dreaming with that part.
I also don't think it's A great solution to ask a user to disable this system wide, as it's my app's conflict and that requires them changing their behavior everywhere else.
Is there any guidance, examples, or advice on how to handle conflicts with this specific accessibility gesture?
You do not want to disable it, you want to defer it.
see https://developer.apple.com/documentation/uikit/uiviewcontroller/2887512-preferredscreenedgesdeferringsys
To use this, you want to override preferredScreenEdgesDeferringSystemGestures to defer the part of the screen you need to delay.
In your case:
override func preferredScreenEdgesDeferringSystemGestures() -> UIRectEdge {
return [.bottom]
}
Now if you are doing this in a dynamic fashion, you are also going to need to call setNeedsUpdateOfScreenEdgesDeferringSystemGestures() to notify iOS that your rules are changing.

Can a UIViewController opt-out, access or influence multitasking modes on iPad?

iOS 9 introduced multitasking to the iPad. These modes consist of:
Slide over
Split view
Picture in picture
In these questions, I'm not considering picture in picture (PiP) as that is a niche case which only applies to video playback from a limited number of sources - whereas slide over and split view apply to all view controllers in the app.
Apple's documentation is quite light in this area, and particularly when developing frameworks which get integrated into other people's apps (over which I have no control and cannot opt-out - by enabling "Requires Full Screen"), things start to get complicated.
1. Is there a way for a UIViewController itself to declare it requires full screen? - I know that an app can require full-screen, but can this be set on a per-view controller setting? (...having said that I'm not sure what would happen in split view when that view controller was displayed...? Or perhaps it could just stop the splitting occurring whilst that view controller is on-screen...?)
2. From an app which is in split view/slide over, is it possible to present a view controller full screen?
3. Is there a high-level API to detect whether the app is currently in split view/slide over or full screen? - I mean yes, I could check the view dimensions and compare them to the actual screen dimensions, but that seems clunky - is there an easy way to do this?
4. Is there a way to prevent slide over overlaying the UIViewController, or at least detecting whether there is currently a slide over in place?

How should I change my OpenGL app to use the iOS 11 safe area?

I maintain an OpenGL app that's been running on iOS since 2010. It uses the full screen and hides the status bar. It launches without any .nib file and creates an OpenGL view & controller that, in turn, displays all app content.
What changes do I need to make so the app will work on iPhone X using the new 'safe area' layout design? Presumably the only real change is just creating my "EAGL" surface/view with the same dimensions and location as the safe area instead of the entire screen?
How you respect the safe area in a "fullscreen" app (like most GL, Metal, etc games) is really two questions: one of design, and one of implementation. (But it's easier to tackle them in the reverse of that order, so here goes...)
Making fullscreen OpenGL views
If you have a fullscreen view (e.g. the window's root view controller) and you just set its layerClass to CAEAGLLayer (as is par for the course in most OpenGL ES work), you get a view that covers the entire 1125 x 2436 rectangle of the iPhone X screen. (Be sure to set the scale, too, so you actually get all those pixels... 375 x 812 # 1x scale probably looks hideous on that screen.)
That's probably the user experience you want for your app/game (and it's the one Apple encourages)... your 3D content extends all the way to the edges of the screen, around the curves at the bottom and the 🤘 at the top. That makes a much nicer UX than leaving black borders around all your content.
Designing fullscreen content for iPhone X
On the other hand, the existing design of how your OpenGL content appears may or may not fit well with the curiously shaped screen of iPhone X. If you have anything along the very top that the user is expected to see, it'll be obscured behind the camera/sensor/speaker cutout. Similarly, if you have anything important at the bottom, its edges will be cut off behind the curved corners.
In that case, you'll want to leave the unimportant parts of your fullscreen content (like a game's view of a 3D gameplay world) fullscreen, but inset any important content like UI overlays or interactive 3D elements. As for how you might do that, there's a couple of feasible approaches, with tradeoffs:
Hard-code the iPhone X obstruction dimensions, detect when you're running on iPhone X, and fix your layout accordingly. This is straightforward, but not robust. If Apple decides to change the way software UI elements around screen edges (like the swipe-to-home indicator) work, or makes iPhone XI (or X2? or XX?) next year with a slightly different shape, you'll need to update again to adapt.
Use the Safe Area guides even though you're not using UIKit or Auto Layout to draw/position onscreen content. Ask the view for its safeAreaLayoutGuide and convert that guide's bounds to whatever coordinate system you use for positioning the content you draw with OpenGL. This is a little more work, but it ensures that your app is ready for any curveballs Apple throws in the future.
One more thing...
It uses the full screen and hides the status bar.
When designing for iPhone X, it's worth rethinking whether a "fullscreen" app should hide the status bar. On other iOS devices, showing the status bar means taking away useful space from your app's content. But on iPhone X, most apps don't have anything useful they can do with those "devil horn" corners anyway — your user might appreciate still being able to see the clock, battery, etc.

Hybrid application with iphone?

I want to create a simple game, and as I understand it OpenGL will make that happen but could I make the menu, high score list and every thing except the game with regular xcode?
For instance, for Windows Phone (where im comming from) you could create XAML/DirectX where you totally could make the menu in xaml/cs and then the game in directx
Yes, the main view element in iOS is called an UIView and you use it to present openGL content on it. This results in being able to overlay it with any other views, subviews, put it in a superview, have multiple views with openGL content... All the events such as touches work as well. In summery implementing openGL in iOS UIView will simply override the visual content of the view leaving rest of the functionality as it is.

Airplay: Mirror subview on external window

a design / architectural question on airplay.
I have setup an external display in the AppDelegate:
UIScreen *externalScreen = UIScreen.screens.lastObject;
self.externalWindow = [[UIWindow alloc] initWithFrame:externalScreenFrame];
self.externalWindow.screen = externalScreen;
self.externalWindow.backgroundColor = [UIColor redColor];
Works fine, TV shows an empty screen in red.
Now I have a ViewController with a bunch of subviews, and one view should be shown on the device and the external screen. If I try this in ViewController.m:
[_appDelegate.externalWindow addSubview:self.deviceAndTVView];
deviceAndTVView will only show on the external screen, not on the device anymore.
What I would need is to have deviceAndTVView on the device, updating itself on touches / user interaction, and mirror those updates on the external screen.
Which is the right path to accomplish that?
Thanks for reading!
m
The technology called AirPlay mirroring is poorly named. It actually operates in two modes, one where the entire iOS device is mirrored to the AirPlay device, and in another mode where once the mirroring AirPlay device is connected, the developer has two UIWindow/UIScreen's to work with.
You are using the latter mode, which is often referred to as "mirroring", but really you have a completely separate window/screen to manage and there should be better terminology to refer to this mode of operation.
What you describe doing above is basically moving a UIView from the device window to the AirPlay window, and it's working exactly as it should!
There is no technical way for you to have a single instance of a UIView show on both of these windows - it will exist in one UIView hierarchy or the other, but not both at the same time. In other words, if you want the same thing to show on both screens, you need to create two instances of the same UIView, and add them respectively to the two windows, and then update both of them as they change.
While this may not be the super-convienent "mirroring" you were expecting, it's probably be a good thing because your UIView may have different aspect ratio on the device than it does on the AirPlay device. By having two different views, showing the same content, you can adjust the sizing of the AirPlay view to best use of the available resolution of the device.
I can think of a couple of ways of doing this. Have you looked at using KVO for this? Both the local and external views could observe whatever model or controller is driving their content.

Resources