a design / architectural question on airplay.
I have setup an external display in the AppDelegate:
UIScreen *externalScreen = UIScreen.screens.lastObject;
self.externalWindow = [[UIWindow alloc] initWithFrame:externalScreenFrame];
self.externalWindow.screen = externalScreen;
self.externalWindow.backgroundColor = [UIColor redColor];
Works fine, TV shows an empty screen in red.
Now I have a ViewController with a bunch of subviews, and one view should be shown on the device and the external screen. If I try this in ViewController.m:
[_appDelegate.externalWindow addSubview:self.deviceAndTVView];
deviceAndTVView will only show on the external screen, not on the device anymore.
What I would need is to have deviceAndTVView on the device, updating itself on touches / user interaction, and mirror those updates on the external screen.
Which is the right path to accomplish that?
Thanks for reading!
m
The technology called AirPlay mirroring is poorly named. It actually operates in two modes, one where the entire iOS device is mirrored to the AirPlay device, and in another mode where once the mirroring AirPlay device is connected, the developer has two UIWindow/UIScreen's to work with.
You are using the latter mode, which is often referred to as "mirroring", but really you have a completely separate window/screen to manage and there should be better terminology to refer to this mode of operation.
What you describe doing above is basically moving a UIView from the device window to the AirPlay window, and it's working exactly as it should!
There is no technical way for you to have a single instance of a UIView show on both of these windows - it will exist in one UIView hierarchy or the other, but not both at the same time. In other words, if you want the same thing to show on both screens, you need to create two instances of the same UIView, and add them respectively to the two windows, and then update both of them as they change.
While this may not be the super-convienent "mirroring" you were expecting, it's probably be a good thing because your UIView may have different aspect ratio on the device than it does on the AirPlay device. By having two different views, showing the same content, you can adjust the sizing of the AirPlay view to best use of the available resolution of the device.
I can think of a couple of ways of doing this. Have you looked at using KVO for this? Both the local and external views could observe whatever model or controller is driving their content.
Related
Our company is using a TV in portrait orientation hooked up to an Apple TV running our own custom app to serve as a status board. This is purely an internal, hacked-together app - no worries about sending to the App Store.
To avoid things being rendered sideways, we have a base class view controller doing a 90 degree CGAffineTransform on the view (and all other view controllers in the project inherit from this base class):
class PortraitViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.transform = CGAffineTransform(rotationAngle: -1*CGFloat.pi/2)
}
}
This works great for showing images, text, videos, custom UI controls, etc. However, the focus engine does not rotate with the view, and because it expects the TV is still being shown in landscape orientation, the Apple TV remote gestures end up 90 degrees off from what we want. Here's an example:
Here, we would want swiping right/left on the remote to move the focus of the segmented control between the two segments. But because the Apple TV thinks it's being shown in landscape mode, it thinks the segmented control is oriented vertically, and swiping up/down moves the focus of the segments.
Does anyone know if there's a way to convince the focus engine not to rotate alongside the view, or alternatively, a different way to display the view in portrait mode without rotating the view?
This is terrible, but the only way I can think of to achieve what you want is to take over the focus system entirely. So making your root window/view focusable to hijack all input events, listening to raw input events, and managing your own focus system from raw touch/press events as needed. You can still use preferredFocusEnvironments + setNeedsFocusUpdate to leverage all system UI + animation for focusing/unfocusing elements, but would need to take full ownership of how to shift focus based on user input and how to forward focus update hints for parallax effects.
Overriding sendEvent(_ event:UIEvent) at the UIApplication or UIWindow level to transform input events into portrait coordinates before passing them on to the system seems like a great idea on paper and almost works. Except that it is impossible to generate or modify touch events programmatically. Since you are not concerned about App Store viability though maybe you can hack a way to generate or modify touch events programmatically? Subclassing UITouch/UIPress? Using performSelector to call private or undocumented methods?
I have 2 views in my app: my root view has a button. When you click on that button, you get sent to another view. In that view, I just want to display a completely black screen, as if the iPhone was turned off. What object in the interface builder should I use for this? Sorry if it's a beginner question.
I don't think it's possible to turn the LEDs all the way off. You can however make it look like they are by turning the brightness all the way down in code like this:
[[UIScreen mainScreen] setBrightness:0.0];
If you enable proximity monitoring, then os will turn screen off.
Otherwise, the only way you can turn it off is by accessing undocumented methods (risking app store rejection)
Turn off display in iPhone OS (iOS)
As the title describes I am having a big "what the * is this" at my app atm. It seems I can't get the control over the orientation at all in the different slides.
I can only manage the orientation in one way, via the info-plist file. The problem is, info-plist file sets the orientation for the whole app and I am not interested in that.In some slides I want to allow Landscape left/right and others only Portrait and this is not doable vie info-plist?
I have tried my best to understand the problem but I can not say I have gained any bigger "aha moment" so far. I am using UINavigatorbar and Tabbar in my IOS-app which may occur the problem. How can I make the app to start listening to the code in each-file so I can manage the orientation localy ?
Are you using iOS 6? If so, the -shouldAutorotateToInterfaceOrientation method was deprecated.
You now have to override -supportedInterfaceOrientations and -preferredInterfaceOrientationForPresentation methods in order to manage screen orientation. You can do this globally or within individual view controllers.
See the UIViewController class reference for more details.
I am planning to use a couple of picker controls and segmented controls (as a kind of control panel) embedded in a container view controller to control the contents of a second child view controller. However, according to Apple iOS Human Interface Guidelines, on an iPad a picker may not be presented in the main screen and must be presented in a popover:
On iPad, present a picker only within a popover. A picker is not
suitable for the main screen.
How strict is this rule?
Would it be ok in my case to have the picker on the main screen in order to provide the required interactivity?
Clearly the answer to your question really depends on Apple, not on anything we developers might think or say. Your screen design looks very reasonable to me, but the issue is really whether Apple will approve it.
It might be better to ask if anyone has had an app approved with pickers outside of popovers.
Alternatively, you could ask if anyone has had an app rejected for using Pickers outside popovers.
Apple rules and common sense have a high degree of overlap, but where they differ the only thing that matters is Apple. Either ask them, or just submit your app for approval and see.
Good luck!
Here is my experience regarding picker.
I had an app (say appAA) approved, which use picker view in a modal view, three pickers, one followed by another.
I had another app (say appBB) which basically copies the modal view of appAA. This app was rejected because of a functional problem which I could not reproduce. I debuted in Resolution Center, then the reviewer rejected again with additional reason, saying the pickers are not presented in popover!
I guess I have to put the pickers into the popover if I want appBB to be approved since the review mentioned so.
I think the answer is simply How ugly does it make the UI look?
The left/right edges of the picker were originally intended to be flush with the edge of the iPhone screen (and it was hardcoded to be 320px wide, which failed to work sensibly in landscape mode). On the iPad, the UIPopover lets it be flush with the frame. The obvious way to achieve this effect without using a UIPopover is to draw your own frame, instead of leaving a flat grey background between the pickers.
However, the easiest (and better looking, IMO) solution is to take the picker background (as a 1px-wide image) and use a UIImageView to stretch it behind all the pickers so they look like part of a continuous bar. Remember to check both retina and non-retina versions.
I'm refactoring one of my apps from the iPhone to the iPad and this has resulted in the removal of tabs as I've been able to combine functionality onto 1 screen and use popovers to enable the user to select stuff that previously required a new tab.
I'm basically left with 2 tabs now. One (which is best viewed landscape) shows a map of the world with some overlays drawn on it plus an indication of where you are. The second is a data display with a few graphs which is best viewed portrait.
I note what Apple say about requiring apps to run in all orientations on the iPad, and of course I could do this, and keep my 2 tab bar buttons to switch views.
HOWEVER
In this case, there is 1 view that is best suited to landscape view and 1 view that is best suited to portrait view. Would be be appropriate (or even Apple permissible) UX design to drop the tab bar and switch views on an orientation change instead?
From a user perspective, you wouldn't need to be switching back and forth much, you tend to use the landscape view to change location (if you need to) and then work mainly in the portrait view - so I don't think it would be frustrating and dropping the tabs seems to make more sense to me.
What do you think? Any best practice in these situations?
Roger
London
I would say that the best practice is not to restrict the orientation of views.
The central idea here is not to force the user to hold the device a certain way. For example, a lot of people use iPads in a stand or holder and input with a keyboard. Do you want to force your users to stop and physically adjust the device in the holder/stand before they can read the view in the other orientation? Other people simple prefer holding the device one way or the other and lock the orientation (I do that a lot.) Forcing users to change from their preferred device orientation won't win you happy customers.
Apple will not penalize you for a non-standard UI unless it reflects badly on the device itself. As long as the end users can tell it's your apps non-standard behavior, Apple does not care. However, in my experience, end users tend to interpret non-standard interfaces as flawed or broken because they don't understand them.
In this case, if I launch your app for the first time, how am going to know that changing orientation changes to another view altogether and another data set? Nothing in the standard UI teaches me to expect that. I will have to discover it by trial and error. If I have the orientation locked, not even trial or error will help. At that point, I might well conclude that the app is broken.
You could try adding instructions but just the thought that they might be necessary is a red flag for a potentially poor UI.