I've created an iPad app which contains a slideshow and when this slideshow is tapped by the user he/she can entered some information.
What I'd like to do now is to display the slideshow contents on a TV when connecting the TV and iPad through AirPlay (or cable if possible, but that only seems to mirror things)
Can this be done? Can we have the slideshow run on the TV and also on iPad and then when the user taps the slideshow on the iPad the credentials input screen will show but on TV still the underlying slideshow will show and not the credentials?
How can this be done in iOS? Is it possible to display a portion of the application on the TV? So not mirroring the entire application.
You can write the app to handle 2 UIScreens using Airplay and an Apple TV then set a seperate root view controller for both the TV UIScreen and for the iPad UIScreen. Then display the image or slideshow on the TV's view controller and run that from the events of you iPads view controller!
AMENDED AFTER CLIFS COMMENT:
So firstly in your app delegate in didFinishLaunchingWithOptions or didFinishLaunching setup a notification to receive the screen did connect.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(screenDidConnect:) name:UIScreenDidConnectNotification object:nil];
Then you need to keep a reference to your separate window and push controllers to it as you would any other window.
- (void) myScreenInit:(UIScreen *)connectedScreen:(UIViewController*)mynewViewController
{
//Intitialise TV Screen
if(!windowTV)
{
CGRect frame = connectedScreen.bounds;
windowTV = [[UIWindow alloc] initWithFrame:frame];
windowTV.backgroundColor = [UIColor clearColor];
[windowTV setScreen:connectedScreen];
windowTV.hidden = NO;
}
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)setTvController:(UIViewController*)mynewViewController
{
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)screenDidConnect:(NSNotification *)notification {
[self myScreenInit:[notification object]];
}
There appears to be a bug in iOS 5.0 which makes this tricky. You have to enable mirroring from the running task bar (scrolling all the way the left before a second screen is detected via the API. I've posted details in my question here: How to use iOS 5+ AirPlay for a second screen
Related
I am using a custom AVPlayerLayer to display a simple video. I am trying to add airplay support, but the button, when tapped, does not show anything.
self.player.allowsExternalPlayback = true
...
let airplayButton = MPVolumeView(frame: self.airplayButtonPlaceholder!.bounds)
airplayButton.showsRouteButton = true
airplayButton.showsVolumeSlider = false
self.airplayButtonPlaceholder?.addSubview(airplayButton)
self.airplayButtonPlaceholder?.backgroundColor = UIColor.clear
When I run my code (on a real device), I see the button, but when I tap on it, nothing happens. What could be causing this? Is it because I am using custom AVPlayerLayer and AVPlayer?
EDIT:
When I turn on mirroring through the control center, I can touch the button and it displays the pop up. What's going on?
Nothing happen because you haven't properly configured this "new window".
There are two ways to display content using Airplay.
Mirroring
Doesn't need any configuration.
Note: You don’t need to do anything to make mirroring happen. In iOS
5.0 and later, mirroring—that is, displaying the same content on both the host device and the external display—occurs by default when the
user selects an AirPlay video output device.
Extra Window
(check apple guide here)
The steps as described by apple are:
At app startup, check for the presence of an external display and register for the screen connection and disconnection notifications.
When an external display is available—whether at app launch or while your app is running—create and configure a window for it.
Associate the window with the appropriate screen object, show the second window, and update it normally.
Here is the code taken from the apple docs for a quick reference.
- Create a New Window If an External Display Is Already Present
- (void)checkForExistingScreenAndInitializeIfPresent
{
if ([[UIScreen screens] count] > 1)
{
// Get the screen object that represents the external display.
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
// Get the screen's bounds so that you can create a window of the correct size.
CGRect screenBounds = secondScreen.bounds;
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = secondScreen;
// Set up initial content to display...
// Show the window.
self.secondWindow.hidden = NO;
}
}
- Register for Connection and Disconnection Notifications
- (void)setUpScreenConnectionNotificationHandlers
{
NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
[center addObserver:self selector:#selector(handleScreenDidConnectNotification:)
name:UIScreenDidConnectNotification object:nil];
[center addObserver:self selector:#selector(handleScreenDidDisconnectNotification:)
name:UIScreenDidDisconnectNotification object:nil];
}
- Handle Connection and Disconnection Notifications
- (void)handleScreenDidConnectNotification:(NSNotification*)aNotification
{
UIScreen *newScreen = [aNotification object];
CGRect screenBounds = newScreen.bounds;
if (!self.secondWindow)
{
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = newScreen;
// Set the initial UI for the window.
}
}
- (void)handleScreenDidDisconnectNotification:(NSNotification*)aNotification
{
if (self.secondWindow)
{
// Hide and then delete the window.
self.secondWindow.hidden = YES;
self.secondWindow = nil;
}
}
EDIT:
When using an AVPlayerViewController it is already automatically implemented as described within the documentation here.
AVPlayerViewController automatically supports AirPlay, but you need to
perform some project and audio session configuration before it can be
enabled in your application.
I am trying to change the screen shot that is automatically captured by iOS when the app enters the background.
However I am not entirely sure exactly when this screenshot is taken.
For example:
If you pull down the notification bar while in the app the following method is called:
- (void) applicationWillResignActive:(UIApplication *) application {
}
Also if you double tap the home button while in the app the same method is called. In addition if an alert view is show 'applicationWillResignActive' is called.
But in both of these cases
- (void) applicationDidEnterBackground:(UIApplication *) application {
}
is not called.
So my question is, is there a screenshot captured after the call to applicationWillResignActive even if the application does not enter the background? Or does iOS only capture a screenshot after applicationDidEnterBackground?
Look at official doc - "Preventing Sensitive Information From Appearing In The Task Switcher".
It says applicationDidEnterBackground: should be used for the said purpose.
you can look over here, in summary, change the view before return from applicationDidEnterBackground:
Yes.
- (void) applicationWillResignActive:(UIApplication *) application {
}
is called when you pull down the notification bar or even when you double click the home button. You have to do something here to prevent your sensitive information to be captured by the OS. One workaround might be:
Set a blurry screen overlay before the app goes in the background
Once the app becomes active remove this overlay
Something like this:
-(void)applicationWillResignActive:(UIApplication *)application
{
imageView = [[UIImageView alloc]initWithFrame:[self.window frame]];
[imageView setImage:[UIImage imageNamed:#"blurryImage.png"]];
[self.window addSubview:imageView];
}
And then remove this overlay before the application enters foreground:
- (void)applicationDidBecomeActive:(UIApplication *)application
{
if(imageView != nil) {
[imageView removeFromSuperview];
imageView = nil;
}
}
is called when you pull down the notification bar or even when you double click the home button. You have to do something here to prevent your sensitive information to be captured by the OS. One workaround might be:
Set a blurry screen overlay before the app goes in the background
Once the app becomes active remove this overlay
I need to display a popup or something like a splash screen every time I start my app. The content of this popup must be taken from an external web source (something like jpg, png or pdf).
The purpose of this popup is to warn users about news and special offers. The popup should disappear after a certain time (or at the pressure of a button).
From what I read on other threads, the UIPopoverController feature seems be helpful for what I need (as I read in this class reference), but I'm afraid that the main function of this popup is presenting a choice in result of the pressure of a button.
Why can you not simply add a webview to the screen when the app opens?
Like:
in appDelegate:
UIWebview *popover;
- (void)applicationDidBecomeActive:(UIApplication *)application
{
UIWindow *win = [[UIApplication sharedApplication] keyWindow];
popover = [[UIWebView alloc] initWithFrame:win.bounds];
... load content ...
[win addSubview:popover];
[self performSelector:#selector(dismissPopover) withObject:nil afterDelay:3];
}
-(void)dismissPopover
{
[popover removeFromSuperview];
}
I have an app with a tab bar controller at the root. The "home view" is a 3D rendered screen in OpenGL. There are certain 3D objects that you can click on that need to display a video. The video should be fullscreen and fade in and out.
To do this, I made the HomeViewController create a MPMoviePlayerViewController, assigned it a URL and then presented it from the tab bar controller. (I would have presented it from the HomeViewController but for some reason it didn't change its orientation properly--I'm sure it's related to all of the custom 3D stuff, and I didn't program it, so I just did a workaround by displaying it from a senior view.)
(Note that I am presenting the MPMoviePlayerViewController modally (not using the built-in presentModalMovieViewController or whatever) because Apple forces the transition to be the wacky screen-shift, and I wanted the dissolve.)
Now, that works fine. The modal window dissolves in, the video plays. You can play and pause it, fast forward, hit "Done" and the modal window goes away. Voila.
Now, here comes the totally weird bug: if you don't tap the video player and let the controls fade out (as they do after a second or two) the user cannot bring them back by tapping. It seems like the video controller stops responding to user input after that fade. Again, it works fine before they fade away. But after that point, I have to wait for the video to play all the way (at which point the modal window does, in fact, go away).
For reference, here is the relevant code to the modal video player:
-(void) startVideoWithURL:(NSURL *)videoURL {
if (!self.outsideMoviePlayerViewController) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
}
if (videoURL) {
[self stopAnimation];
self.outsideMoviePlayerViewController.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
self.outsideMoviePlayerViewController.moviePlayer.contentURL = videoURL;
[[[AppController instance] getCustomTabBarViewController] presentModalViewController:self.outsideMoviePlayerViewController animated:YES];
// Move observation of the dismiss from the MoviePlayerViewController to self.
[[NSNotificationCenter defaultCenter] removeObserver:self.outsideMoviePlayerViewController
name:MPMoviePlayerPlaybackDidFinishNotification
object:self.outsideMoviePlayerViewController.moviePlayer];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:self.outsideMoviePlayerViewController.moviePlayer];
}
}
-(void) movieFinishedCallback:(NSNotification *)aNotification {
// Summary: Restart 3D animation, unregister from notifications, kill the modal video.
[self startAnimation];
MPMoviePlayerController *moviePlayer = [aNotification object];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:moviePlayer];
[[[AppController instance] getCustomTabBarViewController] dismissModalViewControllerAnimated:YES];
}
The only other reference I could find to an issue like this was some archived post on the Apple Support Communities, here:
https://discussions.apple.com/thread/2141156?start=0&tstart=0
In this thread, the issue poster figures it out himself, and states that the issue was resolved. Here's his explanation:
The problem occurs when CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0, TRUE). After i changed to CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0.002, TRUE), the playback control can appears/disappear by tapping the screen.
Unfortunately, I'm not programming a game and nobody on my dev team calls CFRunLoopRunInMode anywhere in the code. The closest thing I found to this was in the animation code (in the same ViewController):
- (void)startAnimation
{
if (!animating)
{
NSLog(#"startAnimation called");
CADisplayLink *aDisplayLink = [[UIScreen mainScreen] displayLinkWithTarget:self selector:#selector(drawFrame)];
[aDisplayLink setFrameInterval:animationFrameInterval];
[aDisplayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
self.displayLink = aDisplayLink;
animating = TRUE;
}
}
If anyone has any insights on what could be causing this, I would appreciate it. I figured that, at the very least, even if I figure it out myself tonight, this problem could go up on Stack Overflow and be archived for the sake of posterity.
Cheers.
I figured out what was causing this problem.
I noticed that the first video did play, while successive ones did not. I moved this code:
if (!self.outsideMoviePlayerViewController) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
}
This way the creation of the outsideMoviePlayerViewController was inside the next block, like so:
if (videoURL) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
[self stopAnimation];
self.outsideMoviePlayerViewController.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
self.outsideMoviePlayerViewController.moviePlayer.contentURL = videoURL;
Now a new controller is created instead of recycling the controller each time I played a video. The bug went away. I'm not 100% sure why this happened, because there are a variety of things that occur when you display a modal view controller.
The bottom line is probably that I should have done this in the first place as part of the lazy loading paradigm instead of trying to keep the controller in memory.
I'm trying to enable AirPlay support in my app. I'm not doing video; I want to use the external display as a "second display".
Here's my problem: if I choose "AppleTV" from my AirPlay button, my app doesn't get notified. The only time my app -does- get notified is when I leave my app, go to the OS-level AirPlay button, choose "AppleTV" there and turn on Mirroring. If I turn mirroring off, my app is then told that the external display is gone.
So:
Why doesn't my app get notified when I pick an external display from
within my app?
Why does my app get notified of the presence of an
external display when I turn mirroring on...and not before? I'm obviously misunderstanding something, but it would seem like turning mirroring on should inform my app that the external display is gone (rather than now available, since the OS should now be using that external display for mirroring.)
Code sample below. Thanks in advance for any help!
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
// Is there already an external screen?
if (UIScreen.screens.count > 1)]
{
[self prepareExternalScreen:UIScreen.screens.lastObject];
}
// Tell us when an external screen is added or removed.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidConnect:) name:UIScreenDidConnectNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidDisconnect:) name:UIScreenDidDisconnectNotification object:nil];
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
// Add AirPlay control to view controller.
MPVolumeView* airplayButtonView = [[MPVolumeView alloc] init];
airplayButtonView.frame = CGRectMake(300, 300, 50, 50);
airplayButtonView.backgroundColor = [UIColor blackColor];
airplayButtonView.showsVolumeSlider = NO;
airplayButtonView.showsRouteButton = YES;
[self.viewController.view addSubview:airplayButtonView];
[self.window makeKeyAndVisible];
return YES;
}
#pragma mark - External screen handling
- (void)externalScreenDidConnect:(NSNotification*)notification
{
[self prepareExternalScreen:notification.object];
}
- (void)externalScreenDidDisconnect:(NSNotification*)notification
{
// Don't need these anymore.
self.externalWindow = nil;
}
- (void)prepareExternalScreen:(UIScreen*)externalScreen
{
NSLog(#"PREPPING EXTERNAL SCREEN.");
self.viewController.view.backgroundColor = [UIColor blueColor];
CGRect frame = externalScreen.bounds;
self.externalWindow = [[UIWindow alloc] initWithFrame:frame];
self.externalWindow.screen = externalScreen;
self.externalWindow.hidden = NO;
self.externalWindow.backgroundColor = [UIColor redColor];
}
That's correct, unfortunately. The secondary display (the airplay screen) is only available with mirroring.
Here is an application that shows how to implement this:
https://github.com/quellish/AirplayDemo
Looking at your code, you SHOULD be getting the UIScreenDidConnectNotification when a user goes to the airplay menu and turns on mirroring while your app is active. The "Airplay Button", wether a MPVolumeView or movie controller, does not control mirroring (and thus the external display functionality). Video and audio out are unfortunately separate from mirroring, and mirroring can only be turned on or off using the system wide mirroring UI.
Bottom line: You can't turn on that AirPlay screen from within your app.
Finally found the answer, you must have mirroring enabled in order to get the new screen notification, but then you should overwrite the screen with your second screen content. Very confusing!
See this example:
UIScreen screens always return 1 screen
Now, the worst part. You can add an AirPlay button inside your app using this:
MPVolumeView *volumeView = [ [MPVolumeView alloc] init] ;
[view addSubview:volumeView];
However you cannot enable mirroring from this picker! And there is no programmatic way to turn on mirroring.
How can I turn on AirPlay Screen Mirroring on the iPhone 4S programmatically
So apparently the only way to have a second screen experience is to instruct your user how to turn on AirPlay from multitasking bar and make sure they turn mirror on.
It seems not possible from inside of app unfortunately. Only airplay sound can be turned on from inside app afaik. Here's an example app using the second screen with OpenGL and sounds http://developer.apple.com/library/ios/samplecode/GLAirplay/Introduction/Intro.html