MPVolumeView Airplay only touches when mirroring - ios

I am using a custom AVPlayerLayer to display a simple video. I am trying to add airplay support, but the button, when tapped, does not show anything.
self.player.allowsExternalPlayback = true
...
let airplayButton = MPVolumeView(frame: self.airplayButtonPlaceholder!.bounds)
airplayButton.showsRouteButton = true
airplayButton.showsVolumeSlider = false
self.airplayButtonPlaceholder?.addSubview(airplayButton)
self.airplayButtonPlaceholder?.backgroundColor = UIColor.clear
When I run my code (on a real device), I see the button, but when I tap on it, nothing happens. What could be causing this? Is it because I am using custom AVPlayerLayer and AVPlayer?
EDIT:
When I turn on mirroring through the control center, I can touch the button and it displays the pop up. What's going on?

Nothing happen because you haven't properly configured this "new window".
There are two ways to display content using Airplay.
Mirroring
Doesn't need any configuration.
Note: You don’t need to do anything to make mirroring happen. In iOS
5.0 and later, mirroring—that is, displaying the same content on both the host device and the external display—occurs by default when the
user selects an AirPlay video output device.
Extra Window
(check apple guide here)
The steps as described by apple are:
At app startup, check for the presence of an external display and register for the screen connection and disconnection notifications.
When an external display is available—whether at app launch or while your app is running—create and configure a window for it.
Associate the window with the appropriate screen object, show the second window, and update it normally.
Here is the code taken from the apple docs for a quick reference.
- Create a New Window If an External Display Is Already Present
- (void)checkForExistingScreenAndInitializeIfPresent
{
if ([[UIScreen screens] count] > 1)
{
// Get the screen object that represents the external display.
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
// Get the screen's bounds so that you can create a window of the correct size.
CGRect screenBounds = secondScreen.bounds;
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = secondScreen;
// Set up initial content to display...
// Show the window.
self.secondWindow.hidden = NO;
}
}
- Register for Connection and Disconnection Notifications
- (void)setUpScreenConnectionNotificationHandlers
{
NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
[center addObserver:self selector:#selector(handleScreenDidConnectNotification:)
name:UIScreenDidConnectNotification object:nil];
[center addObserver:self selector:#selector(handleScreenDidDisconnectNotification:)
name:UIScreenDidDisconnectNotification object:nil];
}
- Handle Connection and Disconnection Notifications
- (void)handleScreenDidConnectNotification:(NSNotification*)aNotification
{
UIScreen *newScreen = [aNotification object];
CGRect screenBounds = newScreen.bounds;
if (!self.secondWindow)
{
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = newScreen;
// Set the initial UI for the window.
}
}
- (void)handleScreenDidDisconnectNotification:(NSNotification*)aNotification
{
if (self.secondWindow)
{
// Hide and then delete the window.
self.secondWindow.hidden = YES;
self.secondWindow = nil;
}
}
EDIT:
When using an AVPlayerViewController it is already automatically implemented as described within the documentation here.
AVPlayerViewController automatically supports AirPlay, but you need to
perform some project and audio session configuration before it can be
enabled in your application.

Related

iOS 11: Is it possible to block screen recording?

I have an app which plays video, and I don't want people to use the new iOS-11 feature to record these videos and make them public. That feature is described here.
I could not find any documentation regarding an option for my app to prevent users from recording it.
Can anybody please guide me to anything related to this?
Thank you!
I am publishing here the official response from Apple Developer Technical Support (DTS):
While there is no way to prevent screen recording, as part of iOS 11, there are new APIs on UIScreen that applications can use to know when the screen is being captured:
UIScreen.isCaptured Instance Property
UIScreenCapturedDidChange Notification Type Property
The contents of a screen can be recorded, mirrored, sent over AirPlay, or otherwise cloned to another destination. UIKit sends the UIScreenCapturedDidChange notification when the capture status of the screen changes.
The object of the notification is the UIScreen object whose isCaptured property changed. There is no userInfo dictionary. Your application can then handle this change and prevent your application content from being captured in whatever way is appropriate for your use.
HTH!
The feature is available on and above iOS11. Better keep it inside didFinishLaunchingWithOptions
Objective-C syntax
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
if (#available(iOS 11.0, *)) {
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(screenCaptureChanged) name:UIScreenCapturedDidChangeNotification object:nil];
}
return YES;
}
-(void)screenCaptureChanged{
if (#available(iOS 11.0, *)) {
BOOL isCaptured = [[UIScreen mainScreen] isCaptured];// will keep on checking for screen recorder if it is runnuning or not.
if(isCaptured){
UIView *colourView = [[UIView alloc]initWithFrame:self.window.frame];
colourView.backgroundColor = [UIColor blackColor];
colourView.tag = 1234;
colourView.alpha = 0;
[self.window makeKeyAndVisible];
[self.window addSubview:colourView];
// fade in the view
[UIView animateWithDuration:0.5 animations:^{
colourView.alpha = 1;
}];
}else{
// grab a reference to our coloured view
UIView *colourView = [self.window viewWithTag:1234];
// fade away colour view from main view
[UIView animateWithDuration:0.5 animations:^{
colourView.alpha = 0;
} completion:^(BOOL finished) {
// remove when finished fading
[colourView removeFromSuperview];
}];
}
} else {
// Fallback on earlier versions
// grab a reference to our coloured view
UIView *colourView = [self.window viewWithTag:1234];
if(colourView!=nil){
// fade away colour view from main view
[UIView animateWithDuration:0.5 animations:^{
colourView.alpha = 0;
} completion:^(BOOL finished) {
// remove when finished fading
[colourView removeFromSuperview];
}];
}
}
}

"Tap to focus" to "auto focus" when content in camera view changed. Logic like in the stock Camera app or the UIImagePickerController for iOS?

How do I handle the switch from "tap to focus" at a specific POI back to "auto focus" state automatically after lost focus when the contents in the view changed? If you notice the focus behavior in the stock Camera app or the UIImagePickerController, after you tap focus some area and you move the phone away, the camera can automatically switch to continuous auto focus at the center of the screen.
I need some more flexibility than what a UIImagePickerController can provide, so I need to use AVFoundation to mimic the UIImagePickerController behavior first...
This sounds very complex at first to me... but it turned super simple, Apple already did the 99% of work for us. All you need to do is to set "subjectAreaChangeMonitoringEnabled" on and register KVO on "AVCaptureDeviceSubjectAreaDidChangeNotification"! On the iOS 6.1 docs:
The value of this property indicates whether the receiver should
monitor the video subject area for changes, such as lighting changes,
substantial movement, and so on. If subject area change monitoring is
enabled, the capture device object sends an
AVCaptureDeviceSubjectAreaDidChangeNotification whenever it detects a
change to the subject area, at which time an interested client may
wish to re-focus, adjust exposure, white balance, etc.
Before changing the value of this property, you must call
lockForConfiguration: to acquire exclusive access to the device’s
configuration properties. If you do not, setting the value of this
property raises an exception. When you are done configuring the
device, call unlockForConfiguration to release the lock and allow
other devices to configure the settings.
You can observe changes to the value of this property using key-value
observing.
(Even better, you don't need to handle many corner cases. What if the device is in the middle of "adjustingFocus" at a POI and the content changed? You don't want the device fall back to auto focus at the center, and want the focus action to finish. The "area did change notification" is only triggered after the focus is done.)
Some sample code snippet from my project. (The structure follows the official AVFoundation example AVCam, so you can put them in easily and try out):
// CameraCaptureManager.m
#property (nonatomic, strong) AVCaptureDevice *backFacingCamera;
- (id) init{
self = [super init];
if (self){
// TODO: more of your setup code for AVFoundation capture session
for (AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if (device.position == AVCaptureDevicePositionBack){
self.backFacingCamera = device;
}
}
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
void (^subjectAreaDidChangeBlock)(NSNotification *) = ^(NSNotification *notification) {
if (self.videoInput.device.focusMode == AVCaptureFocusModeLocked ){
// All you need to do is set the continuous focus at the center. This is the same behavior as
// in the stock Camera app
[self continuousFocusAtPoint:CGPointMake(.5f, .5f)];
}
};
self.subjectAreaDidChangeObserver = [notificationCenter addObserverForName:AVCaptureDeviceSubjectAreaDidChangeNotification
object:nil
queue:nil
usingBlock:subjectAreaDidChangeBlock];
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[self addObserver:self forKeyPath:keyPathAdjustingFocus options:NSKeyValueObservingOptionNew context:NULL];
}
return self;
}
-(void) dealloc{
// Remove the observer when done
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
[notificationCenter removeObserver:self.deviceOrientationDidChangeObserver];
}
- (BOOL) setupSession{
BOOL sucess = NO;
if ([self.backFacingCamera lockForConfiguration:nil]){
// Turn on subject area change monitoring
self.backFacingCamera.subjectAreaChangeMonitoringEnabled = YES;
}
[self.backFacingCamera unlockForConfiguration];
// TODO: Setup add input etc...
return sucess;
}
I just saw the comment of the answer of #Xiaochao Yang 's and I want to add some explain of the code CGPointMake(.5f, .5f). According to the apple's API ,the CGPoint you set to the camera is in the range of {0,0} to the {1,1} at the same time the CGPointMake(.5f, .5f)means the center of the camera.
This property represents a CGPoint where {0,0} corresponds to the top left of the picture area, and {1,1} corresponds to the bottom right in landscape mode with the home button on the right—this applies even if the device is in portrait mode
From AVCaptureDevice Class Reference

MPMoviePlayer stops responding to touch

I have an app with a tab bar controller at the root. The "home view" is a 3D rendered screen in OpenGL. There are certain 3D objects that you can click on that need to display a video. The video should be fullscreen and fade in and out.
To do this, I made the HomeViewController create a MPMoviePlayerViewController, assigned it a URL and then presented it from the tab bar controller. (I would have presented it from the HomeViewController but for some reason it didn't change its orientation properly--I'm sure it's related to all of the custom 3D stuff, and I didn't program it, so I just did a workaround by displaying it from a senior view.)
(Note that I am presenting the MPMoviePlayerViewController modally (not using the built-in presentModalMovieViewController or whatever) because Apple forces the transition to be the wacky screen-shift, and I wanted the dissolve.)
Now, that works fine. The modal window dissolves in, the video plays. You can play and pause it, fast forward, hit "Done" and the modal window goes away. Voila.
Now, here comes the totally weird bug: if you don't tap the video player and let the controls fade out (as they do after a second or two) the user cannot bring them back by tapping. It seems like the video controller stops responding to user input after that fade. Again, it works fine before they fade away. But after that point, I have to wait for the video to play all the way (at which point the modal window does, in fact, go away).
For reference, here is the relevant code to the modal video player:
-(void) startVideoWithURL:(NSURL *)videoURL {
if (!self.outsideMoviePlayerViewController) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
}
if (videoURL) {
[self stopAnimation];
self.outsideMoviePlayerViewController.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
self.outsideMoviePlayerViewController.moviePlayer.contentURL = videoURL;
[[[AppController instance] getCustomTabBarViewController] presentModalViewController:self.outsideMoviePlayerViewController animated:YES];
// Move observation of the dismiss from the MoviePlayerViewController to self.
[[NSNotificationCenter defaultCenter] removeObserver:self.outsideMoviePlayerViewController
name:MPMoviePlayerPlaybackDidFinishNotification
object:self.outsideMoviePlayerViewController.moviePlayer];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(movieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification
object:self.outsideMoviePlayerViewController.moviePlayer];
}
}
-(void) movieFinishedCallback:(NSNotification *)aNotification {
// Summary: Restart 3D animation, unregister from notifications, kill the modal video.
[self startAnimation];
MPMoviePlayerController *moviePlayer = [aNotification object];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:MPMoviePlayerPlaybackDidFinishNotification
object:moviePlayer];
[[[AppController instance] getCustomTabBarViewController] dismissModalViewControllerAnimated:YES];
}
The only other reference I could find to an issue like this was some archived post on the Apple Support Communities, here:
https://discussions.apple.com/thread/2141156?start=0&tstart=0
In this thread, the issue poster figures it out himself, and states that the issue was resolved. Here's his explanation:
The problem occurs when CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0, TRUE). After i changed to CFRunLoopRunInMode(kCFRunLoopDefaultMode, 0.002, TRUE), the playback control can appears/disappear by tapping the screen.
Unfortunately, I'm not programming a game and nobody on my dev team calls CFRunLoopRunInMode anywhere in the code. The closest thing I found to this was in the animation code (in the same ViewController):
- (void)startAnimation
{
if (!animating)
{
NSLog(#"startAnimation called");
CADisplayLink *aDisplayLink = [[UIScreen mainScreen] displayLinkWithTarget:self selector:#selector(drawFrame)];
[aDisplayLink setFrameInterval:animationFrameInterval];
[aDisplayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
self.displayLink = aDisplayLink;
animating = TRUE;
}
}
If anyone has any insights on what could be causing this, I would appreciate it. I figured that, at the very least, even if I figure it out myself tonight, this problem could go up on Stack Overflow and be archived for the sake of posterity.
Cheers.
I figured out what was causing this problem.
I noticed that the first video did play, while successive ones did not. I moved this code:
if (!self.outsideMoviePlayerViewController) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
}
This way the creation of the outsideMoviePlayerViewController was inside the next block, like so:
if (videoURL) {
self.outsideMoviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:nil];
[self stopAnimation];
self.outsideMoviePlayerViewController.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;
self.outsideMoviePlayerViewController.moviePlayer.contentURL = videoURL;
Now a new controller is created instead of recycling the controller each time I played a video. The bug went away. I'm not 100% sure why this happened, because there are a variety of things that occur when you display a modal view controller.
The bottom line is probably that I should have done this in the first place as part of the lazy loading paradigm instead of trying to keep the controller in memory.

iOS AirPlay: my app is only notified of an external display when mirroring is ON?

I'm trying to enable AirPlay support in my app. I'm not doing video; I want to use the external display as a "second display".
Here's my problem: if I choose "AppleTV" from my AirPlay button, my app doesn't get notified. The only time my app -does- get notified is when I leave my app, go to the OS-level AirPlay button, choose "AppleTV" there and turn on Mirroring. If I turn mirroring off, my app is then told that the external display is gone.
So:
Why doesn't my app get notified when I pick an external display from
within my app?
Why does my app get notified of the presence of an
external display when I turn mirroring on...and not before? I'm obviously misunderstanding something, but it would seem like turning mirroring on should inform my app that the external display is gone (rather than now available, since the OS should now be using that external display for mirroring.)
Code sample below. Thanks in advance for any help!
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
// Is there already an external screen?
if (UIScreen.screens.count > 1)]
{
[self prepareExternalScreen:UIScreen.screens.lastObject];
}
// Tell us when an external screen is added or removed.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidConnect:) name:UIScreenDidConnectNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(externalScreenDidDisconnect:) name:UIScreenDidDisconnectNotification object:nil];
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
// Add AirPlay control to view controller.
MPVolumeView* airplayButtonView = [[MPVolumeView alloc] init];
airplayButtonView.frame = CGRectMake(300, 300, 50, 50);
airplayButtonView.backgroundColor = [UIColor blackColor];
airplayButtonView.showsVolumeSlider = NO;
airplayButtonView.showsRouteButton = YES;
[self.viewController.view addSubview:airplayButtonView];
[self.window makeKeyAndVisible];
return YES;
}
#pragma mark - External screen handling
- (void)externalScreenDidConnect:(NSNotification*)notification
{
[self prepareExternalScreen:notification.object];
}
- (void)externalScreenDidDisconnect:(NSNotification*)notification
{
// Don't need these anymore.
self.externalWindow = nil;
}
- (void)prepareExternalScreen:(UIScreen*)externalScreen
{
NSLog(#"PREPPING EXTERNAL SCREEN.");
self.viewController.view.backgroundColor = [UIColor blueColor];
CGRect frame = externalScreen.bounds;
self.externalWindow = [[UIWindow alloc] initWithFrame:frame];
self.externalWindow.screen = externalScreen;
self.externalWindow.hidden = NO;
self.externalWindow.backgroundColor = [UIColor redColor];
}
That's correct, unfortunately. The secondary display (the airplay screen) is only available with mirroring.
Here is an application that shows how to implement this:
https://github.com/quellish/AirplayDemo
Looking at your code, you SHOULD be getting the UIScreenDidConnectNotification when a user goes to the airplay menu and turns on mirroring while your app is active. The "Airplay Button", wether a MPVolumeView or movie controller, does not control mirroring (and thus the external display functionality). Video and audio out are unfortunately separate from mirroring, and mirroring can only be turned on or off using the system wide mirroring UI.
Bottom line: You can't turn on that AirPlay screen from within your app.
Finally found the answer, you must have mirroring enabled in order to get the new screen notification, but then you should overwrite the screen with your second screen content. Very confusing!
See this example:
UIScreen screens always return 1 screen
Now, the worst part. You can add an AirPlay button inside your app using this:
MPVolumeView *volumeView = [ [MPVolumeView alloc] init] ;
[view addSubview:volumeView];
However you cannot enable mirroring from this picker! And there is no programmatic way to turn on mirroring.
How can I turn on AirPlay Screen Mirroring on the iPhone 4S programmatically
So apparently the only way to have a second screen experience is to instruct your user how to turn on AirPlay from multitasking bar and make sure they turn mirror on.
It seems not possible from inside of app unfortunately. Only airplay sound can be turned on from inside app afaik. Here's an example app using the second screen with OpenGL and sounds http://developer.apple.com/library/ios/samplecode/GLAirplay/Introduction/Intro.html

iOS app which runs on two screen (no mirroring)

I've created an iPad app which contains a slideshow and when this slideshow is tapped by the user he/she can entered some information.
What I'd like to do now is to display the slideshow contents on a TV when connecting the TV and iPad through AirPlay (or cable if possible, but that only seems to mirror things)
Can this be done? Can we have the slideshow run on the TV and also on iPad and then when the user taps the slideshow on the iPad the credentials input screen will show but on TV still the underlying slideshow will show and not the credentials?
How can this be done in iOS? Is it possible to display a portion of the application on the TV? So not mirroring the entire application.
You can write the app to handle 2 UIScreens using Airplay and an Apple TV then set a seperate root view controller for both the TV UIScreen and for the iPad UIScreen. Then display the image or slideshow on the TV's view controller and run that from the events of you iPads view controller!
AMENDED AFTER CLIFS COMMENT:
So firstly in your app delegate in didFinishLaunchingWithOptions or didFinishLaunching setup a notification to receive the screen did connect.
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(screenDidConnect:) name:UIScreenDidConnectNotification object:nil];
Then you need to keep a reference to your separate window and push controllers to it as you would any other window.
- (void) myScreenInit:(UIScreen *)connectedScreen:(UIViewController*)mynewViewController
{
//Intitialise TV Screen
if(!windowTV)
{
CGRect frame = connectedScreen.bounds;
windowTV = [[UIWindow alloc] initWithFrame:frame];
windowTV.backgroundColor = [UIColor clearColor];
[windowTV setScreen:connectedScreen];
windowTV.hidden = NO;
}
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)setTvController:(UIViewController*)mynewViewController
{
UIViewController* release = windowTV.rootViewController;
windowTV.rootViewController = mynewViewController;
[release removeFromParentViewController];
[release release];
}
- (void)screenDidConnect:(NSNotification *)notification {
[self myScreenInit:[notification object]];
}
There appears to be a bug in iOS 5.0 which makes this tricky. You have to enable mirroring from the running task bar (scrolling all the way the left before a second screen is detected via the API. I've posted details in my question here: How to use iOS 5+ AirPlay for a second screen

Resources