How to differentiate apple tv and other devices in airplay? - ios

I am trying to connect apple tv through airplay, but the issue is some time if i connect any other external device like bluetooth or some other device it shows like device connected in window. So i want to identify which device is connected i have to enable only when apple tv is connected.
How can i identify whether it is apple tv or some other device?
This how i create airplay custom button
for (UIButton *button in volumeView.subviews) {
if ([button isKindOfClass:[UIButton class]]) {
self.airplayButton = (UIButton*)button;
button.frame = CGRectMake(0, 0, 30, 23);
button.backgroundColor = [UIColor clearColor];
[self.airplayButton addObserver:self forKeyPath:#"alpha" options:NSKeyValueObservingOptionNew context:nil];
}
}
So the alpha always changes for button even some other devices gets connected.

I've had a look into this before, there's no easily provided way of determining whether the attached device is an Apple TV, there is a Airplay Picker which does this but the code/functions behind it don't seem to be available.
The best you can do it monitor for additional screens being added/removed and then showing your external content only when the screen has the capabilities to do what you need.
I have read somewhere previously that you can get the capabilities of an airplay device and use this information to detect an Apple TV but unfortunately I cannot find it at the moment. If I do find it I'll add a comment.
For now, your best option would be to use the concepts described in this guide
The code provided is in objective-c but its very easily converted to swift, here is the main part you should be looking at
- (void)checkForExistingScreenAndInitializeIfPresent
{
if ([[UIScreen screens] count] > 1)
{
// Get the screen object that represents the external display.
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
// Get the screen's bounds so that you can create a window of the correct size.
CGRect screenBounds = secondScreen.bounds;
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = secondScreen;
// Set up initial content to display...
// Show the window.
self.secondWindow.hidden = NO;
}
}
Like I said you can code this so that it checks the device supports certain resolutions so you can rule out devices that wont support your UI
Some additional resources: https://developer.apple.com/airplay/

Related

iOS YTPlayerView force video quality

I am currently using iOS-youtube-player-helper library in our application. There is a view controller, with a YTPlayerView that has an aspect ratio of 16:9, which means it takes only a part of the screen. The video is loaded in medium and no matter how, I could not get it to play in 720P or 1080P. I am certain that these qualities are available, it's just the YTPlayerView forcing the quality based on the video player height. Because this is a library and not direct iframe embed, I cannot use "vq" parameter(specifying vq in playerVars does not seem to work), and setting the quality to be small then change it later does not work either(refer to this issue on GitHub)
Now, given the factor that I cannot make the YTPlayerView to fill up the whole screen, because of UI designing issues. So, is it possible to force the YTPlayerView to play in at least 720P? (Workarounds, changing the library code, ...)
Because this is an app that will be on App Store(and of course we don't want to have any legal disputes with google either), please don't suggest using libraries that are against the Youtube ToC such as XCDYouTubeKit
Many Thanks
I've found a workaround and this works well for me.
First of all, the problems depends by the webView size constructed inside the YTPlayerView. For example if you have a 320x200 playerView, try to forcing your video to 720hd don't work because the iFrame youtube player class re-switch to a better resolution according to your player size (in this case small quality because you have 320x200).
You can see this SO answer that explain this issue.
When you have imported the YTPlayerView class to your project you have two files: YTPlayerView.h and YTPlayerView.m
YTPlayerView.m (Update to work also on iPads)
I've change the function where the webview is initialized with a custom size (4k resolution) and to the last part I've added the possibility to scale the contents and restore the original frame, like this:
- (UIWebView *)createNewWebView {
CGRect frame = CGRectMake(0.0, 0.0, 4096.0, 2160.0); //4k resolution
UIWebView *webView = [[UIWebView alloc] initWithFrame:frame];
//UIWebView *webView = [[UIWebView alloc] initWithFrame:self.bounds];
webView.autoresizingMask = (UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight);
webView.scrollView.scrollEnabled = NO;
webView.scrollView.bounces = NO;
if ([self.delegate respondsToSelector:#selector(playerViewPreferredWebViewBackgroundColor:)]) {
webView.backgroundColor = [self.delegate playerViewPreferredWebViewBackgroundColor:self];
if (webView.backgroundColor == [UIColor clearColor]) {
webView.opaque = NO;
}
}
webView.scalesPageToFit = YES;
if ( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad )
{
CGSize contentSize = webView.scrollView.contentSize;
CGSize viewSize = self.bounds.size;
float scale = viewSize.width / contentSize.width;
webView.scrollView.minimumZoomScale = scale;
webView.scrollView.maximumZoomScale = scale;
webView.scrollView.zoomScale = scale;
// center webView after scaling..
[webView setFrame:CGRectMake(0.0, self.frame.origin.y/3, 4096.0, 2160.0)];
} else {
webView.frame = self.bounds;
}
[webView reload];
return webView;
}
Hope this helps who try to use the Youtube original player in his project.
P.S.: All my tries were did with a swift project and the following vars:
var playerVars:Dictionary =
["playsinline":"1",
"autoplay":"1",
"modestbranding":"1",
"rel":"0",
"controls":"0",
"fs":"0",
"origin":"https://www.example.com",
"enablejsapi":"1",
"iv_load_policy":"3",
"showinfo":"0"]
Using the functions:
self.playerView.load(withVideoId: "Rk6_hdRtJOE", playerVars: self.playerVars)
and the follow method to force the resolution:
self.playerView.loadVideo(byId: "Rk6_hdRtJOE", startSeconds: 0.0, suggestedQuality: YTPlaybackQuality.HD720)
About iPads:
As reported by Edward in comments there was a little problem on iPads, that's because these devices seems don't apply scalesPageToFit. The goal is to check if the device is an iPad, then to scale (zooming out) the scrollView to host the little view bounds.
I've tested on my iPad air and it works. Let me know.

is it possible to remove the (i) info button and place some video controllers over 360 video Google VR?

The title says it all! but to be more clear, please check this screenshot. This is a 360 video playback using the Google VR https://developers.google.com/vr/ios/ but I want to know if it is possible to remove this little (info) button? and instead overlay our own set of video controlers?
Google allow you to create a custom GVRView which doesn't have the (i) icon - but it involves creating your own OpenGL code for viewing the video.
A hack working on v0.9.0 is to find an instance of QTMButton:
let videoView = GVRVideoView(frame: self.view.bounds)
for subview in self.videoView.subviews {
let className = String(subview.dynamicType)
if className == "QTMButton" {
subview.hidden = true
}
}
It is a hack though so it might have unintended consequences and might not work in past or future versions.
Well, I have an answer to my own question. Alright, the (i) button cannot be removed. at leased not for now. check this answer
Hi. The (i) is intentional and designed to let users and other
developers understand the feature. It links to a Google help center
article. We do not currently allow developers to disable it.
https://github.com/googlevr/gvr-ios-sdk/issues/9#issuecomment-208993643
GVRVideoView *videoView = [[GVRVideoView alloc] initWithFrame:CGRectMake(0, 0, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height)];
for (UIView *view in videoView.subviews) {
if ([view isKindOfClass:[UIButton class]] ) {
if ([NSStringFromClass([view class]) isEqualToString:#"QTMButton"] ) {
[view removeFromSuperview];
}
}
}

Re-enable mirroring on iOS

In my iOS app I need to display custom content on external display (using AirPlay) as well as mirroring some screens on TV.
For presenting custom content I use code from Multiple Display Programming Guide for iOS and it works well: while my iPad is in 'mirror' AirPlay mode I'm able to show some stuff on the TV. However, documentation says6
To re-enable mirroring after displaying unique content, simply remove the window you created from the appropriate screen object.
And this part isn't working at all. I just cannot destroy my window that I use to display content on external screen. Here's the code:
- (void) destroySecondWindow{
if (secondWindow){
for( UIView* view in secondWindow.subviews ){
[view removeFromSuperview];
}
secondWindow.backgroundColor = [UIColor clearColor];
secondWindow.hidden = YES;
// Hide and then delete the window.
[secondWindow removeFromSuperview];
secondWindow = nil;
}
}
As far as unique content should be displayed only when one particular view controller is visible, I'm trying to destroy external window like this:
- (void) viewWillDisappear:(BOOL)animated{
[self destroySecondWindow];
}
Here's how I create second window:
- (void) createSecondWindowForScreen:(UIScreen*)screen{
if( screen == nil || secondWindow != nil ){
return;
}
CGRect screenBounds = screen.bounds;
secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
secondWindow.backgroundColor = [UIColor blueColor];
secondWindow.screen = screen;
[secondWindow setHidden:NO];
}
So the question is: does anybody know how to re-enable screen mirroring after displaying unique content on TV?
Thanks in advance!

Underscan issue with iPhone 5 Lightning HDMI adapter

I am trying to mirror what's on my iPhone to an external display but I can't remove the underscan. All my searches for an answer lead back to "overscanCompensation". Since this is a property of UIScreen, I'm not sure where to add it if I'm not explicitly creating a new UIScreen as mirroring is now automatic.
Auto-mirroring through Lightning HDMI adapter screenshot:
http://blog.axelgimenez.net/dp7-capture-underscan-issue-1.jpg
The other option is to create a UIScreen when a new display is detected. Please see the code below and note that I did use "overscanCompensation = 3". Unfortunately, it doesn't work for me. I've tried all the other overscanCompensation options with no luck.
- (void)checkForExistingScreenAndInitializeIfPresent
{
if ([[UIScreen screens] count] > 1)
{
// Get the screen object that represents the external display.
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
secondScreen.overscanCompensation = 3;
// Get the screen's bounds so that you can create a window of the correct size.
UIScreenMode *screenMode = [[secondScreen availableModes] lastObject];
CGRect screenBounds = (CGRect){.size = screenMode.size};
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = secondScreen;
// Set up initial content to display...
self.secondWindow.backgroundColor = [UIColor redColor];
// Show the window.
self.secondWindow.hidden = NO;
}
}
Here's the screenshot of the display that the code above creates:
http://blog.axelgimenez.net/dp7-capture-underscan-issue-2.jpg
Same issue of underscan. Any help is greatly appreciated.
FYI - Using the Lightning to HDMI adapter, I connected my iPhone 5 to a SmallHD dp7 monitor. The monitor lets me take screen captures. The underscan also appears when I plug into my TV.

iOS AirPlay Second Screen Tutorial

I am looking at adding AirPlay capabilities to one of my ViewControllers. The View Controller just shows a UIWebView. What I want to do is add a button that will mirror this content to an Apple TV. I know system-wide mirroring can be done, but it doesn't fill up the entire screen, has black bars all around. I have been searching online, but most everything I have found is way back from iOS 5 and out of date. Could someone point me in the direction of a tutorial or drop-in library that would help out? I just need it to mirror the content of just one view to be full-screen on Apple TV.
So far, here is what I have done, but I believe it only creates the second Window, without putting anything on it.
In the AppDelegate, I create a property for it:
#property (nonatomic, retain) UIWindow *secondWindow;
In didFinish method of AppDelegate I run:
NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
[center addObserver:self selector:#selector(handleScreenDidConnectNotification:)
name:UIScreenDidConnectNotification object:nil];
[center addObserver:self selector:#selector(handleScreenDidDisconnectNotification:)
name:UIScreenDidDisconnectNotification object:nil];
Then in AppDelegate I have:
- (void)handleScreenDidConnectNotification:(NSNotification*)aNotification
{
UIScreen *newScreen = [aNotification object];
CGRect screenBounds = newScreen.bounds;
if (!self.secondWindow)
{
self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
self.secondWindow.screen = newScreen;
// Set the initial UI for the window.
}
}
- (void)handleScreenDidDisconnectNotification:(NSNotification*)aNotification
{
if (self.secondWindow)
{
// Hide and then delete the window.
self.secondWindow.hidden = YES;
self.secondWindow = nil;
}
}
In the viewController in which I would like to allow to mirror the WebView on Apple TV, I have:
- (void)checkForExistingScreenAndInitializeIfPresent
{
if ([[UIScreen screens] count] > 1)
{
// Get the screen object that represents the external display.
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
// Get the screen's bounds so that you can create a window of the correct size.
CGRect screenBounds = secondScreen.bounds;
appDelegate.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
appDelegate.secondWindow.screen = secondScreen;
// Set up initial content to display...
// Show the window.
appDelegate.secondWindow.hidden = NO;
NSLog(#"YES");
}
}
I got all this from here. However, that's all that it shows, so I'm not sure how to get the content onto that screen.
Depending on what’s going on in your web view, you’ll either have to make a second one pointed at the same page or move your existing one to the new window. Either way, you treat the second window pretty much the same as you do your app’s main window—add views to it and they should show up on the second display.
I assume you've seen it, but this is the only sample project I could find: https://github.com/quellish/AirplayDemo/
Here are some related questions that might be worth reading:
does anyone know how to get started with airplay?
Airplay Mirroring + External UIScreen = fullscreen UIWebView video playback?
iOS AirPlay: my app is only notified of an external display when mirroring is ON?
Good luck!
There are only two options to do Airplay 'mirroring' at the moment: the system-wide monitoring and completely custom mirroring. Since the system-wide mirroring is not a solution for you, you'll have to go down the way you already identified in your code fragments.
As Noah pointed out, this means providing the content for the second screen, the same way as providing it for the internal display. As I understand you, you want to show the same data/website as before on the internal display, but display it differently in the remote view/webview (e.g. different aspect ratio). One way can be having one webview follow the other in a master/slave setup. You'd have to monitor the changes (like user scolling) in the master and propagate them to the slave. A second way could be rendering the original webview contents to a buffer and drawing this buffer in part in a 'dumb' UIView. This would be a bit faster, as the website would not have to be loaded and rendered twice.

Resources