Blocking HDMI output of video - ios

For legal reasons we need to prevent users from playing the video in our app on external screen (TV, monitors), I know how to stop airplay output but we need to stop HDMI output as well, does anybody know if we can do this or perhaps we can detect HDMI output and stop the video playing altogether.

You can detect the external HDMI screens with: [[UIScreen screens] count]
Then you can get the external screen instance with: UIScreen* secondScreen = [[UIScreen screens] objectAtIndex:1];
Finally, you can create new UIWindow, ititialize it with the same screen bounds and assign the external screen to its .screen. You can add new views to this UIWindow instance.

Related

embedded youtube videos not playing in WKWebView while exact same videos work in UIWebView

When I load a web page into a UIWebView (even a www.youtube.com page) that contains an embedded youtube video, it plays just fine for IOS7 or IOS8. But when I switch to WKWebView (IOS8) and load the same web page, the youtube video doesn't play at all on the first click, then plays for a second or two on subsequent play clicks (on the youtube player). Like it is running out of buffer space and pausing until I hit play again only to play for another second.
From Safari this (or any other) youtube video plays from start to finish on the same LAN. Likewise, as I mentioned earlier, UIWebView has no problem inside of IOS7 or IOS8. So it is not a problem with my wireless data thru put or connection.
Is there something I need to do to keep the player going? Maybe increase the streaming video buffer space for WKWebViews?
OK I figured out why this is happening and also why no one else was affected by it. WKWebView and indeed IOS 8 handles several things that are not deprecated yet act differently in IOS 8. And this is one.
My app samples audio from the microphone in real time. This was fine in prior versions of IOS, but apparently in IOS 8 when a WKWebView starts a video stream, the audio sample can kill it (god only knows precisely why).
However, I need to be able to sample audio with WKWebView but I don't necessarily need to do it when a video is playing. So the challenge will be to figure out when the user has clicked on a youtube or vimeo or apparently any other streaming video and pause my audio sampler while the video is streaming. At the moment I don't know how to do that.
I also discovered another difference between IOS8 and prior when using UIView animations. prior to IOS 8 you could animate a swap between images in the UIWebView and the animation was persistent only as long as you set the duration AND you didn't have to enable animations.
[UIView beginAnimations:#"fade1" context:nil];
[UIView setAnimationDuration:1.0];
[UIView setAnimationsEnabled:YES];
[UIView commitAnimations];
Now in IOS8, apparently you have to set a listener and disable animations after it finishes, otherwise your UIView animation is applied to all sorts of unexpected, and even native, behaviors. Like finger scrolling a native list or when the user vertically scrolls a web page in the UIWebView. Disable the animations and these strange affects on other objects disappear.
[UIView setAnimationsEnabled:NO];
And now I am totally confused about what the commit does. It doesn't seem to be needed in IOS 8 anymore.
To know when the video is playing, you can try this solution from Paulo Fierro.

Difference between airplay mirroring vs without mirroring

I am trying to do airplay with apple TV. I found out that when I just play video with avplayer without mirroring, I can still play with full screen. However, screen count is only 1 (which is for iPad).
If I do mirroring, the screen count is 2 (one is iPad and one is external monitor). I think without mirroring, screen count should be two also. I am confusing about that. I would like to know more about difference between airplay mirroring vs without mirroring
screens = [UIScreen screens]; //to count screen
The difference is simple.
Mirroring will duplicate everything on your screen and display it on another screen. This is used for things like showing off a photo gallery to a group of people or something like this.
If Mirroring is turned off then this will act as an external display. This is used in games like Real Racing 3 where you can play the game on a TV or something and use your iPhone (iPad) as a controller for the game. The TV and the iPhone will have different things on their screens.
Feel like chiming in as Fogmeister's answer is not all that accurate.
You can easily use mirroring AND have different content on the Apple-TV screen. It is, as far as I've been able to find out, the only way that is supported by any of Apple's public APIs at the moment. A solution has been detailed here among other places.
The idea is to hijack the external window and then give it a viewController which you control (like any other):
if([[UIScreen screens] count] > 1){
UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
_secondWindow = [[UIWindow alloc] initWithFrame:secondScreen.bounds];
self.secondWindow.screen = secondScreen;
_externalViewController = [[YourExternalViewControllerClass alloc] init];
self.secondWindow.rootViewController = self.externalViewController;
self.secondWindow.hidden = NO;
}
In the above example the _secondWindow and _externalViewController instances are properties of the viewController setting up the device view.

How to display overlay on Apple TV via AirPlay

I am developing an iOS app that displays a video, e.g., a football game, on Apple TV via AirPlay. I want to display additional information, e.g., player stats, on the big screen while the video is playing.
I am aware of the Redfin approach, where they require the user to turn on AirPlay mirroring first. Unfortunately, this is not acceptable for us. We want it to be obvious to users on how to show the video.
We are currently presenting an AirPlay Route button before displaying the video to allow the user to set it up using the following code.
self.airPlayPicker = [[MPVolumeView alloc] initWithFrame:CGRectMake(0, 0, 50, 50)];
self.airPlayPicker.showsVolumeSlider = NO;
self.airPlayPicker.showsRouteButton = YES;
[self.view addSubview:self.airPlayPicker];
The Route button will show when there is an Apple TV around, allowing the user to turn it on. We then present the video with MPMoviePlayerController.
When AirPlay is turned on and the video is playing, in code, I see only one UIScreen, but two UIWindows. But both UIWindows have the same dimensions as the iPhone. When I add a subview to either UIWindow, the subview always shows up on the iPhone.
Has anyone figured out how to present an overlay on top of the video on Apple TV? How do I even find the view object where the video is hosted?
I am aware that MPMoviePlayerController is built on top of AVPlayer. Would using AVPlayer give us better control of the UI?
As far as I know, this shouldn't be possible. When using AirPlay without mirroring, only the URL of the video is sent to the Apple TV. It is then up to the Apple TV to actually play the media.
Mirroring is the way to do it.

How do I remove the airplay from a video player for the ipad?

So I have this player that I am using to video locally. I need to be able to mirror airplay the screen to a tv but every time I set airplay to mirror the player just sends the whole video to the screen instead of just mirroring.
https://github.com/NOUSguide/NGMoviePlayer
I contacted the developers about how to disable the airplay part of it. They told me to remove the airplay layer from the player. Can someone lead me into the right direction into getting this done?
If you want to disable AirPlay, just set the allowsAirPlay property to false in your MoviePlayerController object.
You can access the mirrored screen by using this method :
UIScreen * screen = [[UIScreen screens] objectAtIndex:1];
before calling this method, ensure that the screens property has more than one object

How do I access video output capabilities on iOS?

Applications such as Netflix have the ability to play video out of the dock connector when the video connector is hooked up. Do applications have the ability to put arbitrary data on video out? Or is the video player the only component on iOS that can do so? If the former is possible, how can my app access video output?
To draw to video out, place a window on the screen returned by [[UIScreen screens] objectAtIndex:1];. Call window.screen = theScreen on a window you have already created to assign it to the screen "theScreen".

Resources