Xcode iOS external display mirroring - ios

Im working at a local newspaper and I have built a simple video uploading app that we are going to use inside our organization to make it easier to upload videos to our servers.
My issues is that when I'm going to demonstrate it for +100 people i would like to use a projector using a hdmi adapter. So i bought the adapter for my iPhone and it works great if it wasn't for one thing. When i edit the clip before uploading it (the simple editing tool that is a part of the AVfoundation) it dosnt mirror the editing tools to the hdmi screen. The projector just shows the video in fullscreen. So i can't demonstrate the buttons and tools for editing. Everything else works perfectly to mirror by default. Is there some way to force it to truly mirror it instead? Has tried reading Apples official iOS database about it but only found info about when it comes to video playback.
The code for the camera and the editing:
// Hides the controls for moving & scaling pictures, or for
// trimming movies. To instead show the controls, use YES.
cameraUI.allowsEditing = YES;
cameraUI.cameraFlashMode = UIImagePickerControllerCameraFlashModeAuto;
cameraUI.videoQuality = UIImagePickerControllerQualityTypeHigh;
cameraUI.delegate = delegate;
// Show camera view controller.
[controller presentViewController:cameraUI animated:NO completion:nil ];

The best way to go is by using the apple tv, and also the recommended one
http://www.apple.com/airplay/

Related

Video Preview Layer Running While in Split-View?

I currently have an app that displays the front facing camera atop a video preview layer. By default in iOS 9, the preview layer is interrupted/paused and will not resume until split-view is dismissed. Based on the nature of the app, maintaining the running camera preview layer while multitasking is essential.
Is there any way to force the capture session to continue previewing while in split view?
Update: Seems as if Apple does not allow any sort of camera use while the device has more than one application open. You can, however, invoke UIImagePickerController in order to take a photo while in split-view. Of course this solution only allows you to snap a single photo, and nothing more. Hope this helps someone!

How to display overlay on Apple TV via AirPlay

I am developing an iOS app that displays a video, e.g., a football game, on Apple TV via AirPlay. I want to display additional information, e.g., player stats, on the big screen while the video is playing.
I am aware of the Redfin approach, where they require the user to turn on AirPlay mirroring first. Unfortunately, this is not acceptable for us. We want it to be obvious to users on how to show the video.
We are currently presenting an AirPlay Route button before displaying the video to allow the user to set it up using the following code.
self.airPlayPicker = [[MPVolumeView alloc] initWithFrame:CGRectMake(0, 0, 50, 50)];
self.airPlayPicker.showsVolumeSlider = NO;
self.airPlayPicker.showsRouteButton = YES;
[self.view addSubview:self.airPlayPicker];
The Route button will show when there is an Apple TV around, allowing the user to turn it on. We then present the video with MPMoviePlayerController.
When AirPlay is turned on and the video is playing, in code, I see only one UIScreen, but two UIWindows. But both UIWindows have the same dimensions as the iPhone. When I add a subview to either UIWindow, the subview always shows up on the iPhone.
Has anyone figured out how to present an overlay on top of the video on Apple TV? How do I even find the view object where the video is hosted?
I am aware that MPMoviePlayerController is built on top of AVPlayer. Would using AVPlayer give us better control of the UI?
As far as I know, this shouldn't be possible. When using AirPlay without mirroring, only the URL of the video is sent to the Apple TV. It is then up to the Apple TV to actually play the media.
Mirroring is the way to do it.

NetStream http Video not playing on IOS device

I am trying to play a video on iPad, my code is below :
public function init_RTMP():void
{
videoURL = "http://rest************_iphone_high.mp4";
vid = new Video();
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onConnectionStatus);
nc.connect(null);
}
private function onConnectionStatus(e:NetStatusEvent):void
{
if (e.info.code == "NetConnection.Connect.Success")
{
trace("Creating NetStream");
netStreamObj = new NetStream(nc);
metaListener = new Object();
metaListener.onMetaData = received_Meta;
netStreamObj.client = metaListener;
netStreamObj.play(videoURL);
vid.attachNetStream(netStreamObj);
addChild(vid);
}
}
when i play it on my system it is working fine, but when i create a IOS app of it and installs on device, it shows white blank screen.
If anyone have same problem or any idea please share with me.
As VC.One pointed out, AIR for iOS does not play most (but not all, it will occasionally play a very specific encode type) h.264 encoded videos. There are three solutions:
As VC.One said, you encode as FLV. Doing this is not good and I would not recommend it. FLV is not hardware accelerated (unless things have changed recently and I have not seen the updates) and will run entirely off the CPU meaning your app will run slowly and the app will eat battery much quicker than normal.
Use StageWebView, in which case you just plug in the URL to the video and it will play the video using the native video player. This has the down side in that you cannot skint he player and you cannot control it. Once it begins playing, you have no control over it except for unloading the page. This works very well, however, and is fairly easy to implement, though the video will appear on top of the stage (it is not in the Display List).
The last option is to use StageVideo. This will play videos using the native framework, so you can easily play h.264 and it will be hardware accelerated. Additionally, this is just a NetStream player so you have full control over it. And best yet, it has no chrome so you can build a player around the video screen. However, like StageWebView, StageVideo is not in the Display List. But unlike StageWebView, it is rendered directly on the stage, below everything else. So the app itself will cover the video. You can get around this by creating a class to mask your app around the video, but it is incredibly difficult to properly pull off. It took me about 12 hours to create my StageVideo player and the masking class, plus another half day later on fixing issues with the masking class and how it handles DPI changes (hint: do NOT set applicationDPI if you are using Flex)
As always, make sure your AIR SDK is up to date as well. 3.5-3.7 have all added a ton of new features and bug fixes for iOS applications so updating to AIR 3.7 might actually solve or make your issue less of a problem (I don't think it will, but it is always worth a shot, right?)
See this link:
Netstream video not playing on iPad
Basically it was fixed by encoding the video file as FLV not MP4.

Capture front-facing camera - Phonegap?

How to capture video from front-facing camera, using Phonegap?
I found this navigator.device.capture.captureVideo(captureSuccess, captureError, {limit:2}); in Phonegaps API docs online. But I dont see anything about using the front camera instead of rear. Is it possible?
I think after the camera opens up, u can choose the front camera. Currently i dont think so there is an option to open the front camera by default
If your smartphone have front and back camera, when you use
navigator.device.capture.captureVideo(captureSuccess, captureError, {limit:2});
capture camera will be opened with switch button automatically.
I was playing around with PhoneGap today since I had an idea for an app. For this idea I needed the front facing camera aswell to be selected as default. After trying different things I found a work around which selected the front facing camera by default [NOTE: this is a dirty fix, I've got no clue what happens on devices without a front facing camera!]
In Xcode (or whatever editor you use) open [ProjectName]/plugins/CDVCapture.m and locate both captureImage and captureVideo. Both functions/commands have a line saying
pickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
Add this line below:
pickerController.cameraDevice = UIImagePickerControllerCameraDeviceFront;
And both Video and Image capturing will select the front facing camera by default when opening up the camera app. However, the user still has the possibility to switch cameras.
Here's a pastebin with my full CVDCapture.m file: http://pastebin.com/kkkyiPdn

iOS - How to play a video with transparency?

I recorded a video with a bluescreen. We have the software to convert that video to a transparent background. What's the best way to play this video overlaid on a custom UIView? Anytime I've seen videos on the iPhone it always launches that player interface. Any way to avoid this?
Don't know if anyone is still interested in this besides me, but I'm using GPUImage and the Chromakey filter to achieve this^^ https://github.com/BradLarson/GPUImage
EDIT: example code of what I did (may be dated now):
-(void)AnimationGo:(GPUImageView*)view {
NSURL *url = [[NSBundle mainBundle] URLForResource:#"test" withExtension:#"mov"];
movieFile = [[GPUImageMovie alloc] initWithURL:url];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[movieFile addTarget:filter];
GPUImageView* imageView = (GPUImageView*)view;
[imageView setBackgroundColorRed:0.0 green:0.0 blue:0.0 alpha:0.0];
imageView.layer.opaque = NO;
[filter addTarget:imageView];
[movieFile startProcessing];
//to loop
[imageView setCompletionBlock:^{
[movieFile removeAllTargets];
[self AnimationGo:view];
}];
}
I may have had to modify GPUImage a bit, and it may not work with the latest version of GPUImage but that's what we used
You'll need to build a custom player using AVFoundation.framework and then use a video with alpha channel. The AVFoundation framework allows much more robust handeling of video without many of the limitations of MPMedia framework. Building a custom player isn't as hard as people make it out to be. I've written a tutorial on it here:
http://www.sdkboy.com/?p=66
I'm assuming what you're trying to do is actually remove the blue screen in real-time from your video: you'll need to play the video through OpenGL, run pixel shaders on the frames and finally render everything using an OpenGL layer with a transparent background.
See the Capturing from the Camera using AV Foundation on iOS 5 session from WWDC 2011 which explains techniques to do exactly that (watch Chroma Key demo at 9:00). Presumably the source can be downloaded but I can't find the link right now.
The GPUImage would work, but it is not perfect because the iOS device is not the place to do your video processing. You should do all your on the desktop using a professional video tool that handles chromakey, then export a video with an alpha channel. Then import the video into your iOS application bundle as described at playing-movies-with-an-alpha-channel-on-the-ipad. There are a lot of quality and load time issues you can avoid by making sure your video is properly turned into an alpha channel video before it is loaded onto the iOS device.
Only way to avoid using the player interface is to roll your own video player, which is pretty difficult to do right. You can insert a custom overlay on top of the player interface to make it look like the user is still in your app, but you don't actually have control of the view. You might want to try playing your transparent video in the player interface and see if it shows up as transparent. See if there is a property for the background color in the player. You would want to set that to be transparent too.
--Mike
I'm not sure the iPhone APIs will let you have a movie view over the top of another view and still have transparency.
You can't avoid launching the player interface if you want to use the built-in player.
Here's what I would try:
get the new window that is created by MPMoviePlayerController (see here)
explore the view hierarchy in that window, using [window subviews] and [view subviews]
try to figure out which one of those views is the actual player view
try to insert a view BEHIND the player view (sendSubviewToBack)
find out if the player supports transparency or not.
I don't think you can do much better than this without writing your own player, and I have no idea if this method would work or not.
Depending on the size of your videos and what you're trying to do, you could also try messing around with animated GIFs.
if you could extract the frames from your video and save them as images then your video could be reproduced by changing the images. Here is an example of how you could reproduce your images so that it looks like a video:
in this image that I uploaded the names of the images have a different name but if you name your images as: frame1, fram2, fram3.... then you could place that inside a loop.
I have never tried it I just know it works for simple animations. Hope it works.

Resources