Multiple Video, Same Screen - ios

Is it possible to play more than one video the same time / on the same screen with iOS?
(E.g. Picture-in-Picture mode or split screen.)

Step by step and full source code on how to do this: http://iosguy.com/2012/01/11/multiple-video-playback-on-ios/

You can't using MPMoviePlayerController. The documentation states that clearly..
Although you can create multiple MPMoviePlayerController objects and
present their views in your interface, only one movie player at a time
can play its movie.
But I think you can do this with lower level AVFoundation framework..But I never tried it..See this

To update the answers up today's SDKs, with the SDK for iOS 5 (and probably back to 4) you can use AVFoundation to play video in an AVPlayerLayer and add a number of layers to a view. I've confirmed that three separate WQVGA h.264 video streams can play on the same view at the same time under both iOS 4.3.5 on an iPhone 3GS and iOS 5.0 on an iPhone 4S.

Related

Output UIView as video stream

Is it possible to stream content/context of a UIView as a direct video stream in Swift? I am not really looking for a “view screenshotting” functionality and than assembling video, this solution is possible but the framerate is far from ideal.
Update: maybe using OpenGL view?
1. View screenshots: What's your current solution of timing function ?
I believe if you use CADisplayLink, you can get better frame rate. As in my project, I can get ~15-20fps live streaming on full screen video view on iPhone 7Plus.
2. Using ReplayKit: I think I don't need to rewrite the introduction in another way because the Apple's docs were so clear.
Record or stream video from the screen, and audio from the app and
microphone.
Using the ReplayKit framework, users can record video from the screen,
and audio from the app and microphone. They can then share their
recordings with other users through email, messages, and social media.
You can build app extensions for live broadcasting your content to
sharing services. ReplayKit is incompatible with AVPlayer content.
The frame rate is quite higher than draw screenshot of views but currently it only supports capturing the whole screen.
So if you want to achieve capturing just a view, may be think about this way: Crop the buffer array of the output CMSampleBufferRef frame.
edit: If it's about mirroring a view to an external screen then we could have other solutions than ReplayKit or view screenshots.

How to implement screen recording with audio in iOS programmatically?

I have a requirement where I have screen recording with audio as well. I have done some Google and got to know about how can we implement screen recording but I am wondering how to save audio while screen recording.
Is there any possibility that we can merge the video and audio and then save the final data on disk?
But I am not sure that will it be feasible because there will be difference in syncing with audio and video frames.
For screen recording I got a link of ScreenCaptureView which actually lets you save the screen recording.
On iOS9 there is ReplayKit it's a framework that can be used to make screen recording on video games. It seems that you can use also for common screen capture.
For lower platforms it's a kind of big deal, video screen capturing exists but is a private framework ( I guess IOSurface). There are some work arounds as in this project , basically it starts to grab sigle screenshot and append them into a movie file, without audio

Swift - Slow Motion Camera in App

I'm been learning Swift and iOS development and am interested in making an application that uses the camera on the back of the iPhone to shoot slow motion video and then allow the user to use a slider to move through the video normally, but also to move frame by frame. Of course, this would be regulated by a button that transitioned the user from "normal navigation" to frame by frame. Can anyone point me in the right direction? Are there any resources where I can read about using the "slo-mo" feature in a custom app? Thanks.
If you don't want the user to be able to save the slow-mo video, what you could do is take a video shot, then have the app play it back in slow-mo. You can do all this using the AVFoundation framework. Here are some links to help you out:
Taking Control of the iPhone Camera in iOS 8
Objective-C: How to do slow-motion video in iOS (yes, this is for Objective-C, but you can change the code to Swift or just copy and paste it into a new Objective-C file)

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Show two videos at the same time on the same iOS screen [duplicate]

This question already has answers here:
Multiple Video, Same Screen
(3 answers)
Closed 9 years ago.
I need to show two different videos at the same time within my iOS app. One video needs to be playing in the top half of the screen, and a separate video needs to be playing on the bottom half of the screen. I don't need to have the stop/play/ffw/rew buttons enabled - just to show the videos playing. I have some of my own buttons to stop and start playback, which should affect both videos.
I tried using two MPMoviePlayerController views, but it will only allow one to be playing at any one time. I read that it's possible to have two playing in separate ViewControllers - if so, do I simply need to create two UIViewControllers and add them to the parent viewcontroller?
Thanks for your help guys!
That is not possible on any multimedia framework level.
AVFoundation- (AVPlayer) and MediaPlayer-Framework (MPMoviePlayerController) both only support a single video playback at a time.
Update
The situation has changed. In fact, AVFoundation does permit multiple movie playbacks at a time. MediaPlayer still sticks towards the single playback at a time restriction.
You can obtain AVPlayer to implement controllers and interfaces for multiple instances of AVPlayerLayer. These may be added to the layer hierarchy, so it is very possible to have multiple videos on one screen at the same time.
Here is an overview of one approach with accompanying sample code.

Resources