I am new to iOS app developing field. I am trying to capture video using AVFoundation. I am successful in this. But when I tried to play the video back using MPMoviePlayerController, I got too many issues. So I am trying to play using AVPlayer.
But in AVPlayer there are two approaches AVPlayerLayer and AVPlayerViewController.
I tried searching about those, but I didn't get any particular reason to choose one.
Can anyone suggest me which is better to use?
AVPlayerViewController is an all in one solution. You setup your AVPlayer with a video and present the player controller. It handles all the playing and has its own controls baked in (I'm sure you've used seen this in other apps). It is the simplest way to show a video.
AVPlayerLayer is for when you want to add some customization, like adding your own controls or extra views, or not making the video full screen.
Related
Goal:
To add youtube like preview feature when user seeks manually using the player seek bar.
From what I understand so far is that I will have to add "I-Frame only playlist" to my stream to enable trick play but I am not able to figure out how I will be using this to show the preview view on the video player?
Other solutions I considered:
AVAssetImageGenerator: It does not work on streams. Explained here.
This says if my .m3u8 file contains "I-Frame only playlist", AVAssetImageGenerator will start returning the snapshot, but even if it does, generating thumbnails of a complete 1 hour video upfront is just not optimal.
AVPlayerItemVideoOutput This also seems like a very brut force way to approach the problem as I need thumbnails of almost complete video.
Current player implementation:
I have added AVPlayerLayer as a sublayer to my view controller's view and added custom controls on top of it.
I am thinking of using something like this https://github.com/pbs/iframe-playlist-generator to add the I-Frame playlist.
PS: I am new to this, so if I have made any wrong assumption, please let me know.
Also, any links or references to some reading material I can use to dive in deeper are appreciated. Thanks.
I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!
I found a lots posts about playing mp4 in MPMoviePlayerController (can be in a smaller view in screen) or play youtube video in YTMoviePlayer (fullscreen). I'd like to know if there are someways to play a YouTube video in a certain view? Like apple's example code for MPMoviePlayerController?
https://developer.apple.com/library/ios/#samplecode/MoviePlayer_iPhone/Introduction/Intro.html
Thanks you
You should try this. LBYoutubeView. This allows you to play a video in specific view frame.
If you're using an iPhone, there's no way of using the MPMoviePlayerController other than fullscreen. With the iPad, it's a different story, you can embed the player in another view.
My iPad app has the option to play videos. I use the MPMoviePlayerViewController class to play my videos.
My question is: if I want to play the videos on an attached external monitor, how do I keep the playback controls on the iPad like YouTube does? If I add the view of the MPMoviePlayerViewController 's player to the external screen's hierarchy I can play the video fine, but I now have no control over it. Is there a way to move or duplicate the view where the controls lie and place it on a view which resides on the iPad?
I'm not aware of an officially supported way of pulling out the original UI in this way. The MPMoviePlayerViewController only exposes the MPMoviePlayerController object it uses via its moviePlayer property. The MPMoviePlayerController in turn only exposes view and backgroundView, which aren't helpful for such a purpose. You could in theory inspect the subviews of the movie player's main view, find the playback controls and try to move them to the other screen. I have a feeling this will not end well though, as they're anything but static. You also never know what will happen in later iOS versions, or if they'll let your hack on the app store. It's probably less trouble to just re-do the UI yourself.
Actually controlling the video playback programmatically is straightforward - the view controller's moviePlayer implements the MPMediaPlayback protocol.
In my iPhone app I have designed a custom video player, currently it is very basic with just a play pause and stop button,
but I would like the user to be able to scrub, (I think thats the right word) the video like you can do with apple's original media player.
So for Instance I would like to be able to take a UISlider and have it control the current postiion of the videos playback if you get what I mean. Oh and incase your curious, the way I pause/play/stop the video is by using this simple piece of code [self.theMovie play]; [self.theMovie stop]; [self.theMovie pause]; The trouble is I don't know how to scrub the video.
Any help appreciated.
I've asked the same question: customize quicktime iphone and here MPVideoPlayer add/remove buttons
It seams that you have to posibilities:
You can add you view over the main window. An sample can be found here: MoviePlayer Sample
You can iterate through views, found one you need and Add/Remove views. I don't know yet how must does Apple likes/dislikes this method.