AVPlayer with Streaming videos or NSFileHandle - ios

In my app I need to play multiple videos one after another. Currently, I am streaming the videos using AVPlayer but its seems very laggy, the videos freeze quite often. I'm wondering if downloading the files with NSFileHandle will provide a better user experience with less lagging. BUT, Im worried about memory issues.
Does anyone have any recommendations in which way is more efficient? Or, for example, how snapchat plays such a large number of videos so smoothly. Thanks.

To control the playback of assets, you use an AVPlayer object. During playback, you can use an AVPlayerItem instance to manage the presentation state of an asset as a whole, and an AVPlayerItemTrack object to manage the presentation state of an individual track. To display video, you use an AVPlayerLayer object.
enter link description here

Related

Playing Two AVPlayers with two remote videos in sync [iOS]

As titled, I'm currently working on an app that needs to play two videos on a server. The thing is I need to play these two videos in sync which means two of the AVPlayers start the video exactly at the same time, and whenever one of the AVPlayer is paused due to buffering, the other AVPlayer needs to be paused, along with the resumed after buffer and vice versa.
I tried to look around but can't really seem to find a solution to this case.
I need to play it via stream which means my business requirement won't allow me to download both of the videos first.

How can we overcome the missing muted/volume property on HTML5 video on UiWebView/iOS Safari?

As many hybrid app developers know, Apple has decided to disallow setting the volume property of HTML5 video elements in JavaScript. This also amounts to the the muted property. The concept of muted videos which autoplay when scrolled into view and with the option of unmuting on tap is growing increasingly popular (pioneered by Vine, Facebook, etc.). I'm trying to find a way around this limitation in design. From what I've been able to read on the subject, there's not any hack or solution that solves this design requirement of mine.
Here's my thoughts so far:
I could split the audio from the video into a separate stream and sync current time with the video and call play() when the user is tapping. However, iOS Safari/UiWebView does not support simultaneous audio/video streams. Thus, this is simply not an option.
I could encode two videos, one with sound and one without. I could then swap the src on tap. However, this requires reloading the entire stream and also nearly doubles the amount of data required. The latency is noticeable. Thus, this won't be a viable solution.
I could embed a native AVPlayer class element in the webview. However, this would be an overlay and not be manageable from within the webview. Custom controls and UI interaction from within the dom would not be possible. Thus, this is not an option.
I could simply disable the output of the app and dynamically switch it on whenever the user taps a video element. However, to my knowledge this is not possible. I could show the native software volume slider, but that would defeat the purpose of this whole thing.
Do you have any suggestions or ways around this limitation?
I managed to find an acceptable solution. I split the videos into three files. One without audio, one without video and then one with both video and audio for desktop browsers/Android.
It seems like running simultaneous streams CAN work as long as they doesn't conflict with each other, which basically means a separate audiotrack and a video with no audio Channels play just fine in unison.

Switch between audio tracks in MPMoviePlayerController

In the app I'm currently working on I have some videos with multiple audio tracks (in different languages) and I'd like to have the ability to switch between these tracks programmatically. There is a button that allows the user to switch between them manually (which presents this screen by tapping on it: http://i.stack.imgur.com/mkUTZ.png) so I assumed this wouldn't be too hard. However, after two hours of research I haven't found anything useful. There's this StackOverflow question that suggests it's impossible to play video with multiple audio streams in the first place, but apparently that doesn't stand true anymore. There are also some examples with AVFoundation (I haven't really looked into the), but none with MPMoviePlayer or MPMoviePlayerController.
So I'd like to know if my goal is achievable with stock MPMoviePlayerController and if not, what are the good alternatives to it.

multiple MPMoviePlayerController on a tableview

I have multiple MPMoviePlayerController on a UITableView (on different sections).
I know that only one can play on a specific time, but the thing is that if a different player was in "pause" mode, it gets stuck and I need to re-init it.
I can do a sophisticated [tableview reload] on everything else but me - but it seems cycle consuming and idiotic (and not that simple to reload all but me)
Is there a better way? Maybe a 3rd party OS package that does handle this nicely?
Ok, first of all, why you need multiple MPMoviePlayerController instances if you play only one video at a time? You could create one instance of MPMoviePlayerController to play all videos one by one.
And, I think, AVPlayer with AVPlayerLayer would be more extensible solution for playing videos on iOS. Take a look at the AVFoundation framework reference for more information about the AVPlayer here.
Good Luck!

Need youtube audio to continue to play when iOS app enters background

My app has a UIWebView that plays specific youtube content within the app. Some of these videos can be lengthy and are more audio centric (e.g. lectures and/or music). I'd like users of the app to be able to leave the app but have the audio of the youtube video continue to play, without the need for hitting a resume or play button. How can this be done? I'm willing to abandon the UIWebView if there is a better alternative.
Thank You.
I work on the same troube. The problem come from the mpmovieplayer which play the movie in ios.
An article of Matt Gallagher said:
"This pause is sent from the CALayer displaying the video frames. This is a private class for an MPMoviePlayerController and is your own AVPlayerLayer for an AVPlayer."
source - http://www.cocoawithlove.com/2011/04/background-audio-through-ios-movie.html
This article is very old (2011), but maybe the problem is similar.
Good luck

Resources