We've got an app we're working on that needs to provide playback of video files via AVPlayer. The files need to be stored on the user's device, but also must playback while downloading.
At the moment we've built a download module that uses the ASIHTTPRequest library to get the video files via PHP (we don't want the media to be linkable via public URLs) and write them to disk asynchronously.
We've then setup an AVPlayer as per the AV Foundation Programming Guide, getting the file with AVURLAsset, making an AVPlayerItem with the asset, building the AVPlayer with the item, then setting the player to an AVPlayerLayer. The player runs fine with a fully downloaded video file and will also run a progressively downloaded file perfectly well in the simulator.
Unfortunately on an actual device, the behavior of the player changes, where instead it seemingly loads the video once and doesn't attempt to grab new packets from disk. The result is that the player will play video and audio up to the point in the video that marks where the download progress was at the time the asset was loaded (e.g. if 2MB of data are buffered then the player is created, the player will only play up to the 2MB worth of data). Because it has the video's header, the player will happily continue thinking it's playing for the full duration of the video, but no other video is rendered to screen.
The last wrinkle is that if the video is inserted into an AVComposition and the AVPlayer is created with that, the progressive video will play fine. This would be a fine solution (and is necessary for our app anyway on occasion) except that the client for our app requires that the video be playable on an Apple TV via AirPlay, which AVCompositions are incapable of.
Does anyone know if there is a way to play progressively downloading video files using an AVPlayer built from AVURLAssets? Is there a way to force the player/playerItem to read from disk with an AVURLAsset the way it seems to do with an AVComposition instead of seemingly caching the video in memory?
Thanks!
I haven't a solution to just make it work wit AVURLAssets but I use a slightly different approach. We bundle our App with CocoaHTTPServer and play video files which aren't fully downloaded trough a HTTP request against the local server.
The server knows the total length of the file and can then decide by looking at the HTTP-Headers which part of the file is request and either loads it from disk or from remote source.
While developing this there where always 3 initial requests, one for the first two bytes of the file, one of a larger chunk from the beginning of the file and one chunk directly of the end of the file. That's why it was always needed to load at least the last part directly from the remote server since the player would need it right from the start. I would guess the same happens for local files so the player loads the last bytes from the file (which aren't the right last bytes) and won't play over that length.
You would have to subclass HTTPConnection and make your own HTTPResponse class by looking at the provided "HTTPAsyncFileResponse".
I hope this gives you an idea how to accomplish this with a different approach.
Related
I have a list of videos in my iOS app, which I fetch from my API. The list contains video nodes with an mp4 video file url on the server, and as the user scrolls from one video to the next, the next video starts and the earlier video pauses. This all works well, but I need a way to cache mp4 videos to disk, in such a way that when the user tries to seek, I first try serving the sought chunk from my disk cache, and if this is not cached already, I start loading the chunk from the network, and then the chunks are cached to disk while being played. I also need a way to pre cache the next videos, for example if the user starts playing first video, I need to be able to start caching the second video, so that when the second video starts playing, it does so from the cache, to the point upto which the video is cached, and then seamlessly hop to fetching from the server (and of course caching the responses). How do I go about this?
There's no automated way to do this built into the framework. You'll use background NSURLSessions with download tasks and delegate methods.
You can monitor the progress of the downloads, access the temporary files while they download, pause, resume, and cancel them.
Currently in our iOS Application, we are downloading the video and then playing it, which is indeed taking a lot of time and killing the user experience. Now, we want to shift to directly stream the video or play the video while it is downloading.
I have tried using AVAssetResourceLoaderDelegate. I doesn't play the video at all and displays the mime type as text/html. Maybe because I am loading from a secured URL with headers.
Can someone suggest me a good way where I can stream or Play the video while downloading it. Would be glad if someone suggest a way to Stream the video's directly from the URL.
I'm working in an app where I'm able to play a HLS m3u8 playlist of a streaming radio (audio only) without any problem using an instance of AVPlayer. Using Charles I can see how the playlist is updated properly in a normal pace (each 9-10 seconds, which takes one media segment file). When I perform a seekToTime: (back in time), the player success playing the stream from when I want to, but in Charles I observe how the player starts dowloading a huge amount of media segment files, consuming a lot of data. It seems that the player downloads all the media segment files until that time and then keeps again with the normal behaviour.
I understand that the correct behaviour would be to download the media segment file for the time I'm seeking to, start playing it and then download constantly 1 or 2 media segment files each 9-10 seconds, as it does when I play the stream without timeshift.
My question is if this is a normal behaviour, or if something could be wrong with my m3u8 playlist or the client implementation. Anyone could help me to clarify this?
UPDATED: I can confirm this doesn't happen in iOS 7, so it seems to be a bug introduced by iOS 8.
I've been told by Apple that this is not a bug, but a feature. They've made the buffer bigger since iOS 8.
Problem:
I need to download mp4 video from media server and simultaneously feed the localhost httpserver (GCDWebServer) to enable video streaming.
What have i tried so far:
I am able to download a video file from media server and then provide the localhost url for streaming the video.
But this takes a lot of time in downloading. So i wanted to know if there is a way i can segment the video on the fly and keep updating the m3u8 file as and when my next segment is ready.
Ok, Why downloading? Basically i need to deal with a custom encrypted video file. So i need to download, then decrypt the stream and then feed to my local httpserver.
I hope i'm clear in the requirements.
I am trying to get the MPMovieplayerController to play incomplete video files. I want to use this so a user can download a part of a movie and play it offline. I am using ASIHTTP so i can resume downloads, if i try to play the temporary the player does nothing and and i get no errors. Also i registered for the MPMoviePlayerContentPreloadDidFinishNotification notification but it does not get dispatched. When the file is done downloading i can play it correctly.
Is it possible to somehow play incomplete video files? Alternatives to MPMovieplayerController are also welcome if that offers a solution.
I needed to download the first part op the mp4 file and the last part before i was able to play the video. So i am now downloading the first mb of the file then the last mb then the middle part and now i can play the file while it is not complete.