Does Looping remotely fetched video in AVPlayer cause redownload? - ios

I'm using a github repo to playback video on an app, specifically Player. I'm trying to better understand the code and AVFoundation in general:
If I set a NSURL for the AVAssetURL with a remote server URL video and into the AVPlayer's AVPlayerItem, is it streaming the data from the remote URL? My guess is that this is true for the first play (and that it isn't downloaded all at once and then played, please correct me if I am wrong)
And then if I continuously loop the video that I started playing (by setting the seekToTime to kCMTimeZero once it has ended), am I causing the AVPlayer/Asset to continuously re-stream/re-download the file every time it loops? Or is it cached until the AVPlayer/Asset is released?
If anyone could help me answer or point me to the right Apple docs, I would appreciate it! Thanks!
Another similar (?) question said AVAssetResourceDownloader, but I'm not looking to download the file to local disk (if that's what it does).

You don't download the file but you fill the AVPlayer buffer (a sort of cache)
If you seek to zero you don't download the files since you have the buffer.
You can compare the AVPlayer buffer to the YouTube one.

Related

How can I stream MP4 videos from S3 without AVPlayer downloading the files before playing them?

I have a lot of long (45 mins - 90 mins) MP4 videos in a public S3 bucket and I want to play them in my iOS app using AVPlayer.
I am using AVPlayerViewController to play them but I need to wait several minutes before they start playing as it downloads the whole video rather than streaming it.
I am caching it locally so this is only happening the first time but I would love to stream the video so the user doesn't have to wait for the entire video to download.
Some people are pointing out that I need Cloudfront to stream videos but in the documentation, I've read that this is only necessary when you have many people streaming the same file. I'm building a MVP so I only need a simple solution.
Is there any way to stream an MP4 video from an S3 bucket with AVPlayerViewController without it fully downloading the file before playing it to the user?
TLDR
AVPlayer does not support 'streaming' (HTTP range requests) as you would define it, so either use an alternative video player that does or use a real media streaming protocol like HLS which is supported by AVPlayer & would start the video before downloading it all.
CloudFront is great for delivery in general but is not truly needed - you may have seen it mentioned due to CloudFront RTMP distributions but they now have been discontinued.
Detailed Answer
S3 supports a concept called byte-range fetches using HTTP range requests - you can verify this by doing a HEAD request to your video file & seeing that the Accept-Ranges header exists with a value set to bytes (or not 'none').
Load your MP4 file in the browser & notice that it can start as soon as you click play. You're also able to move to the end of the video file and yet, you haven't really downloaded the entire video file. HTTP range requests are what allow this mechanism to work. Small chunks of the video can be downloaded as & when the user gets to that part of the video. This saves the file server & the user bandwidth while providing a much better user experience than the client downloading the entire file.
The server would need to support byte-range fetches in the first instance before the client can then decide to make range requests (or not to). The key is that, once the server supports it, it is up to the HTTP client to decide whether it wants to fetch the data in chunks or all in one go.
This isn't really 'streaming' as you know it & are referring to in your question but it is more 'downloading the video from the server in chunks and playing it back' using HTTP 206 Partial Content responses.
You can see this in the Network tab of your browser as a series of multiple 206 responses when seeking in the video. The entire video is not downloaded but the video is streamed from whichever position that you skip to.
The problem with AVPlayer
Unfortunately, AVPlayer does not support 'streaming' using HTTP range requests & HTTP 206 Partial Content responses. I've verified this manually by creating a demo iOS app in Xcode.
This has nothing to do with S3 - if you stored these files on any other cloud provider or file server, you'd see that the file is still fully loaded before playing.
The possible solutions
Now that the problem is clear, there are 2 solutions.
Using an alternative video player
The easiest solution is to use an alternative video player which does support byte-range fetches. I'm not an expert in iOS development so I sadly can't help in recommending an alternative but I'm sure there'll be a popular library that the industry prefers over the in-built AVPlayer. This would provide you with your (extremely common) definition of 'streaming'.
Using a video streaming protocol
However, if you must use AVPlayer, the solution is to implement true media streaming with a video streaming protocol - true streaming also allows you to leverage features like adaptive bitrate switching, live audio switching, licensing etc.
There are quite a few of these protocols available like DASH (Dynamic Adaptive Streaming over HTTP), SRT (Secure Reliable Transport) & last but not least, HLS (HTTP Live Streaming).
Today, the most widely used streaming protocol on the internet is HLS, created by Apple themselves (hey, maybe the reason to not support range requests is to force you to use the protocol). Apple's own documentation is really wonderful for delving deeper if you are interested.
Without getting too much into protocol detail, HLS will allow playback to start more quickly in general, fast-forwarding can be much quicker & delivers video as it is being watched for the true streaming experience.
To go ahead with HLS:
Use AWS Elemental MediaConvert to convert your MP4 file to HLS format - the resulting output will be 1 (or more) .M3U8 manifest files in addition to .ts media segment file(s)
Upload the resulting output to S3
Point AVPlayer to the .M3U8 file
let asset = AVURLAsset(url: "https://ermiya.s3.eu-west-1.amazonaws.com/videos/video1playlist.m3u8")
let item = AVPlayerItem(asset: asset)
...
Enjoy near-instant loading of the video
CloudFront
In regards to Amazon CloudFront, it isn't required per se & S3 is sufficient in this case but a quick Google search will mention loads of benefits that it provides, especially caching which can help you save on S3 costs later on.
Conclusion
I would go with converting to HLS if you can, as it will yield more possibilities down the line & is a better true streaming experience in general, but using an alternative video player will work just as well due to iOS AVPlayer restrictions.
Whether to use CloudFront or not, will depend on your user base, usage of S3 and other factors.
As you're creating an MVP, I would recommend just doing a batch conversion of your MP4 files to HLS format & not using CloudFront which would add additional complexity to your cloud configuration.
Like #ErmiyaEskandary said, you could just use HLS to solve your problem, which is probably a good idea, but you should not have to wait for the entire MP4 file to download before playing it with AVPlayer. The issue is actually not with AVPlayer or byte-range requests at all, but rather with how your MP4 files are formatted.
You could have your MP4 file configured incorrectly for streaming. MP4's have a metadata section called the MOOV atom. By default, many encoders put this at the back of the file. In this case, the player would have to download the entire file before it could begin playing.
For streaming usecases, this would need to be put at the front of the file. The player then will only need to buffer the MOOV atom, and it can begin playing the video as the data is loaded.
You can use ffmpeg with the fast start flag enabled to move the MOOV atom to the beginning of the file.

Why video start buffing with offline server?

I am doing the functionality of downloading video and playing in offline mode. Here I am using NexPlayer with GCDWebServer my videos are encoded and needs to sync with server. I am using GCDWebServer for offline mode but after some video play its starts buffering.
So my question is this is offline server and we already having all data so why its buffering, I am not getting this. Please suggest something or can I use any other server instate of GCDWebServer.
Sorry to late reply but I found the solution for my problem.
Actually I set some buffering values with NexPlayer as online video playing but as we have a file locally stored, we don't want any buffing values to interrupt the process.
So I just removed them, simply set to 0 and it works perfectly as per expectations.
No buffing, happy life :)

Single AVPlayer with both streaming and non-streaming content

I'm building a video player that should handle both streaming and non-streaming content and I want it to be playable with AirPlay.
I'm currently using multiple AVPlayer instances (one for each clip), and it works okay, but the problem is it doesn't give a very smooth experience when using AirPlay. The interface jumps back and forth between each clip when switching AVPlayer, so I would like to migrate to using a single AVPlayer. This seems like a trivial task, but I haven't yet found a way to do this.
This is what I've tried so far:
Using a single AVPlayer with multiple AVPlayerItems and switching between those using replaceCurrentItemWithPlayerItem. This works fine when switching between streaming->streaming clips or non-streaming->non-streaming, but AVPlayer doesn't seem to accept replacements between streaming->non-streaming or vice versa. Basically, nothing happens when I try to switch.
Using an AVQueuePlayer with multiple AVPlayerItems fails for the same reason as above.
Using a single AVPlayer with a single AVPlayerItem based on an AVMutableComposition asset. This doesn't work because streaming content is not allowed in an AVMutableComposition (and AVURLAssets created from a streaming url doesn't have any AVAssetTracks and they are required).
So is there anything I am missing? Any other suggestion on how to accomplish this?
I asked this question to Apple's Technical Support and got the answer that it's currently not possible to avoid the short jump back to menu interface, and that no version of AVPlayer supports mixing of streaming and non-streaming content.
Full response:
This is in response to your question about how to avoid the short jump back to the main interface when switching AVPlayers or AVPlayerItems for different media items while playing over AirPlay.
The issue here is the same with AVPlayer and AVQueuePlayer: no instance of AVPlayer (regardless of which particular class) can currently play both streaming and non-streaming content; that is, you can't mix HTTP Live Streaming media (e.g. .m3u8) with non-streaming media (a file-based resource such as an .mp4 file).
And with regard to AVMutableComposition, it does not allow streaming content.
Currently, there is no support for "seamless" video playback across multiple items. I encourage you to file an enhancement request for this feature using the Apple Bug Reporter (http://developer.apple.com/bugreporter/).
AVComposition is probably the best option at present for "seamless" playback. However, it has the limitation just described where streaming content is not allowed.

How can I stream a movie in iOS and playback from the filesystem later?

I've got an app that currently ships with all the videos it can play embedded in it. This doesn't scale well, and unless you want to play all the movies, wastes disk space. It also makes it less desirable to upgrade the app because you have to re-download all movies.
What I would like to do is download the movie on the fly, play it back while downloading, and then if it's successfully downloaded, save it to the file system so that next time they want to watch it, it streams from the local file.
I can do whatever is needed to the video, but currently I'm serving it up as an .mp4 file from Amazon S3, with a mimetype of video/mp4, and so the first half of my issue works fine: the movie downloads, and MPMovieViewController will start playing it as soon as it thinks it has downloaded "enough."
Is there any way to tap into the cache of that video file so that I can save it and control how long it resides on the filesystem? This seems like it would be the easiest approach.
I am targeting iOS 5+6, but if the only solution available required iOS 6, I would consider it also. Thanks!
UPDATE: Using AFNetworking, I am now half-way there, I think. I am downloading the video file from the server, and listening for the download progress. Once I see 25% of the video has been downloaded, I start playback on the local file using an MPMoviePlayerController.
The main issue I'm running into now is playback seems to get screwed up. It's going along fine, 25% downloaded, playback starts... download continues normally... then the file finishes downloading completely, and shortly thereafter video freezes. The onscreen playback timer still indicates playback is ongoing and I don't see any "playback finished" type notifications, but the video is frozen. My guess based on the behavior is that perhaps the initial buffer for the video playback was used up, and it isn't detecting that more video is available on disk now?
Is there any way to interact with MPMoviePlayerController to let it know periodically to refresh the buffer it's playing out of? Or some other way to handle this situation?
UPDATE: Make sure to see the newer answer from #TomHamming.
I have yet to find a conclusive answer, but at this time I believe the answer is: you can't reliably do this. At least not without a lot of work which seems too much like a hack. I filed a feature request with Apple as it really seems like this should be possible with some adjustments to MPMoviePlayerController.
I will go over the variety of things I tried or considered, and the results I encountered.
Pass MPMoviePlayerController a URL to your movie file, which allows it to stream, and then pull the file out of the cache it was saved into, into your local Documents folder. Won't work, as of iOS 6. I filed a feature request with Apple, but as it stands now there's no way to get your hands on the file they are downloading, AFAIK.
Start downloading the movie file with NSURLConnection (or something like AFNetwork), and then when a "decent amount" has been downloaded to the device, pass the file URL to the MPMoviePlayerController and let it stream from disk. Sort of works, but not well. Three problems:
It's really hard to know when to start playing the file. I haven't figured out the algorithm Apple uses, and so I always erred on the side of caution, waiting for 25% to be downloaded before playing.
The MPMoviePlayerController interface provides no sense of the movie being streamed, as it does when Apple is doing the calculations via the network. It appears to the user that the file is totally downloaded when it really is not.
And most importantly, MPMoviePlayerController seems to not work well with playing a file that is not completely downloaded. I experienced playback problems once the file finished downloading, or if the player caught up with the amount downloaded, and never found a graceful way to handle these situations.
Same procedure as above, but use AVFoundation classes to more finely control the playback process, and avoid the issues described above regarding playback stopping, etc. Might work, but I want all the features of MPMoviePlayerController. Re-implementing MPMoviePlayerController myself just to get this one feature seems like a waste of time.
Same procedure as #1 above, but run a small web server in your app to handle streaming the video from the disk to MPMoviePlayerController, with the hope being that the streaming would work more like it normally does when streaming the file directly from an external web server. Works, but results were still sporadic and performance seemed to suffer. I did my test with CocoaHTTP. I decided against this approach because it just felt like a terrible hack.
Run a lightweight HTTP proxy, thus intercepting the downloaded movie file data as it gets streamed from the internet into your MPMoviePlayerController. Not sure if this works or not. I was not able to test this yet, as I have not found a lightweight HTTP proxy written in Objective-C, and at this point don't feel like implementing one just to try this experiment. It seems like the next easiest of all these hacks to implement -- if you don't have to write the proxy!
At this point I've decided to go the less-hacky, but also less user-friendly route of simply downloading the file completely, and then passing it to MPMoviePlayerController, until a better solution comes along.
You can do this as of iOS 10 with AVAssetDownloadTask. See this WWDC 2016 session and this documentation.
Alternatively, if your movie isn't DRM'd, you can do it with AVAssetResourceLoaderDelegate, which effectively lets you give an AVPlayer an arbitrary stream of bytes. See this walkthrough.

How do you write audio to the first frame with AVAssetWriter while capturing video/audio on iOS?

Long story short, I am trying to implement a naive solution for streaming video from the iOS camera/microphone to a server.
I am using AVCaptureSession with audio and video AVCaptureOutputs, and then using AVAssetWriter/AVAssetWriterInput to capture video and audio in the captureOutput:didOutputSampleBuffer:fromConnection method and write the resulting video to a file.
To make this a stream, I am using an NSTimer to break the video files into 1 second chunks (by hot-swapping in a different AVAssetWriter that has a different outputURL) and upload these to a server over HTTP.
This is working, but the issue I'm running into is this: the beginning of the .mp4 files appear to always be missing audio in the first frame, so when the video files are concatenated on the server (running ffmpeg) there is a noticeable audio skip at the intersections of these files. The video is just fine - no skipping.
I tried many ways of making sure there were no CMSampleBuffers dropped and checked their timestamps to make sure they were going to the right AVAssetWriter, but to no avail.
Checking the AVCam example with AVCaptureMovieFileOutput and AVCaptureLocation example with AVAssetWriter and it appears the files they generate do the same thing.
Maybe there is something fundamental I am misunderstanding here about the nature of audio/video files, as I'm new to video/audio capture - but thought I'd check before I tried to workaround this by learning to use ffmpeg as some seem to do to fragment the stream (if you have any tips on this, too, let me know!). Thanks in advance!
I had the same problem and solved it by recording audio with a different API, Audio Queue. This seems to solve it, just need to take care of timing in order to avoid sound delay.

Resources