I have an HD video that I am streaming to an iOS app. I want to allow the user the ability to cap the max stream quality (low, medium, high) considering the video is several GBs when streaming at the max bit rate. Along the same lines, I would like to automatically choose a setting based on cellular vs wifi connection, for the obvious data-cap reasons.
I have no problem getting the current bit rate by accessing the AVPlayerItemAccessLogEvent, but am lost when it comes to forcing a lower quality stream.
Is this even possible with HLS? Thanks!
If you are using AVPlayer, the right way should be
preferredPeakBitRate
From Apple doc here, The desired limit, in bits per second, of network bandwidth consumption for this item.
It's not exactly dynamic, but I did solve this problem by creating four different m3u8 playlists. I labeled each playlist to represent a stream quality (low, medium, high, extreme). The user would select one based on the desired max quality. The extreme playlist includes the URLs of all qualities. The high playlist has less URLs than the extreme, the medium less URLs than the high, and the low less URLs than the medium. Whenever the user selects a different quality, I would just switch the base stream playlist to the respective quality playlist URL.
Here is a simple example of the four different playlists.
HLS_Movie_Extreme.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=64000
stream-0-64000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=350000
stream-1-350000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000
stream-2-800000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1200000
stream-3-1200000/index prog_index.m3u8 m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1800000
stream-4-1800000/prog_index.m3u8
HLS_Movie_High.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=64000
stream-0-64000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=350000
stream-1-350000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000
stream-2-800000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1200000
stream-3-1200000/index prog_index.m3u8 m3u8
HLS_Movie_Medium.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=64000
stream-0-64000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=350000
stream-1-350000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=800000
stream-2-800000/prog_index.m3u8
HLS_Movie_Low.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=64000
stream-0-64000/prog_index.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=350000
stream-1-350000/prog_index.m3u8
Like I said, it's not dynamic, but you could use various techniques to get the users network connection and point to the desired quality playlist if needed. For me, it was sufficient to get the user's preference, and adjust the stream accordingly.
Related
iOS (seemingly especially with iOS 15), isn't always the best at taking an HLS m3u8 playlist with multiple playback quality options and selecting the best one to play. For instance even with a great internet connection, iOS will often pick one of the lower qualities of the available options and take quite awhile to transition to the higher quality one. Users aren't too happy about this, and understandably, since I can paste the direct link to a high quality video stream in the playlist (without audio) into Safari and it loads instantly when I do it manually, versus the slower AVPlayer implementation.
Is it possible to use AVAssetResourceLoader to intercept the HLS playlist and strip out a few of the lower quality options and thus prevent iOS from choosing the lower quality options more?
I noticed in this question on the Apple Developer forums it was indicated by an employee it might be possible, however alongside Apple's demo code I've been unable to find a clear way to do this.
An example might be the following HLS manifest:
#EXTM3U
#EXT-X-VERSION:4
#EXT-X-STREAM-INF:CLOSED-CAPTIONS=NONE,BANDWIDTH=308000,AVERAGE-BANDWIDTH=279000,RESOLUTION=162x288,CODECS="avc1.42001e"
HLS_224.m3u8
#EXT-X-STREAM-INF:CLOSED-CAPTIONS=NONE,BANDWIDTH=522000,AVERAGE-BANDWIDTH=481000,RESOLUTION=180x320,CODECS="avc1.42001e"
HLS_270.m3u8
#EXT-X-STREAM-INF:CLOSED-CAPTIONS=NONE,BANDWIDTH=969000,AVERAGE-BANDWIDTH=886000,RESOLUTION=244x432,CODECS="avc1.4d001e"
HLS_360.m3u8
#EXT-X-STREAM-INF:CLOSED-CAPTIONS=NONE,BANDWIDTH=1529000,AVERAGE-BANDWIDTH=1388000,RESOLUTION=360x640,CODECS="avc1.4d001f"
HLS_540.m3u8
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=20113,RESOLUTION=162x288,CODECS="avc1.42001e",URI="HLS_224-iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=29898,RESOLUTION=180x320,CODECS="avc1.42001e",URI="HLS_270-iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=51637,RESOLUTION=244x432,CODECS="avc1.4d001e",URI="HLS_360-iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=79210,RESOLUTION=360x640,CODECS="avc1.4d001f",URI="HLS_540-iframe.m3u8"
Wherein I'd like to strip out the 224 and 227 options and leave the 360 and 540 options. This is of course with the assumption that the HLS playlist is hosted on a server I don't implicitly have access to so I can't edit the files directly.
The answers on Apple development forum are a little bit old. The good news is that iOS 15 introduced some new APIs to work with bitrates in multi-variant playlists. When loading video, use the AVURLAsset and AVAssetResourceLoader as you suggested. After loading the asset you get info which playlist variant the system selected by accesing all variants with calling AVURLAsset.variants , currently selected variant by calling following code on AVPlayerItem:
if let lastEvent = self.playerItem?.accessLog()?.events.last {
let selectedBitRate = lastEvent.indicatedAverageBitrate
selectedAverageBitRate = selectedBitRate
selectedIndex = variants.firstIndex(where: {$0.averageBitRate ?? 0 == selectedBitRate})
}
For settingplaylist variant preferrences you can use:
playerItem?.variantPreferences
On the other hand , if you somehow want to limit the preferred bit rate use following:
playerItem?.preferredPeakBitRate
Hello Friends,
I am working on OTT platform app, I need to play video very smoothly without any delay like Snapchat and instagram as reference. I am using Cloudinary for uploading videos and everything is working good but at first time, AVPlayer takes time of 1-2 second to start video, which is bad thing for Me. Once video play, next time I come on same video it plays smoothly with less delay of max Half second.
As far as I tried to learn through different blogs and stack over flow answers, I get rid this is default AVPlayer Buffering time and it depends of video durations and its fetching video information like title, metadata etc. But I don't have to use these information anywhere.
I tried to set false this property of AVPlayer .automaticallyWaitsToMinimizeStalling = false, but still no luck.
I tried few solutions from StackOverflow posts, but didn't get success
How to reduce iOS AVPlayer start delay
This is demo video Link Which you can try http://res.cloudinary.com/dtzhnffrp/video/upload/v1621013678/1on1/bgasthklqvixukocv6xy.mov
If you can suggest, what I can use for OTT platforms to play video smoothly really grateful to everyone...
Thanks In Advance
Most streaming services use ABR, which creates multiple resolution copies of the video and beaks each into 2-10 second, typicaLLY, chunks.
One benefit of ABR is that to speed up playback start up, the video can start on a lower resolution bit rate and then 'step up' to higher bit rates as it proceeds.
You can often see this on popular streaming services where you will see the video quality is lower when the video starts and improves after a short time.
See here for more on ABRs: https://stackoverflow.com/a/42365034/334402
This requires you to do work on the server side to prepare the video for HLS and DASH streaming, the two most common ABR streaming protocols.
Typically dedicated streaming servers, or a combination of encoders and packagers, are use to prepare and serve the ABR streams. There are also cloud services, for example AWS Media Services or Azure Media Services, which allow on demand streaming models.
You can make the videos smaller either by reducing the dimensions or by compressing it more. Both of these have the effect of lowering startup time - but will sacrifice quality in exchange.
Cloudinary will create ABR versions for you, but the last I checked, you pay for each version created.
I heard from a WWDC video that it measures the speed of previous HLS downloads to pick which rendition to use, but how does it choose which one to use at the very start? Is the download speed of the file for the list of renditions or the download speed of the file for a specific rendition used at all? I want to make sure that I'm not tricking the video player into using too high quality of a rendition by loading metadata files instantly from the cache.
It picks the first entry:
The first entry in the variant playlist will be played at the initiation of a stream and is used as part of a test to determine which stream is most appropriate. The order of the other streams is irrelevant. Therefore, the first bit rate in the playlist should be the one that most clients can sustain.
From the Bit rate recommendations section of Apple's Technical Note TN2224:
Best Practices for Creating and Deploying HTTP Live Streaming Media for the iPhone and iPad
In simple terms how does video on demand and streaming video work over P2P? I assume videos are cut up into small pieces (a few seconds each) and these pieces are transferred in chunks. As soon as a user is finished watching a chunk, it is deleted from their computer. Wouldn't this mean if no user on the network was currently watching a certain instance (chunk/time slice?) of the video then it's permanently lost? If no, how does VoD over P2P work? If you store all the chunks then it's exactly the same as normal file sharing with P2P.
Let me know if any parts of the question are unclear and I'll try to improve it.
P2P Live: each user downloads and simultaneously uploads chunks for other users who watch the same stream. More users means better quality.
source: P2P TV - Wikipedia
P2P VOD: this is more challenging to achieve since like you noticed there's less simultaneity in the way users watch the video. In this case each user is expected to contribute a reasonable amount of disk space to store chunks for other users. The strategies concerning what to store on each user's cache are subject to ongoing research.
If you search for P2P VOD you will find a lot of white papers presenting different approaches. There are too many links to list here.
We have an FFMPEG stream being streamed to mobile devices. We're using the HTML5 <video src="..." webkit-playsinline> tag to display the video inline (inside a real-time streaming app). We've managed to reduce the delay at the FFMPEG end down to the minimum but there's still a lag at the iOS end, where the player presumably buffers for a couple of seconds.
Is there any way to reduce the client-side delay?
We need as close to real-time as possible and skipping is acceptable.
If you are using an HTML5 video tag then the iOS device will use Quicktime to playback the video. Apple offers no control over internal mechanism like buffer settings for its Quicktime player. For a project on Apple TV I even work with a guy in Cupertino at Apple and they just won't allow any access to the information you would require on their device.
Typically if you use HLS:
Is this a real-time delivery system?
No. It has inherent latency corresponding to the size and duration of the media files containing stream segments. At least one segment must fully download before it can be viewed by the client, and two may be required to ensure seamless transitions between segments. In addition, the encoder and segmenter must create a file from the input; the duration of this file is the minimum latency before media is available for download. Typical latency with recommended settings is in the neighborhood of 30 seconds.
What is the latency?
Approximately 30 seconds, with recommended settings. See question #15.
For live streaming scenario on iOS you better off tuning the streaming chain before the actual player:
capture -> transcoding -> upload -> streaming server -> delivery -> playback
Using ffmpeg you can tune for zero lantency streaming at transcoding level which I understand you have done. After that using a well established streaming server like Wowza and CDN delivery will help you get there (of course at a certain cost - and assuming you need a streaming server which you may not).
If you go all native for your iOS app you may look at MPMoviePlayerController. I have no experience with native app code in iOS so I let you decide if it is worth the time (still I doubt it will be possible because of the underlying Quicktime/HLS layer).
I also came across this which sounds interesting but I have not tested it and even with such an approach you will face limitations.
Even if it may not be the answer you were looking for I hope this helps.