Adaptive Bitrate Streaming (ABR) - system-design

How does an "Adaptive Bit Rate Streaming" works? For instance, how does Netflix or Youtube manages to continue playing the video from that very timestamp with a different resolution? How do they get to know about the bandwidth or network speed of any client? What if a particular video resolution was not available at the client's nearest OCA or CDN?

Related

AVFoundation, limiting video streaming playback to WiFi only

I'm developing a player using AVFoundation. The source videos I have no control over but they certainly violate:
https://developer.apple.com/streaming/
REQUIREMENTS FOR APPS
...If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming. (Progressive download may be used for smaller clips.)...
I'm talking mp4 videos that are 100Mb+ for 3 to 5 min clips.
Again, 0 control over the source material but I have to play them.
Looking at AVFoundation, none of the classes I'm using, AVPlayer, AVPlayerItem, AVQueuePlayer and so on have a similar property to a NSURLSessionConfiguration's .allowsCellularAccess (at least that I can see)
So agreement with the client was to limit the streaming to Wifi only, but I see no way around at the moment to force streaming over WiFi only.
Any pointers on how to get around it?
Any help is MUCH appreciate

How to avoid audio only screen when using Apple's HTTP Live Streaming for video?

I'm working with Apple's HTTP Live Streaming protocol which, when submitting apps to the App Store, required that there's an audio only stream as part of the multiplex. As a result of this the first segment (10 seconds) is always audio only and the image below is shown instead of the beginning of the video stream, regardless of the amount of bandwidth that's actually available.
I know I can show a static image instead but I'm wondering if there's a way to delay the stream starting until it's determined if there's enough bandwidth to go straight to the video stream.
The order of your bitrates in the manifest file is key, as the device try to play the bitrates segments in order. We recommend listing the bitrates from highest to lowest to avoid starting the video off with the worst bitrate and then switching only when iOS had detected sufficient bandwidth.

How does tvtak work for real-time content recognition for TV broadcast channel?

tvtak is a platform for TV content recognition.
It can auto-recognize real-time broadcast and offline advertising video.
Tvtak's core technology should be the real-time image matching between front-end image(from tv viewer's phone camera)
and backend frame images(from real-time capturing broadcast frame).
The question is :
1. How tvtak can get the real-time broadcast channels stream?. We know the channels are encrypted by cable operator! Does tvtak need to corporate with cable operator? Or do they get the video from some free internet broadcast stream?
2. What may be the matching algorithm for tvtak?
3. How do tvtak get the electronic program guideline (EPG) for all channels?
http://www.tvtak.com/developers.html says they take the real-time streams and index them on-the-fly:
Live TV -In the back-office, TvTak indexes in real-time broadcast TV
channels in multiple countries. Video is not recorded but only
analyzed in real-time to produce the reference matching identifiers.
Cues for Pre-recorded Clips – for ad spots, movie trailers, or any other pre-recorded content, reference IDs can be generated in advance.
I doubt anyone will tell you the algorithm
Why do they need the EPG? They have the live streams, which include things like "programme name" as meta-data (I assume!)

How do you display the bitrate of the currently playing HTTP live stream?

I'm building some pages that are using HTTP Live Streaming for the iPad. For some reason, the videos appear very low quality and I'm wondering if the iPad is not accurately detecting the available bandwidth.
http://m.wgbh.org/Apps/Explore/2012/1/preview_AmericasTestKitchen.cfm
...is an example. On the iPad, the video is very poor quality, whereas the source file looks great. I know the iPad is choosing the quality based on the available bandwidth, but even when on a very fast WiFi connection, it seems to choose the 110kbps stream. The video itself is being served from Amazon's S3 CDN, so I know it's not a network issue.
Is there a way to expose the decision the device is making about which bitrate stream to play? Is it possible to display the bitrate of the current HTTP Live Stream on the page itself?
AVPlayerItem *thisItem = self.player.currentItem;
for (AVPlayerItemAccessLogEvent *event in [[thisItem accessLog] events]) {
NSLog(#"indicated bitrate is %f", [event indicatedBitrate]);
NSLog(#"observerd bitrate is %f", [event observedBitrate]);
}

How to use HTTP Live Streaming protocol in iPhone SDK 3.0

I have developed on IPhone application and submitted to App store. But my application got rejected based on below criteria.
Thank you for submitting your yyyyyyyy
application. We have reviewed your
application and have determined that
it cannot be posted to the App Store
at this time because it is not using
the HTTP Live Streaming protocol to
broadcast streaming video. HTTP Live
Streaming is required when streaming
video feeds over the cellular network,
in order to have an optimal user
experience and utilize cellular best
practices. This protocol automatically
determines bandwidth available to
users and adjusts the bandwidth
appropriately, even as bandwidth
streams change. This allows you the
flexibility to have as many streams as
you like, as long as 64 kbps is set as
the baseline feed.
In my apps I have to stream prerecorded m4v and mp3 files from my server. I used MPMoviePlayerController to stream and play those videos / audio.
How to implement the HTTP Live Streaming Protocol in my apps? Also can I get some sample code?
Thanks in advance!
There are many documents about Apple's HTTP Live Streaming:
HTTP Live Streaming Overview
IETF HTTP Live Streaming Internet-Draft
There are many encoder devices claim to support this protocol e.g.,
Inlet's Spinnaker, acquired by Cisco and renamed to Cisco Media Processor Family.
For a software solution, please give a visit to Wowza
Please check the below notes specified in Apple documentation.
****Important: iPhone and iPad apps that send large amounts of audio or video data over cellular networks are required to use HTTP Live Streaming.****

Resources