AVFoundation, limiting video streaming playback to WiFi only - ios

I'm developing a player using AVFoundation. The source videos I have no control over but they certainly violate:
https://developer.apple.com/streaming/
REQUIREMENTS FOR APPS
...If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming. (Progressive download may be used for smaller clips.)...
I'm talking mp4 videos that are 100Mb+ for 3 to 5 min clips.
Again, 0 control over the source material but I have to play them.
Looking at AVFoundation, none of the classes I'm using, AVPlayer, AVPlayerItem, AVQueuePlayer and so on have a similar property to a NSURLSessionConfiguration's .allowsCellularAccess (at least that I can see)
So agreement with the client was to limit the streaming to Wifi only, but I see no way around at the moment to force streaming over WiFi only.
Any pointers on how to get around it?
Any help is MUCH appreciate

Related

Can I stream an MP4 video of duration > 10 minutes on iOS

I'm building an app that is playing different streaming videos. The file that I'm playing in my AVPlayer object is an MP4 file.
Reading through the App Store Review Guidelines I just noticed that the rule 2.5.7 says:
Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
What does this mean exactly? Can I stream an MP4 video which is longer than 10 minutes?
If your MP4 video is less than 10 minutes then presumably you can just put it on a server somewhere and have the player download the file (progressive download) - you don't need to use a streaming protocol like HLS. However, if your video is more than 10 minutes then you must use HLS. This means segmenting your video into chunks and creating a playlist for them. You can do this with Apple's streaming tools - such as mediafilesegmenter - or you can use ffmpeg to segment your videos.
That guideline is for cellular networks only, so it doesn't apply if the user is connected via wifi. Take a look at Apple's recommendations for encoding your video(s) for HLS.
It explicitly says you CAN play a video longer than 10 minutes. However, the guidelines say your app will be rejected if it doesn't meet the requirements stated.
Admittedly I've never attempted a 10 minute video playback, but from the documentation it seems to imply your media must allow 192 kbps playback (presumably for cellular data plans) and also must conform to the HTTP Live Stream protocol.
Here is the technical documentation Apple provides on HTTP Live streaming
https://developer.apple.com/library/ios/technotes/tn2224/_index.html
Best of luck! Please let me know if I can help with anything more specific :)

AVPlayer seekToTime download an insane amount of media segment files consuming a lot of data

I'm working in an app where I'm able to play a HLS m3u8 playlist of a streaming radio (audio only) without any problem using an instance of AVPlayer. Using Charles I can see how the playlist is updated properly in a normal pace (each 9-10 seconds, which takes one media segment file). When I perform a seekToTime: (back in time), the player success playing the stream from when I want to, but in Charles I observe how the player starts dowloading a huge amount of media segment files, consuming a lot of data. It seems that the player downloads all the media segment files until that time and then keeps again with the normal behaviour.
I understand that the correct behaviour would be to download the media segment file for the time I'm seeking to, start playing it and then download constantly 1 or 2 media segment files each 9-10 seconds, as it does when I play the stream without timeshift.
My question is if this is a normal behaviour, or if something could be wrong with my m3u8 playlist or the client implementation. Anyone could help me to clarify this?
UPDATED: I can confirm this doesn't happen in iOS 7, so it seems to be a bug introduced by iOS 8.
I've been told by Apple that this is not a bug, but a feature. They've made the buffer bigger since iOS 8.

HLS streaming, advices about segment size configuration

we are developing a mobile application which will need to play 10 seconds videos.
The first version will only support iOS (iPhone & iPad). To have a good quality on all devices we will use Adaptive Streaming.
I thorougly read the Apple HLS documentation and it seems that 10 seconds is a good tradeoff for the size of the HLS segments.
So if we use the default 10 seconds in our case segmentation is not really useful.
As we are on a Mobile app with very small videos I'm wondering if for some devices / network conditions changing this 10s "default" to a smaller value could be better ?
Is it possible to speed-up the starting of the video by lowering this value ?
I suppose the 10 seconds "default" we find everywhere is a good choice and advice for videos which have to be played "everywhere" (Desktop, Smartphone, Tablet), but perhaps an other value would be more appropriate for Smartphones only ?
Finally do you think that in our case HLS is not a good choice and that simply using progressive download of an MP4 video is better?
Thanks in advance for your responses.
Mp4 will be better. Adaptive streaming works well for long content, but very poorly for short videos. The player will not have enough time to to adapt, and your viewers you almost always just see the default quality.
Apple only requires HLS for videos that are longer than X seconds (I cant remember exactly what X is, but it is larger than 10)

How to avoid audio only screen when using Apple's HTTP Live Streaming for video?

I'm working with Apple's HTTP Live Streaming protocol which, when submitting apps to the App Store, required that there's an audio only stream as part of the multiplex. As a result of this the first segment (10 seconds) is always audio only and the image below is shown instead of the beginning of the video stream, regardless of the amount of bandwidth that's actually available.
I know I can show a static image instead but I'm wondering if there's a way to delay the stream starting until it's determined if there's enough bandwidth to go straight to the video stream.
The order of your bitrates in the manifest file is key, as the device try to play the bitrates segments in order. We recommend listing the bitrates from highest to lowest to avoid starting the video off with the worst bitrate and then switching only when iOS had detected sufficient bandwidth.

Streaming audio: HTTP Live Streaming a must for app store approval?

I'm on the eve of submitting an app to the store that streams audio over cellular and Wi-Fi, and realizing that the app may be in danger of getting rejected.
The app is for a radio station with an existing streaming architecture, and setting up the HTTP Live Streaming protocol would add a fifth and sixth stream to the mix -- potentially a very complicated setup. So, to minimize complexity on the station's end, the app code currently uses the iphone_radio open-source library to get the streams to work. According to the creator of that library, it is used in an app that is in the store, Radio Javan.
A quick Google finds many different rejection cases for video streaming, but few if any for audio. Apple's policy on HTTP Live Streaming is not very clear about audio:
If your app delivers video over cellular networks, and the video
exceeds either 10 minutes duration or 5 MB of data in a five minute
period, you are required to use HTTP Live Streaming. (Progressive
download may be used for smaller clips.)
If your app uses HTTP Live Streaming over cellular networks, you are
required to provide at least one stream at 64 Kbps or lower bandwidth
(the low-bandwidth stream may be audio-only or audio with a still
image).
These requirements apply to iOS apps submitted for distribution in the
App Store for use on Apple products. Non-compliant apps may be
rejected or removed, at the discretion of Apple.
One line that jumps out, though, is the 64 Kbps. The current streams are 128 Kbps, though dropping them down to 64 Kbps is relatively trivial compared to switching them to HTTP Live Streaming.
Is it even worth submitting the app to the store as-is (128 Kbps streams), or am I pretty much guaranteed to get rejected for not using the Live Streaming protocol? What about if I drop the streams down to 64 Kbps?
From App Store Review Guidelines :
9.3 Audio streaming content over a cellular network may not use more than 5MB over 5 minutes
9.4 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 64 kbps HTTP Live stream.
https://developer.apple.com/appstore/resources/approval/guidelines.html#media-content
So App Store Guidelines doesn't set requirement for AUDIO stream to have 64 kbps HTTP Live stream.
According to this there is no even requirement to use HLS unless audio stream uses more than 5MB over 5 minutes data.
I have an app in the store with http live streaming for audio only. No problems from apple on approval, but we use 48kbps streams only.

Resources