In our app, we have an HLS index file which contains multiple HLS streams for different Bandwidths. Our Lowest Badwidth stream is only audio.
We want to show a message to our users when the app has degraded to the Audio Only stream.
Is there a way to detect when we have switched to an audio only stream?
Thanks!
I am streaming m3u8 url in an iOS application.
I am referring Apple HLS Catalog sample application.
I could stream and add offline the url.
Also playing from the offline is possible.
Is it possible to stream and save the video partially (eg:- user trying to stop the video, I need to save up the user cancelled part)
This question is in regards to DailyMotion's hls.js API
My goal is to save on data usage when not connected to WiFi by playing only the audio portion of a HLS video stream.
I have looked at similar questions for other APIs but have not found anything relevant to the hls.js API.
Details:
I tested my live stream HLS file on your demo page. It identified 1 audio track and displayed it in the Audio Track Controls. At the bottom of this post I am including the format of my HLS file with identifying info changed.
Question:
Will the hsl.js API allow me to force the playback to only audio once I have identified the lack of a WiFi connection? What setting or command would I use to do that? Alternatively, can I force playback to the lowest resolution?
Thanks,
RKern
HLS File Format:
#EXTM3U
#EXT-X-VERSION:5
#EXT-UPLYNK-LIVE
#EXT-X-START:TIME-OFFSET=0.00
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="aac",NAME="unspecified",LANGUAGE="en",AUTOSELECT=YES,DEFAULT=YES
#UPLYNK-MEDIA0:416x234x30,baseline-13,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=416x234,BANDWIDTH=471244,CODECS="mp4a.40.5,avc1.42000d",FRAME-RATE=30.000,AUDIO="aac",AVERAGE-BANDWIDTH=411975
http://content-ause1.uplynk.com/channel/test/d.m3u8?pbs=test
#UPLYNK-MEDIA0:704x396x30,main-30,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=704x396,BANDWIDTH=873267,CODECS="mp4a.40.5,avc1.4d001e",FRAME-RATE=30.000,AUDIO="aac",AVERAGE-BANDWIDTH=688830
http://content-ause1.uplynk.com/channel/test/e.m3u8?pbs=test
#UPLYNK-MEDIA0:896x504x30,main-31,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=896x504,BANDWIDTH=1554841,CODECS="mp4a.40.5,avc1.4d001f",FRAME-RATE=30.000,AUDIO="aac",AVERAGE-BANDWIDTH=1171051
http://content-ause1.uplynk.com/channel/test/f.m3u8?pbs=test
#UPLYNK-MEDIA0:1280x720x30,main-31,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=1280x720,BANDWIDTH=3328000,CODECS="mp4a.40.5,avc1.4d001f",FRAME-RATE=30.000,AUDIO="aac",AVERAGE-BANDWIDTH=2414865
http://content-ause1.uplynk.com/channel/test/g.m3u8?pbs=test
#UPLYNK-MEDIA0:192x108x15,baseline-11,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=192x108,BANDWIDTH=136226,CODECS="mp4a.40.5,avc1.42000b",FRAME-RATE=15.000,AUDIO="aac",AVERAGE-BANDWIDTH=120009
http://content-ause1.uplynk.com/channel/test/b.m3u8?pbs=test
#UPLYNK-MEDIA0:256x144x30,baseline-12,2x48000
#EXT-X-STREAM-INF:PROGRAM-ID=1,RESOLUTION=256x144,BANDWIDTH=259601,CODECS="mp4a.40.5,avc1.42000c",FRAME-RATE=30.000,AUDIO="aac",AVERAGE-BANDWIDTH=232565
http://content-ause1.uplynk.com/channel/test/c.m3u8?pbs=test
I'm making an iOS video player application that gets a whole bunch of URLs of videos from a web service and plays them in an MPMoviePlayerViewController.
Most of the videos are just .mp4 or .mov files, but some of them are live HTTP streams. The only thing I have is a URL and those don't indicate whether it's a file or stream.
Since I'm using MPMovieControlStyleNone and my own video controls, I need to be able to detect whether or not a video is a file or a live stream in order to change my controls.
Is there any way I can detect that?
Use an HTTP HEAD request. If the Content-Type is application/vnd.apple.mpegurl, then it's an http live streaming stream.
I have a test application that uses AVPlayer to play video specified by an m3u8 HLS playlist. The playlist specifies several alternate audio streams, similar to the "Listing 10" sample playlist provided by Apple found here: http://developer.apple.com/library/ios/#technotes/tn2288/_index.html#//apple_ref/doc/uid/DTS40012238-CH1-ALTERNATE_MEDIA The app needs to be able to switch among the alternate audio streams while the video is playing. For example, the app should be able to switch among the English, French, and Spanish audio streams by the user tapping buttons in the app while the video is playing.
Which AVFoundation classes and methods would be used by the AVPlayer and its related objects to switch among the audio streams that are specified in the m3u8 playlist? I have looked at the AVFoundation class documentation but do not see how to do this.
A link to some sample code that shows how to do this would be great. I have been searching the web for this information without success. Thanks for any help with this.
For m3u8 playback with AVPlayer, it looks like you cannot use an AVAsset to construct an AVPlayerItem. You need to construct an AVPlayerItem from the URI directly. Upon instantiating an AVPlayer with this AVPlayerItem, and then KVO listening on the property #"status", you will have an asset within the [[avPlayerInstance currentItem] asset] if the status is AVPlayerStatusReadyToPlay. This is described on page 20 of the AV Foundation Programming Guide.
To change audio to various alternates use:
AVMediaSelectionGroup *audioSelectionGroup = [[[avPlayerInstance currentItem] asset] mediaSelectionGroupForMediaCharacteristic: AVMediaCharacteristicAudible];
NSLog(#"audioSelectionGroup: %#", audioSelectionGroup);
// [audioSelectionGroup options] // Array of the options in the group above.
And the select AVMediaSelectionOption (the audio channel you want) with:
[[avPlayerInstance currentItem] selectMediaOption:avMediaSelectionOptionInstance] inMediaSelectionGroup: audioSelectionGroup];
The same will work for video.
This is described in the "Selection of audio and subtitle media according to language and other criteria" section of the AV Foundation Release Notes for IOS 5 (3rd section).