Selecting other qualities in a hls live streaming - ios

I am trying to develop an ios application which permits to visualize a streaming using the protocol hls. As hls is by default adaptive, I can only select the most suitable quality for my connection. I cannot access other qualities from my developed ios application. Is there a way to access the four streams from an m3u8 link? I am using objective c as language.

m3u8 is a text based file format so you can just open it and parse the sources it is describing.

Related

Chrome-cast with AVPlayer iOS

I'm playing videos using AVPlayer in my iOS application, and now want to add chrome cast support.
1- As per this link, we can view chrome-cast button when video is playing. Is it the same case with AVPlayer?
2- As per Apple's requirement, my videos are encoded and are in m3u8 format. Can we play that in chrome cast?
Well, you can try to check this Google Cast documentation, it includes API libraries and sample application code to help your applications go big. These APIs are documented in the API references, and the sample code is discussed in the Sender Applications and Receiver Applications overviews.
To answer the question if you can play the m3u8 format in the Chrome Cast, first, you can check this Supported Media for Google Cast to know all the supported media facilities and types in the Google Cast.
Note that Some of these require additional coding or the Media Player Library. See Receiver Applications for more information about developing your receiver application to support these media types.
For more information, check these SO questions:
ChromeCast doesnt play HLS in .m3u8 format
Streaming .m3u8 format using Chromecast

live video streaming formats supported in iOS programming

I am working live video streaming in iPad app. I have done it previously using MPMediaPlayer with HTTP servers.
From link here I understand that "iOS devices support HTTP progressive download for .mp4 files, the server could be simply Apache or Nginx. The user experience is quite similar to HTTP live streaming , RTSP is also possible. You can migrate live555 to iOS platform as the RTSP client, as use DarwinStreamingServer as the RTSP server.
But my client has provided video streaming URLs in UDP (eg. udp://225.X.X.X:XXXXX ) format and also suggested to refer links link1 and link2 as support to his statement that this will work in iOS for live video streaming.
But I am unable to relate provided links by him with requirement.
My doubts are -
what all formats does iOS supports for live video streaming ?
Does this udp link is of use for me in iOS for video streaming ?
On wifi, iOS supportS everything, because you have access to raw sockets, and the h.264 decode via VideoToolbox. So any protocol can be developed even If there is no out of box support. If you want the video to work over a cellular network, it MUST use HLS. no other options. (Unless you are Facebook, then apple will grant exceptions to this policy)

Google Cast Media Player Library - for streaming from Local Device

Despite reading the documentation it not not clear to me exactly what " Google Cast Media Player Library" is and whether it is the route I need to take for my Chromecast app.
What I am trying to achieve is to play media from my local IOS device on Chromecast. My main aim to to play users videos and photos and not necessarily DRM media.
Up till now I have been doing this by exporting the AVAsset and then passing file address this to a simple HTTP server. This seems horribly inefficient and I thought I could use AVAssetReader to pass a stream to Chromecast. During my research I came across terms
MPEG-DASH -
SmoothStreaming
HTTP Live Streaming (HLS)
But I do not understand whether I need such complex implementations
I find the name - Google Cast Media Player Library, to be very ambiguous and there is no concise explanation of what it is.
https://developers.google.com/cast/docs/player
This is a piece of the definition given there:
... It provides JavaScript support for parsing manifests and playing HTTP
Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also
provide support for HLS AES encryption, PlayReady DRM, and Widevine
DRM.
I hope this is not ambiguous; if your media has encryption and/or you are dealing with adaptive streams of the types specified (HLS, ..), then this library can help you. If you are playing a simple mp4 or showing images, you don't need to use this library.
There is plenty of posts in this forum on how to cast local media; it amounts to embedding a local tiny embedded web server in your sender app and then sending the url of the media (that is now exposed through your embedded web server via a URL) to chromecast and have your receiver show or play that media tiem (via the url that was exposed).

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.

How to convert URL file 'Smooth streaming' to Apple HLs format?

In
iOS App
for Live streaming
I am getting the response from server
like this, (You can check)
http://4a75a0cce3694e29bc670b3d574fec92.cloudapp.net/push.isml/manifest
Which is Smooth Streaming file.
How to play this file in my ios app.
Is there any run time converter to convert this file to Apple HLS ?
Any player*(Smooth Streaming)* like
OSMF plugin for iOS
HTML5 player
Silverlight plugin for iOS
Actually :
Actually we have not tried much with Azure Framework. We setted up
the IIS server and got the live streaming.
Played stream by HTML5 video tag in web view.
By following this link
http://www.hanselman.com/blog/CommentView.aspx?guid=86968cd5-feeb-47f2-b02e-1eb4fa556379#commentstart
We can able to play live streaming in our iOS devices.
Still I am happy if I can configure the windows AZure framework to do the same.
You can use the Windows Azure Media Services for converting SmoothStreaming media content into Apple HLS content. Either you use the Management Portal to upload your media assets and then encode it with the preset "Playback on iOS devices and PC/MAC" or you use the REST Api for the Windows Azure Media Services at runtime.
With the REST API you can utilize the Windows Media Packager to encode Smooth Streaming content for HLS. Find a sample configuration for that task here.

Resources