Despite reading the documentation it not not clear to me exactly what " Google Cast Media Player Library" is and whether it is the route I need to take for my Chromecast app.
What I am trying to achieve is to play media from my local IOS device on Chromecast. My main aim to to play users videos and photos and not necessarily DRM media.
Up till now I have been doing this by exporting the AVAsset and then passing file address this to a simple HTTP server. This seems horribly inefficient and I thought I could use AVAssetReader to pass a stream to Chromecast. During my research I came across terms
MPEG-DASH -
SmoothStreaming
HTTP Live Streaming (HLS)
But I do not understand whether I need such complex implementations
I find the name - Google Cast Media Player Library, to be very ambiguous and there is no concise explanation of what it is.
https://developers.google.com/cast/docs/player
This is a piece of the definition given there:
... It provides JavaScript support for parsing manifests and playing HTTP
Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also
provide support for HLS AES encryption, PlayReady DRM, and Widevine
DRM.
I hope this is not ambiguous; if your media has encryption and/or you are dealing with adaptive streams of the types specified (HLS, ..), then this library can help you. If you are playing a simple mp4 or showing images, you don't need to use this library.
There is plenty of posts in this forum on how to cast local media; it amounts to embedding a local tiny embedded web server in your sender app and then sending the url of the media (that is now exposed through your embedded web server via a URL) to chromecast and have your receiver show or play that media tiem (via the url that was exposed).
Related
In my app, I want to display some mp4 tutorial videos for the user using AVPlayerViewController. I upgraded to a Pro account on Vimeo, enabling me to use the direct links to my videos.
It gives you a couple of options for which kind of link to use: high def, standard def, and HTTP Live Streaming. I'm a little bit confused on which to use. My videos aren't live streamed, but I see that HTTP Live Streaming can dynamically adjust the size of the file according to the users internet connection.
I don't know much about video, does HTTP Live Streaming make sense to me here if I'm not streaming anything live or should I just have the user download the entire video?
It's a bit of a misnomer - HTTP Live Streaming (HLS) is just the name of the protocol and is not necessarily used for streaming of live content.
HLS is simply a method used for serving the best quality video file (pre-recorded/pre-saved) for the given viewing environment.
Apple's HLS documentation is found here: https://developer.apple.com/streaming/
I'm playing videos using AVPlayer in my iOS application, and now want to add chrome cast support.
1- As per this link, we can view chrome-cast button when video is playing. Is it the same case with AVPlayer?
2- As per Apple's requirement, my videos are encoded and are in m3u8 format. Can we play that in chrome cast?
Well, you can try to check this Google Cast documentation, it includes API libraries and sample application code to help your applications go big. These APIs are documented in the API references, and the sample code is discussed in the Sender Applications and Receiver Applications overviews.
To answer the question if you can play the m3u8 format in the Chrome Cast, first, you can check this Supported Media for Google Cast to know all the supported media facilities and types in the Google Cast.
Note that Some of these require additional coding or the Media Player Library. See Receiver Applications for more information about developing your receiver application to support these media types.
For more information, check these SO questions:
ChromeCast doesnt play HLS in .m3u8 format
Streaming .m3u8 format using Chromecast
I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.
I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.
Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..
(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?
(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?
(3) How will my server communicate with a streaming server like Wowza ?
(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.
It would be great if someone gives a graphical representation of the data flow.
Thanks a lot in Advance !!!
Let me quickly add my understanding to your questions:
1a. Why Media Server? ..
You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.
1b. What actually Media Server do to make its role necessary?
A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.
An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.
In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the
streaming server?
You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.
How will my server communicate with a streaming server like Wowza?
The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.
How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.
What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.
I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)
The docs are a little hard to parse here. I was wondering if there was any way to
Stream YouTube live into an iOS app, without significant/any YouTube branding.
Stream from an iOS device as a broadcast stream for YouTube live.
My initial Googling turned up mixed responses. I was hoping to see an example of this if it's possible, or save myself some time if it's not.
Suppose I have a person on ATT next to a person on Verizon streaming content, and I want to make both appear as a single uninterrupted stream switching back and forth. Does YouTube or a library do to anything to facilitate this?
Streaming from an iOS device is no different than streaming from any other device. You would have to write an h264 encoder and RTMP packetizer, and send the video to your YouTube stream object's ingestionAddress. Outlining the details of the encoder beyond the above is too broad for Stack Overflow, but I highly recommend looking at the VideoCore iOS project.
As far as branding goes, the only way to play back YouTube content in an iOS app without breaking YouTube's terms of service is to play the video in a UIWebView or YouTube's iOS player helper library (which is just a web view with some playback interfaces).
There is no way to completely remove YouTube branding from the IFrame player. However, there are branding options you can toggle using the modestBranding flag on the player. See the IFrame docs here.
I am new to multimedia and iOS programming and I came across Weborb while Googling, which provides RTMP library for iOS. It doesn't clearly mention that if it can be used to stream live video through a media server like Red5.
If any one have used this, please let me know that whether it can be used to stream live video from iPhone to a media server and where does it fit in the whole setup.
Does it act like a server itself between a media server and the iPhone application or does it also have its own media server?
I also want some links for tutorials which can help me start the real coding pertaining to RTMP streaming to a media server?
Thanks.
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server.
As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.