I've made an application to cast HLS streams on a Chromecast.
It works well with VOD streams (non live), but it's not with a LIVE stream.
So here is my question : Can Chromecast read LIVE streams ?
Yes, it can and many chromecast application already do. You may want to use our MPL library, or use your own player. You may need to write a custom receiver if the Styled/Default receiver is not doing what you need.
Related
I'm playing videos using AVPlayer in my iOS application, and now want to add chrome cast support.
1- As per this link, we can view chrome-cast button when video is playing. Is it the same case with AVPlayer?
2- As per Apple's requirement, my videos are encoded and are in m3u8 format. Can we play that in chrome cast?
Well, you can try to check this Google Cast documentation, it includes API libraries and sample application code to help your applications go big. These APIs are documented in the API references, and the sample code is discussed in the Sender Applications and Receiver Applications overviews.
To answer the question if you can play the m3u8 format in the Chrome Cast, first, you can check this Supported Media for Google Cast to know all the supported media facilities and types in the Google Cast.
Note that Some of these require additional coding or the Media Player Library. See Receiver Applications for more information about developing your receiver application to support these media types.
For more information, check these SO questions:
ChromeCast doesnt play HLS in .m3u8 format
Streaming .m3u8 format using Chromecast
I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere.
I have a Windows desktop app which captures audio from the selected audio interface and streams it as ยต-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In Android we have "android.net.rtp" package to manage this and transmit or receive data streams over the network.
Is there any kind of equivalent package for iOS to implement this? Could you give me any kind of reference / sample to do this, or just tell me where to start?
You can see this libraryHTTPLiveStreaming, But his protocol maybe is not standard one, You can check my fork aelam/HTTPLiveStreaming-1, I'm still working on it, it can be played by ffplay. You can try
check the file rtp.c in ffmpeg, I think it will help out
Despite reading the documentation it not not clear to me exactly what " Google Cast Media Player Library" is and whether it is the route I need to take for my Chromecast app.
What I am trying to achieve is to play media from my local IOS device on Chromecast. My main aim to to play users videos and photos and not necessarily DRM media.
Up till now I have been doing this by exporting the AVAsset and then passing file address this to a simple HTTP server. This seems horribly inefficient and I thought I could use AVAssetReader to pass a stream to Chromecast. During my research I came across terms
MPEG-DASH -
SmoothStreaming
HTTP Live Streaming (HLS)
But I do not understand whether I need such complex implementations
I find the name - Google Cast Media Player Library, to be very ambiguous and there is no concise explanation of what it is.
https://developers.google.com/cast/docs/player
This is a piece of the definition given there:
... It provides JavaScript support for parsing manifests and playing HTTP
Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also
provide support for HLS AES encryption, PlayReady DRM, and Widevine
DRM.
I hope this is not ambiguous; if your media has encryption and/or you are dealing with adaptive streams of the types specified (HLS, ..), then this library can help you. If you are playing a simple mp4 or showing images, you don't need to use this library.
There is plenty of posts in this forum on how to cast local media; it amounts to embedding a local tiny embedded web server in your sender app and then sending the url of the media (that is now exposed through your embedded web server via a URL) to chromecast and have your receiver show or play that media tiem (via the url that was exposed).
I'm developing a website who needs an external RTMP stream.
I'm using jwplayer to run the stream using Flash (examples and information about here).
My problem is the stream do not works at iOS.
Somebody suggests a solution?
IOS does not support rtmp protocol for that you has to use http protocols, i.e. ,http live streaming
I'm going to develop an application, whose main task would be send video frames captured from device camera to server. Server uses protocol over TCP. I heard that Apple restricts developers from using any video streaming protocols, except HTTP live streaming. Is this information correct? Will be there any problems while approving my app in appstore?
After some digging into the topic, found some info, here is another topic. And for video broadcasting from device we cannot use HLS.