Interested in the question m3u8-play video streams on chromecast device. Studying the docks as I understand it does not necessarily write Custom Receiver, ready enough to use Default Receiver or Styled Media Receiver. But some servers with HLS videos have the problem with CORS.
What are the options to solve this problem to play any m3u8-streams (from any server)?
Use CORS-proxy or something else?
You cannot get around the CORS requirement so your options are limited; the proxy approach seems to be your only option. Note that your options are the same regardless of using a custom receiver or a default/styled; even if you write a custom receiver, you will run into the very same CORS requirement.
Related
I am trying to implement adaptive bit rate with AVPlayer but i don't know how to switch between a low/high stream. I am a bit confused and have few questions:
Is it the sole responsibility of the server to implement HLS on its side OR the client also has to do something about it OR the client handles it automatically?
I am getting the following URLs from server, can someone tell me how to switch between the them based on network speed and what other steps are involved?
{
"VideoStreamUrl": "http://50.7.149.74:1935/pitvlive/aplus3.stream/playlist.m3u8?",
"VideoStreamUrlLow": "http://50.7.149.74:1935/pitvlive/aplus3_240p.stream/playlist.m3u8?",
"VideoStreamUrlHD": null
}
AVPlayer supports HLS natively from the framework so you shouldnt need to do anything to support this.
The framework will automatically switch between low and high streams according to the current available bandwidth, so you dont actually need to pick a stream.
Not sure if this is something obvious or not. After creating an YouTube LiveBroadcast, binding that to a LiveStream with a specific CDN format (let's say "720p"), and transitioning the broadcast from "ready" to "live" ... how can I change the stream quality without having to create a new broadcast?
Trying to unbind the current stream - exception is returned, cannot unbind the stream.
Trying to bind broadcast to another stream - same exception as above.
In addition, after looking through the support pages for YouTube live streaming, it is suggested that "ingest settings cannot be modified after the broadcast has started" - it says nothing about the actual API not being able to support this, but it looks like a major limitation from somewhere deeper. I only thought it applies to the web Live Control room.
I need this functionality so that I can change the stream quality for when a user switches from WiFi to mobile data. Currently streaming RTMP data in another resolution that what the LiveStream CDN format is configured for, results in health errors and encoding artifacts on YouTube's side. As suggested by the support pages, creating a "1080p" live stream ("maximum expected resolution") should work, but when that stream is receiving a 720p or 480p stream, depending on whether it was started or not, it either doesn't start at all, or goes to a gray scene with high-pitch audio (my stream is sent correctly, since I can output it to a dozen more outputs, like MP4, FLV, and other RTMP servers).
Solution?
I would like to know if it is posible to cast the audio taken directly from the microphone iOS device to the receiver. (in a live way)
I´ve downloaded all the git example projects, and in all of them use a "loadMedia" method to start the casting. Example of one of those:
- (NSInteger)loadMedia:(GCKMediaInformation *)mediaInfo
autoplay:(BOOL)autoplay
playPosition:(NSTimeInterval)playPosition;
Can I follow this approach to do what I want? If so, what´s the expected delay?
Thanks a lot
Echo is likely if the device (iOS, Android, or Chrome) is in range of the speakers. That said:
Pick a fast codec that is supported, such as CELT/Opus or Vorbis
I haven't tried either of these, but they should be possible.
Implement your own protocol using CastChannel that passes the binary data. You'll want to do some simple conversion of the stream from Binary to something a bit more friendly. Take a look at Intro to Web Audio for using AudioContext.
or, 2. Setup a trivial server to stream from on your device, then tell the Receiver to just access that local server.
According the Chromecast Developers page, Chromecast supports the SmoothStreaming container, which I believe uses video chunks with the .ismv extension. I am having problems getting those video files to play.
If I am not mistaken, Chrome/Chromecasts's implementation of the video tag only supports .mp4 and .webm files, so using cast.MediaLoadRequest (in a Chrome Sender App) would not work if you pass it a url for a manifest file or .ismv container.
It does seem possible to write code that stitches together MPEG-DASH chunks using the MediaSource API from a MPEG-DASH manifest file. However, it doesn't appear that Chrome's implementation of the MediaSource spec supports .ismv chunks and therefore a means to play smooth streaming video.
Assuming you parsed a manifest file to get the smooth streaming video chunks, how would it be possible for Chromecast to play .ismv h.264 containers, such as the ones that can be found here? Or does Chrome not support .ismv files? If so, what SmoothStreaming containers does Chrome/Chromecast support?
Chromecast supports MPEG-DASH and Smooth Streaming. See more detail here:
https://developers.google.com/cast/supported_media_types
We'll provide some code snippet of smooth streaming soon. Stay tuned.
The default Receiver provided does not support SmoothStreaming (nor MPEG-DASH).
You'll need to code your own receiver to do so.
See https://stackoverflow.com/a/17978070/2665789 for a little more help.
Hopefully Google posts working samples of Live streaming soon!
You can throw SmoothStreaming to some sample receivers provided by Google.
The cast-custom-receiver and the Cast-Media-Player-Library-Sample supports SS with PlayReady encryption out of the box.
Well, you need to do some tricks like modify the extension from "ism/" to "ism/Manifest" and it just work. You'll need to do the same in the [cast-sender-tool-chrome] adding the file extension to the list of three inside the main html file.
i am trying to set up a few ip cameras for a client.
yet i am having trouble getting the stream from the camera to the webpage then for it to play through a player. i have tried different players such as WMP, VLC(cant get it to work). i am now trying to use jplayer which would be great if i could get it to work on this one as works for phones or so it says.
what i have done so far is to go into my router(thomson TG585 v8) and set up port forwarding.
i have set up port 554 on TCP/UDP which in the camera settings is the RTSP port.
i have set my camera to a static ip and am using no-ip for the ddns.
rtsp://thepolishedknob.servebeer.com/h264/1/media.amp
this is the url i am trying to use but i cannot get the stream on to the player i know the stream works outside my LAN as connected to it through VLC last night.
if anyone can help me it would be greatly appreciated as been working on this for over a week and been going nowhere.
so if you have noticed i have missed anything out or done something wrong please let me know.
There is no universal way to do the embedding. Typically solution is one of the following:
As IP cameras comes with web interface, they already provide some way to present video off the web page. Often, this is an ActiveX control based solution with respective browser limitations. You can check HTML and duplicate the code, and it can also be mentioned in the vendor's documentation.
As you discovered the stream is RTSP, and hopefully valid RTSP without tricks, there might be a third party solution "player" to present the stream off the webpage.
You might want to re-encode the stream into another format (VLC or another aplication), such as M-JPEG which is more browser friendly. You'll find a lot of discussions for this, e.g. http://forum.videolan.org/viewtopic.php?f=16&t=57715