There is a WWW page with Flash stream on it. I want to download and forward this stream to another streaming server, when possible - replace audio stream (e.g. translate), but without recompressing video stream. Usual way for this ATM is to capture and broadcast Flash player view from the web page, which is obviously suboptimal because video needs to be recompressed, making the quality notably worse and loading the cpu.
Has someone an idea how to do it? VLC seems to be able making relay, but it also seems not to support RTMP at all.
if you're ready to do this programmatically you can use crtmpserver (C++) or red5 (Java) with any RTMP client, otherwise this question doesn't belong to SO
Related
I want to be able to load only the audio stream of the youtube video and process it (EQ, Effects, etc.) through a graph of Web Audio nodes.
Is this doable? Any open-source work out there, doing that?
Thanks in advance to all and any responses.
No, because you can't get audio streams cross-domain. (that is, if your code could be hosted on YouTube.com, sure, but not from mydomain.com.)
The reason for this (you CAN do it if CORS is set up, but it's not on YouTube) is because if you can get the audio stream, you can do a bit-copy of the data. Just like images, they don't want to leak the raw data.
Not sure if this is something obvious or not. After creating an YouTube LiveBroadcast, binding that to a LiveStream with a specific CDN format (let's say "720p"), and transitioning the broadcast from "ready" to "live" ... how can I change the stream quality without having to create a new broadcast?
Trying to unbind the current stream - exception is returned, cannot unbind the stream.
Trying to bind broadcast to another stream - same exception as above.
In addition, after looking through the support pages for YouTube live streaming, it is suggested that "ingest settings cannot be modified after the broadcast has started" - it says nothing about the actual API not being able to support this, but it looks like a major limitation from somewhere deeper. I only thought it applies to the web Live Control room.
I need this functionality so that I can change the stream quality for when a user switches from WiFi to mobile data. Currently streaming RTMP data in another resolution that what the LiveStream CDN format is configured for, results in health errors and encoding artifacts on YouTube's side. As suggested by the support pages, creating a "1080p" live stream ("maximum expected resolution") should work, but when that stream is receiving a 720p or 480p stream, depending on whether it was started or not, it either doesn't start at all, or goes to a gray scene with high-pitch audio (my stream is sent correctly, since I can output it to a dozen more outputs, like MP4, FLV, and other RTMP servers).
Solution?
Several people have tried to cache pre-loaded video data using AVPlayer or MPMoviePlayerController, for example
Caching with AVPlayer and AVAssetExportSession
Access the data of AVPlayer when playing video from HTTP Live Streaming
The most straightforward approach would seem to be using AVExportSession on player's currentItem, but nobody seems to be able to get it to work.
My question is: is it is possible to transparently proxy the video requests on the device, with an embedded HTTP server backed by a disk-based cache?
I can run an embedded web server (GCDWebServer), so my question is
Will caching screw up AVPlayer's bandwidth-optimization code that tries to select the highest-bandwidth stream possible? If this is an issue, I can control the stream so it only provides one option.
Is disk performance sufficient to provide an improvement over the network? It would seem like it obviously would be, but I've seen a variety of articles around the web talking about how slow disk I/O is on iOS.
Thanks!
For HTTP Live Streaming:
If the embedded web server is to host the media segment files that comprise the HTTP live stream feed, then the files would need to be downloaded to the device already, unless you reconfigure the webserver to act as a proxy.
In either case, it seems that a simpler way would be to download and parse the index file (typically prog_index.m3u8) to get the list of the media segment files and then just initiate download of each one.
i am streaming some FTA channels from
http://www.tbsdtv.com/products/tbs6985-dvb-s2-quad-tuner-pcie-card.html
using mediaportal
http://www.team-mediaportal.com/
and then i get rtsp url from mediaportal of channel i timeshift
and vlc i can send that stream to mediaserver FMS to get HLS, HDS, RTMP, RTSP
i have 3 servers running erlyvideo (flussonic)
so it take care of the delivery.
i want some alternate solution beside that
i have done some methods to work this our
including
VLC
IPTVL
Dvbdream
but the quality is better when i stream some thing as file, only FMLE works good with live stream, but for that we only can use directshow enabled devices like
http://www.viewcast.com/products/osprey-cards
i am doing it on windows.
if some one have any more methods or want to share his version please do so
I was wondering if I can use an HTTP protocol to acquire an image stream from an RTSP camera? I am currently using VLC Media ActiveX Plugin to connect to and view the RTSP stream, but I would like to eliminate the ActiveX control and move to a more raw level of image acquisition. I recall seeing somewhere that it's possible to get these images using HTTP. I'd like to use Indy TIdHTTP component to connect to the camera and acquire the image. I'm also assuming this would need some sort of speed control, such as a delay in-between requests. However, it's also my understanding that these RTSP cameras have pre-defined frame rates, which using the standard RTSP protocol are supposed to follow.
many cameras will allow you to grab screenshots with a URL that might look like:
http://user:password#camera/snapshot.jpg
for a proper stream, you would need to use RTSP (there are Delphi RTSP clients), tunnelling over HTTP if your device supports the application/x-rtsp-tunnelled content type, or another stream your device supports.