I am trying find a way to transcode an rtsp stream to HTTP (iOS) so that I can view a rtsp stream on a ipad. The video is embedded in our SaaS web view, and launching a third party player is not a possibility.
I found sirannon which according to the documentation can do this no prob.
However I am puzzled on how to actually execute.
Our rtsp stream is as such rtsp:\xxx.xxx.xxx.xxx:554\ch0_unicast_firststream
there is no .sdp file or anything. And vlc can play it fine.
But if i open a browser and attempt to open http://localhost:8080/RTSP-proxy/192.168.33.216/ch0_unicast_firststream
or
http://localhost:8080/RTSP-proxy/192.168.33.216:554/ch0_unicast_firststream
it gives me this error
[1516250] Warning: core.HTTP-server: Handling RuntimeError: Could not
guess container type for
URL(/RTSP-proxy/192.168.33.216/ch0_unicast_firststream)
(core.HTTP-server.session-42)
So far I haven't found any goo dexamples using sirannon. I am also open to using VLC, but again, do not know if or how to do a rtsp to http conversion with VLC.
With wowza we are able preemptively start the root stream ahead of the http stream request.
With a delay of about 3 seconds, the client is able to connect nicely. Then we monitor the client list to see if it is empty, if so, we kill the stream.
I've been digging into this for personal use and have gotten the same error. After some digging (notably in src/Communicator/HTTP/HTTPSession.cpp), I've made some progress.
While it's not explicitly in the documentation, it seems that you have to add the container type to the app type in the proxy, just as if you were streaming a file. As such:
http://localhost:8080/RTSP-proxy#[CONTAINER_TYPE]/192.168.33.216:554/ch0_unicast_firststream
Related
I'm trying to stream audio from a Twilio call to a browser.
I want to use Twilio Media Streams which send base64 encoded data in 8000 sample rate, audio/x-mulaw (according to this: https://www.twilio.com/docs/voice/twiml/stream)
I tried playing back the audio in a browser using audioContext.decodeAudioData but I am getting an exception:
DOMException: The buffer passed to decodeAudioData contains an unknown content type.
I think I need to resample the data and add a header or something of that sort but I'm unable to figure it out.
Any help would be much appreacited
Twilio developer evangelist here.
I have not tried this myself, so I can only give some pointers. You are right that the audio from Twilio is coming as audio/x-mulaw and that browsers do not support this format.
There is a good set of documentation on MDN about web audio codecs and containers. CanIUse has data on browser support for common formats (e.g. mp3).
So you will need to resample the audio into a supported format before you send it to the browser. I don't have any suggestions for tools for that, particularly as I don't know what you are building your server in.
Alternatively, if you need the audio from a call in a browser, have you considered using Twilio Client to dial into the call?
I'm trying to get a livestream working on youtube. I want to stream 360° content with H264 video and AAC audio. The stream is started with the youtube live api from my mobile app and librtmp is used to deliver video and audio packets. I easily get to the point where the livestream health is good and my broadcast and stream are bound successfully.
However, when I try to transition to "testing" like this:
YoutubeManager.this.youtube.liveBroadcasts().transition("testing", liveBroadcast.getId(), "status").execute();
I get stuck on the "startTesting" status every time (100% reproducible) while I expect it to change to testing after few seconds to allow me to change it to live.
I don't know what's going on as in the youtube live control room everything seems to be fine but the encoder won't start.
Is it a common issue? Is there a mean to access the encoder logs? If you need more information feel free to ask me.
Regards.
I found a temporary fix !
I noticed 2 things :
When the autostart option was on, the stream changed its state to startLive as soon as I stopped sending data. It suggested that the encoder was trying to start but it was too slow to do it before some other data paket was received (I guess)
When I tried to stream to the "Stream now" URL, as #noogui suggested, it worked ! So I checked out what was the difference in the stream now & event configurations.
It turned out I just had to activate the low latency option as it's done by default in the stream now configuration.
I consider it as a temporary fix because I don't really know why the encoder isn't starting otherwise and because it doesn't work with the autostart option... So I hope it wont break again if Youtube does another change on their encoder.
So, if you have to work with the Youtube api, good luck guys !
is this possible to create live event by simply using video file instead of web camera? I don't see option like this in live event creation
For doing this directly on youtube: No
For doing this by encoding some video file and push to youtube in real time: Yes
How to do?
Try wirecast play. Just like a live-feed console but free with some limit. Also other rtmp server may work. One of them is ffmpeg. I tried before and can ensure it works. But it's a backend with only command line. For more functionality, you need a front-end app(you can stream/pipe to ffmpeg).
About ffmpeg rtmp read this:
https://www.ffmpeg.org/ffmpeg-protocols.html#rtmp
Several people have tried to cache pre-loaded video data using AVPlayer or MPMoviePlayerController, for example
Caching with AVPlayer and AVAssetExportSession
Access the data of AVPlayer when playing video from HTTP Live Streaming
The most straightforward approach would seem to be using AVExportSession on player's currentItem, but nobody seems to be able to get it to work.
My question is: is it is possible to transparently proxy the video requests on the device, with an embedded HTTP server backed by a disk-based cache?
I can run an embedded web server (GCDWebServer), so my question is
Will caching screw up AVPlayer's bandwidth-optimization code that tries to select the highest-bandwidth stream possible? If this is an issue, I can control the stream so it only provides one option.
Is disk performance sufficient to provide an improvement over the network? It would seem like it obviously would be, but I've seen a variety of articles around the web talking about how slow disk I/O is on iOS.
Thanks!
For HTTP Live Streaming:
If the embedded web server is to host the media segment files that comprise the HTTP live stream feed, then the files would need to be downloaded to the device already, unless you reconfigure the webserver to act as a proxy.
In either case, it seems that a simpler way would be to download and parse the index file (typically prog_index.m3u8) to get the list of the media segment files and then just initiate download of each one.
There is a WWW page with Flash stream on it. I want to download and forward this stream to another streaming server, when possible - replace audio stream (e.g. translate), but without recompressing video stream. Usual way for this ATM is to capture and broadcast Flash player view from the web page, which is obviously suboptimal because video needs to be recompressed, making the quality notably worse and loading the cpu.
Has someone an idea how to do it? VLC seems to be able making relay, but it also seems not to support RTMP at all.
if you're ready to do this programmatically you can use crtmpserver (C++) or red5 (Java) with any RTMP client, otherwise this question doesn't belong to SO