I'm trying to stream audio from a Twilio call to a browser.
I want to use Twilio Media Streams which send base64 encoded data in 8000 sample rate, audio/x-mulaw (according to this: https://www.twilio.com/docs/voice/twiml/stream)
I tried playing back the audio in a browser using audioContext.decodeAudioData but I am getting an exception:
DOMException: The buffer passed to decodeAudioData contains an unknown content type.
I think I need to resample the data and add a header or something of that sort but I'm unable to figure it out.
Any help would be much appreacited
Twilio developer evangelist here.
I have not tried this myself, so I can only give some pointers. You are right that the audio from Twilio is coming as audio/x-mulaw and that browsers do not support this format.
There is a good set of documentation on MDN about web audio codecs and containers. CanIUse has data on browser support for common formats (e.g. mp3).
So you will need to resample the audio into a supported format before you send it to the browser. I don't have any suggestions for tools for that, particularly as I don't know what you are building your server in.
Alternatively, if you need the audio from a call in a browser, have you considered using Twilio Client to dial into the call?
Related
Not sure if this is something obvious or not. After creating an YouTube LiveBroadcast, binding that to a LiveStream with a specific CDN format (let's say "720p"), and transitioning the broadcast from "ready" to "live" ... how can I change the stream quality without having to create a new broadcast?
Trying to unbind the current stream - exception is returned, cannot unbind the stream.
Trying to bind broadcast to another stream - same exception as above.
In addition, after looking through the support pages for YouTube live streaming, it is suggested that "ingest settings cannot be modified after the broadcast has started" - it says nothing about the actual API not being able to support this, but it looks like a major limitation from somewhere deeper. I only thought it applies to the web Live Control room.
I need this functionality so that I can change the stream quality for when a user switches from WiFi to mobile data. Currently streaming RTMP data in another resolution that what the LiveStream CDN format is configured for, results in health errors and encoding artifacts on YouTube's side. As suggested by the support pages, creating a "1080p" live stream ("maximum expected resolution") should work, but when that stream is receiving a 720p or 480p stream, depending on whether it was started or not, it either doesn't start at all, or goes to a gray scene with high-pitch audio (my stream is sent correctly, since I can output it to a dozen more outputs, like MP4, FLV, and other RTMP servers).
Solution?
Context
Most RTP streams (from e.g. an IP camera) need some information from a SDP to be able to decode them.
SDP is usually fetched just in time, usually from a RTSP URL but other means are possible (e.g. HTTP).
Specific case
We have a situation where an RTP stream (from a camera, UDP sent at all time whether anyone listens or not) will be played using VLC, but providing VLC an RTSP URL to fetch SDP just in time is not an option.
There is a RTSP service yet we need to query it in advance and dump the resulting SDP file to feed it to VLC later. Doing a RTSP query just-in-time is useless anyway since the stream exists at all times.
How to do that with VLC?
Search before you post
Of course I've been searching Google, videolan wiki and StackExchange.
Information is difficult to find because when people talk about streaming, RTSP, RTP, they are generally usig VLC to generate a RTP stream, or output a SDP that VLC generates because it does the encoding, etc.
It's not the case here. The SDP to dump comes from the serveur with a single RTSP query.
Question
Basically, I'm looking for a command-line like:
vlc --sout...something...rtsp://sourceIP:Port/...something...out...myfile.sdp
That would dump the SDP in myfile.sdp.
Then, later, running vlc with the myfile.sdp as argument is expected to play the stream.
We did not find a solution using VLC alone (I even looked a little at the VLC source code). So we used a somehow "brute force" solution but hey, it works.
What we do at configure time is ask VLC to play stream once, while Wireshark captures packets with filter rtsp and sdp. One packet appears containing the SDP data we want. We select it and use "extract selected bytes to ..." and save to a file with name ending with .sdp.
That gives us a file containing the SDP information we want. Job done.
From what have gathered so far, Apple provided tools to make Mac to act as HTTP Live Streaming server. But my goal is different. I want to make iDevices to be the HTTP Live Streaming server. (for local network only)
Can it be done at all?
Yes and no. Apple does not provide a way to stream encoded media data, so that part is 100% up to you. Also, Apple does not provide a way to access encoded frames directly (i.e. you can easily get an encoded file or the raw frames, but not easily get "encoded frames'). So you need to develop a way to get these encoded frames from the files for streaming, or encode the raw frames on the fly.
It may or may not fit your use case, but if you first right the streamer portion, you should be able to say small/short clips to disk, and stream them out as they are created with minimal overall latency.
I am trying find a way to transcode an rtsp stream to HTTP (iOS) so that I can view a rtsp stream on a ipad. The video is embedded in our SaaS web view, and launching a third party player is not a possibility.
I found sirannon which according to the documentation can do this no prob.
However I am puzzled on how to actually execute.
Our rtsp stream is as such rtsp:\xxx.xxx.xxx.xxx:554\ch0_unicast_firststream
there is no .sdp file or anything. And vlc can play it fine.
But if i open a browser and attempt to open http://localhost:8080/RTSP-proxy/192.168.33.216/ch0_unicast_firststream
or
http://localhost:8080/RTSP-proxy/192.168.33.216:554/ch0_unicast_firststream
it gives me this error
[1516250] Warning: core.HTTP-server: Handling RuntimeError: Could not
guess container type for
URL(/RTSP-proxy/192.168.33.216/ch0_unicast_firststream)
(core.HTTP-server.session-42)
So far I haven't found any goo dexamples using sirannon. I am also open to using VLC, but again, do not know if or how to do a rtsp to http conversion with VLC.
With wowza we are able preemptively start the root stream ahead of the http stream request.
With a delay of about 3 seconds, the client is able to connect nicely. Then we monitor the client list to see if it is empty, if so, we kill the stream.
I've been digging into this for personal use and have gotten the same error. After some digging (notably in src/Communicator/HTTP/HTTPSession.cpp), I've made some progress.
While it's not explicitly in the documentation, it seems that you have to add the container type to the app type in the proxy, just as if you were streaming a file. As such:
http://localhost:8080/RTSP-proxy#[CONTAINER_TYPE]/192.168.33.216:554/ch0_unicast_firststream
There is a WWW page with Flash stream on it. I want to download and forward this stream to another streaming server, when possible - replace audio stream (e.g. translate), but without recompressing video stream. Usual way for this ATM is to capture and broadcast Flash player view from the web page, which is obviously suboptimal because video needs to be recompressed, making the quality notably worse and loading the cpu.
Has someone an idea how to do it? VLC seems to be able making relay, but it also seems not to support RTMP at all.
if you're ready to do this programmatically you can use crtmpserver (C++) or red5 (Java) with any RTMP client, otherwise this question doesn't belong to SO