I want to create an app which plays videos using the RTP protocol.
I just need to know if anyone can provide any resources/websites, as to where I can find information on how to use the RTP protocol.
What about the RFC's?
RTP RFC 3550:
http://www.ietf.org/rfc/rfc3550.txt
Also you should take a look at the RTCP protocol for controlling an RTP stream.
RTCP RFC 3605:
http://www.ietf.org/rfc/rfc3605.txt
For implementing a RTP server in C++ you could take a look at JRTPLIB:
http://research.edm.uhasselt.be/~jori/page/index.php?n=CS.Jrtplib
Related
I'm trying to stream audio from a Twilio call to a browser.
I want to use Twilio Media Streams which send base64 encoded data in 8000 sample rate, audio/x-mulaw (according to this: https://www.twilio.com/docs/voice/twiml/stream)
I tried playing back the audio in a browser using audioContext.decodeAudioData but I am getting an exception:
DOMException: The buffer passed to decodeAudioData contains an unknown content type.
I think I need to resample the data and add a header or something of that sort but I'm unable to figure it out.
Any help would be much appreacited
Twilio developer evangelist here.
I have not tried this myself, so I can only give some pointers. You are right that the audio from Twilio is coming as audio/x-mulaw and that browsers do not support this format.
There is a good set of documentation on MDN about web audio codecs and containers. CanIUse has data on browser support for common formats (e.g. mp3).
So you will need to resample the audio into a supported format before you send it to the browser. I don't have any suggestions for tools for that, particularly as I don't know what you are building your server in.
Alternatively, if you need the audio from a call in a browser, have you considered using Twilio Client to dial into the call?
I am trying to stream RTP Packets (which is streaming an audio) from RTP URL e.g. rtp://#225.0.0.0
after so much research on the same i have somewhat streamed the URL in my device and playing it with https://github.com/maknapp/vlckitSwiftSample.
This is only playing the Streamed Data but does not have any function to store the data.
From research and other sources i dint find much content and simple information that should be helpful to stream the Packet over RTP and store it in iOS Device.
I have tried with following link.
https://github.com/kewlbear/FFmpeg-iOS-build-script
https://github.com/chrisballinger/FFmpeg-iOS
These two are not even compiling due to POD Issues other projects or guide just giving me reference on RTSP Stream instead of RTP Stream.
If anyone can give us a guidance or any idea that how we can implement such things then it will be appreciated.
First foremost, you need to understand how this works.
The sender i.e. the creator of RTP stream is probably doing the following:
Uses a source for the data: In case of audio, this could be the microphone or audio samples or a file
Encodes the audio using a audio codec such as AAC or Opus.
Uses RTP packetizer to create RTP packets from encoded audio frames
Uses a transport layer such as UDP to send these packets
Protocols such as RTSP provides the necessary signaling information to provide better stream information. Usually RTP itself isn't enough as things such as congestion control, feedback, dynamic bit rate are handled with the help of RTCP.
Anyway, in order to store the incoming stream, you need to do the following:
Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer. There could be headers corresponding to a particular codec to make sure you refer to a correct RFC.
Once you have access to encoded frames, you can write the same in a media container such as m4a or ogg depending upon the audio codec used in the stream.
In order to play the stream, you need to do the following:
Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer.
Once you have access to encoded frames, use a audio decoder (available as a library) to decode the frames or check if your platform supports that codec directly for playback
Once you have access to decoded frames, in iOS, you can use AVFoundation to play the same.
If you are looking at an easy way to do it, may be use a third party implementation such as http://audiokit.io/
I would like to provide a stream of images via RTSP using Indy 10 components. I don't need to know all the individual requests and all, that's all covered separate from what I need. But what Indy component should I use and how should I use it? This stream will not consist of sound, only images.
Note that RTSP is very similar to HTTP, but with a different structure.
Indy does not have any RTSP or RTP/RTCP components, so you will have to implement those protocols from scratch. RTSP is a textual-based protocol, so you can use TIdCmdTCPServer, though it may be better to derive from TId(Custom)TCPServer and override its DoExecute() method to avoid duplicated code (reading headers, processing URLs, etc), like TIdHTTPServer does. As for the images, you can use TIdUDPClient to send the RTP/RTCP packets as needed.
I have a RTP stream which generated by ffmepg, now I need to restream to RTSP with live555.
There is a way to restream from a RTSP to another RTSP: LIVE555 Proxy Server.
But how to modify the Proxy Server to restream RTP source to RTSP (I can get the sdp description) ?
i.e.
source stream: rtp://192.168.1.10:55555
retream to: rtsp://:554/stream1
Any advice will be appreciated. Thanks!
Ok, I find the solution.
Inherit the OnDemandServerMediaSubsession class, use the MPEG4ESVideoRTPSource and 'MPEG4VideoStreamDiscreteFramer' to get RTP input.
Then make a little change from DynamicRTSPServer by replacing createNewSMS function.
This RTP proxier got working!
I was wondering if I can use an HTTP protocol to acquire an image stream from an RTSP camera? I am currently using VLC Media ActiveX Plugin to connect to and view the RTSP stream, but I would like to eliminate the ActiveX control and move to a more raw level of image acquisition. I recall seeing somewhere that it's possible to get these images using HTTP. I'd like to use Indy TIdHTTP component to connect to the camera and acquire the image. I'm also assuming this would need some sort of speed control, such as a delay in-between requests. However, it's also my understanding that these RTSP cameras have pre-defined frame rates, which using the standard RTSP protocol are supposed to follow.
many cameras will allow you to grab screenshots with a URL that might look like:
http://user:password#camera/snapshot.jpg
for a proper stream, you would need to use RTSP (there are Delphi RTSP clients), tunnelling over HTTP if your device supports the application/x-rtsp-tunnelled content type, or another stream your device supports.