Proxy rtp source stream to rtsp via live555 - stream

I have a RTP stream which generated by ffmepg, now I need to restream to RTSP with live555.
There is a way to restream from a RTSP to another RTSP: LIVE555 Proxy Server.
But how to modify the Proxy Server to restream RTP source to RTSP (I can get the sdp description) ?
i.e.
source stream: rtp://192.168.1.10:55555
retream to: rtsp://:554/stream1
Any advice will be appreciated. Thanks!

Ok, I find the solution.
Inherit the OnDemandServerMediaSubsession class, use the MPEG4ESVideoRTPSource and 'MPEG4VideoStreamDiscreteFramer' to get RTP input.
Then make a little change from DynamicRTSPServer by replacing createNewSMS function.
This RTP proxier got working!

Related

How to receive RTP Packets which are Streaming from RTP URL in iOS Device? (e.g. rtp://#225.0.0.0)

I am trying to stream RTP Packets (which is streaming an audio) from RTP URL e.g. rtp://#225.0.0.0
after so much research on the same i have somewhat streamed the URL in my device and playing it with https://github.com/maknapp/vlckitSwiftSample.
This is only playing the Streamed Data but does not have any function to store the data.
From research and other sources i dint find much content and simple information that should be helpful to stream the Packet over RTP and store it in iOS Device.
I have tried with following link.
https://github.com/kewlbear/FFmpeg-iOS-build-script
https://github.com/chrisballinger/FFmpeg-iOS
These two are not even compiling due to POD Issues other projects or guide just giving me reference on RTSP Stream instead of RTP Stream.
If anyone can give us a guidance or any idea that how we can implement such things then it will be appreciated.
First foremost, you need to understand how this works.
The sender i.e. the creator of RTP stream is probably doing the following:
Uses a source for the data: In case of audio, this could be the microphone or audio samples or a file
Encodes the audio using a audio codec such as AAC or Opus.
Uses RTP packetizer to create RTP packets from encoded audio frames
Uses a transport layer such as UDP to send these packets
Protocols such as RTSP provides the necessary signaling information to provide better stream information. Usually RTP itself isn't enough as things such as congestion control, feedback, dynamic bit rate are handled with the help of RTCP.
Anyway, in order to store the incoming stream, you need to do the following:
Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer. There could be headers corresponding to a particular codec to make sure you refer to a correct RFC.
Once you have access to encoded frames, you can write the same in a media container such as m4a or ogg depending upon the audio codec used in the stream.
In order to play the stream, you need to do the following:
Use a RTP depacketizer to get the encoded audio frames out of it. You can write your own or use a third party implementation. In fact ffmpeg is a big framework which has all necessary code for most of the codecs and protocols. However for your case, find a simple RTP depacketizer.
Once you have access to encoded frames, use a audio decoder (available as a library) to decode the frames or check if your platform supports that codec directly for playback
Once you have access to decoded frames, in iOS, you can use AVFoundation to play the same.
If you are looking at an easy way to do it, may be use a third party implementation such as http://audiokit.io/

Use VLC to fetch SDP file once using RTSP

Context
Most RTP streams (from e.g. an IP camera) need some information from a SDP to be able to decode them.
SDP is usually fetched just in time, usually from a RTSP URL but other means are possible (e.g. HTTP).
Specific case
We have a situation where an RTP stream (from a camera, UDP sent at all time whether anyone listens or not) will be played using VLC, but providing VLC an RTSP URL to fetch SDP just in time is not an option.
There is a RTSP service yet we need to query it in advance and dump the resulting SDP file to feed it to VLC later. Doing a RTSP query just-in-time is useless anyway since the stream exists at all times.
How to do that with VLC?
Search before you post
Of course I've been searching Google, videolan wiki and StackExchange.
Information is difficult to find because when people talk about streaming, RTSP, RTP, they are generally usig VLC to generate a RTP stream, or output a SDP that VLC generates because it does the encoding, etc.
It's not the case here. The SDP to dump comes from the serveur with a single RTSP query.
Question
Basically, I'm looking for a command-line like:
vlc --sout...something...rtsp://sourceIP:Port/...something...out...myfile.sdp
That would dump the SDP in myfile.sdp.
Then, later, running vlc with the myfile.sdp as argument is expected to play the stream.
We did not find a solution using VLC alone (I even looked a little at the VLC source code). So we used a somehow "brute force" solution but hey, it works.
What we do at configure time is ask VLC to play stream once, while Wireshark captures packets with filter rtsp and sdp. One packet appears containing the SDP data we want. We select it and use "extract selected bytes to ..." and save to a file with name ending with .sdp.
That gives us a file containing the SDP information we want. Job done.

DVB-S streaming from source to media server

i am streaming some FTA channels from
http://www.tbsdtv.com/products/tbs6985-dvb-s2-quad-tuner-pcie-card.html
using mediaportal
http://www.team-mediaportal.com/
and then i get rtsp url from mediaportal of channel i timeshift
and vlc i can send that stream to mediaserver FMS to get HLS, HDS, RTMP, RTSP
i have 3 servers running erlyvideo (flussonic)
so it take care of the delivery.
i want some alternate solution beside that
i have done some methods to work this our
including
VLC
IPTVL
Dvbdream
but the quality is better when i stream some thing as file, only FMLE works good with live stream, but for that we only can use directshow enabled devices like
http://www.viewcast.com/products/osprey-cards
i am doing it on windows.
if some one have any more methods or want to share his version please do so

How to intercept packets from RTP-stream on loopback and access the data through C++-code?

I want to be able to intercept the packets of this stream and access the data from C++-code.
How do I do this in C++-code?
The RTP- media stream are streamed using this server: link
I will then FEC-encode the packets; send them over the network; FEC-decode them on the receiver-side and pass the stream of data to a RTCP-client.
Evaluate some opensource stacks for media streams like ( live555, openRTSP, VLC, mPlayer ). with any of these you can do the following:
- install
instrument
build
run with sample streams
observe debug and or logger
For example, in live555 source...
./live/liveMedia/MultiFramedRTPSource.cpp:MultiFramedRTPSource::~MultiFramedRTPSource()
you would have the frame handlers found here
for another example , see here for log of a player ( VLC ) getting the stream ( rtsp / rtp ) available from any Youtube entry. If you want, you can drill any youtube videoID for a related streaming link. Then you could use that link for a source in your testing.

How to use RTP protocol in ios

I want to create an app which plays videos using the RTP protocol.
I just need to know if anyone can provide any resources/websites, as to where I can find information on how to use the RTP protocol.
What about the RFC's?
RTP RFC 3550:
http://www.ietf.org/rfc/rfc3550.txt
Also you should take a look at the RTCP protocol for controlling an RTP stream.
RTCP RFC 3605:
http://www.ietf.org/rfc/rfc3605.txt
For implementing a RTP server in C++ you could take a look at JRTPLIB:
http://research.edm.uhasselt.be/~jori/page/index.php?n=CS.Jrtplib

Resources