Real-time Transport Protocol is not used in youtube? - network-programming

I came across reading about real time protocol in Wikipedia, which mentions the following :
"RTP is used extensively in communication and entertainment systems that involve streaming media"
I was curious about this protocol and wanted to see this in wireshark. I thought youtube.com might be using RTP when running videos, but was surprised to see that only TCP packets are being sent when a video is being played.
Can someone please tell another free website which implements RTP, so that I will be able to see it in wireshark. (I am actually wanting to explore network optimization opportunity in my server applications by using RTP, since it is ok to loose a few packets)

According to Computer Networks, RTP is the payload of UDP (or TCP) as the book indicates.
Here is a picture from the book:
According to WireShark's wiki, only RTP on UDP could be detected by WireShark. (Thanks to Ralf)

Youtube uses HTTP AFAIK. Also, keep in mind that RTP can be sent over UDP as well as TCP.
An RTSP server can be used to start an RTP media session. I don't know any public servers, but another option would be to download the live555 RTSP server. There are also some example media files. Then all you need to do is build the media server application as well as the openRTSP client and use the client app to connect to the server for the stream. The client can request RTP over UDP, TCP, etc.
Alternatively you could also use Darwin Streaming Server as an RTSP server.

Related

What is the major role of Streaming Media Server?

I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.
I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.
Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..
(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?
(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?
(3) How will my server communicate with a streaming server like Wowza ?
(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.
It would be great if someone gives a graphical representation of the data flow.
Thanks a lot in Advance !!!
Let me quickly add my understanding to your questions:
1a. Why Media Server? ..
You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.
1b. What actually Media Server do to make its role necessary?
A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.
An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.
In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the
streaming server?
You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.
How will my server communicate with a streaming server like Wowza?
The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.
How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.
What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.
I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)

Integrating PubNub WebRTC SDK for iOS

I Am stuck with integrating the PubNub WebRTC SDK for iOS application.
Its a JavaScript SDK. How To integrate this with my iOS app.
Thanks in advance.....
This does not directly answer the Objective-C implementation, but it might help with understanding the overall solution and the role that PubNub plays.
Why PubNub? - Signaling
WebRTC is not a standalone API, it needs a signaling service to coordinate communication. Metadata needs to be sent between callers before a connection can be established. This metadata includes information such as:
Session control messages to open and close connections
Error messages
Codecs/Codec settings, bandwidth and media types
Keys to establish a secure connection
Network data such as host IP and port
Once signaling has taken place, video/audio/data is streamed directly between clients, using WebRTC’s PeerConnection API. This peer-to-peer direct connection allows you to stream high-bandwidth robust data, such as video. HTML5Rocks provides a great guide on all things WebRTC (no need to read as it is summarize below).
PubNub makes this signaling incredibly simple, and in addition, gives you the power to do so much more with your WebRTC applications.
What PubNub is Not
PubNub is not a server for WebRTC. A signaling service specifies ICE servers that the video chat can stream over. Public STUN servers provided by google can be used, but they are not very reliable. STUN or TURN servers are required to circumnavigate a firewall, else chat will fail. Many services provide the “total package” of signaling and server in one, that is not PN. Our audience are the people who want to build their own, more custom service.
XirSys
XirSys already have a WebRTC-PubNub demo using rails on their GitHub. They host STUN and TURN servers catering to the needs of WebRTC.
Open Source
There are a few open source STUN and TURN server projects that can be downloaded and hosted with ease:
Amazon AWS VM: Pre-made ready to deploy
RFC5766 TURN: Google Code, TURN server
One-to-many: Instructions on MCU for 1-to-many media servers. Necessary for large group chats and streams with hundreds+ users.
So as you can see, we do not provide audio/video streaming services but if you are building this solution, PubNub is a necessary piece to tie it all together with the signal protocol.
AndroidRTC
And here is an PubNub AndroidRTC example by our interns.

Bulletproof HTTP Monitor for iOS

I'm using Charles Proxy and Wireshark to monitor http(s) traffic from various iOS apps I'm using on my iPhone. These apps require me to set the HTTP Proxy under the iOS Wifi settings (let's call these the Proxy Settings).
My business needs to see ALL URL's that are being called from my phone. From all apps. All URL's, not some of them.
Now Charles and Wireshark both work fine and I can see a ton of traffic coming from my phone.
However, I can't help but wonder whether I might be missing some HTTP calls. Maybe calls that don't use the Cocoa Core Foundation libraries as the basis for their networking.
For instance, I could write my own HTTP library out of TCP/IP and these would bypass the Proxy Settings.
So my question is: what is the likelihood that some apps are using custom-rolled HTTP libraries and side-stepping my Proxy Settings. Or worse, they're using raw TCP/IP to communicate with a server. I know it's possible, but do any APIs work this way? Does anyone do it?
I found the answer: Use mitmproxy in transparent mode. proxy is not used. harder to setup because it needs work on the router, but it reliably captures every packet on port 80 and 443 regardless of proxy settings.
Assuming that you are able to keep your device tethered, then you may be able to use the pcap service to monitor all traffic. According to the following paper (2014) the pcap service is running on every iOS device:
"Identifying back doors, attack points, and surveillance mechanisms in iOS devices"
You should be able to connect to it via usbmuxd. I'm not sure whether there is a pre-rolled client for the pcap service. There is a list of services supported by libimobiledevice here. Pcap is not on that list.
Alternatively, you can use wireshark to capture all traffic on your wifi network.

Customizing RTSP client library for woza

I have downloaded a library from this links its working fine for live streaming with in local network. I am not able to customize this library to communicate with wowza server. Please guide me if anyone knows.
Or else suggest me some other open source ios client library(Encoding foramt should be either H.264 or MPEG 2 or 4) to communicate with wowza server.
I recommend you to install wowza locally on your host, turn it on the max debug level and try to establish the session with you client. Then, in the log, you will see what is going on. As per my experience, wowza standard setup behaves in a pretty expectable way, understands h264 + aac sound. I implemented rtsp client for android, can send you negotiating logs, etc. You, on the other side, could also grab the log on how session is established to check if commands order is ok, all is sane and what the response is.

Which connection class type should we use when connecting to a server to stream an audio file?

I need to connect to a server where resides an mp3 file and stream it. If I am not using rtsp connection what do u suggest me to use? Can we use rstp in this case?
Is it fine to use a http connection? I used that but it seems to take a long time to actually connect to that server. Can I improve the performance by using any other connection class?
Thanks
HTTP over Wi-Fi, Direct TCP, or WAP2 is the best connection method to use for streaming audio if you want to reduce the number of intermediate proxies and carrier networks. Transports such as BES and BIS go through third-party infrastructure (enterprise in the base of BES and RIM in the base of BIS), which adds another point of failure in the path. Not only that, but I have asked RIM employees directly what their thoughts were on streaming media over BIS and their short but sweet response is "don't". They don't want the extra traffic going over their network. I have heard from the BlackBerry forums that large HTTP transfers aren't very reliable over BIS, anyways. In a similar way, BES admins probably don't like apps that try to stream a lot of media through their servers as well.
That leaves Wi-Fi, WAP2 and Direct TCP. Wi-Fi is a no-brainer for devices that have it (and users who are connected) but remember most CDMA devices don't have Wi-Fi so only a small percentage of users may have it. WAP2 is nice in that it doesn't require manual "APN" configuration, however in my experience not all carriers are set up for WAP2. So you may want to try that first and if it doesn't work (i.e. no WAP2 service records or connections over WAP2 fail), use Direct TCP.

Resources