red5 live stream always dropped frames - actionscript

I have a live stream site build on red5 server and when i subscribe, the live video seems to drop frames.
I am using the oflademo for my rtmp application.
Here is the actionscript I use for the camera setting:
Broadcast
// setup cam
cam = Camera.get();
// setting dimensions and framerate
cam.setMode(320, 240, 15, false);
// set quality
cam.setQuality(0,0);
cam.setKeyFrameInterval(48);
Subscribe is the same as broadcast.
Can anyone give a hand.

in general dropped frames (or gaps in the audio signal) are a sign for bandwidth issues.
These are our settings, frames are never dropped (if the bandwidth is okay):
cam.setMode(160, 120, 30, true);
cam.setQuality(0,90);
We have never been using "cam.setKeyFrameInterval(48);". I would not set this value at all. I would not force the video compression any value. Do you have a reason why you think 48 is a good value for it?
Sebastian

Related

Creating a rtsp client for live audio and video broadcasting in objective C

I am trying to create a RTSP client which live broadcast Audio and Video. I modified the iOS code at link http://www.gdcl.co.uk/downloads.htm and able to broadcast the Video to server properly. But now i am facing issues in broadcasting the audio part. In the link example the code is written in such a way that it writes the Video data to file and than reads the data from the file and upload the NALU's video packets to RTSP server.
For Audio part i am not sure how to proceed on it. Right now what i have tried is that get the audio buffer from mic and than broadcast it to the server directly by adding RTP headers and ALU.. but This approach is not properly working as Audio starts lagging behind and lag increases with time. Can someone let me know if there is some better approach to achieve this and with lip sycn audio/video.
Are you losing any packets on the client? If so, you need to leave "space." If you receive packet 1,2,3,4,6,7, You need to leave space for the missing packet (5).
The other possibility is a what is known as a clock drift problem. The clock (crystal) on your client and server are not perfectly in sync with each other.
This can be caused by environment, temperature changes, etc.
Let's say in a perfect world your server is producing audio samples 20ms audio samples at 48000 hz. Your client is playing them back using a sample rate of 48000 hz. Realistically your client and server are not exactly 48000hz. Your server might be 48000.001 and your client might be 47999.9998. So your server might be delivering faster than your client or vise versa. You would either consume packets too fast and under run the buffer or lag too far behind and overflow the client buffer. In your case, it sounds like the client is playing back too slow and slowly lagging behind the server. You might only lag a couple milliseconds per minute but the issue will keep continuing and it will look like a 1970s lip synced Kung Fu movie.
In other devices, there is often a common clock line to keep things in sync. For example, Video camera clocks, midi clocks. multitrack recorder clocks.
When you deliver data over IP, there is no common clock shared between a client and server. So your issue concerns syncing clocks between disparate devices with no. I have successfully solved this problem using this general approach:
A) Let the client count the rate of packets that come in over a period of time.
B) Let the client count the rate that the packets are consumed (played back).
C) Adjust the sample rate of the client based on A and B.
So your client requires that you adjust the sample rate of the playback. So yes you play it faster or slower. Note that the playback rate change will be very very subtle. You might set the sample rate to be 48000.0001 hz instead of 48000 hz. The difference in pitch would be undetectable by humans as it would only cause a fraction a cent difference in pitch. I gave an explanation of a very simplified approach. There many other nuances and edge cases that must be considered when developing such a control system. You don't just set it and forget it. You need a control system to manage the playback.
An interesting test to demonstrate this is to take two devices with the exact same file. A long recording (say 3 hours) is best. Start them at the same time. After 3 hours of playback, you will notice that one is ahead of the other.
This post explains that it is NOT a trivial task to stream audio and video.

How to calculate Online Video Gaming Bandwith Requirement

Imagine i have a server which will have hardware to play games on it and it will stream the game itself to the client. Client also will send controls(keyboard & mouse) to the server. (Like Nvidia Grid).
How can i calculate the bandwith required(approximate) on client?
(Game video should be uncompressed)
For example;
-1080p #120Fps
-2K #60Fps
I just want to know how can i calculate(Math behind it) the bandwith requirement(Video+Controls). I know uncompressed video will require huge bandwith.
Uncompressed...
1920*1080*60*3 (24bit color) per frame = 373248000 bytes/sec. For video.
Over 2Gb/sec. Assuming a raw datagram.

DirectX.Capture FrameRates

I'm using DirectX.Capture library to save to an AVI fomr Webcam. I need video to be saved to have 50fps or more, but when i use this:
capture.FrameRate = 59.994;
FrameRate doesn't change at all. It had 30 before that line and passing that line it keeps its 30. I tried other values, even 20 and 10, and nothing changes.
What else should i do so i can be able to change that value? or it is something regarding my hardware and i can hope it works in other machine?
Please help me, i don't know what to do.
Thanx
The source material (video, app/etc), is probably only being updated at 30fps, either because that is the way the video codec or app behaves, or because you have vsync turned on in the target app (check vsync settings, it might be getting forced by the video card drivers if there is hardware acceleration). The behaviour of DirectX.Capture is probably to clamp to the highest available framerate from the source.
If you really want to make the video 50fps, capture it at its native rate (30/29.97) and just resample the video using some other software (note that this would be a destructive operation since 50 is not a clean multiple of 30). This will be no different from what DX capture would do if you could force it at 50fps (even if its nonsensical due to the source material being at a lower framerate). FYI most video files are between 25 and 30 FPS.

Capturing a Multicast UDP Video stream using OpenCV

I have a multi-cast UDP Video stream that I need my OPenCV (Emgu ) 2.4.x app to capture and process ("client").
On the client, I can capture the stream using VLC (udp://xx.yy.zz.aaa:1234, However the my app fails to capture this udp stream. My code is quite simple (
Capture cap = new Capture ("udp://#212.1.1.1:1234");
p.s. I have tried with and 2/o the # also tried rtp on that address. No luck :-/
Does OpenCV directly allow "capture" of UDP streams? or do I need to run VLC on the client to re-stream the video as rtp or http or some other....?
Thanks.
I finally figured this out and sharing in the hope that might help others,
Capture cap = new Capture ("udp://#212.1.1.1:1234");
don't forget the # symbol!
the capture is successfully created on the UDP Stream, however accessing the capture properties causes it to exception out and causes the error.
Long story short, the UDP stream does not appear to stream the device properties so you might need to obtain that elsewhere or code it in.
On other thing of note, that since the FPS (frames per sec) is unreliable, if not outright incorrect, you might need to make the FPS adjustable, especially if you are polling the stream in a loop.
HTH
IplImage* frame;
CvCapture* pCapture;
pCapture = cvCaptureFromFile("udp://ip:port/path");
frame = cvQueryFrame(pCapture);
This will also do the job in case you don't have videoInput libraries

MJPEG Stream Information

I am receiving a MJPEG Stream from my camera. When I look at the video data with an hex editor it seems that it doesn't contain any streaming information. I just see one raw JPEG after another, but no information about the framerate etc. .
Is the lack of any meta information normal for MJPEG or is it just related to the camera I am using? If there a no information about the stream, how can a player know how fast to play the video?
The lack of metadata is normal. IP Cameras typically send MJPEG as just that, one JPEG image after another as a stream. This is the most basic valid MJPEG file. If you were to take a bunch of jpegs, cat them together into a large, giant file, and feed it to ffmpeg, it would see it as a valid mjpeg format file. Some cameras will add an additional header to contain audio data, but it is not needed to be considered valid motion jpeg.
Many cameras will include a header like X-Framerate, in the HTTP header when the stream is initially sent, or you can set it as part of the camera configuration. However, when a camera sends only jpegs, there is no way to tell from the stream itself what the framerate is.
Is the lack of any meta information normal for MJPEG or is it just related to the camera I am using? If there a no information about the stream, how can a player know how fast to play the video?
To add to already answered, IP camera is a live video source and frames are typically presented as soon as they arrive from camera. Rare IP camera attaches extra per frame information other than fame size (some don't do even this! they send data and separators only). Still some do attach time stamps and extra data like motion detection state.
Most of the IP cameras don't do constant frame rate. That is, frame rate might vary, esp. lower down in low light conditions. It is the responsibility of the receiving side to attach per frame time stamps when multiplexing the data into container format. Time stamp might be recovered from metadata (which rarely exists) or - more frequently - receiver stamps a frame with local receive time.
This is the way for the player to play back video sequence in proper rate. Live feed is typically presented on "show received frame as soon as possible" basis.
Normally MJPEG data is sent within a streaming media wrapper such as AVI or MOV (quicktime). The wrapper format will contain the framerate and information about the optional audio data.

Resources