smeared/corrupted capture of RTSP streams - opencv

i've using emgu cv 2.4.10 to create a RTSP stream viewer that will eventually be used with IP cameras. as i don't have the camera/s as yet, i'm testing using VLC (the windows GUI) to create the stream from a video file.
:sout=#duplicate{dst=rtp{sdp=rtsp://:8554/stream},dst=display} :sout-all :sout-keep
i'm doing this all testing on localhost.
here's my capture code:
private void ProcessFrame(object sender, EventArgs arg) {
try {
frame = _capture.QueryFrame();
pictureBox1.Image = frame.ToBitmap();
}
catch (Exception ex) {
MessageBox.Show(ex.Message.ToString());
}
}
this method is called using this eventhandler:
_capture = new Capture("rtsp://localhost:8554/stream");
Application.Idle += ProcessFrame;
_capture.Start();
the capture is corrupted with random occurrences of "smearing" that always occurs in the lower portion of the frame:
i've seen several others online have reported this problem as recently as last december but no solution has been found or that would work for me:
http://workingwithcomputervision.blogspot.co.uk/2012/06/issues-with-opencv-and-rtsp.html
EMGU QueryFrame returns "streaky" Image over RTSP
http://www.emgu.com/forum/viewtopic.php?f=7&t=4882&p=10110&hilit=rtsp#p10069
to narrow down the problem, i've run ffplay from the commandline and the capture is perfect. i've run another instance of VLC to capture the RTSP stream and it displays perfectly. so this is clearly a problem in open cv/emgu cv.
on a whim, i changed VLC to stream using HTTP.
:sout=#duplicate{dst=http{mux=ffmpeg{mux=flv},dst=:8080/stream},dst=display} :sout-all :sout-keep
this displays fine in my code, but at a noticeably lower frame rate that won't work for my application. i'd really appreciate any tips to fixing this problem. thanks.

I don't know if you solved your problem but i suggest you not to make your process in application.idle event. Instead, use thread. Create another thread and make your proccess in it. Example c# code:
Thread t = new Thread(new ThreadStart(()=>{ while(true) {frame = _capture.QueryFrame();
pictureBox1.Image = frame.ToBitmap();}})); t.IsBackGround = true; t.Start();

Related

opencv videocapture fail to read frame from rtsp

I'm getting and error with read frame from rtsp stream of hikvision camera.
Here is my code to read:
public void readImage(){
VideoCapture capture = new VideoCapture(streamUrl);
if(capture.isOpened()){
Mat frame = new Mat();
while(true){
if(capture.read(frame)){
System.out.println("frame read");
}else{
System.out.println("failed to read frame");
}
}
}
}
with above code i can read frame successfully if the resolution of image from stream is low ex (704x576) but if i resolution is hight or i run some parallel task then the capture fail to read frame. After capture has failed in first read loop then i terminate all other task then capture still fail to read unless i recreate another capture (recreate capture object). What should i do now? (this happen on both open cv2.4 and open cv3.2 when i try )
You may want to release memory after use.
Put the code frame.dispose(); after the end of the while loop

Windows 10 IoT Raspberry Pi 2: DHT22/AM2302

I just wanted to start making experience with the DHT22/AM2302 (a temperature and humidity sensor), but I have no idea how to initialize and get the data of it ... I tried to use GpioPin:
gpioController = GpioController.GetDefault();
if(gpioController == null)
{
Debug.WriteLine("GpioController Initialization failed.");
return;
}
sensorPin = gpioController.OpenPin(7); //Exception throws here
sensorPin.SetDriveMode(GpioPinDriveMode.Input);
Debug.WriteLine(sensorPin.Read());
but get the exception: "A resource required for this operation is disabled."
After that I took a look at the library for the unixoids and found this:
https://github.com/technion/lol_dht22/blob/master/dht22.c
But I have no idea how to realize that in VCSharp using Windows 10, anyone an idea or experience?
Thank you very much in advance!
UPDATE:
I got the hint, that there is not GPIO-Pin 7 and this is true, so I re-tried it, but the GPIO-Output seems to be just HIGH or LOW ... So I have to use the I2C or the SPI ... According to this Project, I decided to try it out with SPI: http://microsoft.hackster.io/windowsiot/temperature-sensor-sample and making steps forward ... The difficulty now is to translate the above linked C-Library to the C-Sharp-SDK to receive the right data ...
private async void InitSPI()
{
try
{
var settings = new SpiConnectionSettings(SPI_CHIP_SELECT_LINE);
settings.ClockFrequency = 500000;
settings.Mode = SpiMode.Mode0;
string spiAqs = SpiDevice.GetDeviceSelector(SPI_CONTROLLER_NAME);
var deviceInfo = await DeviceInformation.FindAllAsync(spiAqs);
SpiDisplay = await SpiDevice.FromIdAsync(deviceInfo[0].Id, settings);
}
catch(Exception ex)
{
Debug.WriteLine("SPI Initialization failed: " + ex.Message);
}
}
This works not so well, to be clear: It works just once on starting up the raspberry pi2, then starting / remote debugging the application, but after exiting the application and re-start them, the SPI Initialization fails.
And now Im working on reading the data from the pin and will show some Code in a future update. Any comments, answers and or advices are still welcome.
DHT22 requires very precise timing. Although Raspberry PI/Windows 10 IoT core is extremely fast, since it's an operating system where other things need to happen unless you write some sort of low-level driver (not C#) you won't be able to generate the timings necessary to communicate with a DHT22.
What I do is use a cheap Arduino Mini Pro for about $5 with the sole purpose to generate and send the correct timings between the microcontroller and the Raspberry Pi, then setup some sort of communication channel between the Arduino Mini Pro (I2C, Serial) to pull the data from the Arduino.

Receiving "End of file" while streaming RTSP on iOS

I'm using ffmpeg library to stream RTSP from an IP camera in the local network. The streaming is working fine with the code.
The only problem is that the stream seems to stop after some time. On further debugging I found out that I'm receiving an "End of file" and thats why the loop is breaking.
while(!playerShouldStop)// && (av_read_frame(pFormatCtx, &pkt1)>=0))
{
int ret = av_read_frame(pFormatCtx, &pkt1);
NSLog(#"av read frame returned = %s",av_err2str(ret));
if(ret >= 0)
{
// process video
}
else
break;
}
Logs says
av read frame returned = End of file
I downloaded Wireshark to check what RTSP packets I'm getting but no help there.
First of all is it normal to receive EOF in a live stream (which is not supposed to end).
Secondly, calling av_read_frame() again and again is not helping either, but when I restart the entire method ( right from avformat_open_input ) then it works. Just that the streaming isn't smooth and comes to a pause every now and then.
Ok...it seems to be working without EOF when i open the stream with AVDictionary options.
AVDictionary *opts = 0;
int ret = av_dict_set(&opts, "rtsp_transport", "tcp", 0);
// Open video file and read header information into pFormatCtx
if (avformat_open_input(&pFormatCtx, filename, NULL, &opts) != 0)
{
NSLog(#"Error opening video file.");
return;
}
av_dict_free(&opts);
Still, any proper explanation to this would be helpful.
I have met same question about av_read_frame return EOF(End Of File) while decoding realtime stream. Finanlly I found that when this problem appears. It's Because I set the AVFormatCtx.interrupt_callback.callback, and the number of timeout is too small(this call back can prevent av_read_frame() blocking). So When The callback return, av_read_frame() return EOF. Hope this question I met may help you.

WebKit Audio distorts on iOS 6 (iPhone 5) first time after power cycling

I've been struggling with an elusive audio distortion bug using webkitAudioContext in HTML5 under iOS 6. It can happen in other circumstances, but the only way I can get 100% repro is on the first visit to my page after power cycling the device. It seems like if you visit any audio-capable page prior to visiting this one, the problem will not occur.
The distortion only happens to audio generated by webkitAudioContext.decodeAudioData() and then played through webkitAudioContext.createBufferSource(). Audio playback of webkitAudioContext.createMediaElementSource() will not distort.
Am I missing some initialisation step? Here's the code and HTML in its entirety that I submitted to Apple as a bug report (but have received no reply):
<!DOCTYPE html>
<html>
<head>
<script type="text/javascript">
var buffer = null;
var context = null;
var voice = null;
function load_music(file) {
context = new webkitAudioContext();
voice = context.createBufferSource();
var request = new XMLHttpRequest();
request.onload = function() {
context.decodeAudioData(request.response, function(result) {
buffer = result;
document.getElementById("start").value = "Start";
});
};
var base = window.location.pathname;
base = base.substring(0, base.lastIndexOf("/") + 1);
request.open("GET", base + file, true);
request.responseType = "arraybuffer";
request.send(null);
}
function start_music() {
if (!buffer) {
alert("Not ready yet");
return;
}
voice.buffer = buffer;
voice.connect(context.destination);
voice.noteOn(0);
document.getElementById("compare").style.display = "block";
}
</script>
</head>
<body onload="load_music('music.mp3')">
<p>This is a simple demo page to reproduce a <strong>webkitAudio</strong>
problem occurring in Safari on iOS 6.1.4. This is a stripped down demo
of a phenomenon discovered in our HTML5 game under development,
using different assets.</p>
<p><u>Steps to reproduce:</u></p>
<ol>
<li>Power cycle <strong>iPhone 5 with iOS 6.1.4</strong>.</li>
<li>Launch Safari immediately, and visit this page.</li>
<li>Wait for "Loading..." below to change to
"Start".</li>
<li>Tap "Start".</li>
</ol>
<p><u>Issue:</u></p>
<p>Audio will be excessively distorted and play at wrong pitch. If
another audio-enabled web site is visited before this one, or this
site is reloaded, the audio will fix. The distortion only happens on
the first visit after cold boot. <strong>To reproduce the bug, it is
critical to power cycle before testing.</strong></p>
<p>This bug has not been observed on any other iOS version (e.g. does
not occur on iPad Mini or iPod 5 using iOS 6.1.3).</p>
<input id="start" type="button" value="Loading..." onmousedown="start_music()" />
<span id="compare" style="display:none;"><p>Direct link to audio file, for
comparison.</p></span>
</body>
</html>
Note: The body text suggests this only occurs on iOS 6.1.4, but I mean to say that the problem only occurs upon power cycling in this situation. I've experienced the problem on the iPad Mini under 6.1.3, too, but not upon power cycling.
Edit: a few things I've tried... Deferring the creation of the buffer source makes no difference. Using different transcoders to generate the .mp3 file it plays makes no difference. Playing throwaway silence as the first sound makes no difference as the distortion continues for every decodeAudioData sound until the page reloads. If createMediaElementSource and createBufferSource sources are mixed in the same page, only the createBufferSource audio (using decodeAudioData) will distort. When I check the request.response.byteLength in the failure case and the non-failure case, they are the same, suggesting the XMLHttpRequest is not returning incorrect data, though I would think corruption of the data would damage the MP3 header and render the file unplayable anyway.
There is one observable difference between the failure condition and the non-failure condition. The read-only value context.sampleRate will be 48000 in the failure state and 44100 in the non-failure state. (Yet the failure state sounds lower pitch than the non-failure state.) The only thing that occurs to me is a hack wherein I refresh the page via JavaScript if 48000 is detected on a browser that should be reporting 44100, but that's serious userAgent screening and not very future proof, which makes me nervous.
I have been having similar problems, even on iOS 9.2.
Even without a <video> tag, playback is distorted when first playing audio on the page after cold boot. After a reload, it works fine.
The initial AudioContext seems to default to 48 kHz, which is where distortion is happening (even with our audio at 48 kHz sample rate). When playback is working properly, the AudioContext has a sample rate of 44.1 kHz.
I did find a workaround: it is possible to re-create the AudioContext after playing an initial sound. The newly-created AudioContext seems to have the correct sample rate. To do this:
// inside the click/touch handler
var playInitSound = function playInitSound() {
var source = context.createBufferSource();
source.buffer = context.createBuffer(1, 1, 48000);
source.connect(context.destination);
if (source.start) {
source.start(0);
} else {
source.noteOn(0);
}
};
playInit();
if (context.sampleRate === 48000) {
context = new AudioContext();
playInit();
}
I found a related bug with HTML5 video and think I discovered the root of the problem.
I noticed that if you play a video using a <video> tag, it sets the context.sampleRate value to whatever the video's audio was encoded at. It seems as if iOS Safari has one global sampleRate that it uses for everything. To see this, try the following:
// Play a video with audio encoded at 44100 Hz
video.play();
// This will console log 44100
var ctx = new webkitAudioContext();
console.log(ctx.sampleRate);
// Play a video with audio encoded at 48000 Hz
video2.play();
// This will console log 48000
var ctx = new webkitAudioContext();
console.log(ctx.sampleRate);
This global sample rate appears to persist across page loads and is shared between tabs and browser instances. So, playing a youtube video in another tab could break all your decoded audio.
The audio becomes distorted when it is decoded at one sample rate and played at another one.
Decode audio and store the buffer
Do something to change the sample rate, such as playing a video or audio file
Play buffer (distorted)
I don't know why it's happening after a cold start. If I had to guess, it's that Safari doesn't initialize this global sample rate until you try to use it.
The problem is still there on iOS 7, so I don't think a fix is coming anytime soon. We're stuck with hacks in the mean time like checking for a changed sample rate.
An npm package is online to fix this:
https://github.com/Jam3/ios-safe-audio-context
npm install ios-safe-audio-context

MediaElement.Stop doesn't work when playing a live streaming source in windows store app

I want to make a Windows Store App play a live-streaming source. The source plays fine but I can't stop the source from playing once it has begun. When I call Stop() on the instance of Windows.UI.Xaml.Controls.MediaElement nothing happens.
Below is my code:
public MainPage(){
this.InitializeComponent();
this.mediaplayer.AutoPlay = true;
this.mediaplayer.Source = new Uri("mms://somedomain/mylive");
}
...
void StopButton_Click(object sender, RoutedEventArgs e)
{
//I can reach here when I set a breakpoint
this.mediaplayer.Stop();
}
I just came across the same problem... and was able to solve it by calling MediaElement.Pause() instead.

Resources