how to detect connection lost in vlcj - vlc

I am playing network streams in my VLCJ application. Sometimes network connections are lost and I want to know this moments in my VLCJ application. I tried Error event but it is not fired. It is only fired when there is no network connection and I try to play a stream. When network is gone, last frame of video is freezed and nothing else is happened. How can I detect that video playing is stopped due to network connection lost?

There is no LibVLC API to detect that condition, and so there is no way to do it with vlcj either.
You could conceivably capture the native log and parse it looking for those errors, but that's a really poor approach frankly.

You should check Media state. Either in separate thread in loop with delay or - ( I use libVLCSharp in c# ) - there should be something like an event in Media that State had been changed. In c# it is Media.EventManager.StateChanged event.

Related

Send audio buffer in parts to a server from an iOS device's microphone

I am trying to build an iOS application that streams audio coming directly from the input (or mic) of a device. What I am thinking is that every certain period of time, I'd have to send the audio buffer to the server, so that the server sends it to another client that might want to listen. I am planning to use WebSockets for the server-side implementation.
Is there a way to grab just a specific stream of buffer from the input (mic) of the iOS device and send it to the server while the user speaks another bit and so on and so forth? I am thinking that if I could start an AVAudioRecorder perhaps with AVAudioEngine and record every 1 second or half a second, but I think that that would create too much of a delay and possibly lost streams in the transition process.
Is there a better way to accomplish this? I am really interested in understanding the science behind it. If this is not the best approach please tell me which one it is and maybe a basic idea for its implementation or something that could point me in the right direction.
I found the answer to my own question!! The answer lies in the AVFoundation framework, specifically AVCaptureAudioDataOutput and its delegate that will send you a buffer as soon as the input source captures it.

Youtube encoder won't start for live streams

I'm trying to get a livestream working on youtube. I want to stream 360° content with H264 video and AAC audio. The stream is started with the youtube live api from my mobile app and librtmp is used to deliver video and audio packets. I easily get to the point where the livestream health is good and my broadcast and stream are bound successfully.
However, when I try to transition to "testing" like this:
YoutubeManager.this.youtube.liveBroadcasts().transition("testing", liveBroadcast.getId(), "status").execute();
I get stuck on the "startTesting" status every time (100% reproducible) while I expect it to change to testing after few seconds to allow me to change it to live.
I don't know what's going on as in the youtube live control room everything seems to be fine but the encoder won't start.
Is it a common issue? Is there a mean to access the encoder logs? If you need more information feel free to ask me.
Regards.
I found a temporary fix !
I noticed 2 things :
When the autostart option was on, the stream changed its state to startLive as soon as I stopped sending data. It suggested that the encoder was trying to start but it was too slow to do it before some other data paket was received (I guess)
When I tried to stream to the "Stream now" URL, as #noogui suggested, it worked ! So I checked out what was the difference in the stream now & event configurations.
It turned out I just had to activate the low latency option as it's done by default in the stream now configuration.
I consider it as a temporary fix because I don't really know why the encoder isn't starting otherwise and because it doesn't work with the autostart option... So I hope it wont break again if Youtube does another change on their encoder.
So, if you have to work with the Youtube api, good luck guys !

When NSStream NSStreamEventHasSpaceAvailable event called?

I cant really understand this event.
I'm hoping that it is called when the sending queue (or something similar internal structure) is done sending previously written packets.
Is it a correct assumption?
I'm working on a video streamer over Multipeer connectivity, and I want to use this property to decide if I should drop a camera frame (if there is no NSStreamEventHasSpaceAvailable), or I can submit it for NSOutputStream.
Imagine a BlueTooth connection, where I really need to drop a lot of camera frame, instead of submit every frame to NSStream.
The NSStreamEventHasSpaceAvailable event indicates that you can write (at least one byte!) to the stream without blocking. That does not mean that previously written data is completely
delivered to the other endpoint of the connection.

Trying to use "The Amazing Audio Engine" and "AudioStreamer" at the same app

At the app i'm currently working on, there is a "studio" when you can make some sound effect, and for that i'm using "The amazing audio engine".
And there is an option to listen to songs by stream too.
Unfortunately the amazing audio engine doesn't contain "streaming" functionality, so i'm using the "AudioStreamer" calss.
I don't know why but the two don't work well together for me.
Each off them alone work great, but at the moment i try to play some audio on the amazing audio engine, stop, and move to stream, then move back to the audio engine, the sound doesn't play any more! no sound!
I checked already that i call "Stop" on every class, and make it "nil".
I allocate each of them every time again before they play.
I'm out of options, and thinking maybe it has something to do with core audio that both of them use?
Any help would be much appreciated
Thanks
EDIT
What i found is, this happens only when i use the "Stop" method of the "AudioStreamer"!
Can any won explain way?
Edit second
Found the answer!
This was solved by outmark This:
/*
while (state != AS_INITIALIZED)
{
[NSThread sleepForTimeInterval:0.1];
}
*/
// And adding this:
AudioQueueStart(audioQueue, NULL);
To the "stop" method, still, do not really understand why...
It takes some time after calling an audio stop method or function for it to really stop all the audio units (while the buffers get emptied by the hardware, etc.). You often can't restart audio until after this (a short) delay.

delphi, indy10 tcp audio streaming

i am trying to make an application that use video/audio streaming through TCP connection, i already done the video streaming with indy10 component(idtcpserver and idtcpclient), is it possible do the same thing but with audio?
Sure.
TCP is just data channel. It is totally agnostic to what kind of data is transferred to it. HTML pages, programs, video, audio - whatever. It is just a data channel within TCP protocol.
However, "streaming" usually means "near to real time". If some frames of video or audio did not arrived during few seconds - they better be skipped and forgotten and newer music or video be played. You would not want your Skype conversation suddenly stuck for a minute and then playback all that minute to you, just because of few seconds network jam. You'd better loose a word or two and then either recover by context or ask the correspondent to repeat. Thus TCP with built-in retransmissions and usually not very large buffers is not a perfect choice for multimedia streaming. Usually UDP + application-implemented integrity control is better choice for it.
I believe you need to use the unit VFW. With avistream, you join video + sound in a compressed stream.

Resources