I would like to put the things more clearly as follows.I have window 7 as client and window 2008 as terminal server.so when i remotely connect to terminal server using mstsc and selecting the option play at server machine that time i need to play it local client machine.
Initially what i get server doesn't allow for play the audio files.So how to implement the audio interface so that each application would respond to that interface and eventually it will streaming audio over a virtual channel to the local machine where i can play the audio.
I implemented that remote audio playback for window 7 and Vista using core audio APIs.It works fine for me with low latency and high resolution but in case of windows server the audio interface is not there .so how to implement that so that any application playing audio would respond to the audio interface(audio device).
Could you please suggest me any example so that i can implement the audio interface MMDevice which will respond to all playing applications?
Thanks
Related
While making a broadcast with iOS SDK, everything is working fine and viewers can listen to what broadcaster is saying.
But, after sometime when I am starting a mp3 song from broadcaster mobile library, viewer should be able to listen what broadcaster is saying as well as song which is being played, but no audio is getting send to viewer once I start the song.
Ideally, broadcaster’s audio should mix with song and viewer should be able to listen to both.
I'm using below codes to play the audio song.
Please let me know what should I do as I want to play audio file in background while I am doing streaming.
Yes, it's an expected behavior. I mean when playing the mp3 file, you're setting "shared audio session instance" in iOS mode to playback and it closes the microphone.
try
AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category(rawValue: AVAudioSession.Category.playback.rawValue))
After that you need to change it to record mode again to let it capture the audio from microphone.
The category should be set to playandrecord and mode can be set to voicechat.
I have to add live video streaming in one of my app; I am using Wowza Streaming Cloud for the same. I am using REST Api to make it as my requirement.
The flow I had used is:
Create a stream
Start stream
Check status unless it is "started"
if the status is started, I start broadcasting video.
It goes well sometimes, but sometimes when I try to broadcast even after starting the stream, it says:
Status is idle, An error occurred when trying to connect to host: (code = 15)
Also I see a green screen on player side, and the video on player side is not continuous, it keeps on fluttering.
For Player I used the code provided in sample app.
For Broadcasting I had used GoCoder SDK wherein I set all of the properties like host address, port, app name, stream name, etc.
Do I have to set bitrate or anything?
Where am I wrong?
That error occurs when the entrypoint itself is open for more than 20 minutes without a connection. Once you get an IP returned from the API for the stream, you can connect to it right away. The errors you're getting are showing idle due to lack of connection and it sounds like the timing between starting the event, checking the event, and then finally connecting are hitting this restriction.
is there any way to send Microphone audio stream to service side in real time?
I am using WCF service at the middle layer where I am converting audio to text using system.speech. It is working fine if I am sending wav file as memory stream but how it possible in a live scenario using the microphone?
We are using FFmpeg for real time streaming in iPhone.
For testing purpose we have hosted a video on local server.When we try to start stream avformat_open_input method always return error code -5.
I have read some where that is is I/O error but the same URL working fine on VLC media player.
If we start streaming on VLC first then play it on iPhone it works fine.
I am not getting where I am wrong the IP address and network connection both are fine.
Im using MPMoviePlayerController to stream audio from a server, but after playing the audio for more than two minutes, the audio starts to stop and resume alot, im streaming more than one file one after one, so because of the interruption, some of the audio files are being skipped with those two console messages:
Took background task assertion (38) for playback stall
Ending background task assertion (38) for playback stall
I'm losing a lot of tracks because of this error.
for the first while, i thought that was a memory issue, but the console shows that each time a loose a track, it print those messages,
Check your network connectivity and the stream encoding.
This console output pretty much says exactly what your problem is; the stream dries out of content and could not keep up playing without interruption.
Either your network connection is unstable or the content is encoded in bandwidths that are far too high for your network connection.
For clarification; even if your local internet peering is offering high bandwidths, you should still check the bandwidths of the entire route. For example, you could try to download the streamed files via your browser for testing the throughput.
Are you trying it on a simulator or a device? It may be a simulator issue.
Also, on device, try streaming through multiple networks, e.g., LTE, wifi, etc., see if there is any difference