We are creating a audio video application which using webRTC. The problem is we are not able to show the stream spectrum for remote but for local we are able to.
// setup a analyzer
var analyser = audioCtx.createAnalyser();
analyser.smoothingTimeConstant = 0.0;
analyser.fftSize = 1024;
// get the average for the first channel
var array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(array);
var average = getAverageVolume(array);
For local stream we are getting the frequency values inside the array, but for remote stream we are getting zero values inside the array.
If any help, will be greatly appreciated.
A similar issue is described here https://code.google.com/p/chromium/issues/detail?id=241543
Seems like we don't have any specific solution due to browser's issue.
Related
I'm streaming video from URL stored in firebase storage and I'm using the following code for streaming the video using ExoPlayer
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
LoadControl loadControl = new CustomLoadControl();
exoPlayer = ExoPlayerFactory.newSimpleInstance(SafetyTVHomeActivity.this, trackSelector, loadControl);
Uri videoUri = Uri.parse(videourl);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(videoUri, dataSourceFactory, extractorsFactory, null, null);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
exoPlayer.prepare(mediaSource, false, false);
exoPlayer.seekTo(0, 0);
Everything is fine and the video gets streamed. But the problem I'm facing is the initial load time to start the video is too long (5+ seconds). I want to reduce the initial loading time to start the video to (0-2 seconds). Is there a way to achieve this using exoplayer?
I also tried using DASH media source in exoplayer using the code below
Uri videoUri = Uri.parse(videourl);
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory(Util.getUserAgent(SafetyTVHomeActivity.this, "app-name"));
MediaSource mediaSource = new DashMediaSource.Factory(dataSourceFactory).createMediaSource(videoUri);
exoPlayer = ExoPlayerFactory.newSimpleInstance(this);
exoPlayer.prepare(mediaSource);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
I used the same firebase storage URL in the dash media source but I'm getting the following error
ExoPlayerImplInternal: Source error.
com.google.android.exoplayer2.ParserException: org.xmlpull.v1.XmlPullParserException: Unexpected token (position:TEXT G#��B�%���������...#2:79 in java.io.InputStreamReader#c587547) at com.google.android.exoplayer2.source.dash.manifest.DashManifestParser.parse(DashManifestParser.java:105) at........
Could anyone please help me on how can I work around this.
My main objective is to stream video from URL and the initial load time to start the video should be 0-2 seconds (The way TikTok does it). Any help would be really helpful.
I'm trying to convert a MIDI file to an Audio File (.m4a) in Swift.
Right now I'm using MIKMIDI as a tool to sequence and playback MIDI files, however it does not include the ability to save the playback into a file. MIKMID's creator outlines the process to do this here. In an attempt to capture and save the output to an audio file, I've followed this example to try and replace the MIKMIDI Graph's RemoteIO node with a GeneralIO node in Swift. When I try to save the output to a file using AudioUnitRender and ExtAudioFileWrite, they both return error -50 (kAudio_ParamError).
var channels = 2
var buffFrames = 512
var bufferList = AudioBufferList.allocate(maximumBuffers: 1)
for i in 0...bufferList.count-1{
var buffer = AudioBuffer()
buffer.mNumberChannels = 2
buffer.mDataByteSize = UInt32(buffFrames*sizeofValue(AudioUnitSampleType))
buffer.mData = calloc(buffFrames, sizeofValue(AudioUnitSampleType))
bufferList[i] = buffer
result = AudioUnitRender(generalIOAudioUnit, &flags, &inTimeStamp, busNum, UInt32(buffFrames), bufferList.unsafeMutablePointer)
inTimeStamp.mSampleTime += 1
result = ExtAudioFileWrite(extAudioFile, UInt32(buffFrames), bufferList.unsafeMutablePointer)
}
What is causing error -50, and how can I resolve it to render the MIDI (offline) to .m4a files?
UPDATE: I have resolved the ExtAudioFileWrite error -50 by changing mNumberChannels and channels to = 1. Now I get a one second audio file with noise. AudioUnitRender still returns error -50.
There are a couple of problems with your code:
your AudioBufferList doesn't agree with the client format, try
let bufferList = AudioBufferList.allocate(maximumBuffers: Int(clientFormat.mChannelsPerFrame))
you're replacing the wrong node from the AUGraph, and connecting the remaining node to itself, resulting in an infinite loop on AudioUnitRender.
But the main problem is that you are not implementing the solution that the author suggested. You wish that you could call AudioUnitRender with sample timestamps, faster than realtime, but the author said no, you'll have to manually convert sample time to hosttime and implement the better part of a midi player if you want that.
So you could do that (sounds hard), or file a feature request, or maybe record to file in realtime as you listen to the music by adding a render notification to the graph's remote IO audio unit with AudioUnitAddRenderNotify and writing the samples during the kAudioUnitRenderAction_PostRender phase.
I'm trying to use Dart to get an OGG file to loop using the HTML5 <audio> element. Does anyone have a working example for this. I'm specifically having trouble getting the audio to loop.
I was not able to have a fully controlled loop using the html5 AudioElement; sometimes the loop option was simply not working, sometimes there was a gap, sometimes patterns would overlap.
I had better chance using WebAudio using something like:
source = audioContext.createBufferSource();
source.buffer = buffer;
gainNode = audioContext.createGain();
gainNode.gain.value = 1;
source.connectNode(gainNode);
gainNode.connectNode(audioContext.destination);
// play it now in loop
source.start(audioContext.currentTime);
source.loop = true;
I was not able to load the source buffer from the html audio element which could have been a solution for the CORS issues I had. The samples were loaded using http requests.
I created a dartpad example that demonstrates looping using AudioElement native loop feature and WebAudio
https://dartpad.dartlang.org/879424bca794c63698b0
This query is regarding the Portaudio framework. A little background before I ask the question:I am working on an application in PortAudio to output audio through a multichannel(=8) device. However, the device I am using does not expose itself as a single 8-channel device but instead shows up in my device-list as 4 stereo devices. On searching for an approach to handle this, I got to know that WinMME in PortAudio supports multiple devices.
Now, I went through the appropriate header file("pa_win_wmme.h") and followed the suggestions present. But I get the 'Invalid device' error (error number -9996) after calling PA_OpenStream(). In the above mentioned header file, they have in fact specified the right parameter(s) to use when configuring multiple devices to avoid this error, but in-spite of following them, I still get the error.
So I wanted to know if anybody has faced a similar issue and whether I have missed/wrongly configured anything.
I am pasting the required snippets of code below for reference:
PaStreamParameters outputParameters;
PaWinMmeStreamInfo wmmeStreamInfo;
PaWinMmeDeviceAndChannelCount wmmeDeviceAndNumChannels;**
...
...
outputParameters.device = paUseHostApiSpecificDeviceSpecification;
outputParameters.channelCount = 8;
outputParameters.sampleFormat = paFloat32; /* 32 bit floating point processing */
outputParameters.hostApiSpecificStreamInfo = NULL;
wmmeStreamInfo.size = sizeof(PaWinMmeStreamInfo);
wmmeStreamInfo.hostApiType = paMME;
wmmeStreamInfo.version = 1;
wmmeStreamInfo.flags = paWinMmeUseMultipleDevices;
wmmeDeviceAndNumChannels.channelCount = 2;
wmmeDeviceAndNumChannels.device = 3;
wmmeStreamInfo.devices = &wmmeDeviceAndNumChannels;
wmmeStreamInfo.deviceCount = 4;
outputParameters.hostApiSpecificStreamInfo = &wmmeStreamInfo;
The device id = 3 was obtained through
Pa_GetHostApiInfo( Pa_HostApiTypeIdToHostApiIndex( paMME ) )->defaultOutputDevice
I hope I have made the query clear enough. Will be happy to provide more details if required.
Thanks.
I finally figured out the mistake :-)
The configuration for multiple devices must be made as an array. For instance, in the above case
wmmeDeviceAndNumChannels must be an array of 4, with each individual device field containing the corresponding device index of each of the 4 stereo devices. The channelCount remains 2. The outputParameters.channelCount still has to be the aggregate number of channels, i.e. 8. With this I was able to run the application with a single stream, and of course, without any errors related to invalid device or invalid number of channels.:-)
Thanks.
Based on the code pasted above, it looks like you are trying to call open on a single 8-channel device. Instead you will have to get the Pa index of all four devices and call open 4 times. Once for each stereo device. You will then have 4 interleaved stereo streams to maintain. My guess is that changing channelCount = 8 to channelCount = 2 will allow the first stream to open.
Can anyone suggest how to handle a slow network when streaming video in a web view?
When the network strength is poor, a blank screen appears or video doesn't stream.
Is there a way to detect this condition so that we can alert the user? (Apart from using private API.)
Perhaps ifi_baudrate member of the if_data structure (declared in <net/if.h>) is what you need. If baudrate is less than some threshold value, then you can show an alert.
Please see the following answer to know how to obtain the if_data structure for a particular network interface:
https://stackoverflow.com/a/8014012/1310204
You can easily detect the state of the network connection via the HTML5 networking API
http://www.html5rocks.com/en/mobile/optimization-and-performance/#toc-network-detection
Also if you want to test the network speed, just set up some files on your server of a specific size, and do a ajax request for the file, while timing how long it takes to download.
You can use a simple:
var start = new Date();
$.get("someFile.jpg")
.done(function() {
var elapsed = (new Date() - start);
});
Or dig into the HTML5 performance API:
http://www.html5rocks.com/en/tutorials/webperformance/basics/
...if you not using javascript, the same applies. Just open a network connection with whatever is at your disposition, download a small file & do the math ;-)