Web Audio API Memory Leaks on Mobile Platforms - ios

I am working on an application that will be using Audio quite heavily and I am in the research stages of deciding whether to use Web Audio API on devices that can support it. I have put together a very simple test bed that loads an MP3 sprite file (~600kB in size), has a play and pause button and also a destroy button, which should in theory allow GC reclaim the memory used by the Web Audio API implementation. However, after loading and destroying ~5 times iOS crashes due to an out of memory exception.
I have profiled MobileSafari in XCode Instruments and indeed MobileSafari continually eats up memory. Furthermore the 600kb MP3 turns out to use ~80-90MB of memory when decoded.
My question is - When decoding audio data using Web Audio API, why is the memory usage so big and also why is the memory never reclaimed? From my understanding the decoding is an async operation for the browser and so presumably happens on a separate thread? Is it possible the browsers separate thread is never releasing the memory used during decoding?
My code is below, any help/explanation is greatly appreciated:
<!DOCTYPE html>
<html>
<head lang="en">
<meta charset="UTF-8">
<title>Web Audio Playground</title>
</head>
<body>
<button id="load">
Load
</button>
<button id="play">
Play
</button>
<button id="pause">
Pause
</button>
<button id="destroy">
Destroy
</button>
<script type="application/javascript">
(function () {
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var loadButton = document.getElementById('load'),
playButton = document.getElementById('play'),
pauseButton = document.getElementById('pause'),
destroyButton = document.getElementById('destroy'),
audioContext = new window.AudioContext(),
soundBuffer = null,
soundSource = null;
loadButton.addEventListener('click', function () {
var request = new XMLHttpRequest();
request.open('GET', 'live-sprite.mp3', true);
request.responseType = 'arraybuffer';
// Decode asynchronously
request.onload = function () {
audioContext.decodeAudioData(request.response, function (buffer) {
soundBuffer = buffer;
});
};
request.send();
});
playButton.addEventListener('click', function () {
soundSource = audioContext.createBufferSource();
soundSource.buffer = soundBuffer;
soundSource.connect(audioContext.destination);
soundSource.start(0);
});
pauseButton.addEventListener('click', function () {
if (soundSource) {
soundSource.stop(0);
}
});
destroyButton.addEventListener('click', function () {
if (soundSource) {
soundSource.disconnect(0);
soundSource = null;
soundBuffer = null;
alert('destroyed');
}
});
})();
</script>
</body>
</html>

I made post on SoundJS issue tracker about this, but I'll reiterate it here for anyone looking:
It seems that simply disconnecting and dereferencing the AudioBufferSourceNode object on iOS Safari isn't enough; you need to manually clear out the reference to its buffer, or the buffer itself leaks. (This implies the AudioBufferSourceNode obj itself leaks, but we didn't see this as a practical limit in our project.)
Unfortunately to do this, a 1-sample long scratch buffer needs to get created, as assigning to null will cause an exception. The statement must be try-catch wrapped, too, as Chrome/FF will throw when .buffer is reassigned at any time.
The solution that worked was:
var ctx = new AudioContext(),
scratchBuffer = ctx.createBuffer(1, 1, 22050);
class WebAudioAdapter extends AudioAdapter {
close() {
if( this.__src ) {
this.__src.onended = null;
this.__src.disconnect(0);
try { this.__src.buffer = scratchBuffer; } catch(e) {}
this.__src = null;
}
}
}
Hope this helps y'all too!

The memory is large because the Web Audio API decodes your small MP3 into 32-bit LPCM – which will give you something on the order of 10MB per minute per channel.
So a 4 minute stereo MP3 would end up being something like 80MB.
This memory can't be reclaimed for as long as your application is holding on to the decoded AudioBuffer. So as long as you have a reference to it (in your case, soundBuffer), that memory can't be released. If it was, you couldn't play back the audio.

Related

RTCPeerConnection replaceTrack only changing stream for the remote peer

I am new to RTCPeerConnection (WEbRTC), so please bear with me.
So far I am able to get to the point where I can replace tracks on the run by switching camera or screen sharing in my app. But I noticed it in 2 browser tabs that newly replaced track stream is captured in partner/remote peer only, not on initiator's tab. It just keep showing old stream even though stream has been replaced.
It should've been nice if initiator can also see what he/she is sharing. I tried but no luck so far. Looking for some assistance.
My code looks like:
function screenShare(){
(async () => {
try {
await navigator.mediaDevices.getDisplayMedia(
{
cursor: true
}).then(stream => {
// localStream = stream;
let videoTrack = stream.getVideoTracks()[0];
var sender = senders.find(function(s) {
return s.track.kind == videoTrack.kind;
});
sender.replaceTrack(videoTrack);
videoTrack.onended = function(){
sender.replaceTrack(localStream.getTracks()[1]);
}
});
} catch (err) {
console.log('(async () =>: ' + err);
}
})();
}
Thanks in advance.
By design replaceTrack replaces the stream on the RTCPeerConnection. This does not affect the local video object. Reset the srcObject on the local video element to change it.

ASP NET Core 3.1 Streaming Video Doesn't Work on iOS and Safari

I've seen numerous questions with people having troubles with implementing video streaming for iOS and Safari. I have tried implementing all of the common pitfalls that I have read about in other questions, but I have not been successful.
I am trying to stream a video from an ASP NET Core 3.1 controller. This works completely correctly in basically every browser, except on iOS and Safari, for which the video just doesn't load (the player just shows a timebar with length 0:00).
I have shown the controller action and the view that I am using to stream the video below. As you can see, I have added EnableRangeProcessing, set the content length and added the playsinline attribute, which are all the main issues people talk about with iOS and Safari.
Can anyone see any other potential problem here?
public async Task<ActionResult> StreamVideo(int id)
{
var range = HttpContext.Request.GetTypedHeaders()?.Range?.Ranges.FirstOrDefault();
Stream data = await _service.DownloadVideo(id);
if (range is null) Response.ContentLength = data.Length;
else
{
Response.ContentLength = (range.To.HasValue ? range.To.Value + 1 : data.Length) - (range.From ?? 0);
}
return new FileStreamResult(data, "video/mp4") { EnableRangeProcessing = true };
}
<video style="max-height:100%; max-width: 100%;" controls="controls" preload="auto" playsinline>
<source src="#Url.Action("StreamVideo", "Attachment", new { Id = id })" type="video/mp4">
</video>

Web Audio API not playing sound sample on device, but works in browser

I have an Ionic app that is a metronome. Using Web Audio API I have made everything work using the oscillator feature, but when switching to use a wav file no audio is playing on a real device (iPhone).
When testing in the browser using Ionic Serve (chrome) the audio plays fine.
Here is what I have:
function snare(e) {
var audioSource = 'assets/audio/snare1.wav';
var request = new XMLHttpRequest();
request.open('GET', audioSource, true);
request.responseType = 'arraybuffer';
// Decode asynchronously
request.onload = function() {
audioContext.decodeAudioData(request.response, function(theBuffer) {
buffer = theBuffer;
playSound(buffer);
});
}
request.send();
}
function playSound(buffer) {
var source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
}
The audio sample is in www/assets/audio.
Any ideas where this could be going wrong?
I believe iOS devices require a user-gesture of some sort to allow playing of audio.
It's July 2017, iOS 10.3.2 and we're still finding this issue on Safari on iPhones. Interestingly Safari on a MacBook is fine.
#Raymond Toy's general observation still appears to be true. But #padenot's approach (via https://gist.github.com/laziel/7aefabe99ee57b16081c) did not work for me in a situation where I wanted to play a sound in response to some external event/trigger.
Using the original poster's code, I've had some success with this
var buffer; // added to make it work with OP's code
// keep the original function snare()
function playSound() { // dropped the argument for simplicity.
var source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
}
function triggerSound() {
function playSoundIos(event) {
document.removeEventListener('touchstart', playSoundIos);
playSound();
}
if (/iPad|iPhone/.test(navigator.userAgent)) {
document.addEventListener('touchstart', playSoundIos);
}
else { // Android etc. or Safari, but not on iPhone
playSound();
}
}
Now calling triggerSound() will produce the sound immediately on Android and will produce the sound on iOS after the browser page has been touched.
Still not ideal, but better than no sound at all...
I had a similar issue in current iOS (15). I tried to play base64 encoded binary data which worked in all browsers, but not on iOS.
Finally reordering of the statements solved my issue:
let buffer = Uint8Array.from(atob(base64), c => c.charCodeAt(0));
let context = new AudioContext();
// these lines were within "play()" before
audioSource = context.createBufferSource();
audioSource.connect(context.destination);
audioSource.start(0);
// ---
context.decodeAudioData(buffer.buffer, play, (e) => {
console.warn("error decoding audio", e)
});
function play(audioBuffer) {
audioSource.buffer = audioBuffer;
}
Also see this commit in my project.
I assume that calling audioSource.start(0) within the play() method was somehow too late because it's within a callback after context.decodeAudioData() and therefore maybe "too far away" from a user interaction for the standards of iOS.

Performance issues with Web Audio API playing audio samples

I am creating a complex metronome app using Ionic and Web Audio API.
At certain points, the metronome could be playing 10+ 'beats' a second.
This essentially calls this function 10 times+ a second.
function playSound(e, name) {
var buffer = audioBuffers[name];
var source = audioContext.createBufferSource();
var gain = audioContext.createGain();
source.connect(gain);
gain.connect(audioContext.destination);
gain.gain.value = 1;
source.buffer = buffer;
source.connect(audioContext.destination);
sched.nextTick(e.playbackTime, () => {
source.start(0);
});
}
The user can choose multiple samples, so I fetch them all first once and store the buffer in an array to improve performance instead of making a XMLHttpRequest() every time.
The issue is that when playing at these higher rates the playback gets odd and sometimes goes out of sync. I am using https://github.com/mohayonao/web-audio-scheduler which works lovely so I know its not a timing issue.
If I swap out the sample playback for a basic oscillator:
function oscillator(e) {
const t0 = e.playbackTime;
const t1 = t0 + 0.4;
const osc = audioContext.createOscillator();
const amp = audioContext.createGain();
osc.frequency.value = 1000;
osc.start(t0);
osc.stop(t1);
osc.connect(amp);
amp.gain.setValueAtTime(1, t0);
amp.gain.exponentialRampToValueAtTime(1e-6, t1);
amp.connect(masterGain);
sched.nextTick(t1, () => {
osc.disconnect();
amp.disconnect();
});
}
Performance is fine no matter what tempo. Is there any improvements I can make to the sample playback to help improve performance?
Your first function just uses source.start(0); which makes me think you're relying on setTimeout or setInterval to "schedule" the audio. The second one properly uses the Web Audio scheduler ("start(t0)"). See "A Tale of Two Clocks" for more: https://www.html5rocks.com/en/tutorials/audio/scheduling/.
What cwilso says is right. Use AudioContext.CurrentTime and +/- to determine the next time for setTimeout manually and not with that scheduler library. Then everything should be fine.

iOS app freezes, not crashing when loading video in webview

I have an iPad app developed using a 3rd party tool called OpenPlug which converts AS3 to C++ and from there on exports to iOS. (Just wanted to note this is not a "native" app using Obj-C in XCode written by me, I wrote AS3)
Now I have this iPad application which displays pictures and video in a slideshow. For the video I'm using a WebView which loads a HTML page where I change the src property of the video object to the location of the video file which was downloaded to my application storage. This is working fine except that the application freezes when it is running for a few hours (3-6).
I searched for this problem and tried the solution in iOS Safari memory leak when loading/unloading HTML5 <video> but that does not seem to change anything for me.
Since the application freezes (right before the HTML page needs to load a video) and does not crash what does that mean? Do I need to destroy the video object? First I was creating a new WebView for each video but now I'm reusing the webview and just changing the src property but that also does not help me.
Can anyone shed some light on this? OpenPlug has discontinued it service and does not offer any support anymore but nevertheless I think it is more of a webview/video problem on iPad (?)
Important to note: The application is freezing but my iPad is not. The application does not generate a crash report and does not execute any code anymore (also no traces). When I push the Home button on my iPad and press the app icon then the application is restarted.
Here is the code of my HTML page which is refreshed every time a new video needs to start (webview.location = ...)
<html>
<head>
<script>
function videoEndedHandler(){
var video = document.getElementById("videoPlayer");
video.src = "";
video.load();
window.location.hash = "ended";
}
function videoErrorHandler(){
window.location.hash = "error";
var video = document.getElementById("videoPlayer");
video.src = "";
video.load();
}
var loop;
function setup(){
var video = document.getElementById("videoPlayer");
video.addEventListener("error", videoErrorHandler,false);
video.addEventListener("ended", videoEndedHandler,false);
video.load();
video.play();
startHashLoop();
}
function startHashLoop(){
if(window.location.hash == "#touched"){
setAsPaused();
}
if(window.location.hash == "#paused"){
//check image
testImage("shouldResume.png?nocache=" + Math.random());
}
if(window.location.hash == "#resume"){
var video = document.getElementById("videoPlayer");
video.play();
}
loop = setTimeout(hashLoop,500);
}
function testImage(url) {
var img = new Image;
img.addEventListener("load",isGood);
img.addEventListener("error",isBad);
img.src = url;
}
function isGood() {
window.location.hash = "resume";
}
function isBad() {
//alert("Image does not exist");
}
function hashLoop(){
startHashLoop();
}
function setAsTouched(){
window.location.hash = "touched";
}
function setAsPaused(){
var video = document.getElementById("videoPlayer");
video.pause();
window.location.hash = "paused";
}
</script>
</head>
<body onload="setup();" style="background-color:#000000;">
<video id="videoPlayer" style="top:0;left:0;position:absolute;" width="100%" height="100%" preload="auto" src="##VIDEO_URL##" autoplay="autoplay" webkit-playsinline />
</body>
</html>
It would be helpful if you'd post your entire code, but I think it's freezing because you're loading a video from web in your main thread, which is also responsible for redrawing UI. By loading a large video inside your thread, your UI freezes as well.
I would recommend moving the code that does video loading into a separate thread (use blocks if you're on iOS5).

Resources