AIR4: How to restore audio in iOS 8 (beta 5) after requesting mic access? - ios

Problem: iOS 8 (beta 5) fades out all sound output in my app after listening for sample data, and never restores it, but only if sound has been played before requesting microphone sample data.
Another user noted that this behaviour stems from requesting microphone access, but sound plays as expected in our released app using iOS7 in both of the cases below, and the workaround to close/reopen the app isn't a viable solution since microphone recording is an recurring part of our app.
Conditions:
Flex 4.6.0 & AIR 4.0
iPad 3 (MD335LL/A)
iOS 8 (12A4345D)
Both test cases assume that microphone permission has been granted..
Test case 0:
Play sound
Stop sound channel
Audio stops
Connect to microphone, remove listener once sample data received
Attempt to play sound
No audible output, nor does the sound complete event ever fire
Test case 1:
Connect to microphone, remove listener once sample data received
Sounds plays without a problem and sound complete event is fired
Sample code:
protected var _url:String = 'audio/organfinale.mp3';
protected var _sound:Sound;
protected var _soundChannel:SoundChannel;
protected var _microphone:Microphone;
public function init():void
{
initSound();
}
/** SOUND **/
protected function initSound():void
{
// Set up to mimic app
SoundMixer.audioPlaybackMode = AudioPlaybackMode.MEDIA;
_sound = new Sound( new URLRequest( _url ) );
_sound.addEventListener(Event.COMPLETE, onSoundLoaded );
}
protected function onSoundLoaded(e:Event):void
{
_sound.removeEventListener(Event.COMPLETE, onSoundLoaded);
switch ( 0 )
{
// Will cut audio completely, prevent audio dispatch
case 0:
playSound();
setTimeout( initMicrophone, 1500 );
break;
// Works perfectly (because no overlap between sound/microphone?)
case 1:
initMicrophone();
break;
}
}
protected function playSound():void
{
trace( 'Play sound' );
if ( _soundChannel && _soundChannel.hasEventListener( Event.SOUND_COMPLETE ) )
_soundChannel.removeEventListener( Event.SOUND_COMPLETE, onSoundCompleted );
_soundChannel = _sound.play( 0 );
_soundChannel.addEventListener( Event.SOUND_COMPLETE, onSoundCompleted, false, 0, true );
}
protected function onSoundCompleted( e:Event ):void
{
trace( "Sound complete" );
}
/** MICROPHONE **/
protected function initMicrophone():void
{
if ( Microphone.isSupported )
{
// Testing pre-emptive disposal of sound that might be conflicting with microphone in iOS 8
if ( _soundChannel )
{
// Tested this, but it throws an error because sound is not streaming
// _sound.close();
_soundChannel.removeEventListener( Event.SOUND_COMPLETE, onSoundCompleted );
_soundChannel.stop();
_soundChannel = null;
// Instead the sound will be cut abruptedly, and will fail to dispatch complete event
}
_microphone = Microphone.getMicrophone();
_microphone.setSilenceLevel(0);
// _microphone.setUseEchoSuppression(true);
// _microphone.gain = 50;
// _microphone.rate = 44;
_microphone.addEventListener( SampleDataEvent.SAMPLE_DATA, onSampleDataReceived, false, 0, true );
}
else
{
trace( 'Microphone is not supported!' );
}
}
protected function onSampleDataReceived( e:SampleDataEvent ):void
{
trace( 'Sample data received' );
_microphone.removeEventListener( SampleDataEvent.SAMPLE_DATA, onSampleDataReceived );
_microphone = null;
setTimeout( playSound, 1500 );
}
Notes:
I stopped the sound channel before adding the mic listener, assuming that might have been causing a conflict, but it made no difference.
I've tested using the same device/OS after compiling the app with Flex 4.6.0 & AIR 14.0, and the problem persists.
By comparison, testing this app on iOS 7 worked in both cases.
Any ideas?
Update 1: Bug has been logged here: https://bugbase.adobe.com/index.cfm?event=bug&id=3801262
Update 2: Bug is fixed as of AIR 15.0.0.274: http://labs.adobe.com/downloads/air.html

Related

Flutter Record (3.0.0) plugin no sound recorded in iOS sometimes

I have tried to record an audio using the Flutter Record plugin(3.0.0) and there was an issue where the recorded files (.aac) of some users didn't have any sound. The bitrate of those files were very low (4478). I haven't set any bitrate from the code, so it should be the default (128000) by the plugin.
And also, the reported issue were from iphone users who were using above iOS 15.6.1.
Any thoughts on what could go wrong here?
if (await _audioRecorder.hasPermission()) {
await _audioRecorder.start(
encoder: AudioEncoder.AAC, path: audioPath);
bool isRecording = await _audioRecorder.isRecording();
setState(() {
if (isRecording) {
_isRecording = true;
}
_recordDuration = 0;
});
_startTimer();
}
}

onaudioprocess not called on ios11

I am trying to get audio capture from the microphone working on Safari on iOS11 after support was recently added
However, the onaudioprocess callback is never called. Here's an example page:
<html>
<body>
<button onclick="doIt()">DoIt</button>
<ul id="logMessages">
</ul>
<script>
function debug(msg) {
if (typeof msg !== 'undefined') {
var logList = document.getElementById('logMessages');
var newLogItem = document.createElement('li');
if (typeof msg === 'function') {
msg = Function.prototype.toString(msg);
} else if (typeof msg !== 'string') {
msg = JSON.stringify(msg);
}
var newLogText = document.createTextNode(msg);
newLogItem.appendChild(newLogText);
logList.appendChild(newLogItem);
}
}
function doIt() {
var handleSuccess = function (stream) {
var context = new AudioContext();
var input = context.createMediaStreamSource(stream)
var processor = context.createScriptProcessor(1024, 1, 1);
input.connect(processor);
processor.connect(context.destination);
processor.onaudioprocess = function (e) {
// Do something with the data, i.e Convert this to WAV
debug(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
On most platforms, you will see items being added to the messages list as the onaudioprocess callback is called. However, on iOS, this callback is never called.
Is there something else that I should do to try and get it called on iOS 11 with Safari?
There are two problems. The main one is that Safari on iOS 11 seems to automatically suspend new AudioContext's that aren't created in response to a tap. You can resume() them, but only in response to a tap.
(Update: Chrome mobile also does this, and Chrome desktop will have the same limitation starting in version 70 / December 2018.)
So, you have to either create it before you get the MediaStream, or else get the user to tap again later.
The other issue with your code is that AudioContext is prefixed as webkitAudioContext in Safari.
Here's a working version:
<html>
<body>
<button onclick="beginAudioCapture()">Begin Audio Capture</button>
<script>
function beginAudioCapture() {
var AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var processor = context.createScriptProcessor(1024, 1, 1);
processor.connect(context.destination);
var handleSuccess = function (stream) {
var input = context.createMediaStreamSource(stream);
input.connect(processor);
var recievedAudio = false;
processor.onaudioprocess = function (e) {
// This will be called multiple times per second.
// The audio data will be in e.inputBuffer
if (!recievedAudio) {
recievedAudio = true;
console.log('got audio', e);
}
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
(You can set the onaudioprocess callback sooner, but then you get empty buffers until the user approves of microphone access.)
Oh, and one other iOS bug to watch out for: the Safari on iPod touch (as of iOS 12.1.1) reports that it does not have a microphone (it does). So, getUserMedia will incorrectly reject with an Error: Invalid constraint if you ask for audio there.
FYI: I maintain the microphone-stream package on npm that does this for you and provides the audio in a Node.js-style ReadableStream. It includes this fix, if you or anyone else would prefer to use that over the raw code.
Tried it on iOS 11.0.1, and unfortunately this problem still isn't fixed.
As a workaround, I wonder if it makes sense to replace the ScriptProcessor with a function that takes the steam data from a buffet and then processes it every x milliseconds. But that's a big change to the functionality.
Just wondering... do you have the setting enabled in Safari settings? It comes enabled by default in iOS11, but maybe you just disabled it without noticing.

AS3 MediaPromise for CameraUI doesn't get back to me on iOS

I need to capture an image using CameraUI on AS3/Air/Starling framework. I get the CameraUI, it opens, the image gets selected and I receive a MediaPromise object.
The samples in the documentation mentions the following code to capture the Image from an async media promise on iOS but I do not hear any callbacks in neither onMediaLoaded function nor the ioError function.
BTW the log line "Asynchronous Mode Media Promise." is displayed when I choose a photo.
private function imageSelected(evt:MediaEvent):void
{
this.feedbackText.text = "Image Selected\n";
// Create a new imagePromise
var imagePromise:MediaPromise = evt.data;
// Open our data source
dataSource = imagePromise.open();
if(imagePromise.isAsync)
{
this.feedbackText.text += "Asynchronous Mode Media Promise.\n";
var eventSource:IEventDispatcher = dataSource as IEventDispatcher;
eventSource.addEventListener( Event.COMPLETE, onMediaLoaded );
eventSource.addEventListener( IOErrorEvent.IO_ERROR, ioError );
}
else
{
this.feedbackText.text += "Synchronous Mode Media Promise.\n";
readMediaData();
}
}
// =======================================================================
// onMediaLoaded
// =======================================================================
protected function onMediaLoaded( event:Event ):void
{
this.feedbackText.text += "Image Loaded.\n";
readMediaData();
}
// ========================================================================
// ioError()
// ========================================================================
protected function ioError(event:IOErrorEvent):void
{
this.feedbackText.text = "IOError - Unable to process photo - " + event.text;
}
Test Device: iPhone 6 running iOS 10
Air SDK: v22
Just found the problem with my code, I had imported starling.events.Event (instead of flash.events.DataEvent) hence the "Event" type in my callback function argument resulted in "Type Coercion failed".

Security Sandbox Violation for ready youtube video in my AIR app (IOS)

I'm trying to load movie trailers in my IOS AIR app. It works fine when I'm trying on my computer but the video won't play on an Apple device.
On my computer, the video is playing but this message is display in the output :
*** Security Sandbox Violation ***
SecurityDomain 'https://s.ytimg.com/yts/swfbin/player-vflm0X9AB/watch_as3.swf' tried to access incompatible context 'app:/Iphone3.swf'
Warning: Domain i.ytimg.com does not explicitly specify a meta-policy, but Content-Type of policy file https://i.ytimg.com/crossdomain.xml is 'text/x-cross-domain-policy'. Applying meta-policy 'by-content-type'.
Here's my code :
urlTrailer = "https://www.youtube.com/v/"+item.trailer+"?gl=BE&version=3";
// item.trailer is the videoID of the YoutubeVideo
function showTrailer():void {
var loader:Loader = new Loader();
var player:Object;
loader.contentLoaderInfo.addEventListener(Event.INIT, onLoaderInit);
loader.load(new URLRequest(urlTrailer));
function onLoaderInit(event:Event):void {
addChild(loader);
loader.content.addEventListener("onReady", onPlayerReady);
loader.content.addEventListener("onError", onPlayerError);
loader.content.addEventListener("onStateChange", onPlayerStateChange);
loader.content.addEventListener("onPlaybackQualityChange",
onVideoPlaybackQualityChange);
}
function onPlayerReady(event:Event):void {
// Event.data contains the event parameter, which is the Player API ID
trace("player ready:", Object(event).data);
player = loader.content;
player.y=200;
player.setSize(480, 360);
}
I've tried to add in my xml file :
<cross-domain-policy>
<allow-access-from domain="*"/>
</cross-domain-policy>
But I still have the error Security Sandbox Violation
What do you think could be the problem and how can I solve it ?
Try something like this below : (if it doesn't work then try my other Answer code link)
function onLoaderInit(event:Event):void
{
//addChild(loader);
loader.content.addEventListener("onReady", onPlayerReady);
loader.content.addEventListener("onError", onPlayerError);
loader.content.addEventListener("onStateChange", onPlayerStateChange);
loader.content.addEventListener("onPlaybackQualityChange",
onVideoPlaybackQualityChange);
}
function onPlayerReady(event:Event):void
{
// Event.data contains the event parameter, which is the Player API ID
trace("player ready:", Object(event).data);
player = loader.content;
player.y=200;
player.setSize(480, 360);
//# Alt method for adding to stage...
//loader.width = 320; loader.height = 240;
//loader.x = 30; loader.y = 100;
addChild(loader);
}

Distorted audio in iOS 7.1 with WebAudio API

On iOS 7.1, I keep getting a buzzing / noisy / distorted sound when playing back audio using the Web Audio API. It sounds distorted like this, in place of normal like this.
The same files are fine when using HTML5 audio. It all works fine on desktop (Firefox, Chrome, Safari.)
EDIT:
The audio is distorted in the iOS Simulator versions iOS 7.1, 8.1, 8.2. The buzzing sound often starts before I even playback anything.
The audio is distorted on a physical iPhone running iOS 7.1, in both Chrome and Safari.
The audio is fine on a physical iPhone running iOS 8.1, in both Chrome and Safari.
i.e.: the buzzing audio is on iOS 7.1. only.
Howler.js is not the issue. The problem is still there using pure JS like so:
var context;
var sound;
var extension = '.' + ( new Audio().canPlayType( 'audio/ogg' ) !== '' ? 'ogg' : 'mp3');
/** Test for WebAudio API support **/
try {
// still needed for Safari
window.AudioContext = window.AudioContext || window.webkitAudioContext;
// create an AudioContext
context = new AudioContext();
} catch(e) {
// API not supported
throw new Error( 'Web Audio API not supported.' );
}
function loadSound( url ) {
var request = new XMLHttpRequest();
request.open( 'GET', url, true );
request.responseType = 'arraybuffer';
request.onload = function() {
// request.response is encoded... so decode it now
context.decodeAudioData( request.response, function( buffer ) {
sound = buffer;
}, function( err ) {
throw new Error( err );
});
}
request.send();
}
function playSound(buffer) {
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start(0);
}
loadSound( '/tests/Assets/Audio/En-us-hello' + extension );
$(document).ready(function(){
$( '#clickme' ).click( function( event ) {
playSound(sound);
});
}); /* END .ready() */
A live version of this code is available here: Web Audio API - Hello world
Google did not bring up any result about such a distorted sound issue on iOS 7.1.
Has anyone else run into it? Should I file a bug report to Apple?
I believe the issue is caused due to resetting the audioContext.sampleRate prop, which seem to happen after the browser/OS plays something recorded in a different sampling rate.
I've devised the following workaround, which basically silently plays a short wav file recorded in the sampling rate that the device currently does playback on:
"use strict";
var getData = function( context, filePath, callback ) {
var source = context.createBufferSource(),
request = new XMLHttpRequest();
request.open( "GET", filePath, true );
request.responseType = "arraybuffer";
request.onload = function() {
var audioData = request.response;
context.decodeAudioData(
audioData,
function( buffer ) {
source.buffer = buffer;
callback( source );
},
function( e ) {
console.log( "Error with decoding audio data" + e.err );
}
);
};
request.send();
};
module.exports = function() {
var AudioContext = window.AudioContext || window.webkitAudioContext,
context = new AudioContext();
getData(
context,
"path/to/short/file.wav",
function( bufferSource ) {
var gain = context.createGain();
gain.gain.value = 0;
bufferSource.connect( gain );
gain.connect( context.destination );
bufferSource.start( 0 );
}
);
};
Obviously, if some of the devices have different sampling rates, you would need to detect and use a specific file for every rate.
it looks like iOS6+ Safari defaults to a sample rate of 48000. If you type this into the developer console when you first open mobile safari, you'll get 48000:
var ctx = new window.webkitAudioContext();
console.log(ctx.sampleRate);
Further Reference: https://forums.developer.apple.com/thread/20677
Then if you close the initial context on load: ctx.close(), the next created context will use the sample rate most other browsers use (44100) and sound will play without distortion.
Credit to this for pointing me in the right direction (and in case the above no longer works in the future): https://github.com/Jam3/ios-safe-audio-context/blob/master/index.js
function as of post date:
function createAudioContext (desiredSampleRate) {
var AudioCtor = window.AudioContext || window.webkitAudioContext
desiredSampleRate = typeof desiredSampleRate === 'number'
? desiredSampleRate
: 44100
var context = new AudioCtor()
// Check if hack is necessary. Only occurs in iOS6+ devices
// and only when you first boot the iPhone, or play a audio/video
// with a different sample rate
if (/(iPhone|iPad)/i.test(navigator.userAgent) &&
context.sampleRate !== desiredSampleRate) {
var buffer = context.createBuffer(1, 1, desiredSampleRate)
var dummy = context.createBufferSource()
dummy.buffer = buffer
dummy.connect(context.destination)
dummy.start(0)
dummy.disconnect()
context.close() // dispose old context
context = new AudioCtor()
}
return context
}

Resources