AudioKit cannot record microphone - ios

I'm using the latest version of AudioKit, 4.8. I have set up a simple audio chain to record some audio from the microphone, and it works great on the simulator, but when I switch to a physical device, nothing is recorded and it keeps printing this is the console:
AKNodeRecorder.swift:process(buffer:time:):137:Write failed: error -> Cannot complete process.(“com.apple.coreaudio.avfaudio”Error -50.)
So you see, nothing is recorded. -50 is the bad param error, but obviously I couldn't find what I'm missing. The audio session is set to playAndRecord and I DID request the microphone usage. If I don't use AKNodeRecorder but add my own tap instead, the same error still shows up when I try AKAudioFile.write(from:).
Here's my code:
guard let file = try? AKAudioFile(forWriting: destinationURL, settings: [AVFormatIDKey: kAudioFormatMPEG4AAC, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue]) else {
return
}
microphone?.start()
let recorder = try? AKNodeRecorder(node: microphone, file: file)
try? recorder?.record()
What should I do?
P.S. before I upgraded to AudioKit 4.8, I was using 4.7, and instead of giving me error -50, it simply crashed when I began recording on a physical device, giving me this. All is fine on simulators.
Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: '[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10865 "(null)"'

Related

Html5 Audio stop within a second when i play in IOS [IONIC 5]

this is really frustrating and i have an ionic app where we need to play audio and we used html5 audio to play the radio with url like this.
audio = new Audio();
audio.src = url;
audio.play();
it works on android and all ios simulators without any problem but when we use physical device it start and stopped within a second and in logs it give this error.
Error acquiring assertion: <NSError: 0x282cf67c0; domain: RBSAssertionErrorDomain; code: 2; reason:
"Required client entitlement is missing"> {
userInfo = {
RBSAssertionAttribute = <RBSLegacyAttribute: 0x1592432e0; requestedReason: MediaPlayback; reason:
MediaPlayback; flags: PreventTaskSuspend | PreventTaskThrottleDown |
WantsForegroundResourcePriority>;
}
Weird thing is that sometime it works but i mean it happened like one or two occasions.
if anyone has any idea please help me out.
thanks

Connecting bluetooth headphones while app is recording in the background causes the recording to stop

I am facing the following issue and hoping someone else encountered it and can offer a solution:
I am using AVAudioEngine to access the microphone. Until iOS 12.4, every time the audio route changed I was able to restart the AVAudioEngine graph to reconfigure it and ensure the input/output audio formats fit the new input/output route. Due to changes introduced in iOS 12.4 it is no longer possible to start (or restart for that matter) an AVAudioEngine graph while the app is backgrounded (unless it's after an interruption).
The error Apple now throw when I attempt this is:
2019-10-03 18:34:25.702143+0200 [1703:129720] [aurioc] 1590: AUIOClient_StartIO failed (561145187)
2019-10-03 18:34:25.702528+0200 [1703:129720] [avae] AVAEInternal.h:109 [AVAudioEngineGraph.mm:1544:Start: (err = PerformCommand(*ioNode, kAUStartIO, NULL, 0)): error 561145187
2019-10-03 18:34:25.711668+0200 [1703:129720] [Error] Unable to start audio engine The operation couldn’t be completed. (com.apple.coreaudio.avfaudio error 561145187.)
I'm guessing Apple closed a security vulnerability there. So now I removed the code to restart the graph when an audio route is changed (e.g. bluetooth headphones are connected).
It seems like when an I/O audio format changes (as happens when the user connects a bluetooth device), an .AVAudioEngingeConfigurationChange notification is fired, to allow the integrating app to react to the change in format. This is really what I should've used to handle changes in I/O formats from the beginning, instead of brute forcing restarting the graph. According to the Apple documentation - “When the audio engine’s I/O unit observes a change to the audio input or output hardware’s channel count or sample rate, the audio engine stops, uninitializes itself, and issues this notification.” (see the docs here). When this happens while the app is backgrounded, I am unable to start the audio engine with the correct audio i/o formats, because of point #1.
So bottom line, it looks like by closing a security vulnerability, Apple introduced a bug in reacting to audio I/O format changes while the app is backgrounded. Or am I missing something?
I'm attaching a code snippet to better describe the issue. For a plug-and-play AppDelegate see here - https://gist.github.com/nevosegal/5669ae8fb6f3fba44505543e43b5d54b.
class RCAudioEngine {
​
private let audioEngine = AVAudioEngine()
init() {
setup()
start()
NotificationCenter.default.addObserver(self, selector: #selector(handleConfigurationChange), name: .AVAudioEngineConfigurationChange, object: nil)
}
​
#objc func handleConfigurationChange() {
//attempt to call start()
//or to audioEngine.reset(), setup() and start()
//or any other combination that involves starting the audioEngine
//results in an error 561145187.
//Not calling start() doesn't return this error, but also doesn't restart
//the recording.
}
public func setup() {
​
//Setup nodes
let inputNode = audioEngine.inputNode
let inputFormat = inputNode.inputFormat(forBus: 0)
let mainMixerNode = audioEngine.mainMixerNode
​
//Mute output to avoid feedback
mainMixerNode.outputVolume = 0.0
​
inputNode.installTap(onBus: 0, bufferSize: 4096, format: inputFormat) { (buffer, _) -> Void in
//Do audio conversion and use buffers
}
}
​
public func start() {
RCLog.debug("Starting audio engine")
guard !audioEngine.isRunning else {
RCLog.debug("Audio Engine is already running")
return
}
​
do {
audioEngine.prepare()
try audioEngine.start()
} catch {
RCLog.error("Unable to start audio engine \(error.localizedDescription)")
}
}
}
I see only a fix that had gone into iOS 12.4. I am not sure if that causes the issue.
With the release notes https://developer.apple.com/documentation/ios_ipados_release_notes/ios_12_4_release_notes
"Resolved an issue where running an app in iOS 12.2 or later under the Leaks instrument resulted in random numbers of false-positive leaks for every leak check after the first one in a given run. You might still encounter this issue in Simulator, or in macOS apps when using Instruments from Xcode 10.2 and later. (48549361)"
You can raise issue with Apple , if you are a signed developer. They might help you if the defect is on their part.
You can also test with upcoming iOS release to check if your code works in the future release (with the apple beta program)

AudioSession maximumInputNumberOfChannels returning 0

I made an iOS plugin that captures audio data and forwards it along to a listener in the form of a byte stream. It was working flawlessly in an emulator and on various devices, but on an iPhone 6 running iOS 11.3 it is crashing during initialization. I've tracked the problem to this code:
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try session.setPreferredInputNumberOfChannels(1) // This is the line that is throwing
try session.setPreferredIOBufferDuration(65)
} catch {
print(error.localizedDescription) // Prints: The operation couldn’t be completed. (OSStatus error -50.)
return -1
}
As the comment shows, the error is being caused by the call to session.setPreferredIOBufferDuration. Looking at the documentation, it says that the call will throw if the input number is greater than session.maximumInputNumberOfChannels, and judging from the error message, this seems to be the case. Checking that value on this phone, it is returning 0.
What would be causing that value to be 0? As far as I can tell, I don' think it's a permissions issue, as I request microphone permissions prior to the app reaching this point in the code. The only other thing I can think of is that the phone essentially has no microphone capabilities... but it's a phone, so the inclusion of a microphone seems fairly standard.
EDIT: I pulled out an iPad Air that's running iOS 12, and it's having the same issue.
I found the problem. I needed to add session.setActive(true) before trying to set the number of channels. I've never had to do that before, but I guess it's something you should do anyway just in case.
AVAudioSession.sharedInstance()
you can change it anyway,
search it?

Crash when first playing AVPlayer video and then recording audio in AudioKit

I'm having trouble using AudioKit in an iOS app that also use AVPlayer (for video). In my project I have one page with an AVPlayer video and a button that segues to a second page with a record and play button that uses AudioKit. If I don't start the video the AKNodeRecorder is working as expected, but if the video is played before recording the app crashes with the following message:
2018-04-13 13:18:10.576116+0200 AVPlayer_vs_AudioKit[1854:580107] [mcmx] 338: input bus 0 sample rate is 0
2018-04-13 13:18:10.576361+0200 AVPlayer_vs_AudioKit[1854:580107] [avae] AVAEInternal.h:103:_AVAE_CheckNoErr:
[AVAudioEngineGraph.mm:1839:InstallTapOnNode: (err = AUGraphParser::InitializeActiveNodesInInputChain(ThisGraph, *inputNode)): error -10875
2018-04-13 13:18:10.576691+0200 AVPlayer_vs_AudioKit[1854:580107] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10875'
*** First throw call stack:
(0x183a3ed8c 0x182bf85ec 0x183a3ebf8 0x1893671a0 0x189383f58 0x189393410 0x18940c5e8 0x1893fc6e8 0x104e7db90 0x104d7d170 0x104d7d714 0x18d76e6c8 0x18d88f8a4 0x18d77477c 0x18d8aa1dc 0x18d7f1a48 0x18d7e68f8 0x18d7e5238 0x18dfc6c0c 0x18dfc91b8 0x18dfc2258 0x1839e7404 0x1839e6c2c 0x1839e479c 0x183904da8 0x1858e7020 0x18d8e578c 0x104d801a4 0x183395fc0)
libc++abi.dylib: terminating with uncaught exception of type NSException
I use AudioKit 4.2.3 from cocoapods on XCode 9.3
My project can be downloaded here: https://www.dropbox.com/s/gxek91sccit88zv/AVPlayer_vs_AudioKit.zip?dl=0
Configuring AudioKit's idea of an audio session early on to include recording fixes the problem, e.g. in application(didFinishLaunchingWithOptions:)
try! AKSettings.setSession(category: .playAndRecord, options: 0)

Matching Input & Output Hardware Settings for AVAudioEngine

I am trying to build a very simple audio effects chain using Core Audio for iOS. So far I have implemented an EQ - Compression - Limiter chain which works perfectly fine in the simulator. However on device, the application crashes when connecting nodes to the AVAudioEngine due to an apparent mismatch in the input and output hardware formats.
'com.apple.coreaudio.avfaudio', reason: 'required condition is false:
IsFormatSampleRateAndChannelCountValid(outputHWFormat)'
Taking a basic example, my Audio Graph is as follows.
Mic -> Limiter -> Main Mixer (and Output)
and the graph is populated using
engine.connect(engine.inputNode!, to: limiter, format: engine.inputNode!.outputFormatForBus(0))
engine.connect(limiter, to: engine.mainMixerNode, format: engine.inputNode!.outputFormatForBus(0))
which crashes with the above exception. If I instead use the limiter's format when connecting to the mixer
engine.connect(engine.inputNode!, to: limiter, format: engine.inputNode!.outputFormatForBus(0))
engine.connect(limiter, to: engine.mainMixerNode, format: limiter.outputFormatForBus(0))
the application crashes with an kAudioUnitErr_FormatNotSupported error
'com.apple.coreaudio.avfaudio', reason: 'error -10868'
Before connecting the audio nodes in the engine, inputNode has 1 channel and a sample rate of 44.100Hz, while the outputNode has 0 channels and a sample rate of 0Hz (deduced using outputFormatForBus(0) property). But this could be because there is no node yet connected to the output mixer? Setting the preferred sample rate on AVAudioSession made no difference.
Is there something that I am missing here? I have Microphone access (verified using AVAudioSession.sharedInstance().recordPermission()), and I have set the AVAudioSession mode to record (AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryRecord)).
The limiter is an AVAudioUnitEffect initialized as follows:
let limiter = AVAudioUnitEffect( audioComponentDescription:
AudioComponentDescription(
componentType: kAudioUnitType_Effect,
componentSubType: kAudioUnitSubType_PeakLimiter,
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0,
componentFlagsMask: 0) )
engine.attachNode(limiter)
and engine is a global, class variable
var engine = AVAudioEngine()
As I said, this works perfectly fine using the simulator (and Mac's default hardware), but continually crashes on various iPads on iOS8 & iOS9. I have a super basic example working which simply feeds the mic input to a player to the output mixer
do {
file = try AVAudioFile(forWriting: NSURL.URLToDocumentsFolderForName(name: "test", WithType type: "caf")!, settings: engine.inputNode!.outputFormatForBus(0).settings)
} catch {}
engine.connect(player, to: engine.mainMixerNode, format: file.processingFormat)
Here the inputNode has 1 channel and 44.100Hz sampling rate, while the outputNode has 2 channels and 44.100Hz sampling rate, but no mismatching seems to occur. Thus the issue must be the manner in which the AVAudioUnitEffect is connected to the output mixer.
Any help would be greatly appreciated.
This depends on some factors outside of the code you've shared, but it's possible you're using the wrong AVAudioSession category.
I ran into this same issue, under some slightly different circumstances. When I was using AVAudioSessionCategoryRecord as the AVAudioSession category, I ran into this same issue when attempting to connect an audio tap. I not only received that error, but my AVAudioEngine inputNode showed an outputFormat with 0.0 sample rate.
Changing it to AVAudioSessionCategoryPlayAndRecord I received the expected 44.100Hz sample rate and the issue resolved.

Resources