Audio Recording is not working during Screen-share with Zoom or other app - ios

I am trying to record voice with AVAudioRecorder. It is working fine if Screen-share is not enable. But notice when i share my device screen with Zoom or any other app. AVAudioSession is not active.
Here i attach code that i added for audio record
UIApplication.shared.beginReceivingRemoteControlEvents()
let session = AVAudioSession.sharedInstance()
do{
try session.setCategory(.playAndRecord,options: .defaultToSpeaker)
try session.setActive(true)
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2,
AVEncoderAudioQualityKey:AVAudioQuality.high.rawValue
]
audioRecorder = try AVAudioRecorder(url: getFileUrl(), settings: settings)
audioRecorder.delegate = self
audioRecorder.isMeteringEnabled = true
audioRecorder.prepareToRecord()
self.nextBtn.isHidden = true
}catch let error {
print("Error \(error)")
}
When i hit record button it shows me error NSOSStatusErrorDomain Code=561017449 "Session activation failed".
Here i attach video.
https://share.icloud.com/photos/0a09o5DCNip6Rx_GnTpht7K3A

I don't have the reputation to comment or I would. (Almost there lol!) Have you tried AVAudioSession.CategoryOptions.overridemutedmicrophoneinterrupt?
Edit
The more I looked into this it seems like if Zoom is using the hardware then the iPhone won't be able to record that stream. I think that's the idea behind the AVAudioSession.sharedSession() being a singleton.
From the docs:
Type Property
overrideMutedMicrophoneInterruption: An option that indicates
whether the system interrupts the audio session when it mutes the
built-in microphone.
Declaration
AVAudioSession.CategoryOptions { get }
Discussion
Some devices include a privacy feature that mutes the built-in
microphone at the hardware level in certain conditions, such as when
you close the Smart Folio cover of an iPad. When this occurs, the
system interrupts the audio session that’s capturing input from the
microphone. Attempting to start audio input after the system mutes the
microphone results in an error. If your app uses an audio session
category that supports input and output, such as playAndRecord, you
can set this option to disable the default behavior and continue using
the session. Disabling the default behavior may be useful to allow
your app to continue playback when recording or monitoring a muted
microphone doesn’t lead to a poor user experience. When you set this
option, playback continues as normal, and the microphone hardware
produces sample buffers, but with values of 0.
Important
Attempting to use this option with a session category that doesn’t
support audio input results in an error.

Related

I want to make sound effects without stopping the music playing in the background in another app

I am currently developing an application with SwiftUI.
There is a function that plays a sound effect when you tap on it.
When I tested it on the actual device, the Spotify music playing in the background stopped. Is it possible to use AVFoundation to play sound effects without stopping the music? Also, if there is a better way to implement this, please help me.
import Foundation
import AVFoundation
var audioPlayer: AVAudioPlayer?
func playSound(sound: String, type: String) {
if let path = Bundle.main.path(forResource: sound, ofType: type) {
do {
audioPlayer = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: path))
audioPlayer?.play()
} catch {
print("ERROR:Could not find and play the sound file.")
}
}
}
Set your AVAudioSession category options to .mixWithOthers.
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.ambient, mode: .default, options: [.mixWithOthers])
} catch {
print("Failed to set audio session category.")
}
audioSession.setActive(true) // When you're ready to play something
ambient indicates that sound is not critical to this application. The default is soloAmbient, which is non-mixable. (It's possible you can just set the category to ambient here and you'll get mixWithOthers for free. I don't often use that mode, so I don't remember how much you get by default.) If sound is critical, see the mode .playback.
As a rule, you set the category once in the app, and then set the session active or inactive as you need it. If I remember correctly, AVAudioPlayer will automatically activate the session, so you may not need to do that (I typically work at much lower levels, and don't always remember what the high-level tools do automatically). It is legal to change your category, however, if different features of your app have different needs.
For more details, see AVAudioSession.Category and AVAudioSession. There are many options in sound playback, and many of them are not obvious (and a few have very confusing names, I'm looking at you .allowBluetooth), so it's worth reading through the docs.

How to use internal mic for input and bluetooth for output

I'm currently trying to have my device to record audio for a capture session through device mic while having audio output on a bluetooth device (AirPods).
The reason I am doing this is because with bluetooth headphones and especially AirPods when the bluetooth mic is active the playback quality is horrible.
I tried using setPreferredInput but it changes both input and output, here's what I have so far.
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth, .mixWithOthers])
print(session.currentRoute.outputs)
try session.setAllowHapticsAndSystemSoundsDuringRecording(true)
try session.setActive(true, options: .notifyOthersOnDeactivation)
if let mic = session.availableInputs?.first(where: {$0.portType == AVAudioSession.Port.builtInMic}) {
try session.setPreferredInput(mic)
}
} catch let err {
print("Audio session err", err.localizedDescription)
}
Also I saw an old api that could have helped but it seems to be long depreciated now (kAudioSessionProperty_OverrideCategoryEnableBluetoothInput) for AudioSession.
There are other apps on the App Store that seem to have achieved the split recording so it seems to be possible.
Get rid of allowBluetooth and use allowBluetoothA2DP. You also don't want defaultToSpeaker here.
"Allow Bluetooth" actually means "prefer HFP" which is why the audio is so bad. HFP is a low-bandwidth bidirectional protocol used generally for phone calls. The enum name is very confusing IMO. People get confused about it all the time.
A2DP is a high-bandwidth unidirectional protocol (it doesn't support a microphone). When you request that, the headset's microphone will be disabled, and you'll get the iPhone's microphone by default (provided there isn't some other wired microphone available, but that's very unlikely).

AVAudioSession setCategory .allowBluetooth causes crash

I'm writing an app that records user audio with AVAudioSession. All works well when I'm not adding bluetooth to the options, but I'd like to be able to record with AirPods as well. When I add the .allowBluetooth option, it produces a crash and no longer works.
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.record, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
try session.setActive(true)
} catch let error as NSError {
print(error.localizedDescription)
return
}
Any suggestions on this? I've looked through numerous SO posts related to the subject and found nothing that seems to solve my issue.
You are getting error code -50, which indicates invalid parameters.
This is because the .defaultToSpeaker option can only be used with the playAndRecord category:
You can set this option can only when using the playAndRecord category. It’s used to modify the category’s routing behavior so that audio is always routed to the speaker rather than the receiver if no other accessories, such as headphones, are in use.
So either remove this option or use a playAndRecord category.

Very quiet audio samples from AVCaptureMultiCamSession after disconnecting Lightning earphones

I've encountered an issue with some AVFoundation video and audio capture code that only happens when changing AVCaptureSession() to AVCaptureMultiCamSession() with no other code changes.
It only reproduces intermittently, and the steps are:
Boot the app and capture session with no external mic connected
Connect a pair of Apple Lightning earphones
Start recording video and audio - the audio comes from the mic on the Lightning earphones as expected
Unplug the Apple Lightning earphones
The mic input switches to a built-in mic on the device, however the average power level on the audio samples coming in via AVCaptureAudioDataOutputSampleBufferDelegate is now much quieter by about 25-30 dB.
This graph illustrates what's observed looking at the values of AVCaptureAudioChannel.averagePowerLevel from inside AVCaptureAudioDataOutputSampleBufferDelegate.captureOutput(AVCaptureOutput, didOutput: CMSampleBuffer, from: AVCaptureConnection):
The issue only reproduces intermittently, i.e. sometimes after unplugging the Lightning earphones the level coming from the built-in mic is completely normal. This issue only occurs when instantiating an AVCaptureMultiCamSession instead of an AVCaptureSession, the rest of the code is unchanged.
There is no handling of changes to the audio routing in the app. It's using the default audio device so the switch to the built-in mic is all being handled by AVFoundation. The code for setting up the audio capture is:
guard let audioDevice = AVCaptureDevice.default(for: .audio) else {
fatalError("could not create default audio device")
}
let audioIn = try! AVCaptureDeviceInput(device: audioDevice)
if self.session.canAddInput(audioIn) {
self.session.addInput(audioIn)
}
let audioOut = AVCaptureAudioDataOutput()
let audioCaptureQueue = DispatchQueue(label: "com.test.capturepipeline.audio", attributes: [])
audioOut.setSampleBufferDelegate(self, queue: audioCaptureQueue)
if self.session.canAddOutput(audioOut) {
self.session.addOutput(audioOut)
}
self.audioConnection = audioOut.connection(with: .audio)
if let aSettings = audioOut.recommendedAudioSettingsForAssetWriter(writingTo: .mov) {
self.audioCompressionSettings = aSettings as? [String: Any]
}
One theory is that this is a bug with audio beamforming that AVFoundation is configuring internally. It's strange how the audio is still there but is just very quiet. Maybe AVFoundation's weighted processing of audio samples from the multiple built-in microphones done for beamforming is somehow missing some audio data, resulting in a very quiet final signal?
Perhaps someone else has encountered similar sorts of problems with AVCaptureMultiCamSession? It's new in iOS 13 and is only supported on fairly recent hardware.
Any ideas?
Thanks!

Using the default audio ouput for AVAudioEngine in iOS

I'm trying to create a basic iOS application that let me record audio and then play it back in different speeds and pitches.
I'm using AVAudioPlayer to play the recorded audio either slower or faster and it works as expected. The sound is played back in my headphones.
I'm using AVAudioEngine to play the recorded audio in a different pitch and it works except that the audio is played back on my Thunderbolt display speakers, not my headphones.
I've been going through the documentation in order to understand this behavior but come up short. In my view controller in the method, viewDidLoad, I've setup the audio session as follows:
let session = AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayback, error: &error)
session.setActive(true, error: &error)
and further down, I've defined a function that triggers the sound that utilizes the audio engine as follows:
let playerNode = AVAudioPlayerNode()
audioEngine.attachNode(playerNode)
let changePitchNode = AVAudioUnitTimePitch()
changePitchNode.pitch = 1000
audioEngine.attachNode(changePitchNode)
audioEngine.connect(playerNode, to: changePitchNode, format: audioFile.processingFormat)
audioEngine.connect(changePitchNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
playerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(nil)
playerNode.play()
audioEngine and audioFile are globally declared in the class. The whole project can be found at https://github.com/KevinSjoberg/pitch-perfect.
Can anyone shed some light to why it persists on playing the sound on my monitor speaker instead of my headphones?
I've figured it out. The reason for this happening was that I was using the Thunderbolt Display microphone to record my voice. For some unknown reason this makes the AVAudioEngine use the monitor speakers instead. By switching the input source to the microphone of my MBP made everything work as expected.
I'm only guessing, but it does sound like a bug and so I've submitted a report to Apple.

Resources