App doesn't route audio to headphones (initially) - ios

I have a VOIP app implemented using the Sinch SDK and CallKit. Everything works fine, apart from when the device has headphones plugged in. In the latter case, when the call starts, audio is still routed through the main speaker of the device. If I unplug and plug the headphones back in - during the call -, audio is then correctly routed to the headphones.
All I am doing is
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction) {
guard let c = self.currentCall else {
action.fail()
return
}
c.answer()
self.communicationClient.audioController().configureAudioSessionForCallKitCall()
action.fulfill()
}
Shouldn't this be taken care automatically by the OS?

It seems like the Sinch SDK overrides the output audio port. Try to run this code just after the audio session has been configured:
do {
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none)
} catch {
print("OverrideOutputAudioPort failed: \(error)")
}
If it doesn't work, try to configure the audio session by yourself, instead of relying on Sinch SDK if you can. Replace the configureAudioSessionForCallKitCall call with something like this:
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(
.playAndRecord,
mode: .voiceChat,
options: [.allowBluetooth, .allowBluetoothA2DP])
try session.setActive(true)
} catch {
print("Unable to activate audio session: \(error)")
}

Related

WebRTC Audio call voice not hearing in iOS not working with Mobile data but works with WIFI connection

This scenario fails every time Audio call
If the sender uses the wifi network and the receiver uses mobile data 3G/4G network then the receiver hearing sender's voice but the receiver's voice can’t hear from the sender side.
peer connection is connected successfully with sender and receiver on both sides.
"peerConnection new connection state: checking"
"peerConnection new connection state: connected"
also, we are using STUN and TURN servers like "stun:stun.l.google.com:19302" and config.iceServers = [RTCIceServer(urlStrings: ["turn:numb.viagenie.ca:3478"], username: "username#xyz.com", credential: "#password")]
please find below the Audio remote stream code:
private func configureAudioSession() {
self.rtcAudioSession.lockForConfiguration()
do {
try self.rtcAudioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try self.rtcAudioSession.setMode(AVAudioSessionModeVoiceChat)
} catch {
print("Error setting configuration: \(error.localizedDescription)")
}
self.rtcAudioSession.unlockForConfiguration()
}
private func createMediaSenders() {
let streamId = "XYZ"
let audioTrack = self.createAudioTrack()
self.peerConnection.add(audioTrack, streamIds: [streamId])
}
I'm trying to follow three steps of integration properly and working fine in my Application.
Configure WebRTC audio session to use manual audio and disable audio:
RTCAudioSession.sharedInstance().useManualAudio = true RTCAudioSession.sharedInstance().isAudioEnabled = false
On your CXProvider delegate's provider(CXProvider, didActivate: AVAudioSession) method:
Call RTCAudioSession.sharedInstance().audioSessionDidActivate with the
AVAudioSession from the CXProvider
Enable audio: RTCAudioSession.sharedInstance().isAudioEnabled = true
On your CXProvider delegate's provider(CXProvider, didDeactivate: AVAudioSession) call RTCAudioSession.sharedInstance().audioSessionDidDeactivate with the AVAudioSession from the CXProvider

iOS - AudioKit Crashes when receiving a phone call

AudioKit 4.9.3
iOS 11+
I am working on a project where the user is recording on the device using the microphone and it continues to record, even if the app is in the background. This works fine but when receiving a phone call I get an AudioKit error. I assume it has something to do with the phone taking over the mic or something. here is the error:
[avae] AVAEInternal.h:109
[AVAudioEngineGraph.mm:1544:Start: (err = PerformCommand(*ioNode,
kAUStartIO, NULL, 0)): error 561017449
AudioKit+StartStop.swift:restartEngineAfterRouteChange(_:):198:error
restarting engine after route change
basically everything that i have recording up until that point is lost.
here is my set up AudioKit code:
func configureAudioKit() {
AKSettings.audioInputEnabled = true
AKSettings.defaultToSpeaker = true
do {
try try audioSession.setCategory((AVAudioSession.Category.playAndRecord), options: AVAudioSession.CategoryOptions.mixWithOthers)
try audioSession.setActive(true)
audioSession.requestRecordPermission({ allowed in
DispatchQueue.main.async {
if allowed {
print("Audio recording session allowed")
self.configureAudioKitSession()
} else {
print("Audio recoding session not allowed")
}
}
})
} catch let error{
print("Audio recoding session not allowed: \(error.localizedDescription)")
}
}
func configureAudioKitSession() {
isMicPresent = AVAudioSession.sharedInstance().isInputAvailable
if !isMicPresent {
return
}
print("mic present and configuring audio session")
mic = AKMicrophone()
do{
let _ = try AKNodeRecorder(node: mic)
let recorderGain = AKBooster(mic, gain: 0)
AudioKit.output = recorderGain
//try AudioKit.start()
}
catch let error{
print("configure audioKit error: ", error)
}
}
and when tapping on the record button code:
do {
audioRecorder = try AVAudioRecorder(url: actualRecordingPath, settings: audioSettings)
audioRecorder?.record()
//print("Recording: \(isRecording)")
do{
try AudioKit.start()
}
catch let error{
print("Cannot start AudioKit", error.localizedDescription)
}
}
Current audio Settings:
private let audioSettings = [
AVFormatIDKey : Int(kAudioFormatMPEG4AAC),
AVSampleRateKey : 44100,
AVNumberOfChannelsKey : 2,
AVEncoderAudioQualityKey : AVAudioQuality.medium.rawValue
]
What can I do to ensure that I can get a proper recording, even when receiving a phone call? The error happens as soon as you receive the call - whether you choose to answer it or decline.
Any thoughts?
I've done work in this area, I'm afraid you cannot access the microphone(s) while a call or a VOIP call is in progress.
This is a basic privacy measure that is enforced by iOS for self-evident reasons.
AudioKit handles only the basic route change handling for an audio playback app. We've found that when an app becomes sufficiently complex, the framework can't effectively predestine the appropriate course of action when interruptions occur. So, I would suggest turning off AudioKit's route change handling and respond to the notifications yourself.
Also, I would putting AudioKit activation code in a button.

AVFoundation adding audio input mutes audio playback

I have an app that records audio and video using AVFoundation. I want any audio playback from other apps (or the system) to continue while recording audio in my app, but when adding the input to the session an already playing audio from another app gets magically muted.
The corresponding code looks like this:
// set up audio device
if(m_audioenabled) {
print("Audio enabled")
if self.m_audioDevice != nil {
do {
let audioinput = try AVCaptureDeviceInput(device: self.m_audioDevice!)
if self.m_session.canAddInput(audioinput) {
self.m_session.addInput(audioinput)
}
} catch _ {
print("failed adding audio capture device as input to capture session")
}
}
m_audioDataOutput = AVCaptureAudioDataOutput()
m_audioDataOutput?.setSampleBufferDelegate(self, queue: self.m_captureSessionQueueAudio)
if self.m_session.canAddOutput(m_audioDataOutput!){
self.m_session.addOutput(m_audioDataOutput!)
}
}
If I comment out the call to canAddInput(...) the audio keeps playing, when I call it, audio playback gets muted.
How can I disable that behavior ?
Please note the migrating to another Audio-API is not an option.
Sounds like your AVAudioSession is non-mixable (the default, I think). Try activating a mixable audio session before you set up the AVCaptureSession:
let session = AVAudioSession.sharedInstance()
try! session.setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.mixWithOthers])
try! session.setActive(true)

How to disable speakerphone in TwilioVoice iOS (Twilio Programmable Voice)

When a call is in progress (sent or received) the speakerphone is engaged already, and the speakerphone button does not react to presses to turn it off.
Is there a way to toggle it in code or enable it to be toggled in the UI? I believe this UI is Apple's core audio phone call UI.
This happens with Twilio's quickstart demo code from here:
https://github.com/twilio/voice-quickstart-swift
Try the following to force output to speaker:
if !session.overrideOutputAudioPort(AVAudioSessionPortOverride.Speaker, error:&error) {
println("could not set output to speaker")
if let e = error {
println(e.localizedDescription)
}
}
SWIFT 3.0
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeVoiceChat, options: .mixWithOthers)
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none)
try AVAudioSession.sharedInstance().setActive(true)
Overide output port to none.
SWIFT 4
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.voiceChat, options: .mixWithOthers)
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none)
try AVAudioSession.sharedInstance().setActive(true)
You can try below code snippet
func moveToVoiceCall() {
audioDevice.block = {
DefaultAudioDevice.DefaultAVAudioSessionConfigurationBlock()
do {
try AVAudioSession.sharedInstance().setMode(.voiceChat)
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none)
} catch {
print(error)
}
}
audioDevice.block();
}

AVAudioEngine stops device's background audio

I am using an AVAudioEngine object, that has many AVAudioPlayerNodes attached to it. The audio all works fine, except it stops any audio that the iPhone is playing in the background (i. e. from iTunes or another music app).
When my app is opened, it stops any other background audio. Is there a way to allow background audio to continue to play? Even when my app is using AVAudioPlayerNodes to play audio itself?
Music App has it's own audioSession, that makes audio engine stops, i had that problem too, please restart after music app.
func stepB_startEngine(){
if engine.running == false {
do { try engine.start() }
catch let error {
print(error)
}
}
}
setup audioSettion also:
func setUpAudioSession(){
do {
try AVAudioSession.sharedInstance().setCategory(.ambient, options: .mixWithOthers)
try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation)
} catch let error {
print(error)
}
}

Resources