My App allows use of HFP (Hands Free Protocol) for it's "Spoken" prompts (like a Navigation App).
The function below to Setup Audio before TextToSpeech or AVAudioPlayer has worked fairly well since iOS 9.x.
I didn't test it against running a PodCast very often so I'm not sure when things broke. The Function below works perfect if Music is Streaming from the Phone to the Bluetooth A2DP or Music is Playing on the FM Radio in the Car (i.e. it will interrupt Radio, Prompt and the resume Radio). But it will NOT work if I'm streaming a PodCast. PodCast pauses and silence comes out for the prompt, then resumes the PodCast.
I recently checked Waze, Google Maps and Apple Maps (all which also offer this HFP option).
Waze is broken (but again I don't test against PodCast often).
Google Maps still works perfectly.
Apple Maps is just weird. The option for HFP is greyed out when Streaming. But when it tries to Pause and Play it also fails.
But again, Google Maps works, so it can be done.
When I call setPreferredInput with the HFP route, my route change handler (also shown below) is NOT called if a PodCast is streaming. If music is Streaming my route change handler is called and Audio from my app comes over HFP correctly.
Background or Foreground doesn't matter.
Any suggestions to solve would be greatly appreciated.
func setupSound(_ activate: Bool)
{
if !Settings.sharedInstance.soundOn && !Settings.sharedInstance.voiceOn
{
return
}
var avopts:AVAudioSessionCategoryOptions = [
.mixWithOthers,
.duckOthers,
.interruptSpokenAudioAndMixWithOthers,
.allowBluetooth
]
if #available(iOS 10.0, *)
{
avopts.insert(AVAudioSessionCategoryOptions.allowBluetoothA2DP)
avopts.insert(AVAudioSessionCategoryOptions.allowAirPlay)
}
var HFP = false
if Settings.sharedInstance.HFPOn && callCenter.currentCalls == nil
{
do
{
if #available(iOS 10.0, *)
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeSpokenAudio, options: avopts)
}
else
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: avopts)
try AVAudioSession.sharedInstance().setMode(AVAudioSessionModeSpokenAudio)
}
}
catch
{
Settings.vprint("Failed Setup HFP Bluetooth Audio, ", error.localizedDescription)
}
if let inputs = AVAudioSession.sharedInstance().availableInputs
{
for route in inputs
{
if route.portType == AVAudioSessionPortBluetoothHFP
{
HFP = true
do
{
try AVAudioSession.sharedInstance().setPreferredInput(route)
HFP = true
break
}
catch
{
Settings.vprint("Failed Set Route HFP Bluetooth Audio, ", error.localizedDescription)
}
}
}
}
}
lastHFPStatus = HFP
if !HFP
{
var avopts:AVAudioSessionCategoryOptions = [
.mixWithOthers
]
if Settings.sharedInstance.duckingSoundOn
{
avopts.insert(.duckOthers)
avopts.insert(.interruptSpokenAudioAndMixWithOthers)
}
if Settings.sharedInstance.speakerOnlyOn
{
avopts.insert(.defaultToSpeaker)
do
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: avopts)
}
catch
{
Settings.vprint("Failed setCategory, ", error.localizedDescription)
}
}
else
{
do
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: avopts)
}
catch
{
Settings.vprint("Failed setCategory, ", error.localizedDescription)
}
}
}
do
{
try AVAudioSession.sharedInstance().setActive(activate, with: .notifyOthersOnDeactivation)
if ((Settings.sharedInstance.debugMask & 2) != 0)
{
Settings.vprint(activate)
}
}
catch
{
Settings.vprint("Could not setActive, ", error.localizedDescription)
}
}
#objc func handleRouteChange(notification: Notification)
{
Settings.vprint(notification)
let currentRoute = AVAudioSession.sharedInstance().currentRoute
for route in currentRoute.outputs
{
Settings.vprint("Change Output: ", route.portName)
}
for route in currentRoute.inputs
{
Settings.vprint("Change Input: ", route.portName)
}
}
Related
I am experiencing no haptic output while testing the code below on my physical device (iPhone XR). I followed Apple Developer's "Playing a Single-tap Haptic Pattern" article, as well as double-checked with various other articles on the internet, and I am confident I have implemented it correctly. Moreover, there are no errors that are caught when running. What could be the reason why my device is not outputting the haptic pattern?
Side Notes:
I can confirm that my phone does produce haptics for other apps, so it does not appear to be an issue with my physical device.
I also implement an AVAudioPlayer separately; I doubt that would interfere, but I thought I'd mention it just in case.
Any help would be much appreciated--thanks!
var hapticEngine: CHHapticEngine!
var hapticPlayer: CHHapticPatternPlayer!
override func viewDidLoad() {
super.viewDidLoad()
// Create and configure a haptic engine.
do {
hapticEngine = try CHHapticEngine()
} catch let error {
fatalError("Engine Creation Error: \(error)")
}
// The reset handler provides an opportunity to restart the engine.
hapticEngine.resetHandler = {
print("Reset Handler: Restarting the engine.")
do {
// Try restarting the engine.
try self.hapticEngine.start()
// Register any custom resources you had registered, using registerAudioResource.
// Recreate all haptic pattern players you had created, using createPlayer.
} catch let error {
fatalError("Failed to restart the engine: \(error.localizedDescription)")
}
}
// The stopped handler alerts engine stoppage.
hapticEngine.stoppedHandler = { reason in
print("Stop Handler: The engine stopped for reason: \(reason.rawValue)")
switch reason {
case .audioSessionInterrupt: print("Audio session interrupt")
case .applicationSuspended: print("Application suspended")
case .idleTimeout: print("Idle timeout")
case .systemError: print("System error")
#unknown default:
print("Unknown error")
}
}
// Create haptic dictionary
let hapticDict = [
CHHapticPattern.Key.pattern: [
[
CHHapticPattern.Key.event: [
CHHapticPattern.Key.eventType: CHHapticEvent.EventType.hapticTransient,
CHHapticPattern.Key.time: 0.001,
CHHapticPattern.Key.eventDuration: 1.0
]
]
]
]
// Create haptic pattern from haptic dictionary, then add it to the haptic player
do {
let pattern = try CHHapticPattern(dictionary: hapticDict)
hapticPlayer = try hapticEngine.makePlayer(with: pattern)
} catch let error {
print("Failed to create hapticPlayer: \(error.localizedDescription)")
}
// ...
}
// ...
func playHaptic() {
//audioPlayer?.play()
// Start Haptic Engine
do {
try hapticEngine.start()
} catch let error {
print("Haptic Engine could not start: \(error.localizedDescription)")
}
// Start Haptic Player
do {
try hapticPlayer.start(atTime: 0.0)
print("Why")
} catch let error {
print("Haptic Player could not start: \(error.localizedDescription)")
}
// Stop Haptic Engine
hapticEngine.stop(completionHandler: nil)
}
The problem is this line:
hapticEngine.stop(completionHandler: nil)
Delete it and all will be well. You're stopping your "sound" the very instant it is trying to get started. That doesn't make much sense.
(Of course I am also assuming that somewhere there is code that actually calls your playHaptic method. You didn't show that code, but I'm just guessing that you probably remembered to include some. Having made sure that happens, I ran your code with the stop line commented out, and I definitely felt and heard the "pop" of the taptic tap.)
AudioKit 4.9.3
iOS 11+
I am working on a project where the user is recording on the device using the microphone and it continues to record, even if the app is in the background. This works fine but when receiving a phone call I get an AudioKit error. I assume it has something to do with the phone taking over the mic or something. here is the error:
[avae] AVAEInternal.h:109
[AVAudioEngineGraph.mm:1544:Start: (err = PerformCommand(*ioNode,
kAUStartIO, NULL, 0)): error 561017449
AudioKit+StartStop.swift:restartEngineAfterRouteChange(_:):198:error
restarting engine after route change
basically everything that i have recording up until that point is lost.
here is my set up AudioKit code:
func configureAudioKit() {
AKSettings.audioInputEnabled = true
AKSettings.defaultToSpeaker = true
do {
try try audioSession.setCategory((AVAudioSession.Category.playAndRecord), options: AVAudioSession.CategoryOptions.mixWithOthers)
try audioSession.setActive(true)
audioSession.requestRecordPermission({ allowed in
DispatchQueue.main.async {
if allowed {
print("Audio recording session allowed")
self.configureAudioKitSession()
} else {
print("Audio recoding session not allowed")
}
}
})
} catch let error{
print("Audio recoding session not allowed: \(error.localizedDescription)")
}
}
func configureAudioKitSession() {
isMicPresent = AVAudioSession.sharedInstance().isInputAvailable
if !isMicPresent {
return
}
print("mic present and configuring audio session")
mic = AKMicrophone()
do{
let _ = try AKNodeRecorder(node: mic)
let recorderGain = AKBooster(mic, gain: 0)
AudioKit.output = recorderGain
//try AudioKit.start()
}
catch let error{
print("configure audioKit error: ", error)
}
}
and when tapping on the record button code:
do {
audioRecorder = try AVAudioRecorder(url: actualRecordingPath, settings: audioSettings)
audioRecorder?.record()
//print("Recording: \(isRecording)")
do{
try AudioKit.start()
}
catch let error{
print("Cannot start AudioKit", error.localizedDescription)
}
}
Current audio Settings:
private let audioSettings = [
AVFormatIDKey : Int(kAudioFormatMPEG4AAC),
AVSampleRateKey : 44100,
AVNumberOfChannelsKey : 2,
AVEncoderAudioQualityKey : AVAudioQuality.medium.rawValue
]
What can I do to ensure that I can get a proper recording, even when receiving a phone call? The error happens as soon as you receive the call - whether you choose to answer it or decline.
Any thoughts?
I've done work in this area, I'm afraid you cannot access the microphone(s) while a call or a VOIP call is in progress.
This is a basic privacy measure that is enforced by iOS for self-evident reasons.
AudioKit handles only the basic route change handling for an audio playback app. We've found that when an app becomes sufficiently complex, the framework can't effectively predestine the appropriate course of action when interruptions occur. So, I would suggest turning off AudioKit's route change handling and respond to the notifications yourself.
Also, I would putting AudioKit activation code in a button.
I have a VOIP app implemented using the Sinch SDK and CallKit. Everything works fine, apart from when the device has headphones plugged in. In the latter case, when the call starts, audio is still routed through the main speaker of the device. If I unplug and plug the headphones back in - during the call -, audio is then correctly routed to the headphones.
All I am doing is
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction) {
guard let c = self.currentCall else {
action.fail()
return
}
c.answer()
self.communicationClient.audioController().configureAudioSessionForCallKitCall()
action.fulfill()
}
Shouldn't this be taken care automatically by the OS?
It seems like the Sinch SDK overrides the output audio port. Try to run this code just after the audio session has been configured:
do {
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.none)
} catch {
print("OverrideOutputAudioPort failed: \(error)")
}
If it doesn't work, try to configure the audio session by yourself, instead of relying on Sinch SDK if you can. Replace the configureAudioSessionForCallKitCall call with something like this:
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(
.playAndRecord,
mode: .voiceChat,
options: [.allowBluetooth, .allowBluetoothA2DP])
try session.setActive(true)
} catch {
print("Unable to activate audio session: \(error)")
}
In my application, I am using MPMusicPlayerController.systemMusicPlayer for the playing song of Apple music, it's working fine. But when I play back Spotify track using playSpotifyURI it's not working. I have checked logs but not showing error anywhere.
Scenario
Step 1. Play track using playSpotifyURI. It is playing fine
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Step 2. stop track using.
SPTAudioStreamingController.sharedInstance().setIsPlaying(false, callback: { (error) in
})
Step 3. play apple music song using theMPMusicPlayerController.systemMusicPlayer
func beginPlayback(itemID: String) {
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
//musicPlayerController.setQueue(with: [itemID]) //1324456545
musicPlayerController.setQueue(with: [itemID])
musicPlayerController.prepareToPlay { (error) in
print("prepareToPlay----------------")
}
musicPlayerController.play()
}
Step 4. Stop Apple music song using.
if musicPlayerController.playbackState == .playing {
musicPlayerController.stop()
}
Step 5. Play track using playSpotifyURI using below code but it's not playing, I couldn't find any error.
SPTAudioStreamingController.sharedInstance().playSpotifyURI(itemID, startingWith: 0, startingWithPosition: 0) { error in
if error != nil {
print("*** failed to play: \(String(describing: error))")
return
}else{
print("Playing!!")
}
}
Is there any issue in the above code? Please help me to solve an issue. Any help will be appreciated.
I need to implement iPhone speaker (ear and bottom) change during audio call (using TwilioVideo SDK for connection)
Mine code:
let audioSession = AVAudioSession.sharedInstance()
do {
if isSpeaker == false {
try audioSession.overrideOutputAudioPort(.speaker)
isSpeaker = true
} else {
try audioSession.overrideOutputAudioPort(.none)
isSpeaker = false
}
try audioSession.setActive(true)
} catch {
handleError(error.localizedDescription)
}
It works without any exceptions, but don't change audio output speaker
Twilio developer evangelist here.
You should not use AVAudioSession APIs directly with Twilio Video. Instead, you should use the TVIAudioController and set the audioOutput property to one of the options enumerated in TVIAudioOutput.
TVIAudioController.sharedController().audioOutput = .TVIAudioOutputVideoChatSpeaker
Let me know if that helps.