AVAudioEngine() Playback Not Working - ios

I'm trying to change the pitch of a sound using the AVAudioEngine() in Swift. This is my code:
func setUpEngine() {
let fileString = NSBundle.mainBundle().pathForResource("400", ofType: "wav")
let url = NSURL(fileURLWithPath: fileString!)
do {
try audioFile = AVAudioFile(forReading: url)
print("done")
}
catch{
}
}
var engine = AVAudioEngine()
var audioFile = AVAudioFile()
var audioPlayerNode = AVAudioPlayerNode()
var changeAudioUnitTime = AVAudioUnitTimePitch()
override func viewDidLoad() {
setUpEngine()
let defaults = NSUserDefaults.standardUserDefaults()
audioPlayerNode.stop()
engine.stop()
engine.reset()
engine.attachNode(audioPlayerNode)
changeAudioUnitTime.pitch = 800
engine.attachNode(changeAudioUnitTime)
engine.connect(audioPlayerNode, to: changeAudioUnitTime, format: nil)
engine.connect(changeAudioUnitTime, to: engine.outputNode, format: nil)
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
engine.startAndReturnError(nil)
audioPlayerNode.play()
The rest of my code is below (I do close the brackets).
I found most of this code online and I get an error with the line
engine.startAndReturnError(nil)
'Value of type has no member'.
When I remove this line I get the following error:
'AVAudioPlayerNode.mm:333: Start: required condition is false:
_engine->IsRunning() Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false:
_engine->IsRunning()''
Any help would be greatly appreciated. I am using Swift in xCode and a single view application.

The error is that the engine is not running. You need to reorder your operations like this...
setUpEngine()
let defaults = NSUserDefaults.standardUserDefaults()
engine.attachNode(audioPlayerNode)
engine.attachNode(changeAudioUnitTime)
engine.connect(audioPlayerNode, to: changeAudioUnitTime, format: nil)
engine.connect(changeAudioUnitTime, to: engine.outputNode, format: nil)
changeAudioUnitTime.pitch = 800
engine.prepare()
engine.start()
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioPlayerNode.play()
Some time later...
engine.stop()

This is because you are running on either outdated version of XCode or incompatible version of iOS. I had this issue and in the latest Swift version there is no such method instead they provide .start().

Related

Audio is not playing Xcode 10.1 Swift 4

I finally started learning to play sounds but I'm not succeeding in playing sounds in my app. I'm still on Xcode 10.1/Swift4 (not 4.2) as I'm not able to upgrade to Mojave/Catalina yet.
I read quite a few posts about .setCategory(.playback, mode: .default) carrying a bug in Xcode 10.1 but solutions found are for Swift 4.2. Also category: and mode:are expected to be of type String,but on docs the function is explained as:
func setCategory(_ category: AVAudioSession.Category, mode:
AVAudioSession.Mode, options: AVAudioSession.CategoryOptions = [])
throws
and they're not of type String. I'm a bit lost here.
My code doesn't throw any compiling error, but at runtime on console I get :
AVAudioSessionUtilities.mm:106:getUInt32: -- Category Value Converter
failed to find a match for string "ambient"
What am I missing here? Can you please point me in the right direction to understand this category problem?
As always thank you very much for your time and help.
This is the function that should play sounds :
static func playOnceSound(soundToPlay: String) {
var player: AVAudioPlayer?
guard let url = Bundle.main.url(forResource: soundToPlay, withExtension: "mp3") else { return }
do {
// try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) // Error: Type 'String' has no member 'playback'
try AVAudioSession.sharedInstance().setCategory("ambient", mode: "default", options: .defaultToSpeaker)
try AVAudioSession.sharedInstance().setActive(true)
/* The following line is required for the player to work on iOS 11. Change the file type accordingly*/
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileType.mp3.rawValue)
/* iOS 10 and earlier require the following line:
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileTypeMPEGLayer3) */
guard let player = player else { return }
player.numberOfLoops = 1
player.volume = 1.0
player.play()
} catch let error {
print(error.localizedDescription)
}
}
Have you tried inputting ambient & default this way, instead of using a string? Also, instantiating the audio session object before setCategory:
var audioSession: AVAudioSession?
func setup() {
audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSession.Category.ambient, mode: .default, options: .defaultToSpeaker)
try audioSession.setActive(true)
}
catch {
print("Failed to set category", error.localizedDescription)
}
}
#IBAction func doPlay1(_ sender: AnyObject) {
do{
self.audioPlayer = try AVAudioPlayer(contentsOf: URL.init(string: url)!)
self.audioPlayer.prepareToPlay()
self.audioPlayer.delegate = self
self.audioPlayer.play()
}catch{
print(error.localizedDescription)
}
}
So I just set it in AppDelegate didFinishLaunchingWithOptions as :
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch let error1 as NSError{
error = error1
print("could not set session. err:\(error!.localizedDescription)")
}
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch let error1 as NSError{
error = error1
print("could not active session. err:\(error!.localizedDescription)")
}
and rewrote the function as :
static func playOnceSound(soundToPlay: String) {
if Sounds.player!.isPlaying == true {
Sounds.player!.stop()
}
let url = URL(fileURLWithPath: Bundle.main.path(forResource: soundToPlay, ofType: "mp3")!)
var error: NSError?
do {
Sounds.player = try AVAudioPlayer(contentsOf: url)
} catch let error1 as NSError {
error = error1
Sounds.player = nil
}
if let err = error {
print("audioPlayer error \(err.localizedDescription)")
return
} else {
Sounds.player!.prepareToPlay()
}
//negative number means loop infinity
Sounds.player!.numberOfLoops = 0
Sounds.player!.volume = Sounds.volume
Sounds.player!.play()
}
Thanks for your help, and I hope this will be of help to others.
Cheers.

How can I play multiple different audio files from same AVAudioEngine? (Error: !nodeimpl->HasEngineImpl())

In my app I want to play a different audio file every time a different cell in the table view is pressed. Below is my implementation but I keep receiving the error:
Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: !nodeimpl->HasEngineImpl()'
I think this error means that it is crashing because the audio engine already contains the node. But I'm not sure how to fix it in my case.
var audioPlayerFile : AVAudioFile!
var audioEngine = AVAudioEngine()
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
var delay = AVAudioUnitDelay()
override func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
pitchPlayer.stop()
audioEngine.stop()
audioEngine.reset()
let filePathResource = myConversation.conversation[indexPath.row].fileName
print("file:", filePathResource)
if (filePathResource != nil) {
setUpAudioFilePath(filePathRes: filePathResource!)
}
timePitch.pitch = 0
delay.delayTime = 0
//append more effect:
audioEngine.connect(pitchPlayer, to: timePitch, format: audioPlayerFile.processingFormat)
audioEngine.connect(timePitch, to: delay, format: audioPlayerFile.processingFormat)
audioEngine.connect(delay, to: audioEngine.outputNode, format: audioPlayerFile.processingFormat)
pitchPlayer.scheduleFile(audioPlayerFile, at: nil, completionHandler: nil)
do { try audioEngine.start()}
catch {
print("Error: Starting Audio Engine")
}
pitchPlayer.play()
}
func setUpAudioFilePath (filePathRes: String) {
if let filePath = Bundle.main.path(forResource: filePathRes, ofType: "m4a") {
let filePathUrl = NSURL.fileURL(withPath: filePath)
do { audioPlayerFile = try AVAudioFile(forReading: filePathUrl) }
catch {
print("Error: Reading Audio Player File")
}
audioEngine.attach(pitchPlayer)
audioEngine.attach(timePitch)
audioEngine.attach(delay)
} else {
print("Error: filePath is empty")
}
}
I think that you need mixer for playing multiple audio.
AVAudioEngine already had mixer that had connected with outputNode.
Try this
audioEngine.connect(pitchPlayer, to: timePitch, format: audioPlayerFile.processingFormat)
audioEngine.connect(timePitch, to: delay, format: audioPlayerFile.processingFormat)
audioEngine.connect(delay, to: audioEngine.mainMixerNode, format: audioPlayerFile.processingFormat)

How to set an AVAudio3DMixingRenderingAlgorithm for binaural mixing

I am making an iOS app and am trying to mix sounds into a binaural mix.
I have
var engine = AVAudioEngine()
var environmentMixer = AVAudioEnvironmentNode()
func initSound() {
let player = AVAudioPlayerNode()
let url = NSBundle.mainBundle().URLForResource("sound", withExtension: "wav")!
let f = try! AVAudioFile(forReading: url)
engine.attachNode(environmentMixer)
let format = AVAudioFormat(standardFormatWithSampleRate: engine.outputNode.outputFormatForBus(0).sampleRate, channels: 2)
engine.connect(environmentMixer, to: engine.outputNode, format: format)
engine.attachNode(player)
engine.connect(player, to: environmentMixer, format: f.processingFormat)
//
// somewhere here I guess I should set the rendering algorithm...
//
player.scheduleFile(f, atTime: nil, completionHandler: {print("done")})
player.position = AVAudio3DPoint (x: 0.5, y: 0.25, z: 0)
engine.prepare()
do {
try engine.start()
player.play()
} catch {}
}
but how do I set the environmentMixer to a different renderingAlgorithm?
This isn't it:
environmentMixer.renderingAlgorithm = AVAudio3DMixingRenderingAlgorithmHRTF
Xcode gives an error:
"Use of unresolved identifier 'AVAudio3DMixingRenderingAlgorithmHRTF'"
But what is?
(The above code snippet does play sound and places it somewhere in the stereo field... but it is clearly audible that it uses the default AVAudio3DMixingRenderingAlgorithmEqualPowerPanning)
I'm not sure what that algorithm should sound like, but you can set it like this:
environmentMixer.renderingAlgorithm = .HRTF

Playing an audio file repeatedly with AVAudioEngine

I'm working on an iOS app with Swift and Xcode 6. What I would like to do is play an audio file using an AVAudioEngine, and until this point everything OK. But how can I play it without stopping, I mean, that when it ends playing it starts again?
This is my code:
/*==================== CONFIGURATES THE AVAUDIOENGINE ===========*/
audioEngine.reset() //Resets any previous configuration on AudioEngine
let audioPlayerNode = AVAudioPlayerNode() //The node that will play the actual sound
audioEngine.attachNode(audioPlayerNode) //Attachs the node to the audioengine
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil) //Connects the applause playback node to the sound output
audioPlayerNode.scheduleFile(applause.applauseFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(nil)
audioPlayerNode.play() //Plays the sound
Before saying me that I should use AVAudioPlayer for this, I can't because later I will have to use some effects and play three audio files at the same time, also repeatedly.
I found the solution in another question, asked and also auto-answered by #CarveDrone , so I've just copied the code he used:
class aboutViewController: UIViewController {
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}
The only thing you have to change is the filePath constant. Here is the link to the original answer: Having AVAudioEngine repeat a sound
Swift 5, thanks at #Guillermo Barreiro
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super. viewDidLoad()
guard let filePath: String = Bundle.main.path(forResource: "chimes", ofType: "wav") else{ return }
print("\(filePath)")
let fileURL: URL = URL(fileURLWithPath: filePath)
guard let audioFile = try? AVAudioFile(forReading: fileURL) else{ return }
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
do{
try audioFile.read(into: audioFileBuffer)
} catch{
print("over")
}
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
try? audioEngine.start()
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, at: nil, options:AVAudioPlayerNodeBufferOptions.loops)
}

Swift AVAudioEngine crash: player started when in a disconnected state

So my code below is supposed to replay the chimes.wav file over and over again, with a higher pitch, but crashes with the error at the bottom. Can anyone find what is causing this error?
import UIKit
import AVFoundation
class aboutViewController: UIViewController {
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
var timePitch = AVAudioUnitTimePitch()
timePitch.pitch = 2000
let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.attachNode(timePitch)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}
2014-11-10 18:34:37.746 windChimes[2350:108235] **** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'player started when in a disconnected state'
**** First throw call stack:
(0x185f01e48 0x1965f40e4 0x185f01d08 0x1848726c0 0x18489a33c 0x18489975c 0x10009e638 0x10009e858 0x18a6b0e84 0x18a6b0b94 0x18a853ad4 0x18a765310 0x18a7650dc 0x18a76505c 0x18a6ada2c 0x18a005994 0x18a000564 0x18a000408 0x189fffc08 0x189fff98c 0x189ff93bc 0x185eba14c 0x185eb70d8 0x185eb74b8 0x185de51f4 0x18ef7b5a4 0x18a716784 0x1000a54f8 0x1000a5538 0x196c62a08)
libc++abi.dylib: terminating with uncaught exception of type NSException
(lldb)
The statement: "player started when in a disconnected state" indicates that there is a problem with the connection chain. This means either the nodes were not attached to the engine or the nodes were not linked together properly.Because both the audioFilePlayer and the timePitch nodes were attached, my impression would be to say that the problem lies with these two lines:
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
The connection should link all components together:
audioFilePlayer -> timePitch -> audioEngine.mainMixerNode (or outputNode)
So the connection should look like:
audioEngine.connect(audioFilePlayer, to:timePitch, format: audioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: audioFile.processingFormat)
I hope this helps.
This error can also happen if you connect more than one AVAudioPlayerNode to a node that only takes in one input.
For example:
let playerOne = AVAudioPlayerNode()
let playerTwo = AVAudioPlayerNode()
let reverbEffect = AVAudioUnitReverb()
engine.attach(playerOne)
engine.attach(playerTwo)
engine.attach(reverbEffect)
engine.connect(playerOne, to: reverbEffect, format: format)
engine.connect(playerTwo, to: reverbEffect, format: format)
engine.connect(reverbEffect, to: engine.outputNode, format: format)
An error will now be thrown if you try to play audio with playerOne, because it's no longer connected to any node (its output was implicitly disconnected when we called engine.connect(playerTwo, to: reverbEffect, format: format))
The fix is simple; connect both your player nodes to an AVAudioMixerNode:
let playerOne = AVAudioPlayerNode()
let playerTwo = AVAudioPlayerNode()
let mixer = AVAudioMixerNode()
let reverbEffect = AVAudioUnitReverb()
engine.attach(playerOne)
engine.attach(playerTwo)
engine.attach(mixer)
engine.attach(reverbEffect)
engine.connect(playerOne, to: mixer, format: format)
engine.connect(playerTwo, to: mixer, format: format)
engine.connect(mixer, to: reverbEffect, format: format)
engine.connect(reverbEffect, to: engine.outputNode, format: format)

Resources