All I want to do (for now) is play a sound file using AVAudioEngine and AVAudioPlayerNode. Here's my code:
//Init engine and player
let audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
//Hook them up
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil) // < error
While executing the last line, an exception is thrown. Nothing is printed to the console and the execution continues, but when I set an exceptional breakpoint I get the following with LLDB:
(lldb) po [$arg1 reason]
error: Execution was interrupted, reason: EXC_BAD_ACCESS (code=1, address=0xffffd593).
The process has been returned to the state before expression evaluation.
What can't be accessed here? I haven't even loaded a file yet... Thanks for any hints.
Environment
Xcode 8.2.1
iPhone 5 running on iOS 10.3.2 (14F89)
Edit
Here is some more contextual information. The code above is part of an iOS game built with SpriteKit and GameplayKit. It is located in a subclass of GKStateMachine, within a method called playSound. This method is invoked in course of a touch event originating from my subclassed SKScene called GameScene. Upon touchesBegan, the call is delegated to all entities with a TouchComponent that have a method with the same signature (touchesBegan). These components will fire a touchDown event to their delegate, which is in turn my subclass of GKStateMachine called GameStateMachine. If the touch event is correct w.r.t. the game rules, the score property of my GameStateMachine is incremented. Within the score setter, the final method playSound is called if the score increases. Here's a sequence diagram of what I just described:
Here's a working example of playing a local file resource in a playground:
import AVFoundation
import PlaygroundSupport
// prevent program from exiting immediately
PlaygroundPage.current.needsIndefiniteExecution = true
let fileURL = Bundle.main.url(forResource: "song", withExtension: "mp3")!
let file = try! AVAudioFile(forReading: fileURL)
let audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil)
audioPlayerNode.scheduleFile(file, at: nil, completionHandler: nil)
// need to start the engine before we play
try! audioEngine.start()
audioPlayerNode.play()
This assumes that song.mp3 exists in the playground resources directory.
Apparently, the exception thrown is normal. It occurs under any condition and in any environment I have tested, but does not interfere with normal functionality.
The reason why no sound was played is that the AVAudioEngine and AVAudioPlayerNode objects were released as soon as the function returned, because they had no strong pointers keeping them alive. I have fixed the issue by keeping those two objects as properties.
Related
We have a requirement for audio processing on the output of AVSpeechSynthesizer. So we started with using the write method of AVSpeechSynthesizer class to apply processing on top. of it. What we currently have:
var synthesizer = AVSpeechSynthesizer()
var playerNode: AVAudioPlayerNode = AVAudioPlayerNode()
fun play(audioCue: String){
let utterance = AVSpeechUtterance(string: audioCue)
synthesizer.write(utterance, toBufferCallback: {[weak self] buffer in
// We do our processing including conversion from pcmFormatFloat16 format to pcmFormatFloat32 format which is supported by AVAudioPlayerNode
self.playerNode.scheduleBuffer(buffer as! AVAudioPCMBuffer, completionCallbackType: .dataPlayedBack)
}
}
All of it was working fine before iOS 16 but with iOS 16 we started getting this exception:
[AXTTSCommon] TTSPlaybackEnqueueFullAudioQueueBuffer: error -66686 enqueueing buffer
Not sure what this exception means exactly. So we are looking for a way of addressing this exception or may be a better way of playing the buffers.
UPDATE:
Created an empty project for testing and it turns out the write method if called with an empty bloc generates these logs:
Code I have used for Swift project :
let synth = AVSpeechSynthesizer()
let myUtterance = AVSpeechUtterance(string: message)
myUtterance.rate = 0.4
synth.speak(myUtterance)
Can move let synth = AVSpeechSynthesizer() out of this method and declare on top for this class and use.
Settings to enable for Xcode14 & iOS 16 : If you are using XCode14 and iOS16, it may be voices under spoken content is not downloaded and you will get an error on console saying identifier, source, content nil. All you need to do is, go to accessiblity in settings -> Spoken content -> Voices -> Select any language and download any profile. After this run ur voice and you will be able to hear the speech from passed text.
It is working for me now.
I have an app being used by people to receive orders with it needing to make a continuous sound until staff attend to it. It was working for two months then just started crashing a lot. For whatever reason, it runs fine on an iPad but not on iPhones running a recent operating system.
When this bit of code gets called it crashes:
guard let path = Bundle.main.path(forResource: "alert.mp3", ofType: nil) else { return }
let url = URL(fileURLWithPath: path)
do {
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
} catch let err {
print("err: \(err)")
}
DispatchQueue.main.async {
self.alertSoundEffect.numberOfLoops = -1
self.alertSoundEffect.prepareToPlay()
self.alertSoundEffect.play()
}
The fix online to declare the alertSoundEffect variable like this:
private var alertSoundEffect : AVAudioPlayer!
has not worked at all.
I tried moving everything but the line:
self.alertSoundEffect.play()
to viewDidLoad as I thought maybe that code couldn't get called more than once, but it didn't help.
Specifically, the compiler highlights this line when it crashes:
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
I tried using try AVAudioPlayer where it takes a Data object as a parameter or with including the type of audio file to be played, but that did not change anything.
When I try the AVAudioPlayer's delegate and declare it like this:
self.alertSoundEffect.delegate = self
right before the first lines of code I shared above Xcode highlights this line instead when it reliably crashes.
What else should I try?
I suppose your path is wrong.
Try this:
guard let path = Bundle.main.path(forResource: "alert", ofType: "mp3") else { return }
Also, if your audio file is short, like less than 30s, then try not to call self.alertSoundEffect.prepareToPlay(). Just call self.alertSoundEffect.play() right away.
Since iOS 13, this was causing a bug in my app, since I have notification sounds which are 3-10 seconds long.
If you initialise your AVAudioPlayer like var wrongMusicPlayer: AVAudioPlayer = AVAudioPlayer() OR wrongMusicPlayer = AVAudioPlayer() in any method then please remove it and just Declare like var wrongMusicPlayer: AVAudioPlayer!.
iOS 13.1 Crash in AVAudio Player
I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time.
I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it would delay the second sound just slightly after the first sound was started). Furthermore, if I pressed notes at a very fast rate, it seemed like the engine couldn't keep up, and later sounds would start well after I had pressed the later notes.
I am trying to solve this problem, and from the research I have done, it seems like using the AVAudioEngine to play sounds would be the best method, where I can set up the sounds in an array of buffers, and then have them play back from those buffers.
class ViewController: UIViewController
{
// Main Audio Engine and it's corresponding mixer
var audioEngine: AVAudioEngine = AVAudioEngine()
var mainMixer = AVAudioMixerNode()
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
// Array of filepaths
let noteFilePath: [String] = [
Bundle.main.path(forResource: "note1", ofType: "wav")!,
Bundle.main.path(forResource: "note2", ofType: "wav")!,
Bundle.main.path(forResource: "note3", ofType: "wav")!]
// Array to store the note URLs
var noteFileURL = [URL]()
// One audio file per note
var noteAudioFile = [AVAudioFile]()
// One audio buffer per note
var noteAudioFileBuffer = [AVAudioPCMBuffer]()
override func viewDidLoad()
{
super.viewDidLoad()
do
{
// For each note, read the note URL into an AVAudioFile,
// setup the AVAudioPCMBuffer using data read from the file,
// and read the AVAudioFile into the corresponding buffer
for i in 0...2
{
noteFileURL.append(URL(fileURLWithPath: noteFilePath[i]))
// Read the corresponding url into the audio file
try noteAudioFile.append(AVAudioFile(forReading: noteFileURL[i]))
// Read data from the audio file, and store it in the correct buffer
let noteAudioFormat = noteAudioFile[i].processingFormat
let noteAudioFrameCount = UInt32(noteAudioFile[i].length)
noteAudioFileBuffer.append(AVAudioPCMBuffer(pcmFormat: noteAudioFormat, frameCapacity: noteAudioFrameCount)!)
// Read the audio file into the buffer
try noteAudioFile[i].read(into: noteAudioFileBuffer[i])
}
mainMixer = audioEngine.mainMixerNode
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...2
{
audioEngine.attach(audioFilePlayer[i])
audioEngine.connect(audioFilePlayer[i], to: mainMixer, fromBus: 0, toBus: i, format: noteAudioFileBuffer[i].format)
}
// Start the audio engine
try audioEngine.start()
// Setup the audio session to play sound in the app, and activate the audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.soloAmbient)
try AVAudioSession.sharedInstance().setMode(AVAudioSession.Mode.default)
try AVAudioSession.sharedInstance().setActive(true)
}
catch let error
{
print(error.localizedDescription)
}
}
func playSound(senderTag: Int)
{
let sound: Int = senderTag - 1
// Set up the corresponding audio player to play its sound.
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, options: .interrupts, completionHandler: nil)
audioFilePlayer[sound].play()
}
Each sound should be playing without interrupting the other sounds, only interrupting its own sound when the sounds is played again. However, despite setting up multiple buffers and players, and assigning each one to its own Bus on the audioEngine's mixer, playing one sound still stops any other sounds from playing.
Furthermore, while leaving out .interrupts does prevent sounds from stopping other sounds, these sounds won't play until the sound that is currently playing completes. This means that if I play note1, then note2, then note3, note1 will play, while note2 will only play after note1 finishes, and note3 will only play after note2 finishes.
Edit: I was able to get the audioFilePlayer to reset to the beginning again without using interrupt with the following code in the playSound function.
if audioFilePlayer[sound].isPlaying == true
{
audioFilePlayer[sound].stop()
}
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, completionHandler: nil)
audioFilePlayer[sound].play()
This still leaves me with figuring out how to play these sounds simultaneously, since playing another sound will still stop the currently playing sound.
Edit 2: I found the solution to my problem. My answer is below.
It turns out that having the .interrupt option wasn't the issue (in fact, this actually turned out to be the best way to restart the sound that was playing in my experience, as there was no noticeable pause during the restart, unlike the stop() function). The actual problem that was preventing multiple sounds from playing simultaneously was this particular line of code.
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
What happened here was that each item of the array was being assigned the exact same AVAudioPlayerNode value, so they were all effectively sharing the same AVAudioPlayerNode. As a result, the AVAudioPlayerNode functions were affecting all of the items in the array, instead of just the specified item. To fix this and give each item a different AVAudioPlayerNode value, I ended up changing the above line so that it starts as an empty array of type AVAudioPlayerNode instead.
// One AVAudioPlayerNode per note
var audioFilePlayer = [AVAudioPlayerNode]()
I then added a new line to append to this array a new AVAudioPlayerNode at the beginning inside of the second for-loop of the viewDidLoad() function.
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...6
{
audioFilePlayer.append(AVAudioPlayerNode())
// audioEngine code
}
This gave each item in the array a different AVAudioPlayerNode value. Playing a sound or restarting a sound no longer interrupts the other sounds that are currently being played. I can now play any of the notes simultaneously and without any noticeable latency between note press and playback.
This is a continuation of the discussion in here.
I'm building a voice recorder app for iOS in Swift, and I have a custom waveform graphic that I feed data to from a AKFFTTap object. I had a problem that the FFT starts generating all zeros after a while. In order to diagnose and solve this, I'm trying to re-initialize all the nodes and taps whenever the user starts recording (assuming that would solve the issue). Previously, AudioKit was initialized and started when the view was loaded, and that's it.
So, now I try to re-allocate everything each recording, and it works, except that every re-recording (so not the first one, but the one after), the FFT doesn't work again. This time it's consistent and reproducible.
So, here's what I'm doing, and if anyone can show me where I'm going wrong, I'll be very grateful:
When recording starts, I'm doing:
mic = AKMicrophone() //needs to be started
fft = AKFFTTap.init(mic) //will start when mic starts
//now, let's define a mixer, and add the mic node to it, and initialize the recorder to it
micMixer = AKMixer(mic)
recorder = try AKNodeRecorder(node: micMixer)
micBooster = AKBooster(micMixer, gain: 0)
AudioKit.output = micBooster
try AudioKit.start()
mic.start()
micBooster.start()
try recorder.record()
When recording stops:
//now go back deallocating stuff
recorder.stop()
micBooster.stop()
micMixer.stop()
mic.stop()
//now set player file to recorder file, since I want to play it later
do {
if let file = recorder.audioFile {
player = try AKAudioPlayer(file: file, looping: false, lazyBuffering: false, completionHandler: playingEnded)
try AudioKit.stop()
} else {
//handle no file error
}
}
catch {
//handle error
}
So, can anyone please help me figure out why the FFT doesn't work the second time around?
Thanks!
I'm new to Swift and IOS programming. I want to load an audio file in Swift 2, to do some signal processing. But I'm not able to load an audio file and store it as an array of floats. Can anybody give me a hint on literature or even a working code example on how to load an audio file? I searched the forums, but none of the old topics seems to do the trick.
Here's my Code:
import UIKit
import AVFoundation
var audio: AVAudioPlayer!
let path = NSBundle.mainBundle().pathForResource("nameOfFile", ofType:"wav")!
let url = NSURL(fileURLWithPath: path)
do {
let sound = try AVAudioPlayer(contentsOfURL: url)
audio = sound
} catch {
// couldn't load file
}
error message: fatal error: unexpectedly found nil while unwrapping an Optional value
Thank you in advance (and sorry for the possibly dumb question)