AKFFTTap stops working when initialized second time - audiokit

This is a continuation of the discussion in here.
I'm building a voice recorder app for iOS in Swift, and I have a custom waveform graphic that I feed data to from a AKFFTTap object. I had a problem that the FFT starts generating all zeros after a while. In order to diagnose and solve this, I'm trying to re-initialize all the nodes and taps whenever the user starts recording (assuming that would solve the issue). Previously, AudioKit was initialized and started when the view was loaded, and that's it.
So, now I try to re-allocate everything each recording, and it works, except that every re-recording (so not the first one, but the one after), the FFT doesn't work again. This time it's consistent and reproducible.
So, here's what I'm doing, and if anyone can show me where I'm going wrong, I'll be very grateful:
When recording starts, I'm doing:
mic = AKMicrophone() //needs to be started
fft = AKFFTTap.init(mic) //will start when mic starts
//now, let's define a mixer, and add the mic node to it, and initialize the recorder to it
micMixer = AKMixer(mic)
recorder = try AKNodeRecorder(node: micMixer)
micBooster = AKBooster(micMixer, gain: 0)
AudioKit.output = micBooster
try AudioKit.start()
mic.start()
micBooster.start()
try recorder.record()
When recording stops:
//now go back deallocating stuff
recorder.stop()
micBooster.stop()
micMixer.stop()
mic.stop()
//now set player file to recorder file, since I want to play it later
do {
if let file = recorder.audioFile {
player = try AKAudioPlayer(file: file, looping: false, lazyBuffering: false, completionHandler: playingEnded)
try AudioKit.stop()
} else {
//handle no file error
}
}
catch {
//handle error
}
So, can anyone please help me figure out why the FFT doesn't work the second time around?
Thanks!

Related

How to trigger events or callback at a specific point in an Audio Track?

I want to play an Audio file (A wav file for example) and at specific locations of the track I want to fire events or triggers that will control an external device.
My idea for now is to generate a MIDI track that plays in sync with the Audio Track and when the MIDI track notes are played, some trigger events are generated that we can handle to do whatever we want.
The thing where I am stuck right now is how to play the .mid file and generate events when midi notes are played. I also want to play the wav and the mid file in sync, but that is not what I am solving at this point.
I looked into AudioKit, but the examples seem out of date and the documentation isn't helping a lot.
Is MIDI a right approach to do this? is there an easier way in iOS where I don't have to use AudioKit and just use something from AVFoundation.
I want to understand what tool is best to detect when a midi note from the .mid file is played and handle the event.
My research has pointed me to use AKAppleSequencer. What could help is a simple example that loads a midi file and then basically prints something when a note is played.
I came across these posts,
How to connect AKSequencer to a AKCallbackInstrument?
Play MIDI file together with wav AudioKit
but the AKSequencer is now replaced by AKAppleSequencer.
So I figured it out. The answer was basically in the posts above just updated the code so it uses AKAppleSequencer.
let sequencer = AKAppleSequencer(filename: "SaReGaMa") // the .mid file
let callbackInstr = AKMIDICallbackInstrument()
var player: AKPlayer!
func initializeSession() {
callbackInstr.callback = myCallBack
sequencer.setGlobalMIDIOutput(callbackInstr.midiIn)
if let audioFile = try? AKAudioFile(readFileName: "SaReGaMa.wav") {
player = AKPlayer(audioFile: audioFile)
player.completionHandler = { print("Finished playing file")}
player.buffering = .always
AudioKit.output = player
do {
try AudioKit.start()
} catch {
print("Error starting audiokit, \(error)")
}
}
}
// The callback gets triggered when each midi note is played by the sequencer.
func myCallBack(a: UInt8, b:MIDINoteNumber, c:MIDIVelocity) -> () {
print(a,b,c);
}
// These functions let you control the playback.
func play() {
player.play()
sequencer.play()
}
func pause() {
sequencer.stop()
player.pause()
}

How to play multiple sounds from buffer simultaneously using nodes connected to AVAudioEngine's mixer

I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time.
I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it would delay the second sound just slightly after the first sound was started). Furthermore, if I pressed notes at a very fast rate, it seemed like the engine couldn't keep up, and later sounds would start well after I had pressed the later notes.
I am trying to solve this problem, and from the research I have done, it seems like using the AVAudioEngine to play sounds would be the best method, where I can set up the sounds in an array of buffers, and then have them play back from those buffers.
class ViewController: UIViewController
{
// Main Audio Engine and it's corresponding mixer
var audioEngine: AVAudioEngine = AVAudioEngine()
var mainMixer = AVAudioMixerNode()
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
// Array of filepaths
let noteFilePath: [String] = [
Bundle.main.path(forResource: "note1", ofType: "wav")!,
Bundle.main.path(forResource: "note2", ofType: "wav")!,
Bundle.main.path(forResource: "note3", ofType: "wav")!]
// Array to store the note URLs
var noteFileURL = [URL]()
// One audio file per note
var noteAudioFile = [AVAudioFile]()
// One audio buffer per note
var noteAudioFileBuffer = [AVAudioPCMBuffer]()
override func viewDidLoad()
{
super.viewDidLoad()
do
{
// For each note, read the note URL into an AVAudioFile,
// setup the AVAudioPCMBuffer using data read from the file,
// and read the AVAudioFile into the corresponding buffer
for i in 0...2
{
noteFileURL.append(URL(fileURLWithPath: noteFilePath[i]))
// Read the corresponding url into the audio file
try noteAudioFile.append(AVAudioFile(forReading: noteFileURL[i]))
// Read data from the audio file, and store it in the correct buffer
let noteAudioFormat = noteAudioFile[i].processingFormat
let noteAudioFrameCount = UInt32(noteAudioFile[i].length)
noteAudioFileBuffer.append(AVAudioPCMBuffer(pcmFormat: noteAudioFormat, frameCapacity: noteAudioFrameCount)!)
// Read the audio file into the buffer
try noteAudioFile[i].read(into: noteAudioFileBuffer[i])
}
mainMixer = audioEngine.mainMixerNode
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...2
{
audioEngine.attach(audioFilePlayer[i])
audioEngine.connect(audioFilePlayer[i], to: mainMixer, fromBus: 0, toBus: i, format: noteAudioFileBuffer[i].format)
}
// Start the audio engine
try audioEngine.start()
// Setup the audio session to play sound in the app, and activate the audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.soloAmbient)
try AVAudioSession.sharedInstance().setMode(AVAudioSession.Mode.default)
try AVAudioSession.sharedInstance().setActive(true)
}
catch let error
{
print(error.localizedDescription)
}
}
func playSound(senderTag: Int)
{
let sound: Int = senderTag - 1
// Set up the corresponding audio player to play its sound.
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, options: .interrupts, completionHandler: nil)
audioFilePlayer[sound].play()
}
Each sound should be playing without interrupting the other sounds, only interrupting its own sound when the sounds is played again. However, despite setting up multiple buffers and players, and assigning each one to its own Bus on the audioEngine's mixer, playing one sound still stops any other sounds from playing.
Furthermore, while leaving out .interrupts does prevent sounds from stopping other sounds, these sounds won't play until the sound that is currently playing completes. This means that if I play note1, then note2, then note3, note1 will play, while note2 will only play after note1 finishes, and note3 will only play after note2 finishes.
Edit: I was able to get the audioFilePlayer to reset to the beginning again without using interrupt with the following code in the playSound function.
if audioFilePlayer[sound].isPlaying == true
{
audioFilePlayer[sound].stop()
}
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, completionHandler: nil)
audioFilePlayer[sound].play()
This still leaves me with figuring out how to play these sounds simultaneously, since playing another sound will still stop the currently playing sound.
Edit 2: I found the solution to my problem. My answer is below.
It turns out that having the .interrupt option wasn't the issue (in fact, this actually turned out to be the best way to restart the sound that was playing in my experience, as there was no noticeable pause during the restart, unlike the stop() function). The actual problem that was preventing multiple sounds from playing simultaneously was this particular line of code.
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
What happened here was that each item of the array was being assigned the exact same AVAudioPlayerNode value, so they were all effectively sharing the same AVAudioPlayerNode. As a result, the AVAudioPlayerNode functions were affecting all of the items in the array, instead of just the specified item. To fix this and give each item a different AVAudioPlayerNode value, I ended up changing the above line so that it starts as an empty array of type AVAudioPlayerNode instead.
// One AVAudioPlayerNode per note
var audioFilePlayer = [AVAudioPlayerNode]()
I then added a new line to append to this array a new AVAudioPlayerNode at the beginning inside of the second for-loop of the viewDidLoad() function.
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...6
{
audioFilePlayer.append(AVAudioPlayerNode())
// audioEngine code
}
This gave each item in the array a different AVAudioPlayerNode value. Playing a sound or restarting a sound no longer interrupts the other sounds that are currently being played. I can now play any of the notes simultaneously and without any noticeable latency between note press and playback.

Reloading AKAudioFile in AKSequencer using AKCallbackInstrument.noteOff?

First, I called a AKMIDISampler to play an audio file, and then assigned it to AKSequencer. The 'midi' file I used is just a 2 bars long, C3 note, single track midi file, exactly as long as the audio file I wanted to play. But, in calling AKAudioFile, I wanted to choose mp3 file randomly. I temporarily made 1.mp3, 2.mp3 and 3.mp3 as below.
let track = AKMIDISampler()
let sequencer = AKSequencer(filename: "midi")
try? track.loadAudioFile(AKAudioFile(readFileName: String(arc4random_uniform(3)+1) + ".mp3"))
sequencer.tracks[0].setMIDIOutput(track.midiIn)
// Tempo track I had to made to remove sine wave
sequencer.tracks[1].setMIDIOutput(track.midiIn)
And did some sequencer settings,
sequencer.setTempo(128.0)
sequencer.setLength(AKDuration(beats: 8))
sequencer.setLoopInfo(AKDuration(beats: 8), numberOfLoops: 4)
sequencer.preroll()
and assigned AKMIDISampler to AudioKit.output, then did sequencer.play().
The sequencer playback was successful! It loaded among three mp3 files randomly, and played 8 beats (2 bars), looped for 4 times exactly.
But my goal is to load random MP3 files every time the loop repeats. It seems like the sequencer only plays the first assigned mp3 file when looping. I am struggling finding a solution to this.
Perhaps I could use "AKCallbackInstrument"? Since I play audiofile through a midi note in this case, I might reset "loadAudioFile" whenever the midi note is off? In that way I might loop the sequencer and play random a audio file in every loop. This is just an idea, but for me now it is hard to write it properly. I hope I am on the right track. It would be great if I could get an advice here. <3
You're definitely on the right track - you can easily get random audio files to loop at a fixed interval with AKSequencer + AKCallbackInstrument. But I wouldn't worry about trying to reload on the NoteOff message.
I would first load each mp3 into a separate player (e.g., AKAppleSampler) in an array (e.g.,you could call it players) and create a method that will trigger one of these players at random:
func playRandom() {
let playerIndex = Int(arc4random_uniform(UInt32(players.count)))
try? players[playerIndex].play()
}
When you create your sequencer, add a track and assign it to an AKCallbackInstrument. The callback function for this AKCallbackInstrument will call playRandom when it receives a noteOn message.
seq = AKSequencer()
track = seq.newTrack()!
callbackInst = AKCallbackInstrument()
track.setMIDIOutput(callbackInst.midiIn)
callbackInst.callback = { status, note, vel in
guard status == .noteOn else { return }
self.playRandom()
}
It isn't necessary to load the sequencer with a MIDI file. You could just add the triggering MIDI event directly to the track.
track.add(noteNumber: 48, // i.e., C3
velocity: 127,
position: AKDuration(beats: 0), // noteOn message here
duration: AKDuration(beats: 8), // noteOff 8 beats later
channel: 0)
Your problem with the sine wave is probably being caused by an extra track (probably tempo track) in the MIDI file which you created which hasn't been assigned an output. You can avoid the problem altogether by adding the MIDI events directly.
In principle, you could use the callback to check for noteOff events and trigger code from the noteOff, but I wouldn't recommend it in your case. There is no good reason to re-use a single player for multiple audiofiles. Loading the file is where you are most likely to create an error. What happens if your file hasn't finished playing and you try to load another one? The resources needed to keep multiple players in memory is pretty trivial - if you're going to play the same file more than once, it is cleaner and safer to load it once and keep the player in memory.
It was very helpful, c_booth! Thanks to you, I made a huge progress today. Here's what I've written based on your advise. First, I made an array of AKPlayers include 6 mp3 files. They're assigned to AKMixer, and then I called sequencer and callback instrument. I made a track and a note on the sequencer, which calls 'playRandom' function on every noteOn :
let players: [AKPlayer] = {
do {
let filenames = ["a1.mp3", "a2.mp3", "a3.mp3", "b1.mp3", "b2.mp3", "b3.mp3"]
return try filenames.map { AKPlayer(audioFile: try AKAudioFile(readFileName: $0)) }
} catch {
fatalError()
}
}()
func playRandom() {
let playerIndex = Int(arc4random_uniform(UInt32(players.count)))
players[playerIndex].play()
}
func addTracks() {
let track = sequencer.newTrack()!
track.add(noteNumber: 48, velocity: 127, position: AKDuration(beats: 0), duration: AKDuration(beats: 16), channel: 0)
track.setMIDIOutput(callbackInst.midiIn)
callbackInst.callback = { status, note, vel in
guard status == .noteOn else { return }
self.playRandom()
}
}
func sequencerSettings() {
sequencer.setTempo(128.0)
sequencer.setLength(AKDuration(beats: 16))
sequencer.setLoopInfo(AKDuration(beats: 16), numberOfLoops: 4)
sequencer.preroll()
}
func makeConnections() {
players.forEach { $0 >>> mixer }
}
func startAudioEngine() {
AudioKit.output = mixer
do {
try AudioKit.start()
} catch {
print(error)
fatalError()
}
}
func startSequencer() {
sequencer.play()
}
This worked great. It randomly selects one from 6 mp3 files (they are all the same length, 128bpm and 16 beats). What I found strange here is, though, the first playback plays two audio files at once. It works fine after the second loop. I changed the numberOfLoop setting, enableLooping(), etc but still the same - plays two files on the first playback. The trackcount is still 1, and I only called one AKPlayer as you could see. Is there anything I can do about this?
Also, ultimately, I'd like to call hundreds of mp3 files on the array, as what I'm trying to make is a sort of DJing app (something like Ableton Live preset). Do you think it's a good idea to use AKPlayer, assuming this code will load mp3 files from the cloud and stream it to the user? Much appreciated. <3

Limit audio duration when played

I'm trying to build an iOS application using Swift 4 that would play different sounds. Part of what I need to do is to limit the duration of the sound file played depending on some settings.
What I'd like to know is if it's possible to set the duration of the sound prior to playing it so it can stop after the set duration. If so, how?
I'm currently using Swift's AVAudioPlayer but I don't know if it can do it. My current code is shown below:
// resName set depending on settings
url = Bundle.main.url(forResource: resName, withExtension: "mp3")
do{
audioPlayer = try AVAudioPlayer(contentsOf: url!)
audioPlayer.prepareToPlay()
audioPlayer.currentTime = 0
} catch let error as NSError{
print(error.debugDescription)
}
audioPlayer.play()
Thanks in advance for the help :)
Well, since you are in control of playing the sounds, one way how to deal with it would be running a Timer when you start playing that would after the given time period stop playing:
let timeLimit = 1.6 // get it from settings
player.play()
Timer.scheduledTimer(withTimeInterval: timeLimit, repeats: false) { (timer) in
player.stop()
}
And in case you need to cancel it, you can keep a reference to the timer and cancel it using timer.invalidate().
As an audio engineer, I recommend editing your audio to be exactly how you want it to avoid any unwanted artifacts that can happen from coding.

Playing scheduled audio in the background

I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}

Resources