I'm implementing an app utilizing AudioKit that allows you to play a large number of audio files and switch between them at any time. Additionally we need one audio file to play on loop for ambient noise.
What is the correct way to dynamically change players in AudioKit's output? Or what's the proper way to implement this behavior?
AudioKit requires you set its output which can be an AKPlayer that plays one audio file or an AKMixer given an array of AKPlayers for example. You cannot change the output once AudioKit has been started as I saw previously. So currently my approach is to use AKMixer to play two AKPlayers - one for the current file and one for the ambient noise. When the user taps the 'Next' button I stop() AudioKit, recreate an AKMixer with a new player for the next song's audio file and the ambient noise player, assign that to output, start() AudioKit, and play both players. This results in undesirable behavior because playback of the ambient noise is stopped when switching songs resulting in a brief pause.
If this is the correct approach, how can we update the output without stopping AudioKit? I wondered if you can initialize the mixer with an array of players that's a strongly held property and simply add/remove players in the array. But that doesn't work - starting a new player throws an error player started when in a disconnected state because this node is not attached to the output.
I've created a sample project to demonstrate this behavior. When launched the app plays drums as ambient noise and waves for the current track. When you tap Next it'll switch to the next track (which in this demo code is just the same audio file) and you can hear the drums stop and then resume which is undesirable. I've included the project's code below:
final class Maestro: NSObject {
static let shared = Maestro()
private var audioPlayer: AKPlayer?
private var ambientPlayer: AKPlayer = {
let player = AKPlayer(url: Bundle.main.url(forResource: "drums", withExtension: "wav")!)!
player.isLooping = true
return player
}()
private var mixer: AKMixer?
private let audioFileURL = Bundle.main.url(forResource: "waves", withExtension: "mp3")!
func play() {
playNewPlayer(fileURL: audioFileURL)
}
func next() {
//In the real app we'd play the next audio file in the playlist but for the demo we'll just play the same file
playNewPlayer(fileURL: audioFileURL)
}
private func playNewPlayer(fileURL: URL) {
audioPlayer?.stop()
audioPlayer = nil
do {
try AudioKit.stop()
} catch {
print("Maestro AudioKit.stop error: \(error)")
}
audioPlayer = AKPlayer(url: fileURL)!
mixer = AKMixer([audioPlayer!, ambientPlayer])
AudioKit.output = mixer
do {
try AudioKit.start()
} catch {
print("Maestro AudioKit.start error: \(error)")
}
if ambientPlayer.isPlaying {
//need to resume playback from current position
let pos = ambientPlayer.currentTime
ambientPlayer.stop()
ambientPlayer.play(from: pos)
} else {
ambientPlayer.play()
}
audioPlayer?.play()
}
}
Related
I want to play an Audio file (A wav file for example) and at specific locations of the track I want to fire events or triggers that will control an external device.
My idea for now is to generate a MIDI track that plays in sync with the Audio Track and when the MIDI track notes are played, some trigger events are generated that we can handle to do whatever we want.
The thing where I am stuck right now is how to play the .mid file and generate events when midi notes are played. I also want to play the wav and the mid file in sync, but that is not what I am solving at this point.
I looked into AudioKit, but the examples seem out of date and the documentation isn't helping a lot.
Is MIDI a right approach to do this? is there an easier way in iOS where I don't have to use AudioKit and just use something from AVFoundation.
I want to understand what tool is best to detect when a midi note from the .mid file is played and handle the event.
My research has pointed me to use AKAppleSequencer. What could help is a simple example that loads a midi file and then basically prints something when a note is played.
I came across these posts,
How to connect AKSequencer to a AKCallbackInstrument?
Play MIDI file together with wav AudioKit
but the AKSequencer is now replaced by AKAppleSequencer.
So I figured it out. The answer was basically in the posts above just updated the code so it uses AKAppleSequencer.
let sequencer = AKAppleSequencer(filename: "SaReGaMa") // the .mid file
let callbackInstr = AKMIDICallbackInstrument()
var player: AKPlayer!
func initializeSession() {
callbackInstr.callback = myCallBack
sequencer.setGlobalMIDIOutput(callbackInstr.midiIn)
if let audioFile = try? AKAudioFile(readFileName: "SaReGaMa.wav") {
player = AKPlayer(audioFile: audioFile)
player.completionHandler = { print("Finished playing file")}
player.buffering = .always
AudioKit.output = player
do {
try AudioKit.start()
} catch {
print("Error starting audiokit, \(error)")
}
}
}
// The callback gets triggered when each midi note is played by the sequencer.
func myCallBack(a: UInt8, b:MIDINoteNumber, c:MIDIVelocity) -> () {
print(a,b,c);
}
// These functions let you control the playback.
func play() {
player.play()
sequencer.play()
}
func pause() {
sequencer.stop()
player.pause()
}
I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time.
I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it would delay the second sound just slightly after the first sound was started). Furthermore, if I pressed notes at a very fast rate, it seemed like the engine couldn't keep up, and later sounds would start well after I had pressed the later notes.
I am trying to solve this problem, and from the research I have done, it seems like using the AVAudioEngine to play sounds would be the best method, where I can set up the sounds in an array of buffers, and then have them play back from those buffers.
class ViewController: UIViewController
{
// Main Audio Engine and it's corresponding mixer
var audioEngine: AVAudioEngine = AVAudioEngine()
var mainMixer = AVAudioMixerNode()
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
// Array of filepaths
let noteFilePath: [String] = [
Bundle.main.path(forResource: "note1", ofType: "wav")!,
Bundle.main.path(forResource: "note2", ofType: "wav")!,
Bundle.main.path(forResource: "note3", ofType: "wav")!]
// Array to store the note URLs
var noteFileURL = [URL]()
// One audio file per note
var noteAudioFile = [AVAudioFile]()
// One audio buffer per note
var noteAudioFileBuffer = [AVAudioPCMBuffer]()
override func viewDidLoad()
{
super.viewDidLoad()
do
{
// For each note, read the note URL into an AVAudioFile,
// setup the AVAudioPCMBuffer using data read from the file,
// and read the AVAudioFile into the corresponding buffer
for i in 0...2
{
noteFileURL.append(URL(fileURLWithPath: noteFilePath[i]))
// Read the corresponding url into the audio file
try noteAudioFile.append(AVAudioFile(forReading: noteFileURL[i]))
// Read data from the audio file, and store it in the correct buffer
let noteAudioFormat = noteAudioFile[i].processingFormat
let noteAudioFrameCount = UInt32(noteAudioFile[i].length)
noteAudioFileBuffer.append(AVAudioPCMBuffer(pcmFormat: noteAudioFormat, frameCapacity: noteAudioFrameCount)!)
// Read the audio file into the buffer
try noteAudioFile[i].read(into: noteAudioFileBuffer[i])
}
mainMixer = audioEngine.mainMixerNode
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...2
{
audioEngine.attach(audioFilePlayer[i])
audioEngine.connect(audioFilePlayer[i], to: mainMixer, fromBus: 0, toBus: i, format: noteAudioFileBuffer[i].format)
}
// Start the audio engine
try audioEngine.start()
// Setup the audio session to play sound in the app, and activate the audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.soloAmbient)
try AVAudioSession.sharedInstance().setMode(AVAudioSession.Mode.default)
try AVAudioSession.sharedInstance().setActive(true)
}
catch let error
{
print(error.localizedDescription)
}
}
func playSound(senderTag: Int)
{
let sound: Int = senderTag - 1
// Set up the corresponding audio player to play its sound.
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, options: .interrupts, completionHandler: nil)
audioFilePlayer[sound].play()
}
Each sound should be playing without interrupting the other sounds, only interrupting its own sound when the sounds is played again. However, despite setting up multiple buffers and players, and assigning each one to its own Bus on the audioEngine's mixer, playing one sound still stops any other sounds from playing.
Furthermore, while leaving out .interrupts does prevent sounds from stopping other sounds, these sounds won't play until the sound that is currently playing completes. This means that if I play note1, then note2, then note3, note1 will play, while note2 will only play after note1 finishes, and note3 will only play after note2 finishes.
Edit: I was able to get the audioFilePlayer to reset to the beginning again without using interrupt with the following code in the playSound function.
if audioFilePlayer[sound].isPlaying == true
{
audioFilePlayer[sound].stop()
}
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, completionHandler: nil)
audioFilePlayer[sound].play()
This still leaves me with figuring out how to play these sounds simultaneously, since playing another sound will still stop the currently playing sound.
Edit 2: I found the solution to my problem. My answer is below.
It turns out that having the .interrupt option wasn't the issue (in fact, this actually turned out to be the best way to restart the sound that was playing in my experience, as there was no noticeable pause during the restart, unlike the stop() function). The actual problem that was preventing multiple sounds from playing simultaneously was this particular line of code.
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
What happened here was that each item of the array was being assigned the exact same AVAudioPlayerNode value, so they were all effectively sharing the same AVAudioPlayerNode. As a result, the AVAudioPlayerNode functions were affecting all of the items in the array, instead of just the specified item. To fix this and give each item a different AVAudioPlayerNode value, I ended up changing the above line so that it starts as an empty array of type AVAudioPlayerNode instead.
// One AVAudioPlayerNode per note
var audioFilePlayer = [AVAudioPlayerNode]()
I then added a new line to append to this array a new AVAudioPlayerNode at the beginning inside of the second for-loop of the viewDidLoad() function.
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...6
{
audioFilePlayer.append(AVAudioPlayerNode())
// audioEngine code
}
This gave each item in the array a different AVAudioPlayerNode value. Playing a sound or restarting a sound no longer interrupts the other sounds that are currently being played. I can now play any of the notes simultaneously and without any noticeable latency between note press and playback.
I’m having a problem using AKMIDISampler in my project when there’s an initialized AKMicrophone. Along with correctly playing the woodblock sample when “play” is called on the sampler, the first time “play” is called a constant sine wave starts playing - it never stops.
I’ve replicated the problem in the smallest amount of code below. Issue happens when the class is initialized then playTestSample() is called.
Note that if the AKMicrophone related code is all muted the AKMIDISampler plays fine and the sine wave that currently haunts my dreams doesn’t happen.
(I’ve tried switching to use the AKSampler() just to see if the problem would exist there but I haven’t been able to get any sound out of that).
Fyi: I have “App plays audio or streams audio/video using AirPlay” in the “Required background modes” in info.plist - which is know to fix another sine wave issue.
Thank you very much for any assistance.
Btw: AudioKit rocks and has been a massive help on this project! :^)
AK 4.5.4
Xcode 10.1
import Foundation
import AudioKit
class AudioKitTESTManager {
var mixer = AKMixer()
var sampler = AKMIDISampler()
var mic = AKMicrophone()
var micMixer = AKMixer()
var micBooster = AKBooster()
init() {
mixer = AKMixer(sampler, micBooster)
do {
let woodblock = try AKAudioFile(readFileName: RhythmGameConfig.woodblockSoundName)
try sampler.loadAudioFiles([woodblock])
} catch {
print("Error loading audio files into sampler")
}
micMixer = AKMixer(mic)
micBooster = AKBooster(micMixer)
micBooster.gain = 0.0
AudioKit.output = mixer
AKSettings.playbackWhileMuted = true
AKSettings.defaultToSpeaker = true
AKSettings.sampleRate = 44100
do {
print("Attempting to start AudioKit")
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
}
func playTestSample() {
// You hear the sample and a continuous sine wave starts playing through the samplerMixer
try? sampler.play(noteNumber: 60, velocity: 90, channel: 1)
}
}
Wheeew. I believe I've found a solution. Maybe it will help out someone else?
It seems that loading the files into the sampler AFTER AudioKit.start() fixes the Sine Wave of Terror!
//..
do {
print("Attempting to start AudioKit")
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
do {
let woodblock = try AKAudioFile(readFileName: RhythmGameConfig.woodblockSoundName)
try sampler.loadAudioFiles([woodblock])
} catch {
print("Error loading audio files into sampler")
}
I'm having a problem with recording audio from the microphone of my test device to a .caf file in Swift, XCode 9.4.1 using the latest version of AudioKit. In a simple test whereby I send the audio straight from the microphone to the output via an AKBooster, it works just fine and I can hear the mic input coming out of the speakers. I'm more or less following this example, although again using a booster node instead of an oscillator.
The following is my code:
class MicrophoneHandler
{
var microphone : AKMicrophone!
var booster : AKBooster!
var mixer : AKMixer!
var recorder : AKNodeRecorder!
var file : AKAudioFile!
var player : AKAudioPlayer!
init()
{
setupMicrophone()
microphone = AKMicrophone()
booster = AKBooster(microphone) // Stereo amplifier for microphone
mixer = AKMixer(booster)
file = try! AKAudioFile() // File to store recorder output
player = try? AKAudioPlayer(file: file) // Player to play back recorded audio file
//player.looping = true
recorder = try? AKNodeRecorder(node: mixer, file: file)
try? recorder.record()
sleep(5)
let dur = String(format: "%0.3f seconds", recorder.recordedDuration)
print("Stopped. (\(dur) recorded)")
recorder.stop()
//file.exportAsynchronously(name: "Test", baseDir: .documents, exportFormat: .caf){ [weak self] _, _ in
//}
//player.play()
//AudioKit.output = player!
//try? AudioKit.start()
}
func setupMicrophone()
{
// Function to initialise microphone settings
// Adapted from AudioKit example code found here:
// https://audiokit.io/examples/MicrophoneAnalysis
AKSettings.bufferLength = .medium
AKSettings.ioBufferDuration = 0.002 // TODO experiment with this to control latency
do
{
try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP) // Set session type & allow streaming to Bluetooth devices
} catch
{
AKLog("Could not set session category.")
}
AKSettings.defaultToSpeaker = true // Output to speaker when audio input is enabled
}
}
I have commented out the export code as the problem doesn't appear to be here. The console displays the following:
AKMicrophone.swift:init():45:Mixer inputs 8
AKAudioPlayer.swift:updatePCMBuffer():533:AKAudioPlayer Warning: "BF848EC0-94F8-4E39-A211-784B001CED72.caf" is an empty file
2018-11-16 17:49:16.936169+0000 VoxBox[2258:6984570] Audio files cannot be non-interleaved. Ignoring setting AVLinearPCMIsNonInterleaved YES.
AKNodeRecorder.swift:record():104:AKNodeRecorder: recording
Stopped. (0.000 seconds recorded)
As you can see, the recorder appears not to be recording to file for some reason. To my mind, my code should
Initialise the microphone (including settings)
Route the microphone input through a booster followed by a mixer (mixing with an FX bank will happen later)
Create an empty .caf audio file to be written to
Set up a player to play this file when the time comes
Set up a recorder to record the output of the mixer node to the audio file
Record 5 seconds of microphone input to the audio file
Yet for some reason nothing is being recorded. Clearly I am missing something or have misunderstood how the AKNodeRecorder works in this regard. I have read as many StackOverflow questions on similar topics as I can, had a dig through the AudioKit documentation and read a couple of examples from the AudioKit site, but nothing seems to address my particular problem.
Any help would be much appreciated.
All of my other audio files play perfectly through the audio player. Most of the files are less than 5 seconds long, and the longest file that still plays is 23 seconds long. For some reason my 1:15 long file "sortSong" is completely silent when it is called.
Here is the code for my audio player:
import Foundation
import AVFoundation
class AudioPlayer {
static let sharedInstance = AudioPlayer()
enum Sound: String {
case sortSongPlay = "sortSong"
case centerButtonRelease = "centerButtonRelease"
case buttonTap = "tapSound"
static var all: [Sound] {
return [.centerButtonRelease, .buttonTap, sortSongPlay]
}
var url: URL {
return URL.init(fileURLWithPath: Bundle.main.path(forResource: self.rawValue, ofType: "mp3")!)
}
}
private var soundMap = [String: SystemSoundID]()
init() {
for sound in Sound.all {
let soundUrl = sound.url
var soundId: SystemSoundID = 0
AudioServicesCreateSystemSoundID(soundUrl as CFURL, &soundId);
soundMap[sound.rawValue] = soundId
}
}
func play(sound: Sound) {
AudioServicesPlaySystemSound(soundMap[sound.rawValue]!)
}
}
The sound is played when this function is called in my view controller.
func successfulSort() {
AudioPlayer.sharedInstance.play(sound: .sortSongPlay)
rotateSortImageOne()
rotateSortImageTwo()
}
This is the action that calls the successfulSort func
if inputText == "hello" && seedArray == [1, 4, 1] {
successfulSort()
}
If I simply change the case sortSongPlay to = "shortSortSong" (the 23 second version) it plays just fine.
All of my sound files have their target memberships checked for this project file, and all of the files have the correct path. The audio will play in the interface builder if I press the play button. There are no compiler errors or warnings, the app never crashes, the audio for sortSong simply isn't playing when it is called in the app.
This is a link containing examples I have tried on my project. The first two sounds play silently in the app while the shorter sounds all play perfectly. https://soundcloud.com/spiffystache
What is causing this to be silent?
You should put a correct title on your question.
Your code does not use AVAudioPlayer, but uses System Sound Services.
The documentation clearly states its limitation:
You can use System Sound Services to play short (30 seconds or
shorter) sounds.
I have never checked if its limitation is exactly 30 seconds, but it matches the description in your question.
Consider using AVAudioPlayer (as in the title) to play longer sound.