AKMIDICallbackInstrument - callback NOT called - audiokit

How to Reproduce
I copied the "Callback Instrument" playground (which works), into a new project. installed AudioKit via pod (version 4.8)
I removed the implementation of the callback and put there only a print() statement.
Open workspace and run the project.
import UIKit
import AudioKit
class ViewController: UIViewController {
var sequencer = AKAppleSequencer()
var tempo = 120.0
var division = 1
var callbacker = AKMIDICallbackInstrument { statusByte, note, _ in
print("Callback called")
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let clickTrack = sequencer.newTrack()
for i in 0 ..< division {
clickTrack?.add(noteNumber: 80,
velocity: 100,
position: AKDuration(beats: Double(i) / Double(division)),
duration: AKDuration(beats: Double(0.1 / Double(division))))
clickTrack?.add(noteNumber: 60,
velocity: 100,
position: AKDuration(beats: (Double(i) + 0.5) / Double(division)),
duration: AKDuration(beats: Double(0.1 / Double(division))))
}
clickTrack?.setMIDIOutput(callbacker.midiIn)
clickTrack?.setLoopInfo(AKDuration(beats: 1.0), numberOfLoops: 10)
sequencer.setTempo(tempo)
sequencer.play()
}
}
What happens
The callback is not called (print log is not printed)
I can hear a sound of the added notes.
This code works in the example playground.

If you are hearing sounds, but haven't connected your tracks to an audio generating output, then you are probably hearing the default sampler. This will happen if you do not have audio enabled, in 'Background Modes'. If you look at the console output, you should see a message to instructing you to make sure that it is enabled - it is necessary with MusicSequence/AKAppleSequencer.

Related

Why won't the sequencer in AudioKit play my drum sounds and why is the volume so low?

First of all, a great framework. This is singlehandedly allowing me to graduate from my master's program. Also, I'm a sponsor! Any help would be taken with much gratitude. I can also push my repository and share it on GitHub for a closer look.
Anyway, here is my code
import Foundation
import AudioKit
class DrumSounds {
let drums = AKMIDISampler()
var currentBPM = 60
var rideCymbalFile: AKAudioFile?
var snareDrumFile: AKAudioFile?
var bassDrumFile: AKAudioFile?
var hiHatFile: AKAudioFile?
let sequencer = AKAppleSequencer(filename: "4tracks")
var booster = AKBooster()
init() {
do{
try rideCymbalFile = AKAudioFile(readFileName: "rideCymbalSound.wav")
try snareDrumFile = AKAudioFile(readFileName: "snareDrumSound.wav")
try bassDrumFile = AKAudioFile(readFileName: "bassDrumSound.wav")
try hiHatFile = AKAudioFile(readFileName: "hiHatSound.mp3")
try drums.loadAudioFiles([rideCymbalFile!,
snareDrumFile!,
bassDrumFile!,
hiHatFile!])
} catch {
print("error loading samples to drum object")
}
drums.volume = 1
booster = AKBooster(drums)
AudioKit.output = drums
sequencer.clearRange(start: AKDuration(beats: 0), duration: AKDuration(beats: 100))
sequencer.debug()
sequencer.setGlobalMIDIOutput(drums.midiIn)
sequencer.enableLooping(AKDuration(beats: 4))
sequencer.setTempo(Double(currentBPM))
}
func playDrumSounds () {
do {
try AKSettings.setSession(category: .playAndRecord, with: AVAudioSession.CategoryOptions.defaultToSpeaker)
let session = AVAudioSession.sharedInstance()
try session.setCategory(AVAudioSession.Category.playAndRecord)
if !AKSettings.headPhonesPlugged {
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
}
}catch {
print("error in settings.setSession")
}
sequencer.tracks[0].add(noteNumber: 0, velocity: 127, position: AKDuration(beats: 0), duration: AKDuration(beats: 1.0))
sequencer.tracks[0].add(noteNumber: 0, velocity: 127, position: AKDuration(beats: 1), duration: AKDuration(beats: 1.0))
sequencer.tracks[0].add(noteNumber: 0, velocity: 127, position: AKDuration(beats: 2), duration: AKDuration(beats: 1.0))
sequencer.tracks[0].add(noteNumber: 0, velocity: 127, position: AKDuration(beats: 3), duration: AKDuration(beats: 1.0))
sequencer.play()
}
}
I figured it out by randomly stumbling upon a comment in another post. The volume is low because you need to enable "Audio, AirPlay, and Picture in Picture" in "Background Modes" under "Signing & Capabilities". Click the "+" button in the top left to add a capability:
As for playing the right drum sound: The right drum sound was, in fact, being played. However I set the MIDI note number too low, so it sounded like harsh static. If you're having this problem and have never worked with MIDI (like me), here is a link to a description of MIDI note numbers: https://www.inspiredacoustics.com/en/MIDI_note_numbers_and_center_frequencies. The higher the number, the higher the frequency. Changing the MIDI note number will change the frequency of your audio file!

iOS 12.4 AVAudioRecorder.record is returning false when app is in the background

First of all, this error only occurs in the latest 12.4 release on iOS. The issue does NOT occur in the simulator and must be run on a device. The issue is that the call to record on the AVAudioRecorder is returning false once the app goes into the background. In all previous versions of iOS, this would not happen. The info.plist is updated with the NSMicrophoneUsageDescription tag, and the capabilities for the app includes Audio background mode.
I have written a small ViewController that shows the issue. Steps to recreate:
1) Update the info.plist file with the NSMicrophoneUsageDescription tag so that the app gets permission to use the microphone
2) Update the app capabilities to set Audio background mode
3) Run the application
4) Send the application to the background
5) The current recording will finish, but the call to start a new recording will fail.
class ViewController: UIViewController, AVAudioRecorderDelegate {
var recordLabel: UILabel!
var recordingSession: AVAudioSession!
var audioRecorder: AVAudioRecorder!
var count: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
recordingSession = AVAudioSession.sharedInstance()
do {
try recordingSession.setCategory(.playAndRecord, mode: .default)
try recordingSession.setActive(true)
recordingSession.requestRecordPermission() { [unowned self] allowed in
DispatchQueue.main.async {
if allowed {
self.loadRecordingUI()
} else {
print("No permissions!!!")
}
}
}
} catch {
print("Exception in viewDidLoad!!!")
}
}
func loadRecordingUI() {
super.viewDidLoad()
recordLabel = UILabel(frame: CGRect(x: 0, y: 0, width: 300, height: 21))
recordLabel.center = CGPoint(x: 160, y: 285)
recordLabel.textAlignment = .center
recordLabel.text = "Waiting...."
self.view.addSubview(recordLabel)
setupRecorder()
startRecording();
}
func setupRecorder() {
let audioFilename = getDocumentsDirectory().appendingPathComponent("recording.m4a")
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
do {
audioRecorder = try AVAudioRecorder(url: audioFilename, settings: settings)
audioRecorder.delegate = self
} catch {
print("Exception thrown in setupRecorder")
}
}
func startRecording() {
count += 1
let ret = audioRecorder.record(forDuration: 10) //record for 10 seonds
let txt = "Record returned " + ret.description + " for #\(count)"
recordLabel.text = txt
print(txt)
}
func getDocumentsDirectory() -> URL {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
return paths[0]
}
func audioRecorderDidFinishRecording(_ recorder: AVAudioRecorder, successfully flag: Bool) {
startRecording() //immediately start recording again
}
}
Since the capability for the app to record in the background has been set, I expect the call to record on the AVAudioRecorder to return true
I filed a feedback to Apple about this (audioRecorder.record() returns false while in background) and got the answer that it's a new privacy protection restriction, i.e. they will not fix it.
"The behaviour mentioned is a change but is the correct behaviour going forward, attempting to start audio recording in the background which had not been recording previously (and then interrupted by a call or Siri) will not work (essentially trying to start recording from the background randomly will no longer work). This is a new a privacy protection restriction introduced in 12.4."

AudioKit for iOS: Frequency Discrepancy on Simulator vs Device

I am using AudioKit to monitor frequency for a simple guitar tuner application and am experiencing discrepancies in frequency after updating from AudioKit ~4.2 to 4.4, Xcode 9.x to 10, and iOS 11 to 12. Before the updates, I was achieving correct frequency readings on my device. After updating, I am getting accurate results for a low E1 (82.4 Hz) on the simulator, but false readings on the device (alternates from ~23 to ~47 kHz).
I have tried using another device, but achieve the same results.
My viewDidLoad() setting up AudioKit is relatively simple, and I used the AudioKit playgrounds as a guideline:
override func viewDidLoad() {
super.viewDidLoad()
// Enable microphone tracking.
AKSettings.audioInputEnabled = true
let mic = AKMicrophone()
let tracker = AKFrequencyTracker(mic)
let silence = AKBooster(tracker, gain: 0)
AudioKit.output = silence
do {
try AudioKit.start()
}
catch {
print("AudioKit did not start!")
}
mic.start()
tracker.start()
// Track input frequency, 100ms intervals
timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) {
[weak self] (timer) in
guard let this = self else { return }
this.frequencyLabel.text = String(format: "Frequency: %.3f Hz", tracker.frequency)
this.frequencyLabel.sizeToFit()
}
}
As a sidenote, I am getting Objective-C console output regarding AudioKit classes being implemented in two places. Would this contribute to the issue?
objc[517]: Class AKRhodesPianoAudioUnit is implemented in both /private/var/containers/Bundle/Application/5A294050-2DB2-45C9-BB0A-3A0DE25E87C6/Tuner.app/Frameworks/AudioKitUI.framework/AudioKitUI (0x1058413f0) and /var/containers/Bundle/Application/5A294050-2DB2-45C9-BB0A-3A0DE25E87C6/Tuner.app/Tuner (0x104e177e8). One of the two will be used. Which one is undefined.
Any ideas? Thanks in advance!

AVAudioPlayerNode volume change is not applied immediately

Using AVFoundation and AVAudioPlayerNode to play sound with Xcode Version 9.2 (9C40b) and deploying to iOS 11.2
The problem is that when you change volume, the change is not applied the first time you play the sound, and there are other weird effects.
Start a new project in Xcode, select iOS Game, and give any name. Then replace the code in GameScene.swift with:
import SpriteKit
import GameplayKit
import AVFoundation
class GameScene: SKScene {
var entities = [GKEntity]()
var graphs = [String : GKGraph]()
let audioFilePlayer = AVAudioPlayerNode()
var audioFile:AVAudioFile! = nil
var audioFileBuffer:AVAudioPCMBuffer! = nil
override func sceneDidLoad() {
do {
let path = Bundle.main.path(forResource: "sound", ofType: "caf")!
let url = URL(fileURLWithPath: path)
audioFile = try AVAudioFile(forReading: url)
audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat, frameCapacity: UInt32(audioFile.length))
try audioFile.read(into: audioFileBuffer!)
audioEngine.attach(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to: audioEngine.mainMixerNode, format: audioFileBuffer?.format)
try audioEngine.start()
}
catch {
print(error)
}
let playAction = SKAction.sequence([
SKAction.wait(forDuration: 3.0),
SKAction.run { self.play(volume: 1.0) },
SKAction.wait(forDuration: 1.0),
SKAction.run { self.play(volume: 1.0) },
SKAction.wait(forDuration: 1.0),
SKAction.run { self.play(volume: 0.0) },
SKAction.wait(forDuration: 1.0),
SKAction.run { self.play(volume: 0.0) },
SKAction.wait(forDuration: 1.0),
SKAction.run { self.play(volume: 1.0) },
SKAction.wait(forDuration: 1.0),
SKAction.run { self.play(volume: 1.0) },
])
self.run(playAction)
}
func play(volume:Float) {
print("playing at \(volume)")
audioFilePlayer.stop()
audioFilePlayer.volume = volume
audioFilePlayer.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioFilePlayer.play()
}
}
Apologies for poor optionals... Also, add a sound file called sound.caf to the project.
Console output is:
playing at 1.0
playing at 1.0
playing at 0.0
playing at 0.0
playing at 1.0
playing at 1.0
I would expect to hear: loud, loud, nothing, nothing, loud, loud.
I am actually hearing: loud, loud, loud, nothing, soft, loud
(the 'soft' is particularly weird)
I have also tried changing the master volume with:
audioEngine.mainMixerNode.outputVolume = volume
but the sounds are the same.
In older games, used to use OpenAL, but this requires Obj-C and is pretty messy with bridging headers etc. Also, OpenAL is supposedly not supported any more. SKAction based audio cannot handle lots of sounds repeating fast without glitches and scratches (it's a space shooter game...) Problem is the same in simulator and on device. Any help appreciated!
OK I found the answer. AudioEngine needs to be reset after volume changes in order for the changes to be immediate:
audioEngine.reset()
this play() function works:
func play(volume:Float) {
print("playing at \(volume)")
audioFilePlayer.volume = volume
audioEngine.reset()
audioFilePlayer.stop()
audioFilePlayer.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioFilePlayer.play()
}
Yep, in case anyone else is interested I also solved it by adding an AVAudioMixerNode between the input source and engine.mainMixerNode

AVAudioUnitSampler generates sinewaves after headphones route change, iOS 11 iPhone

I'm facing a strange issue on iPhone (iOS 11) when using AVAudioUnitSampler.
Let's say I have a AVAudioUnitSampler initialised with a piano sound. So, every time that I connect or disconnect the headphones I hear the piano sound plus a sinewave tone added to it, which gets louder the more times I connect/disconnect the headphones.
So, to me it feels as if every time that the headphones are plugged/un-plugged, a new audio unit sampler would be internally attached to the sound output (and, since it is un-initialised, it generates just sinewave tones).
The following class already shows the problem. Note that I'm using AudioKit to handle MIDI signals and trigger the sampler (although on that end everything seem to work fine, ie. startNote() and stopNote() get called properly):
class MidiController: NSObject, AKMIDIListener {
var midi = AKMIDI()
var engine = AVAudioEngine()
var samplerUnit = AVAudioUnitSampler()
override public init() {
super.init()
NotificationCenter.default.addObserver(
self,
selector: #selector(handleRouteChange),
name: .AVAudioSessionRouteChange,
object: nil)
midi.openInput()
midi.addListener(self)
engine.attach(samplerUnit)
engine.connect(samplerUnit, to: engine.outputNode)
startEngine()
}
func startEngine() {
if (!engine.isRunning) {
do {
try self.engine.start()
} catch {
fatalError("couldn't start engine.")
}
}
}
#objc func handleRouteChange(notification: NSNotification) {
let deadlineTime = DispatchTime.now() + .milliseconds(100)
DispatchQueue.main.asyncAfter(deadline: deadlineTime) {
self.startEngine()
}
}
func receivedMIDINoteOn(noteNumber: MIDINoteNumber, velocity:MIDIVelocity, channel: MIDIChannel) {
if velocity > 0 {
samplerUnit.startNote(noteNumber: noteNumber, velocity: 127, channel: 0)
} else {
samplerUnit.stopNote(noteNumber: noteNumber, channel: 0)
}
}
func receivedMIDINoteOff(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
samplerUnit.stopNote(noteNumber: noteNumber, channel: 0)
}
}
I have forked AudioKit and replaced the HelloWorld example with a minimal project with which I can reproduce this problem.
Also, I couldn't reproduce this problem on iPad under both iOS 9.3 and 11, so this might be an iPhone-specific problem.
Any help or suggestion on how to continue debugging this would be very welcome, I'm quite puzzled with this and I'm not really an expert on iOS audio development.
Thanks!
You might try checking to see if the engine is started in handleRouteChange and then bounce it if it is instead of just starting it. Let us know if that works.

Resources