AudioKit / AVAudioEngine play nodes in sequence - buffer

I have some audio playing with Audiokit / AVAudioEngine and I would like to be able to play one node after another without any gaps. Here is what I have so far.
var file:AKAudioFile!
var node:AKPlayer!
var node1:AKPlayer!
var mixer:AKMixer!
func playInSequence()
{
do
{
file = try AKAudioFile(readFileName: "Song.m4a", baseDir: .resources)
node = AKPlayer(audioFile: file)
node.startTime = 10
node.endTime = 20
node.buffering = .dynamic
node.prepare()
node.completionHandler =
{
print("node complete");
}
node1 = AKPlayer(audioFile: file)
node1.startTime = 20
node1.endTime = 30
node1.buffering = .dynamic
node1.prepare()
node1.completionHandler =
{
print("node1 complete");
}
mixer = AKMixer(node, node1)
AudioKit.output = mixer
try AudioKit.start()
let future = AVAudioTime.now()
node.play(at: future)
let future2 = AVAudioTime.now() + 10
node1.play(at: future2)
}
catch
{
}
}
Please forgive me if I'm doing it all wrong. I really don't know what I'm doing.

Related

How to modify Engine's output on iOS AudioKit

I want to choose different sound effects to listen to when playing music
this is my code:
class EffectConductor: ObservableObject, ProcessesPlayerInput {
var engine = AudioEngine()
var player = AudioPlayer()
// var dryWetMixer : DryWetMixer
var isPlaying = false
init(path:String) {
let mediaUrl = URL.init(fileURLWithPath: path)
let file = try! AVAudioFile(forReading: mediaUrl)
try! player.load(file: file, buffered: true)
engine.output = player
}
#Published var effectType: Int = 0 {
didSet {
var node : Node? = nil
switch effectType {
case 1:
node = AutoPanner(player)
case 2:
node = AutoWah(player)
case 3:
node = Compressor(player)
case 4:
node = DynamicRangeCompressor(player)
case 5:
node = Expander(player)
case 6:
node = Phaser(player)
case 7:
node = StringResonator(player)
default:
node = nil
}
if node==nil
{
print("effect nil")
engine.output = player
}else{
engine.output = DryWetMixer(player, node!)
}
}
}
}
when i call this codeļ¼š
engine.start() player.start()
Can play music. but when i click a button and call this code:
effectType = 2(or ever value)
to change engine.output value
it's stop playing
i try this when i click a button
let progress = player.getCurrentTime()
if(isPlaying){
player.stop()
}
self.effectType = 3
if(isPlaying){
player.seek(time: progress)
player.play()
}
But there will be a pause in the playback process
how to solve this problem?
thanks everyone!

Set left and right headphone volume using two different sliders

I am generating a wave sound for different frequencies and user should hear this wave sound using headphones only and he/she will set left and right headphone volumes using two different sliders. To achieve wave sound I wrote below code which works perfect.
But problem is: From last 5 days I am trying to set volume for left and right headphones separately, but no luck.
class Synth {
// MARK: Properties
public static let shared = Synth()
public var volume: Float {
set {
audioEngine.mainMixerNode.outputVolume = newValue
}
get {
audioEngine.mainMixerNode.outputVolume
}
}
public var frequencyRampValue: Float = 0
public var frequency: Float = 440 {
didSet {
if oldValue != 0 {
frequencyRampValue = frequency - oldValue
} else {
frequencyRampValue = 0
}
}
}
private var audioEngine: AVAudioEngine
private lazy var sourceNode = AVAudioSourceNode { _, _, frameCount, audioBufferList in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
let localRampValue = self.frequencyRampValue
let localFrequency = self.frequency - localRampValue
let period = 1 / localFrequency
for frame in 0..<Int(frameCount) {
let percentComplete = self.time / period
let sampleVal = self.signal(localFrequency + localRampValue * percentComplete, self.time)
self.time += self.deltaTime
self.time = fmod(self.time, period)
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = sampleVal
}
}
self.frequencyRampValue = 0
return noErr
}
private var time: Float = 0
private let sampleRate: Double
private let deltaTime: Float
private var signal: Signal
// MARK: Init
init(signal: #escaping Signal = Oscillator.square) {
audioEngine = AVAudioEngine()
let mainMixer = audioEngine.mainMixerNode
let outputNode = audioEngine.outputNode
let format = outputNode.inputFormat(forBus: 0)
sampleRate = format.sampleRate
deltaTime = 1 / Float(sampleRate)
self.signal = signal
let inputFormat = AVAudioFormat(commonFormat: format.commonFormat,
sampleRate: format.sampleRate,
channels: 1,
interleaved: format.isInterleaved)
audioEngine.attach(sourceNode)
audioEngine.connect(sourceNode, to: mainMixer, format: inputFormat)
audioEngine.connect(mainMixer, to: outputNode, format: nil)
mainMixer.outputVolume = 0
audioEngine.mainMixerNode.pan = 100 // this does not work,
//audioEngine.mainMixerNode.pan = 1.0 // this also does not work
do {
try audioEngine.start()
} catch {
print("Could not start engine: \(error.localizedDescription)")
}
}
//This function will be called in view controller to generate sound
public func setWaveformTo(_ signal: #escaping Signal) {
self.signal = signal
}
}
With the above code I can hear the wave sound as normal in left and right headphone.
I tried to use audioEngine.mainMixerNode.pan for value 100 and -100 also -1.0 and 1.0 but this did not make any change.
I tried to use audioEngine.mainMixerNode.pan for value 100 and -100 but this did not make any change.
The allowable range for the pan value is {-1.0, 1.0}. The values that you say you used are outside that range, so it's not surprising that they had no effect. Try 0.75 or -0.75 instead.

Audiokit AKSampler not playing sounds

currently trying to get my AKSampler to play sounds that I send it but not having much luck getting audio to output. My AKMidiCallbackInstrument is properly logging the notes playing (although I'm seeing the print for each note twice..) However, the call to my sampler is not producing any audio and I can't figure out why.
class Sequencer {
var sampler: AKSampler
var sequencer: AKAppleSequencer
var mixer: AKMixer
init() {
sampler = AKSampler()
sequencer = AKAppleSequencer()
mixer=AKMixer(sampler)
let midicallback = AKMIDICallbackInstrument()
let url = Bundle.main.url(forResource: "UprightPianoKW-20190703", withExtension: "sfz")!;
let track = sequencer.newTrack()
track?.setMIDIOutput(midicallback.midiIn)
sampler.loadSFZ(url: url)
//generate some notes and add thtem to the track
generateSequence()
midicallback >>> mixer
AudioKit.output = mixer
AKSettings.playbackWhileMuted = true
AKSettings.audioInputEnabled = true
midicallback.callback = { status, note, vel in
guard let status = AKMIDIStatus(byte: status),
let type = status.type,
type == .noteOn else { return print("note off: \(note)") }
print("note on: \(note)")
self.sampler.play(noteNumber: note, velocity: vel) }
}
func play() {
try? AudioKit.start()
sequencer.rewind()
sequencer.play()
try? AudioKit.stop()
}
func stop() {
sequencer.stop()
}
you need to connect your sampler to the mixer:
sampler >>> mixer
Fwiw,
midicallback >>> mixer isn't necessary with AKAppleSequencer/AKMIDICallbackInstrument although it would be with AKSequencer/AKCallbackInstrument

No sound with AKSequencer and AKSampler chaining

I'm using AudioKit 4.9.5.
I'm trying to play scales using AKSequencer.
Here's how i'm using AudioKit.
The init:
AKAudioFile.cleanTempDirectory()
AKSettings.bufferLength = .medium
AKSettings.playbackWhileMuted = true
AKSettings.audioInputEnabled = true
tracker = AKFrequencyTracker(mic)
silence = AKBooster(tracker, gain: 0)
try? AKSettings.setSession(category: .playAndRecord,
with: [.defaultToSpeaker, .mixWithOthers])
mixer = AKMixer(silence, conductor.sampler)
AudioKit.output = mixer
Next to this, I'm initializing the conductor:
init() {
let info = ProcessInfo.processInfo
let begin = info.systemUptime
let soundsFolder = Bundle.main.bundleURL.path
AKSettings.bufferLength = .medium
AKSettings.enableLogging = true
// Signal Chain
sampler = AKSampler()
sampler.loadSFZ(path: soundsFolder, fileName: "Sax.sfz")
sampler.attackDuration = 0.01
sampler.decayDuration = 0.1
sampler.sustainLevel = 0.8
sampler.releaseDuration = 0.5
sequencer = Sequencer(name: "Scale", targetNode: sampler)
let elapsedTime = info.systemUptime - begin
print("Time to setup sampler \(elapsedTime) seconds")
}
Finally, my custom sequencer:
self.name = name
self.targetNode = targetNode
self.track = AKSequencerTrack(targetNode: targetNode) //target node is my sampler
self.sequencer = AKSequencer(targetNode: targetNode) //target node is my sampler
And this is how I'm creating the tracks:
let newTrack = AKSequencerTrack(targetNode: targetNode)
for step in track.steps {
for note in step.notes {
newTrack.add(noteNumber: MIDINoteNumber(note.rawValue), position: step.position, duration: step.duration)
}
}
self.track = sequencer.addTrack(for: newTrack)
sequencer.tempo = tempo.bpm
sequencer.length = newTrack.length
sequencer.loopEnabled = loopEnabled
I don't know why there's not sound. Maybe I'm missing something on the node chaining?
Just need the tracks to be in the signal chain, like in my answer here: How to play MIDI with AudioKit's new AKSequencer
Here's a sample pseudo who work:
let sequencerMixer = AKMixer()
let sampler = AKSampler()
...
sequencer = AKSequencer(targetNode: sampler)
...
let newTrack = sequencer.addTrack(for: sampler)
newTrack >>> sequencerMixer
...
AudioKit.output = AKMixer(sequencerMixer, sampler)

Clicks / Distortion in AudioKit

When I add a bunch (20-40) samples playing and overlapping eachother simultaneously sometimes it starts getting distorted and then some waving, oscillating, and clicking begins to happen. A similar sound happens when the samples are playing the the app crashes - sounds like an abrupt, crunchy halt.
Notice the waviness begins between 0:05 and 0:10; nasty clicks start around 0:15.
Listen Here
How can I make it smoother? I am spawning AKPlayer objects (from 4.1) that play 4-8 second .wav files. Those go into AKBoosters which go into AKMixers which go into the final AKMixer for output.
Edit:
Many PenAudioNodes get plugged into the mixer of the AudioReceiver singleton.
Here's my AudioReceiver singleton:
class AudioReceiver {
static var sharedInstance = AudioReceiver()
private var audioNodes = [UUID : AudioNode]()
private let mixer = AKMixer()
private let queue = DispatchQueue(label: "audio-queue")
//MARK: - Setup & Teardown
init() {
AudioKit.output = mixer //peakLimiter
AudioKit.start()
}
//MARK: - Public
func audioNodeBegan(_ message : AudioNodeMessage) {
queue.async {
var audioNode: AudioNode?
switch message.senderType {
case .pen:
audioNode = PenAudioNode()
case .home:
audioNode = LoopingAudioNode(with: AudioHelper.homeLoopFile())
default:
break
}
if let audioNode = audioNode {
self.audioNodes[message.senderId] = audioNode
self.mixer.connect(input: audioNode.output)
audioNode.start(message)
}
}
}
func audioNodeMoved(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.update(message)
}
}
}
func audioNodeEnded(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.stop(message)
}
self.audioNodes[message.senderId] = nil
}
}
}
Here's my PenAudioNode:
class PenAudioNode {
fileprivate var mixer: AKMixer?
fileprivate var playersBoosters = [AKPlayer : AKBooster]()
fileprivate var finalOutput: AKNode?
fileprivate let file: AKAudioFile = AudioHelper.randomBellSampleFile()
//MARK: - Setup & Teardown
init() {
mixer = AKMixer()
finalOutput = mixer!
}
}
extension PenAudioNode: AudioNode {
var output: AKNode {
return finalOutput!
}
func start(_ message: AudioNodeMessage) {
}
func update(_ message: AudioNodeMessage) {
if let velocity = message.velocity {
let newVolume = Swift.min((velocity / 50) + 0.1, 1)
mixer!.volume = newVolume
}
if let isClimactic = message.isClimactic, isClimactic {
let player = AKPlayer(audioFile: file)
player.completionHandler = { [weak self] in
self?.playerCompleted(player)
}
let booster = AKBooster(player)
playersBoosters[player] = booster
booster.rampTime = 1
booster.gain = 0
mixer!.connect(input: booster)
player.play()
booster.gain = 1
}
}
func stop(_ message: AudioNodeMessage) {
for (_, booster) in playersBoosters {
booster.gain = 0
}
DispatchQueue.global().asyncAfter(deadline: DispatchTime.now() + 1) {
self.mixer!.stop()
self.output.disconnectOutput()
}
}
private func playerCompleted(_ player: AKPlayer) {
playersBoosters.removeValue(forKey: player)
}
}
This sounds like you are not releasing objects and you are eventually overloading the audio engine with too many instances of processing nodes connected in the graph. In particular not releasing AKBoosters will cause an issue like this. I can't really tell what your code is doing, but if you are spawning objects without releasing them properly, it will lead to garbled audio.
You want to conserve objects as much as possible and make sure you are using the absolute minimum amount of AKNode based processing.
There are various ways to debug this, but you can start by printing out the current state of the AVAudioEngine:
AudioKit.engine.description
That will show how many nodes you have connected in the graph at any given moment.

Resources