Synchronizing AKPlayer with AKSampleMetronome - audiokit

I am trying to use AKSamplerMetronome as a master clock in my (sort of) multi audiofile playback project. I wanted AKPlayers to be started in sync with Metronome's downbeat. Mixing AKPlayer and AKSamplerMetronome as AudioKit.output was successful, however, I am struggling to connect AKPlayer.start with AKSamplerMetronome.beatTime(or something else I haven't figured out) so the playback starts with the Metronome's downbeat in sync (and repeat every time Metronome hits downbeat). Here's what I've written:
class ViewController: UIViewController {
let metronome = AKSamplerMetronome()
let player = AKPlayer(audioFile: try! AKAudioFile(readFileName: "loop.wav"))
let mixer = AKMixer()
func startAudioEngine() {
do {
try AudioKit.start()
} catch {
print(error)
fatalError()
}
}
func makeConnections() {
player >>> mixer
metronome >>> mixer
AudioKit.output = mixer
}
func startMetronome() {
metronome.tempo = 120.0
metronome.beatVolume = 1.0
metronome.play()
}
func preparePlayer() {
player.isLooping = true
player.buffering = .always
player.prepare()
// I wanted AKPlayer to be repeated based on Metronome's downbeat.
}
func startPlayer() {
let startTime = AVAudioTime.now() + 0.25
player.start(at: startTime)
}
override func viewDidLoad() {
super.viewDidLoad()
makeConnections()
startAudioEngine()
preparePlayer()
startPlayer()
startMetronome()
}
}
My problem is that, AKPlayer's start point(at:) doesn't recognize AKSamplerMetronome's properties, maybe because it's not compatible with AVAudioTime? I tried something like:
let startTime = metronome.beatTime + 0.25
player.start(at: startTime)
But this doesn't seem to be an answer (as "cannot convert value type 'Double' to expected argument type 'AVAudioTime?'). It would be extremely helpful if someone could help me exploring Swift/AudioKit. <3

you are calling the AVAudioTime playback function with a Double parameter. That's incorrect. If you want to start the AKPlayer with a seconds param, use player.play(when: time)
In general, you're close. This is how you do it:
let startTime: Double = 1
let hostTime = mach_absolute_time()
let now = AVAudioTime(hostTime: hostTime)
let avTime = now.offset(seconds: startTime)
metronome.setBeatTime(0, at: avTime)
player.play(at: avTime)
Basically you need to give a common clock to each unit (mach_absolute_time()), then use AVAudioTime to start them at the exact time. The metronome.setBeatTime is telling the metronome to reset it's 0 point at the passed in avTime.

Related

Audiokit AKSampler not playing sounds

currently trying to get my AKSampler to play sounds that I send it but not having much luck getting audio to output. My AKMidiCallbackInstrument is properly logging the notes playing (although I'm seeing the print for each note twice..) However, the call to my sampler is not producing any audio and I can't figure out why.
class Sequencer {
var sampler: AKSampler
var sequencer: AKAppleSequencer
var mixer: AKMixer
init() {
sampler = AKSampler()
sequencer = AKAppleSequencer()
mixer=AKMixer(sampler)
let midicallback = AKMIDICallbackInstrument()
let url = Bundle.main.url(forResource: "UprightPianoKW-20190703", withExtension: "sfz")!;
let track = sequencer.newTrack()
track?.setMIDIOutput(midicallback.midiIn)
sampler.loadSFZ(url: url)
//generate some notes and add thtem to the track
generateSequence()
midicallback >>> mixer
AudioKit.output = mixer
AKSettings.playbackWhileMuted = true
AKSettings.audioInputEnabled = true
midicallback.callback = { status, note, vel in
guard let status = AKMIDIStatus(byte: status),
let type = status.type,
type == .noteOn else { return print("note off: \(note)") }
print("note on: \(note)")
self.sampler.play(noteNumber: note, velocity: vel) }
}
func play() {
try? AudioKit.start()
sequencer.rewind()
sequencer.play()
try? AudioKit.stop()
}
func stop() {
sequencer.stop()
}
you need to connect your sampler to the mixer:
sampler >>> mixer
Fwiw,
midicallback >>> mixer isn't necessary with AKAppleSequencer/AKMIDICallbackInstrument although it would be with AKSequencer/AKCallbackInstrument

Swift: Trying to control time in AVAudioPlayerNode using UISlider

I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound.
to get the current time of the player I'm doing this:
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
}
I have a slider that indicates the current time of the audio. When the user changes the slider value, on .ended event I have to change the current time of the player to that indicated in the slider.
To do so:
extension AVAudioPlayerNode {
func seekTo(value: Float, audioFile: AVAudioFile, duration: Float) {
if let nodetime = self.lastRenderTime{
let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodetime)!
let sampleRate = self.outputFormat(forBus: 0).sampleRate
let newsampletime = AVAudioFramePosition(Int(sampleRate * Double(value)))
let length = duration - value
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
self.stop()
if framestoplay > 1000 {
self.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, at: nil,completionHandler: nil)
}
}
self.play()
}
However, my function seekTo is not working correctly(I'm printing currentTime before and after the function and it shows always a negative value ~= -0.02). What is the wrong thing I'm doing and can I find a simpler way to change the currentTime of the player?
I ran into same issue. Apparently the framestoplay was always 0, which happened because of sampleRate. The value for playerTime.sampleRate was always 0 in my case.
So,
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
must be replaced with
let framestoplay = AVAudioFrameCount(Float(sampleRate) * length)

Clicks / Distortion in AudioKit

When I add a bunch (20-40) samples playing and overlapping eachother simultaneously sometimes it starts getting distorted and then some waving, oscillating, and clicking begins to happen. A similar sound happens when the samples are playing the the app crashes - sounds like an abrupt, crunchy halt.
Notice the waviness begins between 0:05 and 0:10; nasty clicks start around 0:15.
Listen Here
How can I make it smoother? I am spawning AKPlayer objects (from 4.1) that play 4-8 second .wav files. Those go into AKBoosters which go into AKMixers which go into the final AKMixer for output.
Edit:
Many PenAudioNodes get plugged into the mixer of the AudioReceiver singleton.
Here's my AudioReceiver singleton:
class AudioReceiver {
static var sharedInstance = AudioReceiver()
private var audioNodes = [UUID : AudioNode]()
private let mixer = AKMixer()
private let queue = DispatchQueue(label: "audio-queue")
//MARK: - Setup & Teardown
init() {
AudioKit.output = mixer //peakLimiter
AudioKit.start()
}
//MARK: - Public
func audioNodeBegan(_ message : AudioNodeMessage) {
queue.async {
var audioNode: AudioNode?
switch message.senderType {
case .pen:
audioNode = PenAudioNode()
case .home:
audioNode = LoopingAudioNode(with: AudioHelper.homeLoopFile())
default:
break
}
if let audioNode = audioNode {
self.audioNodes[message.senderId] = audioNode
self.mixer.connect(input: audioNode.output)
audioNode.start(message)
}
}
}
func audioNodeMoved(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.update(message)
}
}
}
func audioNodeEnded(_ message : AudioNodeMessage) {
queue.async {
if let audioNode = self.audioNodes[message.senderId] {
audioNode.stop(message)
}
self.audioNodes[message.senderId] = nil
}
}
}
Here's my PenAudioNode:
class PenAudioNode {
fileprivate var mixer: AKMixer?
fileprivate var playersBoosters = [AKPlayer : AKBooster]()
fileprivate var finalOutput: AKNode?
fileprivate let file: AKAudioFile = AudioHelper.randomBellSampleFile()
//MARK: - Setup & Teardown
init() {
mixer = AKMixer()
finalOutput = mixer!
}
}
extension PenAudioNode: AudioNode {
var output: AKNode {
return finalOutput!
}
func start(_ message: AudioNodeMessage) {
}
func update(_ message: AudioNodeMessage) {
if let velocity = message.velocity {
let newVolume = Swift.min((velocity / 50) + 0.1, 1)
mixer!.volume = newVolume
}
if let isClimactic = message.isClimactic, isClimactic {
let player = AKPlayer(audioFile: file)
player.completionHandler = { [weak self] in
self?.playerCompleted(player)
}
let booster = AKBooster(player)
playersBoosters[player] = booster
booster.rampTime = 1
booster.gain = 0
mixer!.connect(input: booster)
player.play()
booster.gain = 1
}
}
func stop(_ message: AudioNodeMessage) {
for (_, booster) in playersBoosters {
booster.gain = 0
}
DispatchQueue.global().asyncAfter(deadline: DispatchTime.now() + 1) {
self.mixer!.stop()
self.output.disconnectOutput()
}
}
private func playerCompleted(_ player: AKPlayer) {
playersBoosters.removeValue(forKey: player)
}
}
This sounds like you are not releasing objects and you are eventually overloading the audio engine with too many instances of processing nodes connected in the graph. In particular not releasing AKBoosters will cause an issue like this. I can't really tell what your code is doing, but if you are spawning objects without releasing them properly, it will lead to garbled audio.
You want to conserve objects as much as possible and make sure you are using the absolute minimum amount of AKNode based processing.
There are various ways to debug this, but you can start by printing out the current state of the AVAudioEngine:
AudioKit.engine.description
That will show how many nodes you have connected in the graph at any given moment.

Allowing background audio with Swift not working

I want to allow background audio while the app is not in focus. I currently have this code, which should allow that:
do {
try AKSettings.setSession(category: .playback, with: .mixWithOthers)
} catch {
print("error")
}
AKSettings.playbackWhileMuted = true
I also have the setting 'Audio, Airplay and Picture in Picture' enabled in capabilities settings. However, when I press the home button on my device the audio doesn't keep playing. What am I doing wrong? I am using AudioKit to produce sounds if that matters.
I am using a singleton to house all of the AudioKit components which I named AudioPlayer.swift. Here is what I have in my AudioPlayer.swift singleton file:
class AudioPlayer: NSObject {
var currentFrequency = String()
var soundIsPlaying = false
var leftOscillator = AKOscillator()
var rightOscillator = AKOscillator()
var rain = try! AKAudioFile()
var rainPlayer: AKAudioPlayer!
var envelope = AKAmplitudeEnvelope()
override init() {
super.init()
do {
try AKSettings.setSession(category: .playback, with: .mixWithOthers)
} catch {
print("error")
}
AKSettings.playbackWhileMuted = true
AudioKit.output = envelope
AudioKit.start()
}
func setupFrequency(left: AKOscillator, right: AKOscillator, frequency: String) {
currentFrequency = frequency
leftOscillator = left
rightOscillator = right
let leftPanner = AKPanner(leftOscillator)
leftPanner.pan = -1
let rightPanner = AKPanner(rightOscillator)
rightPanner.pan = 1
//Set up rain and rainPlayer
do {
rain = try AKAudioFile(readFileName: "rain.wav")
rainPlayer = try AKAudioPlayer(file: rain, looping: true, deferBuffering: false, completionHandler: nil)
} catch { print(error) }
let mixer = AKMixer(leftPanner, rightPanner, rainPlayer)
//Put mixer in sound envelope
envelope = AKAmplitudeEnvelope(mixer)
envelope.attackDuration = 2.0
envelope.decayDuration = 0
envelope.sustainLevel = 1
envelope.releaseDuration = 0.2
//Start AudioKit stuff
AudioKit.output = envelope
AudioKit.start()
leftOscillator.start()
rightOscillator.start()
rainPlayer.start()
envelope.start()
soundIsPlaying = true
}
}
And here is an example of one of my sound effect view controllers, which reference the AudioKit singleton to send it a certain frequency (I have about a dozen of these view controllers, each with its own frequency settings):
class CalmView: UIViewController {
let leftOscillator = AKOscillator()
let rightOscillator = AKOscillator()
override func viewDidLoad() {
super.viewDidLoad()
leftOscillator.amplitude = 0.3
leftOscillator.frequency = 220
rightOscillator.amplitude = 0.3
rightOscillator.frequency = 230
}
#IBAction func playSound(_ sender: Any) {
if shared.soundIsPlaying == false {
AudioKit.stop()
shared.setupFrequency(left: leftOscillator, right: rightOscillator, frequency: "Calm")
} else if shared.soundIsPlaying == true && shared.currentFrequency != "Calm" {
AudioKit.stop()
shared.leftOscillator.stop()
shared.rightOscillator.stop()
shared.rainPlayer.stop()
shared.envelope.stop()
shared.setupFrequency(left: leftOscillator, right: rightOscillator, frequency: "Calm")
} else {
shared.soundIsPlaying = false
shared.envelope.stop()
}
}
}
I instantiated the AudioPlayer singleton in my ViewController.swift file.
It depends on when you are doing your configuration in relation to when AudioKit is started. If you're using AudioKit you should be using its AKSettings to manage your session category. Basically not only the playback category but also mixWithOthers. By default, does this:
/// Set the audio session type
#objc open static func setSession(category: SessionCategory,
with options: AVAudioSessionCategoryOptions = [.mixWithOthers]) throws {
So you'd do something like this in your ViewController:
do {
if #available(iOS 10.0, *) {
try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP])
} else {
// Fallback on earlier versions
}
} catch {
print("Errored setting category.")
}
So I think its a matter of getting that straight. It might also help to have inter-app audio set up. If you still have trouble and provide more information, I can help more, but this is as good an answer as I can muster based on the info you've given so far.

How to detect max dB Swift

I'm trying to detect dB on a iOS Device, however, I am new to AV audio foundation can't really get to figure it out. I have come across this post: iOS - Detect Blow into Mic and convert the results! (swift), but it is not working for me.
My current code is this:
import Foundation
import UIKit
import AVFoundation
import CoreAudio
class ViewController: UIViewController {
var recorder: AVAudioRecorder!
var levelTimer = NSTimer()
var lowPassResults: Double = 0.0
override func viewDidLoad() {
super.viewDidLoad()
//make an AudioSession, set it to PlayAndRecord and make it active
var audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: nil)
audioSession.setActive(true, error: nil)
//set up the URL for the audio file
var documents: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
var str = documents.stringByAppendingPathComponent("recordTest.caf")
var url = NSURL.fileURLWithPath(str as String)
// make a dictionary to hold the recording settings so we can instantiate our AVAudioRecorder
var recordSettings: [NSObject : AnyObject] = [AVFormatIDKey:kAudioFormatAppleIMA4,
AVSampleRateKey:44100.0,
AVNumberOfChannelsKey:2,AVEncoderBitRateKey:12800,
AVLinearPCMBitDepthKey:16,
AVEncoderAudioQualityKey:AVAudioQuality.Max.rawValue
]
//declare a variable to store the returned error if we have a problem instantiating our AVAudioRecorder
var error: NSError?
//Instantiate an AVAudioRecorder
recorder = AVAudioRecorder(URL:url, settings: recordSettings, error: &error)
//If there's an error, print otherwise, run prepareToRecord and meteringEnabled to turn on metering (must be run in that order)
if let e = error {
print(e.localizedDescription)
} else {
recorder.prepareToRecord()
recorder.meteringEnabled = true
//start recording
recorder.record()
//instantiate a timer to be called with whatever frequency we want to grab metering values
self.levelTimer = NSTimer.scheduledTimerWithTimeInterval(0.02, target: self, selector: #selector(ViewController.levelTimerCallback), userInfo: nil, repeats: true)
}
}
//This selector/function is called every time our timer (levelTime) fires
func levelTimerCallback() {
//we have to update meters before we can get the metering values
recorder.updateMeters()
//print to the console if we are beyond a threshold value. Here I've used -7
if recorder.averagePowerForChannel(0) > -7 {
print("Dis be da level I'm hearin' you in dat mic ")
print(recorder.averagePowerForChannel(0))
print("Do the thing I want, mofo")
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
i was currently building my app about movie making,and learned something about how to metering sound level in dB.
the origin data of recorder.averagePowerForChannel is not really dB level of the sound,it's provide a level range which is -160 - 0,so we need some modification to make this data more reasonable
so i was finding some thing that makes this data(value) convert to dB level data.
(Sorry about forgot where i was found it!)
here is the code
/**
Format dBFS to dB
- author: RÅGE_Devil_Jåmeson
- date: (2016-07-13) 20:07:03
- parameter dBFSValue: raw value of averagePowerOfChannel
- returns: formatted value
*/
func dBFS_convertTo_dB (dBFSValue: Float) -> Float
{
var level:Float = 0.0
let peak_bottom:Float = -60.0 // dBFS -> -160..0 so it can be -80 or -60
if dBFSValue < peak_bottom
{
level = 0.0
}
else if dBFSValue >= 0.0
{
level = 1.0
}
else
{
let root:Float = 2.0
let minAmp:Float = powf(10.0, 0.05 * peak_bottom)
let inverseAmpRange:Float = 1.0 / (1.0 - minAmp)
let amp:Float = powf(10.0, 0.05 * dBFSValue)
let adjAmp:Float = (amp - minAmp) * inverseAmpRange
level = powf(adjAmp, 1.0 / root)
}
return level
}
i was noticed that you are recording whit 2 channels so it will be little different with my code;
wish could help you out or give you some ideas :D
LAST UPDATE
Change coding language to swift

Resources