Swift - AudioKit Sequencer with Oscillator (AKOscillatorBank). Frequencies wont play at higher range (MidiNote 120+) - audiokit

I'm learning how to use AudioKit. I'm trying to play around with the sequencer and an oscillator. Everything is working dandy but i noticed when i feed a higher frequency to an oscillator that is in a sequencer track, it will render the same for such frequency (MidiNote) and all that are higher. if passed the same frequency to just the oscillator you can see the variance.
my initial setup
let oscillator = AKOscillatorBank()
let oscillatorTrackIndex = 0
let sequencer = AKAppleSequencer()
let midi = AKMIDI()
var scale: [Int] = []
let sequenceLength = AKDuration(beats: 8.0)
func setupTracks() {
let midiNode = AKMIDINode(node: oscillator)
_ = sequencer.newTrack()
sequencer.setLength(trueLength)
AudioKit.output = midiNode
try! AudioKit.start()
midiNode.enableMIDI(midi.client, name: "midiNode midi in")
sequencer.setTempo(currentTempo)
sequencer.enableLooping()
sequencer.play()
}
my method
func generateSequence(_ stepSize: Float = 1/4, clear: Bool = true) {
if clear { sequencer.tracks[oscillatorTrackIndex].clear() }
let numberOfSteps = Int(Float(sequenceLength.beats) / stepSize)
for i in 0 ..< numberOfSteps { //4
if i%4 == 0 {
sequencer.tracks[0].add(noteNumber: 140, velocity: 127, position: AKDuration(beats: Double(i)), duration: AKDuration(beats: 0.5))
} else {
sequencer.tracks[0].add(noteNumber: 200, velocity: 127, position: AKDuration(beats: Double(i)), duration: AKDuration(beats: 0.5))
}
}
}
as you can see i'm using note number 140 and 200. when the sequencer plays these notes, they render out the same audio. if i use .midiNoteToFrequency() and feed these through the oscillator by itself then you can hear the difference.
Thanks!

In the MIDI spec, there are only 7 bits for note number, allowing values between 0-127. Presumably (and this might happening internally in Apple's MusicSequence, since I don't think that AKAppleSequencer or AKMusicTrack do this explicitly), values outside of this range are clamped into this range to avoid unexpected crashes.

Related

Swift - AudioKit AKMidi to AKSequencer

I currently have an application which uses an AKKeyboard to create sounds using an Oscillator. Whenever the keyboard is played I get the MIDI data also. What I would like to do is create an AKSequence from the MIDI data I receive.
Any advice or pointers will be greatly appreciated, thank you.
Here is a partial amount of my code:
var bank = AKOscillatorBank()
var midi: AKMIDI!
let sequencer = AKSequencer()
let sequenceLength = AKDuration(beats: 8.0)
func configureBank() {
AudioKit.output = bank
do {
try AudioKit.start()
} catch {
AKLog("AudioKit couldn't be started")
}
midi = AudioKit.midi
midi.addListener(self)
midi.openInput()
}
// AKKeyboard Protocol methods
func noteOn(note: MIDINoteNumber) {
let event = AKMIDIEvent(noteOn: note, velocity: 80, channel: 5)
midi.sendEvent(event)
bank.play(noteNumber: note, velocity: 100)
}
func noteOff(note: MIDINoteNumber) {
let event = AKMIDIEvent(noteOff: note, velocity: 0, channel: 5)
midi.sendEvent(event)
bank.stop(noteNumber: note)
}
// AKMIDIListener Protocol methods..
func receivedMIDINoteOff(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
print("ReceivedMIDINoteOff: \(noteNumber), velocity: \(velocity), channel: \(channel)")
}
You don't actually need to build the sequence directly from the AKMIDIEvents. Just query the sequence's currentPosition when you call AKKeyboardView's noteOn and noteOff methods and programmatically add events to a sequencer track based on this.
The process is basically identical to this (minus the final step, of course): https://stackoverflow.com/a/50071028/2717159
Edit - To get the noteOn and noteOff times, and duration:
// store notes and times in a dictionary:
var noteDict = [MIDINoteNumber: MIDITimeStamp]()
// when you get a noteOn, note the time
noteDict[currentMIDINote] = seq.currentPosition.beats
// when you get a noteOff
let endTime = seq.currentPosition.beats
if let startTime = noteDict[currentMIDINote] {
let durationInBeats = endTime - startTime
// use the startTime, duration and currentMIDINote to add event to track
noteDict[currentMIDINote] = nil
}

AudioKit - Play sound files at specific position using sequencer

I'd like to use the AudioKit framework to generate a small sound sequence of some high and low sounds.
So what I'm starting with is the message that could look like this: "1100011010"
--> Every column should be looped through and if it's value is "1" AudioKit should play a (short) high frequency sound and if not it should play a (short) lower frequency sound.
Because a simple timer-loop that triggers every 0.15s the .play()-function for running a 0.1s sound (high/low) doesn't seems to be very accurate I decided to use the *AudioKit Sequencer*:
(o) audiokit:
enum Sequence: Int {
case snareDrum
}
var snareDrum = AKSynthSnare()
var sequencer = AKSequencer()
var pumper: AKCompressor?
var mixer = AKMixer()
public init() {
snareDrum >>> mixer
pumper = AKCompressor(mixer)
AudioKit.output = pumper
AudioKit.start()
}
func setupTracks() {
_ = sequencer.newTrack()
sequencer.tracks[Sequence.snareDrum.rawValue].setMIDIOutput(snareDrum.midiIn)
generateMessageSequence()
sequencer.enableLooping()
sequencer.setTempo(2000)
sequencer.play()
}
(o) play:
var message="1100011010"
var counter=0
for i in message {
counter+=0.15
if (i=="1") {
// play high sound at specific position
}
else {
// play low sound at specific position
}
}
(o) play low sound at specific position:
sequencer.tracks[Sequence.snareDrum.rawValue].add(noteNumber: 20,
velocity: 10,
position: AKDuration(beats: counter),
duration: AKDuration(beats: 1))
My question: How is it possible to play local sound files at specific positions using (position: AKDuration(beats: counter)) //the code from above instead of using default instruments like in this case AKSynthSnare()?
You could create two tracks, each with an AKMIDISampler. One plays a 'low' sample, and the other plays a 'high' sample. Assign the high notes to the high track, and low notes to the low track.
let sequencer = AKSequencer()
let lowTrack = sequencer.newTrack()
let lowSampler = AKMIDISampler()
try! lowSampler.loadWav("myLowSoundFile")
lowTrack?.setMIDIOutput(lowSampler.midiIn)
let highTrack = sequencer.newTrack()
let highSampler = AKMIDISampler()
try! highSampler.loadWav("myHighSoundFile")
highTrack?.setMIDIOutput(highSampler.midiIn)
sequencer.setLength(AKDuration(beats: 4.0))
sequencer.enableLooping()
then assign the high and low sounds to each track
let message = "1100011010"
let dur = 4.0 / Double(message.count)
var position: Double = 0
for i in message {
position += dur
if (i == "1") {
highTrack?.add(noteNumber: 60, velocity: 100, position: AKDuration(beats: position), duration: AKDuration(beats: dur * (2/3)))
} else {
lowTrack?.add(noteNumber: 60, velocity: 100, position: AKDuration(beats: position), duration: AKDuration(beats: dur * (2/34)))
}
}
(I haven't run the code, but something like this should work)

"lldb" crash with XCODE & ARKIT : 0 __ UpdateAudioTransform

I have written a game using ARKIT that pops up random nodes (with drones) in the 3d space around the player, using ARKIT, and I get an incomprehensible crash, as only thing in the console is "lldb" , and the rest of the crash details are on the screenshot attached (I am still a newbie in Swift so not able to debug it).
The crash happens when there's a lot of nodes on the screen (maybe 20-30) and the FPS drops - ie happens "mid - game" and the FPS drops a lot.
Can someone point me to the right direction for this crash?
The part of the code that is in my opinion relevant is the function that spawns the random 3d nodes (they also have SCNActions attached that play sounds when they're tapped - perhaps this could be relevant as the left hand side debugger opens with that line highlighted when the crash occurs, as per the photo attached). In case it is also relevant, the program uses some SCNParticleSystem calls as well. Attaching relevant code, and also the snapshot of the crash screen:
var droneSound : SCNAudioSource()
override func viewDidLoad() {
super.viewDidLoad()
droneSound = SCNAudioSource(named: "Sounds/drone1.wav")!
playDroneSound = SCNAction.playAudio(self.droneSound, waitForCompletion: true)
}
func startGame() {
DispatchQueue.main.async {
self.spawnTimer = Timer.scheduledTimer(timeInterval: TimeInterval(self.randomFloat(min: 2.5, max: 5)), target: self, selector: #selector(self.spawnDronesInterim), userInfo: nil, repeats: true)
}
}
#objc func spawnDronesInterim() {
for _ in 0...5 {
spawnDrone()
}
}
#objc func spawnDrone() {
let newDroneScene = SCNScene(named: "Ar.scnassets/DroneScene.scn")!
var newDroneNode = newDroneScene.rootNode.childNode(withName: "Drone", recursively: false)!
newDroneNode.name = "Drone\(self.droneCounter)"
newDroneNode = newDroneScene.rootNode.childNode(withName: "Drone\(self.droneCounter)", recursively: false)!
newDroneNode.position.x = newDroneNode.presentation.position.x + self.randomFloat(min: -10, max: 10)
newDroneNode.position.y = newDroneNode.presentation.position.y + self.randomFloat(min: -1, max: 5)
newDroneNode.position.z = newDroneNode.presentation.position.z + self.randomFloat(min: -10, max: 10)
let move1 = SCNAction.move(to: SCNVector3((self.randomFloat(min: -7, max: 7)), (self.randomFloat(min: 1, max: 5)), (self.randomFloat(min: -7, max: 7))), duration: 15)
let disappearMove = SCNAction.move(to: SCNVector3((self.randomFloat(min: -10, max: 10)), (self.randomFloat(min: 1, max: 5)), (self.randomFloat(min: -10, max: 10))), duration: 3)
let rotateAction = SCNAction.run { (SCNNode) in
let rotate = SCNAction.rotateBy(x: 0, y: CGFloat(360.degreesToRadians), z: 0, duration: 2)
newDroneNode.runAction(rotate)
}
let removeIt = SCNAction.removeFromParentNode()
let sequence = SCNAction.sequence([waitFive,move1,rotateAction,waitFive,disappearMove,waitTwo,removeIt])
newDroneNode.runAction(sequence)
self.sceneView.scene.rootNode.addChildNode(newDroneNode)
if self.droneCounter >= 5 {
self.sceneView.scene.rootNode.childNode(withName: "Drone\(self.droneCounter)", recursively: true)!.runAction(SCNAction.repeatForever(self.playDroneSound))
}
self.droneCounter += 1
}
when user taps on one of the 3d nodes, this gets called:
func handleExplosion (node : SCNNode) {
self.sceneView.scene.rootNode.childNode(withName: node.name!, recursively: true)!.runAction(self.playExplosionSound)
node.opacity = 0
print (self.sceneView.scene.rootNode.position)
let confetti = SCNParticleSystem(named: "Ar.scnassets/confetti.scnp", inDirectory: nil)
confetti?.loops = false
confetti?.particleLifeSpan = 1.5
confetti?.emitterShape = node.geometry
let confettiNode = SCNNode()
confettiNode.addParticleSystem(confetti!)
confettiNode.position = node.presentation.position
self.sceneView.scene.rootNode.addChildNode(confettiNode)
let fire = SCNParticleSystem(named: "Ar.scnassets/fire", inDirectory: nil)
fire?.loops = false
fire?.particleLifeSpan = 0.1
fire?.emitterShape = node.geometry
let fireNode = SCNNode()
fireNode.addParticleSystem(fire!)
fireNode.position = node.presentation.position
self.sceneView.scene.rootNode.addChildNode(fireNode)
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
node.removeFromParentNode()
self.killedLabel.text = "Killed: \(self.killedCount)"
self.dronesOut -= 1
}
}
Any pointer to the right direction for solving this crash would be greatly appreciated]1

Apply amplitude modulation to pink noise operation

AudioKit provides documentation on creating white noise with panning as follows:
let generator = AKOperationGenerator { _ in
let white = AKOperation.whiteNoise()
let pink = AKOperation.pinkNoise()
let lfo = AKOperation.sineWave(frequency: 0.3)
let balance = lfo.scale(minimum: 0, maximum: 1)
let noise = mixer(white, pink, balance: balance)
return noise.pan(lfo)
}
However rather than panning, I'm looking to change the amplitude with the following parameters (from SoundForge Pro):
// AmplitudeModulation -> Sine
// 0.15 (s) -> Modulation frequency
// Minimum amplitude: Up to -30.0
// Stereo pan: Up to 20
// Dry out -30db
Is this possible using AudioKit?
You could use AKTremolo.
class ViewController: UIViewController {
let whiteNoise = AKWhiteNoise()
let tremolo = AKTremolo()
let mixer = AKMixer()
override func viewDidLoad() {
AudioKit.output = mixer
AudioKit.start()
whiteNoise >>> tremolo >>> mixer
tremolo.frequency = 0
whiteNoise.start()
let slider = AKSlider(property: "Tremolo") { value in
self.tremolo.frequency = 100 * value
}
slider.frame = CGRect(x: 0, y: 100, width: view.bounds.width, height: 100)
view.addSubview(slider)
}
}
You can do amplitude modulation by using AKOperationEffect. For example:
let Amplfo = AKOperation.sineWave(frequency: freq, amplitude: 1.0)
let Output = AKOperationEffect(generator) { generator, _ in
let lfo = max(Amplfo,0)
return generator * lfo }

How can I make an iOS device play music programatically?

I'm trying to make my iphone play a tune without using prerecorded files. What are my options here? AVAudioEngine, AudioKit? I've looked at them, but the learning curve is relatively steep for something I'm hoping is easy. They also seem like tools for creating sound effect given a PCM buffer window.
I'd like to be able to do something like
pitchCreator.play(["C4", "E4", "G4"], durations: [1, 1, 1])
Preferrably sounding like an instrument or at least not like a pure sine wave.
EDIT: The below code has been replaced by AudioKit
To anyone wondering this; I did make it work (kind of) using code similar to the one below.
class PitchCreator {
var engine: AVAudioEngine
var player: AVAudioPlayerNode
var mixer: AVAudioMixerNode
var buffer: AVAudioPCMBuffer
init() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
mixer = engine.mainMixerNode;
buffer = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0), frameCapacity: 100)
buffer.frameLength = 4096
engine.attachNode(player)
engine.connect(player, to: mixer, format: player.outputFormatForBus(0))
}
func play(frequency: Float) {
let signal = self.createSignal(frequency, amplitudes: [1.0, 0.5, 0.3, 0.1], bufferSize: Int(buffer.frameLength), sampleRate: Float(mixer.outputFormatForBus(0).sampleRate))
for i in 0 ..< signal.count {
buffer.floatChannelData.memory[i] = 0.5 * signal[i]
}
do {
try engine.start()
player.play()
player.scheduleBuffer(buffer, atTime: nil, options: .Loops, completionHandler: nil)
} catch {}
}
func stop() {
engine.stop()
player.stop()
}
func createSignal(frequency: Float, amplitudes: [Float], bufferSize: Int, sampleRate: Float) -> [Float] {
let π = Float(M_PI)
let T = sampleRate / frequency
var x = [Float](count: bufferSize, repeatedValue: 0.0)
for k in 0 ..< x.count {
for h in 0 ..< amplitudes.count {
x[k] += amplitudes[h] * sin(2.0 * π * Float(h + 1) * Float(k) / T)
}
}
return x
}
}
But it doesn't sound good enough so I've gone with sampling the notes I need and just use AVAudioPlayer instead to play them.

Resources