Really excited by all the additions in 4.6! After reviewing most of the changes, I did not see anything explicitly different in AKMIDICallbackInstrument, however, I cannot get the call back to work anymore. Here is my implementation:
var sequencer: AKSequencer = AKSequencer()
var callbackTrack: AKMusicTrack = AKMusicTrack()
var callbackInst: AKMIDICallbackInstrument = AKMIDICallbackInstrument()
---
public func setupSequencerWithBeats(beats: Int, bpm: Double) {
print("Num beats: \(beats) | BPM: \(bpm)")
sequencer.setTempo(bpm)
callbackTrack = sequencer.newTrack()!
callbackTrack.setMIDIOutput(callbackInst.midiIn)
for i in 0 ..< beats {
callbackTrack.add(noteNumber: MIDINoteNumber(60), velocity: 100, position: AKDuration(beats: Double(i)), duration: AKDuration(beats: 1))
}
callbackInst.callback = {status, noteNumber, velocity in
//Using the new AKMIDIStatus object to unwrap the status and check if it's .noteOn
if let midiStatus = AKMIDIStatus(byte: status), midiStatus.type != .noteOn
{
return
}
// just some delegates to other classes
self.sequencerdDelegate?.didRecieveCallbackFromSequencer(beatNumber: self.beatNumber)
self.beatNumber += 1
}
When I call sequencer.play() the callbackInst fails to fire the callback anymore. My assumption here is did something change with setMIDIOutput() method? If there is a better way to get a callback when a .noteOn event is fired in my sequencer, I would love to know. Thanks everyone!
Thanks to AudioKit contributor oettam for seeing a very small change in the 4.6.0 that affected all MIDI components! This issue is fixed with 4.6.1! His commit: https://github.com/AudioKit/AudioKit/commit/dcfbbb98058425e43af23b9df69fd9794ecc34d5
Related
I use AudioKit 5 for iOS and want to play MIDI files (or single MIDI events) using sound fonts, but I hear that MIDISampler or AppleSampler ignores Release Time option of Sound Font. The option is needed to fade notes slowly, but the sampler stops them immediately. It sounds really strange especially for Sound Fonts like Strings, Violin etc.
I use Strings Sound Font and it plays great in Logic or Polyphone (I attached the screenshot from Polyphone app that shows that the Sound Font has option Vol Env Release = 1.1 and if I change the value it works as expected).
I also tried:
To play a MIDI file via AppleSequencer with MIDISampler connected
To play MIDI events manually added to the track of AppleSequencer
To load a Sound Font as a Melodic Sound Font
To replace MIDISampler with AppleSampler
But had no luck.
Below I attached a peace of my code that plays on/off MIDI events manually
import Foundation
import AudioKit
final class MySampler {
let engine = AudioEngine()
let strings = MIDISampler()
init() {
do {
try strings.loadSoundFont("strings", preset: 0, bank: 0)
engine.output = strings
try engine.start()
print("MIDI", "started")
} catch {
print("MIDI", error.localizedDescription)
}
}
func playNote(note: MIDINoteNumber, velocity: MIDIVelocity) {
strings.play(noteNumber: note, velocity: velocity, channel: 0)
}
func stopNote(note: MIDINoteNumber) {
strings.stop(noteNumber: note, channel: 0)
}
}
let mySampler = MySampler()
var currentNote: MIDINoteNumber = 0
func randomNote() -> MIDINoteNumber {
currentNote = (48...48 + 24).randomElement() ?? 60
return currentNote
}
func keyTouchDown() {
mySampler.playNote(note: randomNote(), velocity: 112)
}
func keyTouchUp() {
mySampler.stopNote(note: currentNote)
}
Thanks in advance for your help
I am attempting to combine operation generated sounds into a single instrument, I am aware that it is possible to use operations as arguments in each other - but I'm trying to trigger two (or more) simultaneously in the same instrument if possible - so Im trying to do so with a mixer. This is my instrument code:
public class LayeredInstrument: MIDIInstrument {
var opGenOne = OperationGenerator {
let volSlideCurve = Operation.exponentialSegment(trigger: Operation.trigger, start: 1, end: 0, duration: 0.09)
return Operation.sawtooth(frequency: 880, amplitude: volSlideCurve)
}
var opGenTwo = OperationGenerator {
let volSlideCurve = Operation.exponentialSegment(trigger: Operation.trigger, start: 1, end: 0, duration: 0.09)
return Operation.square(frequency: 220, amplitude: volSlideCurve)
}
var mixer = Mixer()
public init(){
super.init()
opGenOne.start()
opGenTwo.start()
mixer.start()
mixer.addInput(Node(avAudioNode: opGenOne.avAudioNode))
mixer.addInput(Node(avAudioNode: opGenTwo.avAudioNode))
avAudioUnit = mixer.avAudioUnit
avAudioNode = mixer.avAudioNode
}
public override func play(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
opGenOne.trigger()
opGenTwo.trigger()
}
public func stop(){}
}
I have two questions,
1: how come I can take the avAudioUnit and avAudioNode of either of those two operations and use it for a voice but I cant use the mixer (it is silent upon play() at the minute), is there a way to get this to work and hear both voices?
2: a question about operation triggering itself, is it necessary to stop a note played in an operation? if so what's the best way of doing that? possibly have some event at the end of a duration, or a class that monitors all notes played? its just that the examples Ive seen thus far have empty stop() methods
I'm trying to build a sequencer app on iOS. There's a sample on the Apple Developer website that makes an audio unit play a repeating scale, here:
https://developer.apple.com/documentation/audiotoolbox/incorporating_audio_effects_and_instruments
In the sample code, there's a file "SimplePlayEngine.swift", with a class "InstrumentPlayer" which handles sending MIDI events to the selected audio unit. It spawns a thread with a loop that iterates through the scale. It sends a MIDI Note On message by calling the audio unit's AUScheduleMIDIEventBlock, sleeps the thread for a short time, sends a Note Off, and repeats.
Here's an abridged version:
DispatchQueue.global(qos: .default).async {
...
while self.isPlaying {
// cbytes is set to MIDI Note On message
...
self.audioUnit.scheduleMIDIEventBlock!(AUEventSampleTimeImmediate, 0, 3, cbytes)
usleep(useconds_t(0.2 * 1e6))
...
// cbytes is now MIDI Note Off message
self.noteBlock(AUEventSampleTimeImmediate, 0, 3, cbytes)
...
}
...
}
This works well enough for a demonstration, but it doesn't enforce strict timing, since the events will be scheduled whenever the thread wakes up.
How can I modify it to play the scale at a certain tempo with sample-accurate timing?
My assumption is that I need a way to make the synthesizer audio unit call a callback in my code before each render with the number of frames that are about to be rendered. Then I can schedule a MIDI event every "x" number of frames. You can add an offset, up to the size of the buffer, to the first parameter to scheduleMIDIEventBlock, so I could use that to schedule the event at exactly the right frame in a given render cycle.
I tried using audioUnit.token(byAddingRenderObserver: AURenderObserver), but the callback I gave it was never called, even though the app was making sound. That method sounds like it's the Swift version of AudioUnitAddRenderNotify, and from what I read here, that sounds like what I need to do - https://stackoverflow.com/a/46869149/11924045. How come it wouldn't be called? Is it even possible to make this "sample accurate" using Swift, or do I need to use C for that?
Am I on the right track? Thanks for your help!
You're on the right track. MIDI events can be scheduled with sample-accuracy in a render callback:
let sampler = AVAudioUnitSampler()
...
let renderCallback: AURenderCallback = {
(inRefCon: UnsafeMutableRawPointer,
ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>,
inTimeStamp: UnsafePointer<AudioTimeStamp>,
inBusNumber: UInt32,
inNumberFrames: UInt32,
ioData: UnsafeMutablePointer<AudioBufferList>?) -> OSStatus in
if ioActionFlags.pointee == AudioUnitRenderActionFlags.unitRenderAction_PreRender {
let sampler = Unmanaged<AVAudioUnitSampler>.fromOpaque(inRefCon).takeUnretainedValue()
let bpm = 960.0
let samples = UInt64(44000 * 60.0 / bpm)
let sampleTime = UInt64(inTimeStamp.pointee.mSampleTime)
let cbytes = UnsafeMutablePointer<UInt8>.allocate(capacity: 3)
cbytes[0] = 0x90
cbytes[1] = 64
cbytes[2] = 127
for i:UInt64 in 0..<UInt64(inNumberFrames) {
if (((sampleTime + i) % (samples)) == 0) {
sampler.auAudioUnit.scheduleMIDIEventBlock!(Int64(i), 0, 3, cbytes)
}
}
}
return noErr
}
AudioUnitAddRenderNotify(sampler.audioUnit,
renderCallback,
Unmanaged.passUnretained(sampler).toOpaque()
)
That used AURenderCallback and scheduleMIDIEventBlock. You can swap in AURenderObserver and MusicDeviceMIDIEvent, respectively, with similar sample-accurate results:
let audioUnit = sampler.audioUnit
let renderObserver: AURenderObserver = {
(actionFlags: AudioUnitRenderActionFlags,
timestamp: UnsafePointer<AudioTimeStamp>,
frameCount: AUAudioFrameCount,
outputBusNumber: Int) -> Void in
if (actionFlags.contains(.unitRenderAction_PreRender)) {
let bpm = 240.0
let samples = UInt64(44000 * 60.0 / bpm)
let sampleTime = UInt64(timestamp.pointee.mSampleTime)
for i:UInt64 in 0..<UInt64(frameCount) {
if (((sampleTime + i) % (samples)) == 0) {
MusicDeviceMIDIEvent(audioUnit, 144, 64, 127, UInt32(i))
}
}
}
}
let _ = sampler.auAudioUnit.token(byAddingRenderObserver: renderObserver)
Note that these are just examples of how it's possible to do sample-accurate MIDI sequencing on the fly. You should still follow the rules of rendering to reliably implement these patterns.
Sample accurate timing generally requires using the RemoteIO Audio Unit, and manually inserting samples at the desired sample position in each audio callback block using C code.
(A WWDC session on core audio a few years back recommended against using Swift in the audio real-time context. Not sure if anything has changed that recommendation.)
Or, for MIDI, use a precisely incremented time value in each successive scheduleMIDIEventBlock call, instead of AUEventSampleTimeImmediate, and set these calls up slightly ahead of time.
I am trying to build a sequencer that render the note from midi file.
Currently I am using AudioKit for the music data processing. Would like to know how can I get the note data / event from the midi file with AudioKit.
I have tried to use AKSequencer and output to AKMIDINode to listen the MIDI event, but seems cannot get anything from it.
class CustomMIDINode: AKMIDINode {
override init(node: AKPolyphonicNode) {
print("Node create") // OK
super.init(node: node)
}
func receivedMIDINoteOff(noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
print("midi note off") // Not printed
}
func receivedMIDISetupChange() {
print("midi setup changed") // Not printed
}
override func receivedMIDINoteOn(_ noteNumber: MIDINoteNumber, velocity: MIDIVelocity, channel: MIDIChannel) {
print("receivedMIDINoteOn") // Not printed
}
}
func setupSynth() {
oscBank.attackDuration = 0.05
oscBank.decayDuration = 0.1
oscBank.sustainLevel = 0.1
oscBank.releaseDuration = 0.1
}
let seq = AKSequencer(filename: "music")
let oscBank = AKOscillatorBank()
var midi = AKMIDI()
let midiNode = CustomMIDINode(node: oscBank)
setupSynth()
midi.openInput()
midi.addListener(midiNode)
seq.tracks.forEach { (track) in
track.setMIDIOutput(midiNode.midiIn)
}
AudioKit.output = midiNode
AudioKit.start()
seq.play()
Have you looked at any of the example Audio Kit projects available for download? they are very useful for troubleshooting AK. I actually find the examples better than the documentation (as implementation isn't explained very well).
As for your question you can add a midi listener to an event, there is an example of this code in the Analog Synth X Project available here.
let midi = AKMIDI()
midi.createVirtualPorts()
midi.openInput("Session 1")
midi.addListener(self)
For a more worked bit of code you can refer to this although the code is likely out of date in parts.
Tony, is it that you aren’t receiving any MIDI events, or just the print statements?
I agree with Axemasta’s response about adding AKMidiListener to the class, along with checking out the MIDI code examples that come with AudioKit. This ROM Player example shows how to play external MIDI files with the AKMidiSsmpler node:
https://github.com/AudioKit/ROMPlayer
In order for the print to display, try wrapping it in a DispatchQueue.main.async so that it’s on the main thread. Here’s an AudioKit MIDI implementation question with a code example that I posted here:
AudioKit iOS - receivedMIDINoteOn function
I hope this helps.
This is an AudioKit question:
I am really new to AudioKit and audio in general.
My question is: How could I use AudioKit to create a sound that changes as I move my phone around? I already know how to get the gyro information so lets say I can take the gyro values between 0-10, zero being no movement and 10 being a lot of movement of the phone. I want to translate that into sounds that corresponds to how hard/quickly the phone is being moved. To start just move the sound higher in pitch as the speed increase, low pitch down at zero. Sounds easy yes?
I'm just not experienced enough to know which AudioKit class to use or how to use it to achieve my results.
Thank you!
Michael
You have to write your own AKOperationGenerator.
enum PitchEnvVCOSynthParameter: Int {
case frequency, gate
}
struct PitchEnvVCO {
static var frequency: AKOperation {
return AKOperation.parameters[PitchEnvVCOSynthParameter.frequency.rawValue]
}
static var gate: AKOperation {
return AKOperation.parameters[PitchEnvVCOSynthParameter.gate.rawValue]
}
}
extension AKOperationGenerator {
var frequency: Double {
get { return self.parameters[PitchEnvVCOSynthParameter.frequency.rawValue] }
set(newValue) { self.parameters[PitchEnvVCOSynthParameter.frequency.rawValue] = newValue }
}
var gate: Double {
get { return self.parameters[PitchEnvVCOSynthParameter.gate.rawValue] }
set(newValue) { self.parameters[PitchEnvVCOSynthParameter.gate.rawValue] = newValue }
}
}
and
let generator = AKOperationGenerator { parameters in
let oscillator = AKOperation.squareWave(
frequency: PitchEnvVCO.frequency
)
return oscillator
}
and then make your variable control the frequency
var vco1Freq: Double = 440.0
{
didSet {
generator.parameters[PitchEnvVCOSynthParameter.frequency.rawValue] = vco1Freq
}
}
Fetch the gyro data and make it control your variable like describes here