I followed the following two posts to playback music notes using Apple MIDI, but the sounds that is coming out does not sound like a note being played, but just the same frequency from a generator that is being played. Basically I don't hear an instrument playing the note, but just a generator creating the same frequency as the note.
Posts:
How do you change the music instrument for a MusicTrack?
Play musical notes in Swift Playground
My Code:
func playMusic(){
var sequence : MusicSequence? = nil
var musicSequence = NewMusicSequence(&sequence)
var track : MusicTrack? = nil
var musicTrack = MusicSequenceNewTrack(sequence!, &track)
// Adding notes
var time = MusicTimeStamp(0.0)
for notee in TestOne().notes{
var number = freqsScale[notee.frequency] ?? 0
print(number)
print(notee.distance)
number += 11
var note = MIDINoteMessage(channel: 0,
note: UInt8(number),
velocity: 64,
releaseVelocity: 0,
duration: notee.distance )
musicTrack = MusicTrackNewMIDINoteEvent(track!, time, ¬e)
time += Double(notee.distance)
}
var inMessage = MIDIChannelMessage(status: 0xB0, data1: 120, data2: 0, reserved: 0)
MusicTrackNewMIDIChannelEvent(track!, 0, &inMessage)
inMessage = MIDIChannelMessage(status: 0xC0, data1: 48, data2: 0, reserved: 0)
MusicTrackNewMIDIChannelEvent(track!, 0, &inMessage)
// Creating a player
var musicPlayer : MusicPlayer? = nil
var player = NewMusicPlayer(&musicPlayer)
player = MusicPlayerSetSequence(musicPlayer!, sequence)
player = MusicPlayerStart(musicPlayer!)
}
What am I doing wrong?
What is the correct way to set an instrument?
Related
I need to time duration and end event of midi file. I am using below code for play midi file. i tried but didn't found anything. thanks in advance
var s: MusicSequence?
NewMusicSequence(&s)
let midiFilePath = Bundle.main.path(forResource: "CCL-20180308-A-04", ofType: "mid")
let midiFileURL = URL(fileURLWithPath: midiFilePath ?? "")
MusicSequenceFileLoad(s!, midiFileURL as CFURL, MusicSequenceFileTypeID(rawValue: 0)!, [])
var p: MusicPlayer?
NewMusicPlayer(&p)
MusicPlayerSetSequence(p!, s)
MusicPlayerPreroll(p!)
MusicPlayerStart(p!)
usleep(3 * 100 * 100)
var now: MusicTimeStamp = 0
MusicPlayerGetTime(p!, &now)
This will work:
var s: MusicSequence!
NewMusicSequence(&s)
let midiFileURL = Bundle.main.url(forResource: "CCL-20180308-A-04", withExtension: "mid")!
MusicSequenceFileLoad(s!, midiFileURL as CFURL, .midiType, [])
var p: MusicPlayer!
NewMusicPlayer(&p)
MusicPlayerSetSequence(p, s)
MusicPlayerPreroll(p)
MusicPlayerStart(p)
var numTracks: UInt32 = 0
MusicSequenceGetTrackCount(s, &numTracks)
let length = (0..<numTracks).map { (index: UInt32) -> (MusicTimeStamp) in
var track: MusicTrack?
MusicSequenceGetIndTrack(s, index, &track)
var size = UInt32(MemoryLayout<MusicTimeStamp>.size)
var scratchLength = MusicTimeStamp(0)
MusicTrackGetProperty(track!, kSequenceTrackProperty_TrackLength, &scratchLength, &size)
return scratchLength
}.max() ?? 0
var lengthInSeconds = Float64(0)
MusicSequenceGetSecondsForBeats(s, length, &lengthInSeconds)
self.timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true, block: { (t) in
var now: MusicTimeStamp = 0
MusicPlayerGetTime(p, &now)
var nowInSeconds = Float64(0)
MusicSequenceGetSecondsForBeats(s, now, &nowInSeconds)
print("\(nowInSeconds) / \(lengthInSeconds)")
})
The important piece you were missing was to get the total sequence length by finding the length of the longest track. You can get the length of a track using MusicTrackGetProperty() for the kSequenceTrackProperty_TrackLength property.
For what it's worth, CoreMIDI is gnarly enough, especially in Swift, that I think it's worth using a higher level API. Check out AVMIDIPlayer, which is part of AVFoundation. If you need something more sophisticated, you might check out MIKMIDI, which is an open source MIDI library that builds on Core MIDI but adds a ton of additional functionality, and is significantly easier to use. (Disclaimer: I'm the original author and maintainer of MIKMIDI.) With MIKMIDI, you'd do this:
let midiFileURL = Bundle.main.url(forResource: "CCL-20180308-A-04", withExtension: "mid")!
let sequence = try! MIKMIDISequence(fileAt: midiFileURL)
let sequencer = MIKMIDISequencer(sequence: sequence)
sequencer.startPlayback()
self.timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true, block: { (t) in
let now = sequencer.timeInSeconds(forMusicTimeStamp: sequencer.currentTimeStamp, options: [])
let length = sequence.durationInSeconds
print("\(now) / \(length)")
})
Just a little bit simpler! Things get even more interesting if you're trying to do recording, more complex synthesis, routing MIDI to/from external devices, etc.
I'm trying to get something like this playground working on iOS:
http://audiokit.io/playgrounds/Analysis/Tracking%20Amplitude/
This is my view controller, where I use the mandolin physical model to create notes and then run an fft and an amplitudeTracker. But I get no values from them. You can see the output below:
var fft: AKFFTTap!
var amplitudeTracker: AKAmplitudeTracker!
override func viewDidLoad() {
super.viewDidLoad()
let mandolin = AKMandolin()
mandolin.detune = 1
mandolin.bodySize = 1
let pluckPosition = 0.2
let scale: [MIDINoteNumber] = [72, 74, 76, 77, 79, 81, 83, 84]
let delay = AKDelay(mandolin)
let mix = AKMixer()
mix.connect(delay)
let reverb = AKReverb(mix)
amplitudeTracker = AKAmplitudeTracker(mix)
fft = AKFFTTap(mix)
AudioKit.output = reverb
AudioKit.start()
for note in scale {
let note1: MIDINoteNumber = note
let octave1: MIDINoteNumber = 4
let course1 = 2
let count = 25
mandolin.fret(noteNumber: note1 + octave1, course: course1 - 1)
mandolin.pluck(course: course1 - 1, position: pluckPosition, velocity: 127)
print("plying note")
let fftData = self.fft.fftData
let lowMax = fftData[0 ... (count / 2) - 1].max() ?? 0
let hiMax = fftData[count / 2 ... count - 1].max() ?? 0
let hiMin = fftData[count / 2 ... count - 1].min() ?? 0
let amplitude = Float(self.amplitudeTracker.amplitude * 65)
print("amplitude \(amplitude)")
print("lowMax \(lowMax)")
print("hiMax \(hiMax)")
print("hiMin \(hiMin)")
sleep(1)
}
}
This is the output I get when I run it :
2017-09-26 12:43:27.724706-0700 AK[9467:1161171] 957: AUParameterSet 2 (1/8): err -10867
2017-09-26 12:43:28.177699-0700 AK[9467:1161171] 957: AUParameterSet 2 (1/8): err -10867
playing note
amplitude 0.0
lowMax 0.0
hiMax 0.0
hiMin 0.0
playing note
amplitude 0.0
lowMax 0.0
hiMax 0.0
hiMin 0.0
...
The main problem here is that Frequency Tracker node is not part of the signal chain. AudioKit (and Apple's underlying AVAudioEngine) works on a pull model in that audio will not be pulled through a node unless it is requested by a downstream node. This basically means everything up from the AudioKit.output node will get bytes pulled through them.
However, here, the reverb is made to be the output, so the tracker itself doesn't get any data coming through it. Changing it to AudioKit.output = amplitudeTracker will get the data going through the node.
The amplitudeTracker acts as a passthrough so the audio comes through as well. If you would not want the audio, you'd then stick the output of the tracker through a booster which would lower the volume down to zero.
I was getting this -10867 error while trying to reinitialize a AKSequencer variable that had a bunch of tracks/samplers/etc.
I stored them in arrays, called the following before reinitializing, and the -10867 errors went away:
private var samplers = [AKMIDISampler]()
private var tracks = [AKMusicTrack]()
private var mixer = AKMixer()
...
public func cleanSequencer() {
for track in tracks {
track.clear()
}
for sample in samplers {
sample.disconnectOutput()
sample.destroyEndpoint()
}
mixer.detach()
}
Hope this helps!
------- UPDATE: 01 -------
This produced some unexpected effects, mainly with no sound being played after using this method.
But, now curious if anyone knows why the -10867 would go away and sound too?
I have a trouble with getting NSTimeInterval after system reboot. This functions gave me a wrong result. The reboot time is restarting when device's battery is charged. There is a trouble on iPhone 5s and iPad 3 (the new). How can I fix it?
I use this method and it helps me
public static func uptime() -> (days: Int, hrs: Int, mins: Int, secs: Int) {
var currentTime = time_t()
var bootTime = timeval()
var mib = [CTL_KERN, KERN_BOOTTIME]
// NOTE: Use strideof(), NOT sizeof() to account for data structure
// alignment (padding)
// http://stackoverflow.com/a/27640066
// https://devforums.apple.com/message/1086617#1086617
var size = strideof(timeval)
let result = sysctl(&mib, u_int(mib.count), &bootTime, &size, nil, 0)
if result != 0 {
#if DEBUG
print("ERROR - \(__FILE__):\(__FUNCTION__) - errno = "
+ "\(result)")
#endif
return (0, 0, 0, 0)
}
I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ:
var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0)
// Add the node to the graph
status = AUGraphAddNode(graph, &cd, &MyAppNode)
println(status)
// Once the graph has been opened get an instance of the equalizer
status = AUGraphNodeInfo(graph, self.MyAppNode, nil, &MyAppUnit)
println(status)
var eqFrequencies: [UInt32] = [ 32, 250, 500, 1000, 2000, 16000 ]
status = AudioUnitSetProperty(
self.MyAppUnit,
AudioUnitPropertyID(kAUNBandEQProperty_NumberOfBands),
AudioUnitScope(kAudioUnitScope_Global),
0,
eqFrequencies,
UInt32(eqFrequencies.count*sizeof(UInt32))
)
println(status)
status = AudioUnitInitialize(self.MyAppUnit)
println(status)
var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
AUGraphConnectNodeInput(graph, sourceNode, sourceOutputBusNumber, self.MyAppNode, 0)
AUGraphConnectNodeInput(graph, self.MyAppNode, 0, destinationNode, destinationInputBusNumber)
and then to apply changes in the frequency gains my code is as follows:
if (MyAppUnit == nil) {return}
else{
var bandValue0 :Float32 = tenBands.objectAtIndex(0) as! Float32
var bandValue1 :Float32 = tenBands.objectAtIndex(1) as! Float32
var bandValue2 :Float32 = tenBands.objectAtIndex(2) as! Float32
var bandValue3 :Float32 = tenBands.objectAtIndex(3) as! Float32
var bandValue4 :Float32 = tenBands.objectAtIndex(4) as! Float32
var bandValue5 :Float32 = tenBands.objectAtIndex(5) as! Float32
AudioUnitSetParameter(self.MyAppUnit, 0, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue0, 0);
AudioUnitSetParameter(self.MyAppUnit, 1, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue1, 0);
AudioUnitSetParameter(self.MyAppUnit, 2, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue2, 0);
AudioUnitSetParameter(self.MyAppUnit, 3, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue3, 0);
AudioUnitSetParameter(self.MyAppUnit, 4, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue4, 0);
AudioUnitSetParameter(self.MyAppUnit, 5, AudioUnitScope(kAudioUnitScope_Global), 0, bandValue5, 0);
}
Can anyone point out what I am doing wrong here? I think it is related to the second variable in AudioUnitSetParameter. I have tried AudioUnitParameterID(0), and AudioUnitParameterID(kAUNBandEQParam_Gain + 1) for this Value but those don't seem to work at all. Any help is appreciated!
Comment adding as answer because comments are insufficient.
The following Code is in Objective-c but it should help identify your problem.
There are a number of places this might fail. Firstly, you should check the status of the AudioUnitSetParameter, and indeed all the AudioUnit Calls as this will give you a clearer point of where you're code is failing.
I've done this successfully in Objective-C and have a test app i can make available, if you need it, which shows the complete graph setup and setting the bands and gains by moving a slider ... back to your specific question. The following works just fine for me, this might help you rule out a particular section.
You can try and obtain the current "gain", this will indicate if your bands are set up correctly.
- (AudioUnitParameterValue)gainForBandAtPosition:(uint)bandPosition
{
AudioUnitParameterValue gain;
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitGetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
&gain);
if (status != noErr) {
#throw [NSException exceptionWithName:#"gettingParamGainErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on getting EQ Gain! Status returned %d).", (int)status]
userInfo:nil];
}
return gain;
}
then setting the gain can be done in the following way;
- (void)setGain:(AudioUnitParameterValue)gain forBandAtPosition:(uint)bandPosition
{
AudioUnitParameterID parameterID = kAUNBandEQParam_Gain + bandPosition;
OSStatus status = AudioUnitSetParameter(equalizerUnit,
parameterID,
kAudioUnitScope_Global,
0,
gain,
0);
if (status != noErr) {
#throw [NSException exceptionWithName:#"settingParamGainAudioErrorException"
reason:[NSString stringWithFormat:#"OSStatus Error on setting EQ gain! Status returned %d).", (int)status]
userInfo:nil];
}
}
Finally, what value are you trying to set, the valid range (if I'm not mistaken) is -125.0 to 25.0
Does anyone know how to implement this in swift? The entire function call is
glGetProgramInfoLog(
<#program: GLuint#>,
<#bufsize: GLsizei#>,
<#length: UnsafeMutablePointer<GLsizei>#>,
<#infolog: UnsafeMutablePointer<GLchar>#>)
I understand passing the pointers but not the buffer sizes. Android doesn't even have these parameters at all.
For anyone looking for an answer you can use following code :
Where program is let program = glCreateProgram()
Swift 2
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
print(String(UTF8String: message))
Swift 3
var message = [CChar](repeating: CChar(0), count: 256)
var length = GLsizei(0)
glGetProgramInfoLog(program, 256, &length, &message)
var s = String.init(utf8String: message)!
if(s.characters.count > 0){print("Shader compile log: \(s)")} //only prints if log isnt empty
Try this:
var message = [CChar](count: 256, repeatedValue: CChar(0))
var length = GLsizei(0)
var log = Array<GLchar>(count: Int(length ), repeatedValue: 0)
log.withUnsafeBufferPointer { logPointer -> Void in
glGetShaderInfoLog(yourProgram, length, &length, UnsafeMutablePointer(logPointer.baseAddress))
NSLog("Shader compile log: \n%#", String(UTF8String: log)!)
}