midi packet list dose not change timestamp - ios

i have this code from here(Using MIDIPacketList in swift) but i can't pm the user or comment on that question, so i will ask my question.
#ephemer dude if you see my question, i love your code on midi list and it work so well, but when i change the time stamp, nothing happens, it must create some delay but it will be the same as 0 time stamp.
do anyone know how to fix this?
and how i can have the time stamp out of this extension to have that in midi event, i want to be able to change time stamp for every midi event,
to have it here:
var packets = MIDIPacketList(midiEvents: [[0x90, 60, 100]])
public typealias MidiEvent = [UInt8]
extension MIDIPacketList {
init(midiEvents: [MidiEvent]) {
let timestamp = MIDITimeStamp(0) // do it now
let totalBytesInAllEvents = midiEvents.reduce(0) { total, event in
return total + event.count
}
// Without this, we'd run out of space for the last few MidiEvents
let listSize = MemoryLayout<MIDIPacketList>.size + totalBytesInAllEvents
// CoreMIDI supports up to 65536 bytes, but in practical tests it seems
// certain devices accept much less than that at a time. Unless you're
// turning on / off ALL notes at once, 256 bytes should be plenty.
assert(totalBytesInAllEvents < 256,
"The packet list was too long! Split your data into multiple lists.")
// Allocate space for a certain number of bytes
let byteBuffer = UnsafeMutablePointer<UInt8>.allocate(capacity: listSize)
// Use that space for our MIDIPacketList
self = byteBuffer.withMemoryRebound(to: MIDIPacketList.self, capacity: 1) { packetList -> MIDIPacketList in
var packet = MIDIPacketListInit(packetList)
midiEvents.forEach { event in
packet = MIDIPacketListAdd(packetList, listSize, packet, timestamp, event.count, event)
}
return packetList.pointee
}
byteBuffer.deallocate() // release the manually managed memory
}
}
// to send use this
var packets = MIDIPacketList(midiEvents: [[0x90, 60, 100]])
MIDISend(clientOutputPort, destination, &packetList)

i figure it out, how it should work, but not quite right.
according to apple documentation, it must use mach_absolute_time() .
so i used that but i don't know how to work properly with mach_absolute_time().if anyone knows please tell me too.
if i use var timestamp = mach_absolute_time()+1000000000 it will delay like 1 min or so.if i change the 1000000000 with any number lower than this ,will not make a delay.
do anyone know how to work with mach_absolute_time()?
i saw some code on mach_absolute_time() but they use that as timer, it is acutely a timer that give time from the fist boot ,but how to give a time as a mach_absolute_time() to work with midi timestamp.

I found the answer.
set the timestamp to :
var delay = 0.2
var timestamp: MIDITimeStamp = mach_absolute_time() + MIDITimeStamp(delay * 1_000_000_000)
the delay is the time you want the midi message age to have delay.

Related

How to detect if Device is rebooted in iOS?

I need to identify if the device is rebooted.
Currently saving time in the database, and periodically check time interval from last boot using the following code as suggested in Apple forums:
func bootTime() -> Date? {
var tv = timeval()
var tvSize = MemoryLayout<timeval>.size
let err = sysctlbyname("kern.boottime", &tv, &tvSize, nil, 0);
guard err == 0, tvSize == MemoryLayout<timeval>.size else {
return nil
}
return Date(timeIntervalSince1970: Double(tv.tv_sec) + Double(tv.tv_usec) / 1_000_000.0)
}
But the problem with this is even without a reboot, tv.tv_sec value differs around 30 (it varies from 0 seconds to 30 secs).
Anybody have any idea about this variation? or any other better way to identify device reboot without using sysctl or other reliable sysctl.
https://developer.apple.com/forums/thread/101874?answerId=309633022#309633022
Any pointer is highly appreciated.
I searched over SO, all the answer points to the solution mentioned here. Which have the issue as I mentioned. Please don't mark duplicate.

How to sync accurately enough two music sequences (Audiokit.AKAppleSequencer)?

I have 2 sequencers:
let sequencer1 = AKAppleSequencer(filename: "filename1")
let sequencer2 = AKAppleSequencer(filename: "filename2")
Both have the same bpm value.
When sequencer1 starts playing one midi track (playing it only once) I need that sequencer2 begin playing exactly after first sequencers finished. How can I achieve this ?
Note that sequencer2 looped.
Currently I have this approach but it is not accurate enough:
let callbackInstrument = AKMIDICallbackInstrument(midiInputName: "callbackInstrument", callback: nil)
let callbackTrack = sequencer1.newTrack()!
callbackTrack.setMIDIOutput(callbackInstrument.midiIn)
let beatsCount = sequencer1.length.beats
callbackTrack.add(noteNumber: MIDINoteNumber(beatsCount),
velocity: 1,
position: AKDuration(beats: beatsCount),
duration: AKDuration(beats: 0.1))
callbackInstrument.callback = { status, _, _ in
guard AKMIDIStatusType.from(byte: status) == .noteOn else { return }
DispatchQueue.main.async { self.sequencer2.play() }//not accurate
}
let sampler = AKMIDISampler(midiOutputName: nil)
sequencer1.tracks[0].setMIDIOutput(sampler.midiIn)
Appreciate any thoughts.
Apple's MusicSequence, upon which AKAppleSequencer is built, always flubs the timing for the first 100ms or so after it starts. It is a known issue in closed source code and won't ever be fixed. Here are two possible ways around it.
Use the new AKSequencer. It might be accurate enough to make this work (but no guarantees). Here is an example of using AKSequencer with AKCallbackInstrument: https://stackoverflow.com/a/61545391/2717159
Use a single AKAppleSequencer, but place your 'sequencer2' content after the 'sequencer1' content. You won't be able to loop it automatically, but you can repeatedly re-write it from your callback function (or pre-write it 300 times or something like that). In my experience, there is no problem writing MIDI to AKAppleSequencer while it is playing. The sample app https://github.com/AudioKit/MIDIFileEditAndSync has examples of time shifting MIDI note data, which could be used to accomplish this.

Modify Playback Tempo of an AVAudioSequencer / AVMusicTrack?

How does one programmatically change the tempo (in BPM) of an AVAudioSequencer that's been loaded from an existing MIDI file (i.e. using the following)?
try sequencer.load(from: fileURL, options: AVMusicSequenceLoadOptions.smfChannelsToTracks)
I know that the sequencer's tempoTrack property returns the AVMusicTrack controlling the tempo, but how does one then edit it to add/change tempo events? The Apple documentation simply says...
"The tempo track can be edited and iterated upon as any other track. Non-tempo events in a tempo track are ignored."
...but gives no further indication on how such editing would be done.
I know there's the rate property, but that just revolves around a default value of 1.0, which would need some complex adjustments to allow BPM values, and I don't think would even be possible unless the file's original BPM is known at runtime.
Alternatively, is there a way to create a new AVMusicTrack from scratch, with a custom tempo, and make that the sequencer's tempoTrack?
The only way I managed this was to dip into the Audio Toolbox API momentarily.
This approach assumes that you instantiated your AVAudioSequencer with an AVAudioEngine via:
init(audioEngine engine: AVAudioEngine)
(1) After loading your midi file with AVAudioSequencer, get a pointer to the underlying MusicSequence from this property on AVAudioEngine
var musicSequence: MusicSequence? { get set }
(2) Get a pointer to the sequence's tempo track.
var tempoTrack: MusicTrack!
MusicSequenceGetTempoTrack(musicSequence, &tempoTrack)
(3) Remove all existing tempo information from the tempo track.
var iterator: MusicEventIterator!
NewMusicEventIterator(tempoTrack, &iterator)
var hasEvent = DarwinBoolean(false)
MusicEventIteratorHasCurrentEvent(iterator, &hasEvent)
while hasEvent.boolValue {
var timeStamp = MusicTimeStamp()
var eventType = MusicEventType()
var data: UnsafeRawPointer? = nil
var dataSize = UInt32()
MusicEventIteratorGetEventInfo(iterator, &timeStamp, &eventType, &data, &dataSize)
guard eventType == kMusicEventType_ExtendedTempo else {
MusicEventIteratorNextEvent(iterator)
MusicEventIteratorHasCurrentEvent(iterator, &hasEvent)
continue
}
// remove tempo event
MusicEventIteratorDeleteEvent(iterator)
MusicEventIteratorHasCurrentEvent(iterator, &hasEvent)
}
DisposeMusicEventIterator(iterator)
(4) Set the new tempo.
let bpm = 92
let timeStamp = MusicTimeStamp(0)
MusicTrackNewExtendedTempoEvent(tempoTrack, timeStamp, bpm)

In Swift, how can I get an NSDate from a dispatch_time_t?

"Walltime" is a little-known time format used by Grand Central Dispatch. Apple talks about it here:
https://developer.apple.com/library/ios/documentation/Performance/Reference/GCD_libdispatch_Ref/
There are some things it's really handy for, though, but it's a sticky wicket. It's hard to make it play nice with other time formats, which is what my question's about.
I can make a walltime by turning an NSDate into a timespec, and then using with dispatch_walltime:
let now = NSDate().timeIntervalSince1970
let nowWholeSecsFloor = floor(now)
let nowNanosOnly = now - nowWholeSecsFloor
let nowNanosFloor = floor(nowNanosOnly * Double(NSEC_PER_SEC))
var thisStruct = timespec(tv_sec: Int(nowWholeSecsFloor),
tv_nsec: Int(nowNanosFloor))
let wallTime = dispatch_walltime(& thisStruct, 0)
But lord love a duck, I can't figure out how to get it back into an NSDate. Here's my try:
public func toNSDate(wallTime: dispatch_time_t)->NSDate {
let wallTimeAsSeconds = Double(wallTime) / Double(NSEC_PER_SEC)
let date = NSDate(timeIntervalSince1970: wallTimeAsSeconds)
return date
}
The resulting NSDate is not just off, but somewhat hilariously off, like five hundred years or something. As Martin R pointed out, the problem is that dispatch_time_t is an opaque value, with an undocumented representation of time.
Does anyone know how to do this?
EDIT: if the process of creating the walltime is confusing, this is basically what's going on:
NSDate defines time with a Double, and everything after the decimal point is the nanoseconds. dispatch_time, which can create a walltime, defines time with UInt64, so you have to convert between Double and UInt64 to use it. To do that conversion you need to use a timespec, which takes seconds and nanoseconds as separate arguments, each of which must be Int.
A whole lotta convertin' going on!
The real answer is: you can't.
In the "time.h" header file it is stated:
/*!
* #typedef dispatch_time_t
*
* #abstract
* A somewhat abstract representation of time; where zero means "now" and
* DISPATCH_TIME_FOREVER means "infinity" and every value in between is an
* opaque encoding.
*/
typedef uint64_t dispatch_time_t;
So dispatch_time_t uses an undocumented "abstract" representation of time, which
may even change between releases.
That being said, let's have some fun and try to figure out what
a dispatch_time_t really is. So we have a look at "time.c", which contains the implementation of
dispatch_walltime():
dispatch_time_t
dispatch_walltime(const struct timespec *inval, int64_t delta)
{
int64_t nsec;
if (inval) {
nsec = inval->tv_sec * 1000000000ll + inval->tv_nsec;
} else {
nsec = (int64_t)_dispatch_get_nanoseconds();
}
nsec += delta;
if (nsec <= 1) {
// -1 is special == DISPATCH_TIME_FOREVER == forever
return delta >= 0 ? DISPATCH_TIME_FOREVER : (dispatch_time_t)-2ll;
}
return (dispatch_time_t)-nsec;
}
The interesting part is the last line: it takes the negative value of the
nanoseconds, and this value is cast back to an (unsigned) dispatch_time_t. There are also some special cases.
Therefore, to reverse the conversion, we have to negate the
dispatch_time_t and take that as nanoseconds:
public func toNSDate(wallTime: dispatch_time_t)->NSDate {
// Tricky part HERE:
let nanoSeconds = -Int64(bitPattern: wallTime)
// Remaining part as in your question:
let wallTimeAsSeconds = Double(nanoSeconds) / Double(NSEC_PER_SEC)
let date = NSDate(timeIntervalSince1970: wallTimeAsSeconds)
return date
}
And indeed, this converts the walltime correctly back to the original
NSDate, at least when I test it in an OS X application.
But again: don't do it! You would rely on an undocumented representation which could change between OS releases. There may also
be special cases that are not considered in the above code.
Also the representation in the iOS runtime could be different, I did
not try that.
You have been warned!

AVAudioEngine seek the time of the song

I am playing a song using AVAudioPlayerNode and I am trying to control its time using a UISlider but I can't figure it out how to seek the time using AVAUdioEngine.
After MUCH trial and error I think I have finally figured this out.
First you need to calculate the sample rate of your file. To do this get the last render time of your AudioNode:
var nodetime: AVAudioTime = self.playerNode.lastRenderTime
var playerTime: AVAudioTime = self.playerNode.playerTimeForNodeTime(nodetime)
var sampleRate = playerTime.sampleRate
Then, multiply your sample rate by the new time in seconds. This will give you the exact frame of the song at which you want to start the player:
var newsampletime = AVAudioFramePosition(sampleRate * Double(Slider.value))
Next, you are going to want to calculate the amount of frames there are left in the audio file:
var length = Float(songDuration!) - Slider.value
var framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
Finally, stop your node, schedule the new segment of audio, and start your node again!
playerNode.stop()
if framestoplay > 1000 {
playerNode.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, atTime: nil,completionHandler: nil)
}
playerNode.play()
If you need further explanation I wrote a short tutorial here: http://swiftexplained.com/?p=9
For future readers, probably better to get the sample rate as :
playerNode.outputFormat(forBus: 0).sampleRate
Also take care when converting to AVAudioFramePosition, as it is an integer, while sample rate is a double. Without rounding the result, you may end up with undesirable results.
P.S. The above answer assumes that the file you are playing has the same sample rate as the output format of the player, which may or may not be true.

Resources