CoreMidi and external USB controller - ios

I 'm using coremidi all right, but I want to also support an external USB function.
I 've tried an app called Midi Monitor which indeed finds my USB interface when connected.
The problem is how to enable this interface through my own app. As said in MIDIGetNumberOfExternalDevices documentation, "Their presence is completely optional, only when a UI (such as Audio MIDI Setup) adds them."
How am I supposed to add them?
Best Regards.

"External devices" are not what you want. Those are the things that a user can create in Audio MIDI Setup in OS X, to represent a synthesizer or keyboard or other device that is connected to the computer via a MIDI cable. The system does not automatically create them. (It can't, because MIDI is terribly primitive and has no device discovery protocol.)
External devices are only for the user's benefit in naming and arranging things. They can't be used to do MIDI input or output. They're especially useless in iOS, since there's no Audio MIDI Setup app.
Instead, use MIDIGetNumberOfSources and MIDIGetSource to find sources of MIDI data.
To actually get input, use MIDIInputPortCreate to create an input port, then MIDIPortConnectSource to connect one or more sources to that port. Then your port's MIDIReadProc will be called when MIDI comes in.
Similarly, for output, you would use MIDIGetNumberOfDestinations and MIDIGetDestination to find destinations, create an output port using MIDIOutputPortCreate, and MIDISend to send data through a port to a destination.
For reference, see the MIDIServices documentation.

Related

Need help understanding alsa drivers

I'm trying to debug the sound system on a amlogic device. amixer and alsamixer aren't working as expected and amixer can crash the system. What I'm struggling with is that the drivers pass methods for accessing the hardware registers by constructing a snd_kcontrol object all as described in Writing an ALSA Driver on the ALSA website. But amixer cset calls snd_ctl_elem_write from control.c which refers to element_write in a snd_ctl_t object.
I can't see any link between the defined snd_kcontrol and any snd_ctl_t objects so can't see how amixer is supposed to be writing to the hardware. How is it normally done?
In user space, a control device is represented by snd_ctl_t, which contains the file handle of the device node. element_write points to snd_ctl_hw_elem_write(), which issues a system call.
In the kernel, an opened device file is represented by a struct snd_ctl_file, which is linked to the struct snd_card.

Is it posible to cast to the receiver the input from the phone microphone?

I would like to know if it is posible to cast the audio taken directly from the microphone iOS device to the receiver. (in a live way)
I´ve downloaded all the git example projects, and in all of them use a "loadMedia" method to start the casting. Example of one of those:
- (NSInteger)loadMedia:(GCKMediaInformation *)mediaInfo
autoplay:(BOOL)autoplay
playPosition:(NSTimeInterval)playPosition;
Can I follow this approach to do what I want? If so, what´s the expected delay?
Thanks a lot
Echo is likely if the device (iOS, Android, or Chrome) is in range of the speakers. That said:
Pick a fast codec that is supported, such as CELT/Opus or Vorbis
I haven't tried either of these, but they should be possible.
Implement your own protocol using CastChannel that passes the binary data. You'll want to do some simple conversion of the stream from Binary to something a bit more friendly. Take a look at Intro to Web Audio for using AudioContext.
or, 2. Setup a trivial server to stream from on your device, then tell the Receiver to just access that local server.

How to synchronize audio playback on 2 or more iOS devices?

I would like to write a web application that allows me to sync audio playback of an MP3 down to ~50ms, or close enough that the human ear can't detect the difference.
The idea would be that two or more smartphones could each be paired to a bluetooth speaker, and two or more speakers would play the same audio at the exact same time.
How would you suggest I go about setting this up, both client-side and server-side? I'm planning to use Rails/Ruby for backend, and iOS/obj c for mobile dev.
I had though of the idea of syncing to a global/atomic clock on the server, and having the server provide instructions to clients on when to start playing/jump in to an already playing track. My concern is that, if I want to stream the audio, that it will be impossible to load a song into memory and start playback accurately on the millisecond level.
Thoughts?
The jitter in internet packet delivery will be too large, so forget about syncing over the internet. However you could check the accuracy of NTP which is still used (I guess, I know that older UNIX's used it) by the OS when you switch on automatic date/time in Settings, but my guess is that it won't be good enough either. But perhaps the OS may also use other time sources like GPS; I'm don't know how iOS does it but accuracy within 20ms is not to be expected. You could create experimental app to check it out.
So, what's left is a sync closer to home, meaning between the devices directly. Of course you need to make sure that all devices haves loaded (enough of) the song, and have preloaded it in AVAudioPlayer or whatever you're using, to be able to start playing immediately. (It may actually not be the best idea to use higher level 'AVAudioPlayer` API's as it may give higher delays, and what more important higher jitter, than lower level API's.)
Here are three ideas (one device needs to be master triggering the start play, the others are slaves that are waiting for the trigger):
Use an audio trigger pulse, like a high tone of a defined length and frequency. Then use FFT to recognise this tone.
Connect the devices via GameKit Bluetooth and transmit the trigger on these connections.
Use the iPhone 4+ flash as trigger: flash in a certain pattern. This would require you to sample the video data which is quite doable and can be very fast.
I'm going with a solution that uses an atomic clock for synchronization, and an external service that allows server instructions/messages to be sent to all devices in close sync.

Midi Timing Issues with Delphi ASIO VST and MiniHost

I'm coming from a background of using MSC* MidiSequencer for a Delphi XE2 project and have been playing with DelphiASIOVST this weekend on the off chance the MIDI may be stable enough to use as my core MIDI engine while also allowing me to support VST plug ins. I pulled the D16 trunk off the SVN and compiled effortlessly after a few path tweaks.
I understand a great deal of what I'm seeing but I'm wondering if others have experienced issues with MIDI file playback in the the MiniHost example application. Specifically with a one track melodic performance it sounds like notes are getting skipped and/or playing back a bit later over other notes that are playing as they should. Basically it's just hit or miss if a note is even played at all.
I have numerous pro sequencers on my machine and the MIDI files are fine there. they also support VST with little to no problems. I also know the MIDI lowest level file format and know the file structure is sound.
Can the TMidiFile play direct to the standard MIDI synth in the computer? I'm trying to rule out VST issues by getting a direct pipeline to the built in synth. Barring that, anyone seen these issues or know of some more/better examples of MIDI file to VST using the component set?
I use FL studios with my Midi and odds are is you need to turn down your buffer quality so that there is little to no delay.
It's probably by default set to about mid-high range which means you will almost for sure have 1 - 1.5 second delay
Don't turn it down too low otherwise you'll get trash can sound where everything sounds hollow and robotic, but keep smashing the keys while you're adju8sting the setting
Is the wordclock functioning properly? Do you have the ability of driving off another midi clock-source, just to test with?
Though you said: "I have numerous pro sequencers on my machine and the MIDI files are fine there", you could also try http://www.reaper.fm (works on Linux/BSD, Mac and Win) DAW and import the midi directly into that, then set your default midi device as that which you wish to test with.
Check the Midi Overflow settings.
Ensure each of your Midi Devices has a unique ID.
Get a midi throughput app like Midi-ox http://www.midiox.com/ To see realtime messages and data. and see where things are going.
Midi workflow checking is required for setting as per our requirements.
Set all devices with unique ids that have been specified in your midi overflow.
Midi throughput application is required to see the messages that are realtime and data that are also realtime.
User has to see the things where they are going to what purpose.
Hope this will help u...

Accessing an AR2112

This is a little off the beaten path. I've got a DLink DWL-G520 card I'm using under OpenBSD and it works fine. What I want to do is be able to access the radio part of it. Why? I want to use it in a radio telescope. It's a 2.4 GHz receiver with an external antenna connector. I want to connect some coax, some amplifiers, and an old TV dish and point the dish at the sky. It has an RSSI signal and variable RF gain (which it adjusts, from what I can find) so all I'd need to do is record those over time while pointed at a certain spot in the sky. I don't need to control the frequency really since most natural events are broadband.
I'm poking through the OpenBSD ath driver following nested structs but I don't want any of the normal network stuff, which is most of what the driver does. dmesg identifies it as an AR5212 which according to the Atheros PDF is always paired with an AR2112 radio. Is there any easier way than wading through PCI stuff to see what my options are? I need to turn the transmitter off so it doesn't fry my amps too. Trying to find low level documentation is about impossible from what I've seen. Ultimately I'd like to have this work with other WiFi cards too, but I'll start with this one. I've got a Cistron with an external antenna connector also.
Alan, ab1jx

Resources