ios midi voice like(0x90,30,127) - ios

**I had a piano ios applications, I can use CoreMidi normally receive external MIDI controller information, but no voice. So I want to use MIDI like (0x90,30,127) bytes to make a sound, can someone answer my question? Thank you very much!!!

If you mean uses iOS device as MIDI Synthesizer (sound output from iOS device),
You needs AUGraph & AudioUnit (Look up in Xcode Help search for "Sampler Unit Presets (LoadPresetDemo)")
After you understand that some tips from me:
define your custom structure that contains AUGraph & AudioUnit.
pass your custom structure pointer as refCon in MIDIInputPortCreate or srcConnRefCon in MIDIPortConnectSource for call functions such as MusicDeviceMIDIEvent in your MIDIReadProc at that callback thread.

Related

AudioKit MIDI Events - basic questions, sending to external synth, not sure about handling

I had some more time to play around with AudioKit, yet I have a few questions about MIDI. I tried googling and I also checked the AKMIDI class reference, but this just further confused me.
Let's keep it simple: I have a UIButton that should give out a "C" to an external synth.
import AudioKit
midiOut.openOutput()
midiOut.sendEvent(AKMIDIEvent(noteOn: 72, velocity: 80, channel: 1))
sleep(1)
midiOut.sendEvent(AKMIDIEvent(noteOff: 72, velocity: 0, channel: 1))
Obviously this code is spread over the entire ViewController.swift and the sendEvents are inside an #IBAction, but for the sake of keeping it simple, I merged the LOCs.
So my questions are:
Will this send out MIDI signals to external machines? (I ordered the Lightning Camera Connection Kit, but it will take a few more days to arrive, so I cannot test it yet.) I cannot get anything from the iPhone Simulator, so I take it, this is not working.
Will this send out MIDI signals over wireless MIDI as well? According to what I read on some Google group, you can have Wireless MIDI on macOS. Unfortunately I didn't find anything on if and how you can use this in AudioKit.
What about MIDI over Bluetooth? Will this all be handled by AudioKit automatically or will I have to change anything? I read something about midi.createVirtualPorts(), but I am not sure what this actually does. But the word "virtual" makes me believe this is for testing?
Sorry if these questions are ridiculously noobish, but I really am confused by the exact inner workings of this.
Thanks a lot in advance!
You can test your app with simulator as well. Just run your app on simulator and open the Audio MIDI Setup app on your Mac, go to network settings and connect your simulator. Here is a screenshot:
You have full control over AKMIDI.
* If you want to open all available outputs, just call midi.openOutput().
* If you want to open a specific one, like a network session, mostly named "Session 1", call midi.openOutput(name: "Session 1").
* You can get all available destinations by midi.destinationNames string array, if you want to prompt a midi dest picker to user, then just open them with their names.
* For closing them midi.endpoints.removeValue(forKey: "Session 1").
* And for virtual outputs, call sequencer.midi.createVirtualOutputPort(name: "App MIDI Out") which is useful to send MIDI to other apps living in your iOS/Mac.
* Also, you can subscribe to AKMIDIListener protocol's receivedMIDISetupChange function to get notified when a MIDI hardware/software available or not to connect.
Yes it should. You've opened all outputs, so everything should get the output.
I'm not sure about wireless MIDI as there's nothing specifically done with that in mind in AudioKit. If it doesn't "just work" I can try to replicate.
You might have to explicitly open the bluetooth connection with openOutput("Bluetooth")

Do we have a builtin Bluetooth connection listener in iOS?

We want to set a call-back function (or listener) to trigger some functionalities when a paired bluetooth device is connected to iPhone. I've found some related answers like this, this, or this, but all of them seem quite old.
Do we have a builtin or external library in iOS that enables an app to automatically check if a Bluetooth device is connected or not?
Any input will be greatly appreciated!
You could observe the inputAvailable property of an AVAudioSession object via NSNotificationCenter.

iOS 5/6: low volume after first usage of CoreAudio

I work on a VoIP app. The AudioSession's mode is set to kAudioSessionMode_VoiceChat.
For a call, I open a CoreAudio AudioUnit with subtype kAudioUnitSubType_VoiceProcessingIO . Everything works fine. After the first call, I close the AudioUnit with AudioUnitUninitialize() and I deactivate the audio session.
Now, however, it seems as if the audio device is not correctly released: the ringer volume is very low, the media player's volume is lower than usual. And for a subsequent call, I cannot activate kAudioUnitSubType_VoiceProcessingIO anymore. It works to create an AudioUnit with kAudioUnitSubType_RemoteIO instead, but also the call's volume is very low (both receiver and speaker).
This first occured on iOS 5. With the iPhone 5 on iOS 6, it is even worse (even lower volume).
Has anyone seen this? Do I need to do more than AudioUnitUninitialize() to release the Voice Processing unit?
I've found the solution: I've incorrectly used AudioUnitUninitialize() to free the audio component retrieved with AudioComponentInstanceNew(). Correct is to use AudioComponentInstanceDispose().
Yes, you need to dispose the audioUnit when using voiceProcessingIO. For some reason there is no problem when using RemoteIO subtype. So whenever you get OSStatus -66635 (kAudioQueueErr_MultipleVoiceProcessors), check for missing AudioComponentInstanceDispose() calls.

FMOD Voice Chat

I want to create a voice chat software using FMOD.
Now I can receive data from microphone and play it immediately.
It's also easy to send sound data to another computer on the network.
But I don't know how to play the sent data on the other computer with FMOD.
Can anyone help me?
When you receive the incoming sound data on the destination machine you need to create a streaming buffer to play the audio. The simplest method would be to look at the userccreatedsound example. It shows how to create a custom stream buffer and use the pcmreadcallback to populate the sound with data as needed.

How can I generate programmatically a MIDI event on iPad

I would like to test a MIDI app and want to generate some MIDI events without attaching a physical keyboard. Any hints?
If you are using CoreMidi, setup your app to use MIDINetworkSessions. Once you advertise your iPad over the network, use any MIDI sequencer etc to connect to it and send messages over WiFi.
That way you can test without constantly unplugging/replugging things, and while still tethered to Xcode which is a huge bonus.
The other option would be to create an artificial MIDIPacketList and send it directly to your handler, but this is a lot less flexible.
If you mean sending a MIDI event to an iPad, then you could use a simple program like Rondo to play a MIDI file to it.
I suppose you need some source codes to generate MIDI events on iPad.
I found this one. It is a wrapper class of CoreMIDI, and it has source codes of sending/receiving some MIDI events.
RCTMidiLib
https://github.com/recotana/RCTMidiLib
I connect iPad and Mac wirelessly, and successfully send/receive MIDI events using the test application on iPad.

Resources