I am trying to build a project that will allow me to use an iphone USB connected to a Mac as an audio output device. There are options to do this wirelessly like "AirFoil" but they all introduce latency during playback.
When connected via USB, the iphone will appear in Audio Midi Setup as a device. Currently it is only selectable as an input device. I believe it can be used as an external microphone or a midi input device. My hope is that through code, the grayed output selection could be overridden. Is this even possible?
Iphone_IO.png
From what I've read the AVAudioEngine might provide a solution to this? I have read through posts about listing all available AudioDevices by ID.
list-all-available-audio-devices
From here would I be able to set the audio output device as the iphone? Or is this something that would require a companion app running on the phone to route the audio as well? Any information would be helpful, thanks!
Related
Is it possible to identify the MIDI device(s) connected (IOS version) when using the camera kit. I don't see a function for that although you can specify a name when starting the MIDI output. I'm guessing there is a way to get a name somehow.
I'm assuming that if you're using class compliant MIDI to usb converters the MIDI devices will register automatically. There is nothing that is specific to the camera connection kit.
It seems to me that real (hardware) (USB-)MIDI devices connected to Mac are not listed when running in the simulator. Is there an option to turn this functionality on?
For a concrete example, using e.g. MIKMIDI, MIKMIDIDeviceManager.shared().availableDevices returns 1 device with no model name or manufacturer, regardless of whether an actual physical device is plugged in or not (as shown in Audio MIDI Setup). I had similar results with MIDIFish.
'audio' background mode is enabled in the entitlements file.
On the other hand, I am planning to buy a Lightning-to-USB Camera Adapter for this purpose. However, I am not sure how would I debug such a configuration: If the iPhone's Lightning port is already occupied by the Adapter, how would I connect Xcode/Mac to the iPhone?
Any other idea? How do people typically do this?
Thank You.
I am looking to send MIDI signals through the lightning cable to the MIDI network on my laptop.
Is there anyway to do this?
The MIDINetworkConnection class has a connectionWithHost method, that allows wifi connections. But I'm looking for something faster, so I'm wondering if we can do that.
macOS and iOS now do this natively with IDAM (inter-device audio and MIDI).
Attach an iOS device to your Mac and unlock it.
If prompted to trust, trust the host Mac.
Launch Audio MIDI Setup.
If not visible, show the Audio Device window (Cmd + 1)
Click the "Enable" button next to the iOS device.
A MIDI device called "iPhone" or "iPad" will show up in the MIDI Setup.
On the iOS device, route MIDI to the IDAM MIDI Host destination.
I don't think you can do it natively, but there is a way to use the lightning cable as a network interface to send data to the computer.
The new version of touchAble does that, but it requires a server application that runs on the computer, talking to the device and opening a virtual MIDI port to route the MIDI messages to your DAW.
You might need to check the Apple MFi program, if it is not out of scope, depending on the nature of your project. Or check that other discussion about USB communication.
We are building an iOS app that does basic speech recognition. Basically, the app counts the number of words you speak into the iOS device. The app works well when speaking into the standard microphone built into the iPhone. However, when connecting a wireless Bluetooth audio device, we are unable to use that Bluetooth device as a method for recording voice audio. We are using following software and devices:
built for iOS7.0/7.1 with the OpeanEars library for speech recognition
we’re using the ZOMM Wireless Leash (http://www.zomm.com/ | http://www.amazon.com/ZOMM-Wireless-Bluetooth-Speakerphone-Black/dp/B003N3J6BU/ref=sr_1_1?ie=UTF8&qid=1409515088&sr=8-1)
Tried other Bluetooth devices with the same behavior (Bluedio 66i and Bluedio DF200)
Unable to capture audio in default Voice Memo app
as far as we know this is simply using standard BlueTooth protocol, as we understand it, once the bluetooth device is paired it should automatically start accepting it as a device for recording/audio capture
According to OpenEars, the Bluetooth audio devices should be picked up automatically (http://www.politepix.com/forums/topic/enabling-bluetooth-support/). Are we right in assuming this?
We used the VoiceMemo app (the voice recording app that ships with iOS) to test out the bluetooth device as a “control” experiment:
Pair the ZOMM with the iOS device
Open VoiceMemo
Select ZOMM as input device from within the VoiceMemo app
Start recording
Stop recording – no audio was captured
Unfortunately this meant that neither our app nor the standard voice recording app is able to use the bluetooth device as a means for recording audio. Either way it’s hard to rule the device as simply the issue.
We’re curious to understand if this simply a hardware issue (and need a BT-enabled device that supports voice recording to iOS) or if there is something in the code we need to enable in order for the app to start accepting the device as recordable.
Also, more details about the ZOMM headset:
ZOMM specifications:
Bluetooth Wireless Compatibility:
This ZOMM device supports the following Bluetooth wireless protocols and profiles:
• Bluetooth core technology v2.1+EDR
• Hands-Free Profile (HFP) v1.5 headset role
• Headset Profile (HSP) v1.2 headset role
Bluetooth Wireless Interoperability:
This ZOMM device is designed to be interoperate with all Bluetooth wireless products that support compatible profiles and roles
including:
Bluetooth core technology v3.0, v2.1+EDR, v2.0 +EDR, v1.2
Bluetooth master and slave roles
Bluetooth Hands-Free Profile (HFP) v1.5 and prior headset (HS) role
Bluetooth Headset Profile (HS) v1.2 and prior headset (HS) role
Any idea on what we could do to resolve this issue and use Bluetooth together with the OpenEars library on iOS7.1?
Thanks! Philip
testing with Voice Memo is logical, however the app may not be allowing the route the change when BT is connected. A detailed explanation is here. iOS: Using Bluetooth audio output (kAudioSessionProperty_OverrideCategoryEnableBluetoothInput) AudioSession
With OpenEars, I believe you can enable logging to track when the audio route changes. So, you can verify via logging that it is listening to where it should be or .... not listening. I would suggest upgrading as of this post http://www.politepix.com/forums/topic/small-bug-when-running-on-ios-8/ . If you scroll to the bottom you can see BT should work now.
Another test worth running is SaveThatWav within OE. I have not used it, but you should be able to verify what you are listening too.
Is it possible to change the audio route when headphones are plug in or unplugged at iPhone's system level. Actually i want to listen the input and output audio only from internal iphone microphones whether headphones is connected or not.I have tried with kAudioSessionOverrideAudioRoute_Speaker , it is working fine at application level but not at system level. It is an enterprise applications. Any idea? please help me.
As per Audio session programming guide from Apple. You can only force play a audio file through iPhone speakers from app level by using setting kAudioSessionProperty_OverrideAudioRoute to kAudioSessionOverrideAudioRoute_Speaker. AFAIK system level it is not possible