Interfacing with AirPods on IOS - ios

I am trying to write an IOS app that will take left and right audio from connected AirPods' microphones, and play back the sound in real time, after some processing.
I haven't found any information on how to connect to, and talk to the AirPods after much searching on Apple's developer sites, and a few related but unhelpful StackOverflow questions.
How do you connect to these devices, and what's a good place to get information regarding the technology?
Thanks.

Related

AVAudioSession.RouteChangeReason not detecting connected bluetooth headset/earphones

I'm working on a voice calling app for iOS in Swift and I need to enable auto detection and auto connection of wireless(bluetooth) headsets/earpeaces.
I'm using AVAudioSession RouteChangeReason but it's not detecting when airpods or some other wireless headset is connected via bluetooth. After connecting the device I can even see it in Bluetooth settings that it's connected, but the device is not listed in MPVolumeView as you can see in the screenshoot.
I've been looking for an answer on how to work this out all over the internet and even though there are a lot of similar questions here on StackOverFlow none of them is specifically related to my case so please give me a light on how to solve this.
I just read the code and finally found what was wrong with it. The problem was that I was setting the category more than once so for some reason this was causing wireless headset/earpeace to not connect directly to the app.
.allowBluetoothA2DP is not valid for playAndRecord. You can't record over A2DP. Remove that one. Note that when you do this, if the user is listening to music, the quality is going to drop dramatically (it's possible that AirPods have a proprietary work around for that; I haven't played with them in this mode). In order to record over Bluetooth you have to use HFP, which provides basically "telephone" quality audio.

How do I detect the presence of an Airplay AUDIO-ONLY device on the network?

I'm unable to find code online (yes, googled aplenty) that details how to discover devices that support the audio only version of Airplay. There doesn't seem to be anything on Apple's developer site either - I spent an hour scouring Google and searching through Apple's documentation. I'm clearly missing something here.
I did find this: "Using External Display", but that only describes how to discover and connect to devices that support Airplay mirroring. They essentially are recognized as another display and you just begin drawing to it.
I also found this: "Providing an AirPlay Picker", but it only describes how to create an AirPlay picker when a device is available, not how to know when a device is available in general (I'd like to do other logic in my app based on audio device availability).
My end goal here is to prevent the user from trying to use an audio-only Airplay device for my app, which requires video support. I planned to show a warning message when 1+ audio-only devices and 0 video-capable devices were detected.
I'm specifically interested in the code for audio-only devices, like the Apple TV 1st gen, AirPorts, and Airplay-enabled speakers. This is particularly important considering that Apple TVs don't look visibly different across generations, and a user might think they can use the app over Airplay when in fact they cannot.
Can someone point me towards the right resource?

How do apps like iBetterCharge and coconutBattery work?

I've been using iBetterCharge for some time and I quite like it. Except the fact that sometimes it's annoying with its loud warning sound.
Anyway, I'm wondering how do apps like iBetterCharge and coconutBattery work? I mean how do they read iPhone batter level over wireless networks?
My personal research did not reveal much regarding how to read the battery level of a device programatically. However, according to the iBetterCharge FAQ they do not talk with iPhone at all. They say the app communicates with the device using the data which iTunes itself gathers.
But how?
It uses protocols which iTunes itself uses to sync with iOS devices. FAQ says everything you need to know. Check out libimobiledevice This library implements those protocols and can be used to retrieve battery info. I know it works over USB but don't know in which state is networking implementation (it says WIP).
You can try example here. You need to pass com.apple.mobile.battery as domain to retrieve battery info.

Transfer pictures from external camera circuit within my iOS app programmatically

I'm working on my senior engineering design project and I need your help! For this I have my iPhone app receiving images from a external camera circuit, which I built.
To interface my iPhone app to the camera circuit, I have looked into the following approaches:
Build a bluetooth module on the camera circuit, to transfer images to the iPhone
Use Eye-Fi SD card to transfer images to my app somehow! link:http://www.eye.fi/products/iphone
Build a circuit, to make a wired connection to the iPhone with the 30-Pin dock connector
Here are the problems I'm facing with each of these. My actual questions for you guys are highlighted in BOLD:
The iOS BlueTooth framework (4S only), only supports Low Energy Devices. Looking at the the modules out there like this one, I'm doubting it will work for image transfer, which seems to be a bulky task for low energy bluetooth. I know there are jailbreak apps on the cydia store, which do regular bluetooth transfers, but I was unable to find those private APIs for such a task. (NOTE: I'm making this app for my purposes, so feel free to suggest any private/unofficial APIs). Question#1: How can I interface to a regular bluetooth device (not another iPhone) and transfer data?
EYE-FI card sounded amazing as a consumer because the company has their proprietary iPhone app to transfer the images from the EYE-FI SD card. Problem is I can't figure out how to easily interface with the EYE-Fi card in my code. I researched the iOS CFNetwork framework, but haven't had any luck. Question#2:How can I interface with the EYE-FI card in my app?
Building a circuit seems simple enough with this development board, but I read somewhere that the iPhone may not recognize an "un-registered" accessory. I have a developer license but not a MFi licence. Question#3: Do I need to be registered as a MFi developer to create and use this external accessory in my App for my own purposes???
You might try setting something up through a serial port since joining the MPi program is prohibited for individuals. You could possible use a connector like this one http://www.amazon.com/neXplug-Ultra-Small-Micro-Adapter/dp/B0055PCVDO/ref=sr_1_1?ie=UTF8&qid=1339309918&sr=8-1
The Apple website recommends individuals/hobbyists to use " recommend that you use a third-party solution which will allow you to connect iOS devices to serial devices and to write iOS apps that communicate with these serial devices" (from mfi.apple.com/faq).
I am also working on an external camera that can hook to the iphone/ipad. I will be using a serial port in order to get around the MFi requirement for external iphone/pad devices. Trying to use bluetooth is too complicated and the data stream isn't big enough for pictures. the wired version will work much better.
I hope this helps and that your college term and project are not already finished. Best of luck.
As T Reddy has already mentioned, if you want to create hardware the interfaces with external hardware framework, you have to sign up with the Apple MFi program which you, as an individual, can not do.
I'm not sure of how the Eye-Fi system works but it sounds to me that it basically syncs the images to their server and once you download their Apple App, the app can sync the photos for you.
Whether you are using Bluetooth or the 30-pin connector, there is no way to interface to an external device unless that device is MFi compliant and a part of the MFi program. I suggest you try the following options to solve this delimma--
If this is a "Senior Project" at some University, see if your University is part of MFi. Apple will not let individuals join the program, so if you are going to gain access, you have to access it through another organization or, possibly, an educational institution. I don't know if Apple has worked with schools in this regard, but you never know. It might be possible.
If your school isn't in the MFi program then you may want to consider re-writing your application for an Android device. Android devices are not locked down like iOS devices, so that may be a more reasonable approach.
I hate to bring bad news but circumventing these hardware restrictions on an iOS device is excessively prohibited. Your options are quite limited and none of them are probably what you either want or need to hear.

iPad/iPhone app for people with disabilities

I am developing an eBook reader app for the iPad. This app also contains facilities like audio playing, video playing, making notes, etc.
Now I want to add some features so that this app can be of help to the people who are visually impaired (blind) or hearing impaired (deaf).
I know that there is a framework called UIAccessibility in iPad for this specific purpose, but I could not find suitable sample code from apple or from any other developer regarding this issue.
Right now I'm a bit confused regarding how to proceed forward. Can anyone suggest me a few ways, methods or sample code for doing this so that I can proceed forward with my idea?
There are excellent WWDC session videos (both 2009 and 2010) on using the UIAccessibility API. Available through Apple's iOS developer portal.

Resources