Can I transfer audio stream from one iOS device to other iOS device (for example from 4s to new iPad) using CoreBluetooth framework. Maybe BLE is too slow fo media streaming?
Bluetooth Low Energy (BLE) is not intended to stream data !
If you want to a stream you MUST use Bluetooth 2.X+EDR and an appropriate profile. Therefore, if you want to stream audio, you need a headset or A2DP profile.
CoreBluetooth API give only access to BLE devices.
Audio streaming wont work any good, since BLE can stream 20 byte packets at a time, with 37.5ms delay between each transfer on iOS5. So this would be laggy and as good as useless. There is always the possibility of buffering the data, but in the end, this is not a good way to stream audio.
|packet| --- 37.5ms --- |packet| --- 37.5ms --- |packet...
Related
I have a device with a camera and i want to connect to it using my iPhone via Bluetooth, so the question: is it possible to send real-time video stream by bluetooth using Swift/Objective - C?
It is not possible. The transmission speed of Bluetooth is not strong enough to stream a video in real-time to another device. If it was audio it is potentially a different story.
You can use bluetooth to transfer a video to another device, but not to stream as far as I know.
I'm trying to determine how Apple Airpods pair and connect as seamlessly as they do, but I couldn't find any in-depth technical explanation so I embarked on a journey to figure it out for myself. I have used an Ellysis Explorer Bluetooth sniffer to sniff both BLE and Bluetooth Classic packets from the Airpods and the iPhone I have used to connect with it.
The issue is that I lack the background knowledge in Bluetooth to fully understand what I am looking at so I'm hoping somebody can explain what is appearing on the BT sniffer in the snapshots below:
The below picture is a list of the BLE packets captured after the Airpods case has been opened but BEFORE connecting to the phone.
The below picture is a list of the Bluetooth Classic packets captured after the Airpods case has been opened but BEFORE connecting to the phone.
The below picture is a list of the Bluetooth Classic packets AFTER connecting to the phone captured on top of the previous ones.
Note that there are no new BLE packets picked up after connecting.
The 1st pic shows that both ears are sending advertising packets.
Then one of the ear is paging the other ear and exchanging information.
Then the iPhone is connected to one of the ear just like normal A2DP connection.
More captures while audio is just started playing would be helpful.
Before analysing packets you need to learn about CoreBluetooth framework. CoreBluetooth deals with scanning, connecting and writing and reading data from "Bluetooth Low Energy" (BLE) devices. BLEs (Peripherals) continuously broadcast a small packet of data when they are not connected with any device Central.
First images shows data which is being broadcasted by BLE, in your case an airpod.
We have a VOIP app that generally transfers audio packets with a sample rate of 32Khz. For what would seem to be a reasonable match, we've typically set the AVAudioSessions preferred sample rate to 32Khz as well. On later iPhones (e.g. iPhone XS) we've found the speakerphone no longer plays or is garbled when using a sample rate of 32Khz. But the audio session seems to happily accept (with read back confirmation) a preferredSampleRate of 32Khz. I've read that iPhone 6S codec (and perhaps later devices) only support 48Khz sample rates... but if that is the case why wouldn't iOS override the setPreferredSampleRate?
I have an app that used to use GKSession for voice communication between multiple devices. In particular, I used [GKVoiceChatService defaultVoiceChatService] to establish voice communications. Although deprecated back in iOS 7, this continued to work well until now: in iOS 9 this functionality works only over wifi, but not Bluetooth (which is a requirement for my app).
While trying to convert this to use MCSession, I stumbled at the part where audio streaming comes in. I can successfully connect peers over MCSession, but cannot figure out how to treat audio.
I have seen some tutorials on streaming camera video: multipeer connectivity for video streaming between iphones
And for streaming audio from a file: https://robots.thoughtbot.com/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity
However, I still cannot quite come up with a way to stream audio from the microphone. Anyone have any tips?
I have an security cam which sends the video stream over 2,4 GHz to a receiver. I now want to know, if it's possible to receive this signal on iPhone and show the video stream. As WiFi is also sending on 2,4 GHz, the iPhone should be able to receive that signal. Or not?
Security Cam: http://www.jay-tech.de/jaytech/servlet/frontend/content.html?articleOID=d583e45:-495a2735:120c7c04348:446c&keywordOID=d583e45:946c233:1182e6a651d:e4e.
My iPhone is a iPhone 5s on iOS 8.1
If it's not possible over iPhone, is it may possible to catch the signal with any other device? I have this devices which I could use:
Raspberry PI, old WiFi USB Stick, Arduino Uno and a buch of cables for TV/Audio/Video etc
Thanks iComputerfreak
Sorry for my bad English, I'm German ;)
In short, no. To receive the signal, you'd need some dedicated hardware to receive the signal and encode it into a format that the iPhone could understand. It's not possible to arbitrarily capture wireless signals on a particular frequency and decode them in software - not on an iPhone, anyway.
Your best solution would be to look for some external hardware which operates on the same frequency, and can encode the video signal over a wifi network - I'd be surprised if such a device doesn't exist, though it may not be cheap. The iPhone can then simply receive the encoded video via wifi and use it like any other video stream.