Does iOS delay peripheralManager:didReceiveWriteRequests: and peripheralManager:didReceiveReadRequest:? - ios

I use an iPhone as a peripheral to communicate with a micro controller (BLE chip in question is the BGM113). After connecting from the MCU, the MCU sends a couple of read and write requests for characteristics serially. Each request takes only a few ms in the MCU. On the iPhone side, responding to each request also only takes a few ms in the relevant methods (peripheralManager:didReceiveWriteRequests: and peripheralManager:didReceiveReadRequest:).
Still, I have roughly 500ms delay between each request. I have a support request running with the bluetooth chip manufacturer to clarify, but my gut feeling tells me that the fruity company is to blame...
Can anyone confirm such delays when reading or writing characteristics?
(more details: all characteristics are in the same service, read and write may happen on the same characteristic serially, there are several characteristics that I operate on.)

Your delay will be between 1 and 2 times the connection interval, so you set the connection interval to match your maximum delay requirement. Note that the energy consumption for the radio is linear to the inverse of the connection interval though.

Related

NRF51 - iOS BLE advertising interval - Increase speed of connection

I am looking at speeding up the connection time between my iOS application and the peripheral.
I have looked up Apples Documentation on the subject: https://developer.apple.com/library/content/qa/qa1931/_index.html
Originally (prior to reading the doc above) I had the advertising interval set to 2 seconds to, what I had thought would be, a good compromise between power consumption and connection time. Having read the documentation further I have changed the interval to 1285 ms.
#define ADVERTISING_INTERVAL 2056 ble_obj.setAdvertisingInterval(ADVERTISING_INTERVAL);
The device is always discovered quickly by the app but the problem comes when trying to connect.
However, I have seen no increase in speed in connection time between my application and the peripheral device. Connections between the devices can take anything from 3-4 seconds up to 30+ seconds.
Is there something I am missing? Either on the peripheral or the central side?
Peripheral BT chip is the Nordic Semiconductor NRF51822.
On examining the devices advertisement packet on the Nordic Semiconductor app I can see that the advertisement interval normally varies from 1275 ms to about 1295 ms (as expected? due to the random time added to the advertisement packet)
NOTE
Have also tried with an advertising interval of 152.5 ms and am still not seeing any major improvement in connection speed. I am , obviously, seeing a marked improvement in speed of discovery
What you observe is normal. Don't expect fast connection setup with an advertising interval of more than a second.
Core Bluetooth uses a high duty scan window/interval for the initiation the first seconds. If it doesn't connect then it continues to scan with much more power restrictive parameters.

android ble advertising interval change

since Android 5.0 start to support the peripheral device which allow apps to broadcast advertisements. In my app case, I need broadcast every 20ms to 30ms,but I can't find anyway to change the advertising interval. And the default advertising interval is between 20ms to 600ms, which totally can't accept in my case.
You can't have control over the exact millisecond frequency, but you can experiment with using different settings from https://developer.android.com/reference/android/bluetooth/le/AdvertiseSettings.html such as https://developer.android.com/reference/android/bluetooth/le/AdvertiseSettings.html#ADVERTISE_MODE_LOW_LATENCY.
LOW_LATENCY is the best available that sends a packet about 100mS interval.
If you need a "fast" packet transfer, the workaround option is to create multiple advertisers. And if you create 5 such parallel advertisers, eventually it will reach at around 20-30 mS. Yes, it is possible, but remember, you are flooding the environment with a huge data with 5X times.

Reducing connection interval from the default 30ms

I know the default connection interval for CoreBluetooth is 30 ms. I've read couple of articles that claim they can reduce it 30 ms > by changing the min and max of the interval. I didn't see any explanation of how they were changing the parameters of it? I am assuming this is all in the iOS end.
Currently I am working on a project where the iOS device is sending packets to the bluetooth le device. When I was writing without response, there were a lot of packets being dropped so I added a handshake so once the bluetooth device receives a packet the iOS sends the next packet. This is currently taking a long time to upload a file since the connection interval is 30 ms which I am trying to reduce.
Any suggestions would be helpful
td;lr How do I change the connection interval on iOS
Solution So after doing research there is no public API that allows iOS devices to request for a connection interval change request. For Android this is possible.
There is no API on iOS for a app as master (using CBCentralManager) to modify the initial Connection Parameters when connecting to a peripheral.
However, the slave can suggest new connection parameters using a L2CAP Connection Parameter Update Request (see Bluetooth 4.0 specification, Volume 3, Part A, Section 4.20), which iOS will accept if they are reasonable (see Bluetooth Accessory Design Guidelines for Apple Products section 3.6 “Connection Parameters”). Peripherals should do this because different operating systems have different default connection parameters that might not be optimal for a particular peripheral. For example, if you rare implementing your peripheral in iOS or OSX, call -[CBPeripheralManager setDesiredConnectionLatency:forCentral:. Or, if you are using the TI BLE stack to program a CC2540 or the like, call the function L2CAP_ConnParamUpdateReq.

Core Bluetooth Peripheral disconnects every 30 seconds

I have an app that acts as a Bluetooth LE peripheral. I have a single service with four characteristics. 2 out of the 4 are read and write only, the other two are configured as notify.
If I subscribe to one of the "notify" characteristics then the app will not disconnect until I do so manually, works well.
My issue is, If I read or write to the other characteristics, and the then am inactive for around 30 seconds, the BTLE connection disconnects from the peripheral. This may be a limitation set by apple, not sure.
Anyone know of a solution to keep the peripheral active even when there aren't any subscribers and no read or write command has been received in 30 seconds??
This is a by-product of the BLE 4.0 specs. Bluetooth Low Energy is explicitly designed to not maintain a connection for long periods which is what you are describing.
The only way to bypass this (beyond subscribing to a characteristic as you have found) would be to modify the implementation of the BLE stack on the peripheral you are connecting to and removing or elongating the interval of connection to a point that you find satisfactory.
Although this may not help you either as both sides of the BLE communication negotiate these values and iOS may impose a maximum below your requested threshold.
In my case reason was in a mismatch between characteristic properties. I wrote data to a characteristic with "waiting for response" option, but characteristic was in 'without response' state.
The symptom: write callback in delegate does not work when BLE peripheral did not write a response.

Max achievable polling frequency using Bluetooth LE GATT profile?

I am trying to understand BLE and GATT in more depth. My interest is in the max achievable number of reads you would able to make per second over the GATT profile.
I am aware of some of the post made on this topic before, for instance:
Bluetooth Low Energy - updating a characteristic value repeatedly
However, I am trying to explain these results looking at the BLE specification.
What is the relationship between connection events and GATT? Does each ATT read/write require a new connection event? If not, is it possible to say anything about how many ATT read/writes can be made per connection event?
Say I want to poll a BLE connected light sensor for a single byte value, what would be the max Hz I could achieve? Would it always be best to set the mininum connection interval as low a possible?
Would I be able to achieve better results using "GATT server notifications? In the BLE spec (Core_v4.0) it says that "The master initiates the beginning of each connection event". Then how are GATT server notifications implemented? I would think that would require the server to initiate a connection event.
Finally, if anybody knows about any specific iOS imposed limitations on the throughput I would be able to achieve when polling a sensor intensively, I would love to hear about it.
I can answer a portion of of those questions...
What is the relationship between connection events and GATT?
They're different levels of the protocol. You handle connections and connection events via HCI. GATT is something you use after you've connected.
Does each ATT read/write require a new connection event?
No. Once you're connected you can do multiple read/write or other GATT commands.
If not, is it possible to say anything about how many ATT read/writes can be made per connection event?
I think the best method is to actually benchmark the speed yourself. However, the whole point of BLE is a reduction in power usage at the expense of speed. If you're concerned about speed that you probably shouldn't be doing it with BLE. The whole point of notifications/indications is so you don't have to poll an attribute but only get a message when a certain event has occurred.
Say I want to poll a BLE connected light sensor for a single byte value, what would be the max Hz I could achieve? Would it always be best to set the mininum connection interval as low a possible?
See above 2 answers.
Then how are GATT server notifications implemented?
Once you've implemented a GATT connection there's 2 way communication going on between the master and slave-device. Either device can send events to the other. In order to use notifications, you set a bit on a particular attribute to say you want notifications on that information. Then, depending on how that notification works, you'll get events sent back to you whenever there's something to report. I have a feeling that a lightbulb wouldn't have any sort of notification unless there's some interface on it besides the BLE connection. I typical application would be something like a thermometer where it would send a notification every time the temperature changed by 1 degree.
Conclusion:
If you're polling attributes you're doing it wrong. But it's possible that you have to do it wrong because the device didn't properly implement notifications in the way you need and you can't modify the device. However, polling will ramp up the battery usage significantly and you'll have loss the benefit of using BLE.

Resources