I have a quick question regarding BLE packets with iOS.
Say I have an iPhone app that sends out BLE advertising packets once a second in the background. This same app is also configured to receive advertising packets in the background. Say there are 30 other devices running this same app within Bluetooth range of the original device. Is there a theoretical maximum to the amount of packets any one device can receive within a given time interval? Could the app receive and process all the advertising packets from the thirty phones, or even 100 phones? Thanks for your help!
Depends on peripheral advertising speed/interval. If you have 20 ble device around ios device and start scan.
Then first get nearest device and which devices who advertising interval is fast.
Well the BLE advertisements run at 1 MBit/s so that means one microsecond per bit.
In an advertisement packet the overhead is, on top of the advertisement data, the address of 6 bytes and then 10 bytes extra for BLE header and CRC.
If two packets are transmitted by two devices at the same time, they will interfere and possibly none of these packets can be detected at all.
With this information you should be able to calculate the rest of the Math yourself.
Related
I have a custom BLE peripheral that can send a notification data packet to a central. The device sends packets of 234 bytes at a time, and the central is expected to register for notification of characteristic updates on the device. Peripheral is sending the 234 bytes of data to central but iOS device receiving only 182 bytes of data in the didUpdateValueForCharacteristic function.
In android, The central software just works with no problems and the phone (central) receives 234 bytes in a single notify event. - this works just fine in Android but having an issue with iOS devices.
Is there any configuration required for the iOS device to receive the full length of data from the BLE peripheral? Any help would be appreciated!
iOS devices have a maximum ATT_MTU of 185 bytes, which means you can send a maximum of 182 data bytes per packet (the other 3 bytes are overhead for L2CAP). In the beginning iOS devices only supported 158 bytes and then this was increased to 185.
The way ATT_MTU works is that there's a negotiation upon connection where the central sends its maximum ATT_MTU (i.e. for iPhones it is 185) and the peripheral replies with its own ATT_MTU (i.e. in your case it is 237), and then the connection's ATT_MTU will be the minimum between the two (i.e. 185). So to answer your question, no there isn't a way to configure your iOS device to send the full length of data because this is a low level configuration that Apple don't allow access to.
Have a look at the following links for more information:-
iOS BLE Get Negotiate MTU
Maximizing BLE Throughput Part 2: Use Larger ATT_MTU
A Practical Guide to BLE Throughput
My ObjC corebluetooth central, on my iPhone iOS, receives a "stream" of notifications* from my virtual BLE peripheral running on iPad. But each notification is only 16 bytes.
-* Not a byte stream, but a "stream" of continuous, contiguous notification sets of bytes, where each set (ie. a packet) contains a single notification.
How can I increase each BLE notification size?
Greater size would reduce my backend processing at central.
As Paulw11 mentioned, the peripheral decides how much data it sends. However, if your objective is to reduce the backend processing at central by increasing the throughput, then you have several ways to achieve this, namely: decreasing the Connection Interval, increasing the ATT_MTU, using Data Length Extension, and (if both devices support BT5 features) using 2MPHY Modulation. The links below have more information on this:-
Maximising BLE Throughput: Everything You Need to Know
A Practical Guide to BLE Throughput
BLE 5.0 Max Byte Size
After spending hours around the web, I cannot find any documentation about the background BLE scanning rules used by IOS.
As it is not possible to set the scan window on IOS, I am looking for the rules defined by Apple when IOS is scanning in background.
Context
I am working on a wearable peripheral which can be disconnected sometimes when it is out of reach with the phone. The goal is to reconnect quickly (less than 5s) when the peripheral is close enough to the phone. The peripheral has battery constraints so I cannot advertise every 20ms forever, so I am looking for a clever way to reconnect my peripheral to the phone.
If I know how the background scanning mode is working, I would be able to define a smart advertising interval in order to save battery.
Use case
If my peripheral advertises every 1285ms, how long does it will take to be discovered by my IOS application in background mode for 10 minutes?
Not sure exactly what your question is.
I suppose you have read Apple's "Bluetooth Accessory Design Guidelines for Apple Products"?
https://developer.apple.com/hardwaredrivers/BluetoothDesignGuidelines.pdf
In it, they state:
3.5 Advertising Interval
The advertising interval of the accessory should be carefully
considered, because it affects the time to discovery and connect
performance. For a battery-powered accessory, its battery resources
should also be considered.
To be discovered by the Apple product, the accessory should first use
the recommended advertising interval of 20 ms for at least 30 seconds.
If it is not discovered within the initial 30 seconds, Apple
recommends using one of the following longer intervals to increase
chances of discovery by the Apple product:
152.5 ms
211.25 ms
318.75 ms
417.5 ms
546.25 ms
760 ms
852.5 ms
1022.5 ms
1285 ms
Note: Longer advertising intervals usually result in longer discovery and connect times.
Upon discovering the BLE device, iOS will notify apps that are looking for it (based on the advertised service UUID), which will then be able to connect to it.
Apple recommend a 100 ms interval for iBeacons.
This (pretty old, from 2012) discussion states that:
the median discovery time when the phone is in standby is about 60
times the advertising interval. The 95-percentile discovery time when
the phone is in standby mode is about 300 times the advertising
interval
This (slightly more recent, but from Dec 2013) answer states that:
While scanning in the foreground will likely immediately discover a
device advertising next to it, discovery in the background can take up
to ~60 times longer.
There's a problem when (average) advertising interval is near to an integer multiple of the scan interval, then chances are that discovery-time can rise infinitely (, i.e. scanner will never see the advertising, as ADV always occurs outside the scan window).
Probably the ADV-interval list from Apple's design guideline shows optimum values, but it does not tell how to determine discovery-times. That's a mess!
I'd even go further and say: If the Smartphone (Apple or any other) manufacturer does not exactly specify the scan parameters (interval, window, and eventually filter settings) for each power mode, then you're lost and cannot correctly estimate discovery performance.
I want to know in regards to iOS what is the typical advertising rate for peripherals running in the background. I know that it will vary depending on what you are doing in the foreground , but I do not know to what degree. I have read it changes only by 10's of ms but then i have also read it can change by up to 4 seconds between advertisement packets. I can fit all of my data needed into 1 packet. I would like to know between advertising this one packet in the background and nominal iphone usage how likely is it that my advertising rate will still be in the ms range?
Also i have read that i can change this interval programatically, but the examples are only for 30 seconds. (im guessing due to limited discoverability mode?) Is there a way I could keep recalling this advertising boost? OR Maybe I could start as a central scanning in the background instead and when i scan a certain packet i could switch to peripheral mode (since you can only do one or the other in the background) and my advertising boost could happen then. Then when that thirty second boost is over i could switch back to central mode until i scan that certain packet again right? This way i would only have a small window of poor advertising rate every 30 seconds. Is that probable?
We are developing a BLE sensor Peripheral to work with an iPad, that requires the following throughput of data on the BLE notification characteristic (no acknowledge) using a TI CC2541 BLE module and a custom profile:
One 20 bytes (GATT maximum standard packet) every 10ms, or since we appear to have a limit of 4 packets per connection interval, this equates to one connection interval every 40ms. Throughput required is 2,000 bytes per second, the TI website recommends the CC2541 BLE solution be used for several sensor devices requiring this level of data throughput.
The profile for the BLE module is set with min and max connection intervals of 20ms and 40ms respectively, which should suffice. The "Bluetooth Accessory Design Guidelines for Apple Products" document suggests that the minimum and maximum connection intervals we set, as above are correct. We are using the latest iPad and Apple tools for iOS 6 on a new Mac Mini / Mac Book.
With a simple test program on the iPad, we can get the link to work well sending 20 byte packets to the BLE Peripheral at intervals of 20ms, however once we lower this to 10ms as required we start loosing packets or getting corrupt packets, we have the FIFO empty interrupt turned off so we can handle the sending to the BLE module FIFO quicker, and we are using the maximum Baud rate of 230400 to send the 20 byte packets to the BLE TX FIFO from the micro.
We realise we are at the top end of the BLE transfer limit, and of what is possible. Can anyone advise if there is a solution to achieving 2000 bytes per second throughput using the TI CC2541 BLE chip / module with an up to date iPad?
We use TI 2540 (BLE stack version 1.3.2) succesfully with iPad/iPod/iPhone (iOS 6.x and 7.x). We currently send 75 notifications of 20 bytes per second => 1500 bytes/second. But I have tried to send 125 notifications and that worked as well.
Of course the more you send the greater likelihood of loosing data, e.g., less time to resend a NACK'ed message.
I have experienced that iOS' BLE stack may enter a mode where it begins to NACK messages continuously. If this happens you will loose a lot of messages. I have reported an error to Apple about this. (This problem seems to have been fixed in iOS 7.1.beta3/4.)
I currently have:
// Minimum connection interval (units of 1.25ms, 80=100ms) if automatic parameter update request is enabled
#define DEFAULT_DESIRED_MIN_CONN_INTERVAL 10
// Maximum connection interval (units of 1.25ms, 800=1000ms) if automatic parameter update request is enabled
#define DEFAULT_DESIRED_MAX_CONN_INTERVAL 20
Yes, it doesn't conform with Apple's guidelines. But I believe they can be relaxed in our case.
UPDATE: I have also tried to use an iDevice as peripheral, i.e., BTLE between two iDevices. Here I have sent 150 messages per second without any problems.
Are you sending "write without response" commands? You can send 4 packets per connection event this way. Using you previous 20ms connection interval, you would be sending 4 packets with 20 bytes every 0.02 seconds. Putting that together: 4*20/0.02 = 4000 bytes per second easy.
I highly doubt you are getting corrupt data. The link layer adds a CRC and a 2 bits of "next expected" to BLE packets to ensure A) all the bits are received correctly and b) packets were not sent out of order. The TI stack and iOS control the link layer so I doubt you've botched that.
Here are few observation on throughput that we found during our RnD on iPhone with BLE. The below data is based on write with response.
iPhone 8 (BLE 5.0) as Central and Linux desktop (Ubuntu 16.04 with BLE dongle 4.0): MTU = 2048 : Throughput - 2.5 KiloBytes per sec.
iPhone 8 (BLE 5.0) as Central and Android OS with BLE version 4.2 as Peripheral(Xiomi Mi A1): MTU = 180 : Throughput - 2.5 KiloBytes per sec.
iPhone 8 (BLE 5.0) as Central and iPhone 7 plus (BLE 4.2) as Peripheral : MTU = 512 : Throughput - 7.1 KiloBytes per sec.
iPhone 8 (BLE 5.0) as Central and Samsung S8 (BLE 5.0) as Peripheral : Samsung S8 failed to work as peripheral
iPhone 8 (BLE 5.0) as Central and iPhone 8 plus (BLE 5.0) as Peripheral : MTU = 512 : Throughput - 15.5 KiloBytes per sec.
As you can see, as the MTU value increases, we get maximum throughput. But we cannot increase to any limit. The above MTU values are the default maximum allowed MTU value as per the given configuration. [MTU - Maximum Transmission Unit. i.e maximum bytes that can be sent in one write request]
Comments are welcome to the above data.
iOS 7 seems to have made some optimizations regarding the throughput levels for BLE transfers. Try it again on an iOS 7 device.
You don't really pose a question, but I can verify that your desired limit of 2000 bytes/sec is possible.
check out the selected answer on this forum post (http://e2e.ti.com/support/wireless_connectivity/f/538/p/353327/1244676.aspx#1244676) to see how we made it work.