UDPClient not receiving packets on HTC G2 - xamarin.android

I'm attempting to receive a UDP Broadcast under Mono for Android and I am seeing no data coming in. This is somewhat perplexing because it works fine on the Galaxy Tab 7 and Galaxy Tab 10 (Android v 3.2) I have, but fails on an HTC G2 (Android v2.3.4).
The code is straightforward:
public void BeginDiscover()
{
var packet = new DiscoverPacket();
lock (m_syncRoot)
{
var localEndpoint = new IPEndPoint(m_local, 0);
using (var udp = new UdpClient(localEndpoint))
{
var remoteEndpoint = new IPEndPoint(IPAddress.Broadcast, DiscoverPort);
udp.Send(packet.Data, packet.Data.Length, remoteEndpoint);
Thread.Sleep(100);
}
}
}
I have verified that the manifest includes this line:
<uses-permission android:name="android.permission.INTERNET" />
Though this is happening in Debug, so that should be implicitly set anyway.
Other very strange observations:
Again, this is working just fine on another type of device
The handler listening for UDP broadcasts (which list listening for the response) does see this outbound packet. The code for this listener is also straightforward:
[listener code]
private void Start()
{
m_discoverListener = new UdpClient(DiscoverPort);
m_discoverListener.BeginReceive(DiscoverCallback, m_discoverListener);
}
private void DiscoverCallback(IAsyncResult result)
{
try
{
var ep = new IPEndPoint(IPAddress.Any, DiscoverPort);
var data = m_discoverListener.EndReceive(result, ref ep);
// filter out what we send
var add = AddressWithoutPort(ep.Address);
if (add == m_local.ToString()) return;
// parse discover response
// [clipped for clarity]
}
finally
{
m_discoverListener.BeginReceive(DiscoverCallback, m_discoverListener);
}
}
Wireshark running on a separate PC on the same network does see the discover request packet (from above)
The "discovered" device is also seeing it, because Wireshark is also seeing the reply
The Android device UDP listener is not receiving the response packet
The only major differences between devices that I can think of (other than different OEMs implementing the platform) is that the G2 has a cellular radio built in and the Galaxy Tab does not. In my specific test case, I have no SIM card in the phone, though, so no cellular connection is being made. Note that the code above is explicitly using the local endpoint that is on the WiFi network.
Is there a known issue with UDP on the G2 specifically or generally on older implementations of the Android platform?

It took a bit of work as the UDP response in question is coming from a microcontroller on the device and I wanted to make absolutely certain that it wasn't an issue on the micro end (though I suspected it wasn't). I created a PC-based simulator for the microcontroller device that handles my Android UDP request and that sends back the exact same UDP response that the microcontroller does, then verified all of the traffic looks fine with Wireshark.
The net result is that I see he exact same behavior with the simulator. The Galaxy Tab 7 and 10 devices receive the UDP response no problem. The HTC G2 never does. This leads me to conclude that one of the following is true:
a) The HTC G2 specifically has an implementation bug preventing it from receiving (or at least passing along) UDP broadcasts on the network
or
b) The older Android build has his bug.
Until I find different hardware with the same Android version as the G2 (v2.3) I can't tell which is the case. In either event, it's a bug that makes this (and potentially other) hardware unusable for my specific solution.

I have a couple of applications on the market based on UDP communication.
I have problems with HTC phones not receiving the UDP broadcast packets sent from another device... if sent from the same device, the packets arrive.
so, I think the problem is in HTC, and I found a possible solutions online (even though I have not tried it):
http://www.flattermann.net/2010/09/fix-udp-broadcasts-on-htc-phones-running-stock-firmware/

Related

ESP8266 5v Relay USB Disconnection issue

Issue
-When using the ESP8266 wired up in this way it will randomly disconnect the USB interface when it powers the relay. It may then re-connect but is sporadic.
-The code can be viewed below, but essentially the relay is powered for 300ms then waits 10 seconds to loop.
Wiring Diagram https://i.stack.imgur.com/4mycx.png
Tests:
I have swapped out the relay, pump, ESP8266, aswell as re-wiring the circuit multiple times to check for a short. I also have a integer incrementing every loop cycle, when the ESP8266 is able to re-connect it will print this variable, which shows the board is not crashing:
Serial output
https://i.stack.imgur.com/ziM8g.png
I then modified the diagram so the 5v power was not in parallel, but where two different power sources, one for the ESP8266 and one for the pump circuit, however the same issue was observed:
Test Wiring Diagram https://i.stack.imgur.com/7S0aP.png
Question:
Why does the USB disconnect when sending the control signal to the relay?
Is there a way to mitigate this?
Code:
int relayInput = 5; // the input to the relay pin
int debug_test = 0;
void setup() {
// put your setup code here, to run once:
Serial.begin(115200);
pinMode(relayInput, OUTPUT); // initialize pin as OUTPUT
}
void loop() {
// put your main code here, to run repeatedly:
debug_test ++ ;
Serial.println(debug_test);
digitalWrite(relayInput, HIGH); // turn relay on
Serial.println("Water on!");
delay(300);
digitalWrite(relayInput, LOW); // turn relay off
Serial.println("Water off!");
Serial.println("Waiting 10 seconds");
delay(10000);
}
Parts:
Pump - https://www.ebay.co.uk/itm/Mini-Water-Pump-DC-3V-4-5V-Fish-Tank-Fountain-Aquarium-Submersible-White-Parts/174211676084?hash=item288fd337b4:g:128AAOSwfQteYWF3
ESP8255 - https://www.amazon.co.uk/gp/product/B07F5FJSYZ/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1
Relay - https://www.amazon.co.uk/gp/product/B07BVXT1ZK/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1
Ok, so researching in to this, it seems when the pump is on it pulls more current (amps) than the PC can provide.
This will be used connected to a external power source which should supply enough current to it, however I also wanted the flexibility to connect it to a PC with a serial connection to troubleshoot.
So in the end something like this:
https://i.stack.imgur.com/MKD1h.png
You are driving a 5v relay module with 3.3v output, which works perfectly for some people but it depends on the relay module and the board, this might be the problem. or the relay draws more than 12mA which is the maximum current can the ESP8266's GPIO deliver.
so I suggest you use an external power source for the relay and control it through the pin (D1 in your case).
Or just use a generic 5v relay with an external 5v power source and control it using a transistor, here is a circuit.
Additional information: https://electronics.stackexchange.com/questions/213051/how-do-i-use-a-5v-relay-with-a-3-3v-arduino-pro-mini?

Delphi - Is it possible to detect if the Screen monitor is ON or OFF by software? [duplicate]

Does anyone know if there is an API to get the current monitor state (on or off) in Windows (XP/Vista/2000/2003)?
All of my searches seem to indicate there is no real way of doing this.
This thread tries to use GetDevicePowerState which according to Microsoft's docs does not work for display devices.
In Vista I can listen to GUID_MONITOR_POWER_ON but I do not seem to get events when the monitor is turned off manually.
In XP I can hook into WM_SYSCOMMAND SC_MONITORPOWER, looking for status 2. This only works for situations where the system triggers the power off.
The WMI Win32_DesktopMonitor class does not seem to help out as well.
Edit: Here is a discussion on comp.os.ms-windows.programmer.win32 indicating there is no reliable way of doing this.
Anyone else have any other ideas?
GetDevicePowerState sometimes works for monitors. If it's present, you can open the \\.\LCD device. Close it immediately after you've finished with it.
Essentially, you're out of luck—there is no reliable way to detect the monitor power state, short of writing a device driver and filtering all of the power IRPs up and down the display driver chain. And that's not very reliable either.
You could hook up a webcam, point it at your screen and do some analysis on the images you receive ;)
Before doing anything based on the monitor state, just remember that users can use a machine with remote desktop of other systems that don't require a monitor connected to the machine - so don't turn off any visualization based on the monitor state.
You can't.
Look like all monitor power capabilities connected to the "power safe mode"
After searching i found here code that connecting between SC_MONITORPOWER message and system values (post number 2)
I use the code to testing if the system values is changing when i am manually switch off the monitor.
int main()
{
for(;monitorOff()!=1;)
Sleep(500);
return 0;
}//main
And the code is never stopped, no matter how long i am switch off my monitor.
There the code of monitorOff function:
int monitorOff()
{
const GUID MonitorClassGuid =
{0x4d36e96e, 0xe325, 0x11ce,
{0xbf, 0xc1, 0x08, 0x00, 0x2b, 0xe1, 0x03, 0x18}};
list<DevData> monitors;
ListDeviceClassData(&MonitorClassGuid, monitors);
list<DevData>::iterator it = monitors.begin(),
it_end = monitors.end();
for (; it != it_end; ++it)
{
const char *off_msg = "";
//it->PowerData.PD_PowerStateMapping
if (it->PowerData.PD_MostRecentPowerState != PowerDeviceD0)
{
return 1;
}
}//for
return 0;
}//monitorOff
Conclusion : when you manually switch of the the monitor, you cant catch it by windows (if there is no unusual driver interface for this), because all windows capabilities is connected to "power safe mode".
In Windows XP or later you can use the IMSVidDevice Interface.
See
http://msdn.microsoft.com/en-us/library/dd376775(VS.85).aspx
(not sure if this works in Sever 2003)
With Delphi code, you can detect invalid monitor geomerty while standby in progress:
i := 0
('Monitor'+IntToStr(i)+': '+IntToStr(Screen.Monitors[i].BoundsRect.Left)+', '+
IntToStr(Screen.Monitors[i].BoundsRect.Top)+', '+
IntToStr(Screen.Monitors[i].BoundsRect.Right)+', '+
IntToStr(Screen.Monitors[i].BoundsRect.Bottom))
Results:
Monitor geometry before standby:
Monitor0: 0, 0, 1600, 900
Monitor geometry while standby in Deplhi7:
Monitor0: 1637792, 4210405, 31266576, 1637696
Monitor geometry while standby in DeplhiXE:
Monitor0: 4211194, 40, 1637668, 1637693
This is a really old post but if it can help someone, I have found a solution to detect a screen being available or not : the Connecting and Configuring Displays (CCD) API of Windows.
It's part of User32.ddl and the interesting functions are GetDisplayConfigBufferSizes and QueryDisplayConfig. It give us all informations that can be viewed in the Configuration Panel of windows.
In particular the PathInfo contains a TargetInfo property that have a targetAvailable flag. This flag seems to be correctly updated on all the configurations I have tried so far.
This allow you to know the state of every screens connected to the PC and set their configurations.
Here a CCD wrapper for .Net
If your monitor has some sort of built-in USB hub, you could try and use that to detect if the monitor is off/on.
This will of course only work if the USB hub doesn't stay connected when the monitor is consider "off".

Windows 10 IoT Raspberry Pi 2: DHT22/AM2302

I just wanted to start making experience with the DHT22/AM2302 (a temperature and humidity sensor), but I have no idea how to initialize and get the data of it ... I tried to use GpioPin:
gpioController = GpioController.GetDefault();
if(gpioController == null)
{
Debug.WriteLine("GpioController Initialization failed.");
return;
}
sensorPin = gpioController.OpenPin(7); //Exception throws here
sensorPin.SetDriveMode(GpioPinDriveMode.Input);
Debug.WriteLine(sensorPin.Read());
but get the exception: "A resource required for this operation is disabled."
After that I took a look at the library for the unixoids and found this:
https://github.com/technion/lol_dht22/blob/master/dht22.c
But I have no idea how to realize that in VCSharp using Windows 10, anyone an idea or experience?
Thank you very much in advance!
UPDATE:
I got the hint, that there is not GPIO-Pin 7 and this is true, so I re-tried it, but the GPIO-Output seems to be just HIGH or LOW ... So I have to use the I2C or the SPI ... According to this Project, I decided to try it out with SPI: http://microsoft.hackster.io/windowsiot/temperature-sensor-sample and making steps forward ... The difficulty now is to translate the above linked C-Library to the C-Sharp-SDK to receive the right data ...
private async void InitSPI()
{
try
{
var settings = new SpiConnectionSettings(SPI_CHIP_SELECT_LINE);
settings.ClockFrequency = 500000;
settings.Mode = SpiMode.Mode0;
string spiAqs = SpiDevice.GetDeviceSelector(SPI_CONTROLLER_NAME);
var deviceInfo = await DeviceInformation.FindAllAsync(spiAqs);
SpiDisplay = await SpiDevice.FromIdAsync(deviceInfo[0].Id, settings);
}
catch(Exception ex)
{
Debug.WriteLine("SPI Initialization failed: " + ex.Message);
}
}
This works not so well, to be clear: It works just once on starting up the raspberry pi2, then starting / remote debugging the application, but after exiting the application and re-start them, the SPI Initialization fails.
And now Im working on reading the data from the pin and will show some Code in a future update. Any comments, answers and or advices are still welcome.
DHT22 requires very precise timing. Although Raspberry PI/Windows 10 IoT core is extremely fast, since it's an operating system where other things need to happen unless you write some sort of low-level driver (not C#) you won't be able to generate the timings necessary to communicate with a DHT22.
What I do is use a cheap Arduino Mini Pro for about $5 with the sole purpose to generate and send the correct timings between the microcontroller and the Raspberry Pi, then setup some sort of communication channel between the Arduino Mini Pro (I2C, Serial) to pull the data from the Arduino.

OSX Bluetooth LE Peripheral transfer rates are slow

Background Info:
I've implemented a Bluetooth LE Peripheral for OSX which exposes two characteristics (using CoreBluetooth). One is readable, and one is writable (both with indications on). I've implemented a Bluetooth LE Central on iOS which will read from the readable characteristic and write to the writable characteristic. I've set it up so that every time the characteristic value is read, the value is updated (in a way similar to this example). The transfer rates I get with this set up are pathetically slow (topping out at a measured sustained speed of roughly 340 bytes / second). This speed is the actual data, and not a measure including the packet details, ACKs and so on.
Problem:
This sustained speed is too slow. I've considered two solutions:
There's some parameter in CoreBluetooth that I've missed that will help me increase the speed.
I'll need to implement a custom Bluetooth LE service using the IOBluetooth classes instead of CoreBluetooth.
I believe, I've exhausted option 1. I don't see any other parameters I can tweak. I'm limited to sending 20 bytes per message. Anything else and I get cryptic errors on the iOS device concerning Unknown Errors, Unlikely Errors, or the value being "Not Long". Since the demo project also indicates a 20 byte MTU, I'll accept that this likely isn't possible.
So I'm left with option 2. I'm trying to somehow modify the connection parameters for Bluetooth LE on OSX to hopefully allow me to increase the transfer speed (by setting the min and max conn intervals to be 20ms and 40ms respectively - as well as sending multiple BT packets per connection interval). It looks like providing my own SDP Service on IOBluetooth is the only way to achieve this on OSX. The problem with this is the documentation for how to do this is negligible to non-existent.
This tells me how to implement my own service (albeit using deprecate API), however, it doesn't explain the required parameters for registering an SDP service. So I'm left wondering:
Where can I find the required parameters for this dictionary?
How do I define these parameters in a way to offer a Bluetooth LE service?
Is there any alternative to providing a Bluetooth LE Peripheral on OSX via another framework (Python library? Linux in a VM with access to the Bluetooth stack? I'd like to avoid this altogether.)
I decided my best course of action was to attempt to use Linux in a VM as there is more documentation available and access to the source code would hopefully guarantee that I could find a solution. For anyone who is also facing this problem, here's how you can issue a Connection Parameter Update Request on OS X (sort of).
Step 1
Install a Linux VM. I used Virtual Box with Linux Mint 15 (64-bit Cinnamon).
Step 2
Allow usage of the OS X Bluetooth device in your VM. Attempting to forward the Bluetooth USB Controller to your VM will give an error message. To allow this, you need to stop everything that is using the controller. On my machine, that included issuing the following commands from the command line:
sudo launchctl unload /System/Library/LaunchDaemons/com.apple.blued.plist
This will kill the OS X Bluetooth daemon. Attempting to kill blued from the Activity Monitor will just cause it to be automatically relaunched.
sudo kextunload -b com.apple.iokit.BroadcomBluetoothHostControllerUSBTransport
On my MacBook, I've got a Broadcom controller and this is the kernel module that OS X uses for it. Don't worry about issuing these commands. To undo the changes, you can power down and reboot your machine (note, in some cases when playing with the BT controller and it got into a bad state, I had to actually leave the machine powered down for ~10 seconds before rebooting to clear volatile memory).
If after running these two commands you still can't mount the BT controller, you can run kextstat | grep Bluetooth and see other Bluetooth related kernel modules and then try to unload them as well. I've got ones named IOBluetoothFamily and IOBluetoothSerialManager that don't need to be unloaded.
Step 3
Launch your VM and get your Linux BT stack. I checked out the bluez Git repo from here. I specifically grabbed the 5.14 release tag using git checkout tags/5.14 just to be sure it was at least a tagged version and less likely to be broken. 5.14 is the newest tag as of writing this answer.
Step 4
Build bluez. This was done using bootstrap, then configure, then make and make install. I used the --prefix=/opt/bluez flag on configure to prevent overwriting the install bluetooth stack. Also, I used the --enable-maintainer-mode configure flag for the reason stated in the next step. You also might need to use --disable-systemd to get it to configure. Bluez has a bunch of tools and utilities you can use for various things. In order to use the built Bluetooth daemon, you need to stop the system daemon using sudo service bluetooth stop. You can then launch the built one using sudo /opt/bluez/libexec/bluetooth/bluetoothd -n -d (this launches in non-daemon mode with debug output).
Step 5
Get your LE service running via bluez. You can view the bluez/plugins/gatt-example.c for how to do this. I directly modified this by removing the unnecessary code and using the battery service code as a template for my own service and characteristics. You need to recompile bluez to have this code added to the bluetooth daemon. One thing to note (that caused my a day or two of trouble getting this working) was that iOS caches the GATT service listing and this is not read/refreshed on each connection. If you add a service or characteristic or change a UUID, you'll need to disable Bluetooth on your iOS device and then re-enable it. This is undocumented in Apples docs and there is no programmatic way to do it.
Step 6
Unfortunately, this is where things get tricky. Bluez doesn't have support built-in for issuing the Connection Parameters Update Request using any of its utilities. I had to write it myself. I'm currently seeing if they want my code to be included in the bluez stack. I can't post the code currently as I'd need to first see if the bluez devs are interested in the code and then get approval from my workplace to give the code. However, I can currently explain what I did to enable support.
Step 7
Prime yourself on the Bluetooth Standard. Any version 4.0 or greater will have the details you need. Read the following sections.
See Vol. 2, Part E, 4.1 for Host to Controller HCI flow.
See Vol. 2, Part E, 5.4.2 for HCI ACL Data Packet format.
See Vol. 3, Part A, 4 for Signalling Packet format.
See Vol. 3, Part A, 4.20 for Connection Parameter Update Request format.
You're basically going to need to write the code to format the packets and then write them to the hci device. The HCI ACL Data Packet header will contain 4 bytes. This is followed by 4 bytes for the Signalling command's length and channel id. This is then followed by your signal payload which in my case was 12 bytes (for the Connection Parameter Update Request).
You can then write them to the device similar to hci_send_cmd in bluez/lib/hci.c. I did each packet header as it's own struct and wrote them each as iovecs to the device. I put my new function in the hci.c file and exposed it with a function prototype in bluez/lib/hci_lib.h. I then modified bluez/tools/hcitool.c to allow me to call this method from the command line. In my case, I made it so that the command was nearly identical to the lecup command as it requires the same parameters (lecup can't be used as it's meant to be called on the master side, not the slave).
Recompiled all of this and then, voila, I can use my new command on hcitool to send the parameters to the bluetooth controller. After sending my command, it then re-negotiates with the iOS device as expected.
Comments
This process is not for the faint of heart. Hopefully, either this, or some other method of setting the connection parameters is added to bluez to simplify this process. Ideally, Apple will allow the ability to do so via CoreBluetooth or IOBluetooth at some point as well (it could be possible, but undocumentated / difficult to do so, I gave up with the Apple libraries). I've journeyed down the rabbit hole and learned much more about the Bluetooth Spec then I thought I'd have to to simply change the connection parameters between a MacBook and an iPhone. Hopefully this will be helpful to somebody at some point (even if it's me checking back on how I did this).
I know I've left out a lot of details in this in order to keep it somewhat brief (i.e. usage on the bluez tools). Please comment if something isn't clear.
If you are implementing your Peripheral using CoreBluetooth, you can request somewhat customized connection parameters by calling -[CBPeripheralManager setDesiredConnectionLatency:forCentral:] to Low, Medium, or High (where Low latency means higher bandwidth). The documentation does not specify what this means, so we have to test it ourselves.
On an OSX Peripheral, when you set the desired latency to Low, the interval is still 22.5ms which is far from the minimum of 7.5ms.
On OSX Yosemite 10.10.4, this is what the CBPeripheralManagerConnectionLatency values mean:
Low: Min Interval: 18 (22.5ms), Max Interval: 18 (22.5ms), Slave Latency: 4 events, Timeout: 200 (2s).
Medium: Min Interval: 32 (40ms), Max Interval: 32 (40ms), Slave Latency: 6 events, Timeout: 200 (2s)
High: Min Interval: 160 (200ms), Max Interval: 160 (200ms), Slave Latency: 2 events, Timeout: 300 (3s)
Here is the code that I used to run a CBPeripheralManager on OSX. I used an Android device as central using BLE Explorer and dumped the Bluetooth traffic to a Btsnoop file.
// clang main.m -framework Foundation -framework IOBluetooth
#import <Foundation/Foundation.h>
#import <IOBluetooth/IOBluetooth.h>
#interface MyPeripheralManagerDelegate: NSObject<CBPeripheralManagerDelegate>
#property (nonatomic, assign) CBPeripheralManager* peripheralManager;
#property (nonatomic) CBPeripheralManagerConnectionLatency nextLatency;
#end
#implementation MyPeripheralManagerDelegate
+ (NSString*)stringFromCBPeripheralManagerState:(CBPeripheralManagerState)state {
switch (state) {
case CBPeripheralManagerStatePoweredOff: return #"PoweredOff";
case CBPeripheralManagerStatePoweredOn: return #"PoweredOn";
case CBPeripheralManagerStateResetting: return #"Resetting";
case CBPeripheralManagerStateUnauthorized: return #"Unauthorized";
case CBPeripheralManagerStateUnknown: return #"Unknown";
case CBPeripheralManagerStateUnsupported: return #"Unsupported";
}
}
+ (CBUUID*)LatencyCharacteristicUuid {
return [CBUUID UUIDWithString:#"B81672D5-396B-4803-82C2-029D34319015"];
}
- (void)peripheralManagerDidUpdateState:(CBPeripheralManager *)peripheral {
NSLog(#"CBPeripheralManager entered state %#", [MyPeripheralManagerDelegate stringFromCBPeripheralManagerState:peripheral.state]);
if (peripheral.state == CBPeripheralManagerStatePoweredOn) {
NSDictionary* dict = #{CBAdvertisementDataLocalNameKey: #"ConnLatencyTest"};
// Generated with uuidgen
CBUUID *serviceUuid = [CBUUID UUIDWithString:#"7AE48DEE-2597-4B4D-904E-A3E8C7735738"];
CBMutableService* service = [[CBMutableService alloc] initWithType:serviceUuid primary:TRUE];
// value:nil makes it a dynamic-valued characteristic
CBMutableCharacteristic* latencyCharacteristic = [[CBMutableCharacteristic alloc] initWithType:MyPeripheralManagerDelegate.LatencyCharacteristicUuid properties:CBCharacteristicPropertyRead value:nil permissions:CBAttributePermissionsReadable];
service.characteristics = #[latencyCharacteristic];
[self.peripheralManager addService:service];
[self.peripheralManager startAdvertising:dict];
NSLog(#"startAdvertising. isAdvertising: %d", self.peripheralManager.isAdvertising);
}
}
- (void)peripheralManagerDidStartAdvertising:(CBPeripheralManager *)peripheral
error:(NSError *)error {
if (error) {
NSLog(#"Error advertising: %#", [error localizedDescription]);
}
NSLog(#"peripheralManagerDidStartAdvertising %d", self.peripheralManager.isAdvertising);
}
+ (CBPeripheralManagerConnectionLatency) nextLatencyAfter:(CBPeripheralManagerConnectionLatency)latency {
switch (latency) {
case CBPeripheralManagerConnectionLatencyLow: return CBPeripheralManagerConnectionLatencyMedium;
case CBPeripheralManagerConnectionLatencyMedium: return CBPeripheralManagerConnectionLatencyHigh;
case CBPeripheralManagerConnectionLatencyHigh: return CBPeripheralManagerConnectionLatencyLow;
}
}
+ (NSString*)describeLatency:(CBPeripheralManagerConnectionLatency)latency {
switch (latency) {
case CBPeripheralManagerConnectionLatencyLow: return #"Low";
case CBPeripheralManagerConnectionLatencyMedium: return #"Medium";
case CBPeripheralManagerConnectionLatencyHigh: return #"High";
}
}
- (void)peripheralManager:(CBPeripheralManager *)peripheral didReceiveReadRequest:(CBATTRequest *)request {
if ([request.characteristic.UUID isEqualTo:MyPeripheralManagerDelegate.LatencyCharacteristicUuid]) {
[self.peripheralManager setDesiredConnectionLatency:self.nextLatency forCentral:request.central];
NSString* description = [MyPeripheralManagerDelegate describeLatency: self.nextLatency];
request.value = [description dataUsingEncoding:NSUTF8StringEncoding];
[self.peripheralManager respondToRequest:request withResult:CBATTErrorSuccess];
NSLog(#"didReceiveReadRequest:latencyCharacteristic. Responding with %#", description);
self.nextLatency = [MyPeripheralManagerDelegate nextLatencyAfter:self.nextLatency];
} else {
NSLog(#"didReceiveReadRequest: (unknown) %#", request);
}
}
#end
int main(int argc, const char * argv[]) {
#autoreleasepool {
MyPeripheralManagerDelegate *peripheralManagerDelegate = [[MyPeripheralManagerDelegate alloc] init];
CBPeripheralManager* peripheralManager = [[CBPeripheralManager alloc] initWithDelegate:peripheralManagerDelegate queue:nil];
peripheralManagerDelegate.peripheralManager = peripheralManager;
[[NSRunLoop currentRunLoop] run];
}
return 0;
}

Blackberry's WLANInfo.getWLANState() Doesn't Return Correct Information

I'm working with the NetworkUtils.java class created by Sameer Nafdey in his blog post regarding accuiring a network connection within a Blackberry Application. However I recently noticed that my application was using the cell network even when a WiFi connection was available. I realized this was the case when we tested the application on a Torch with no SIM card and the app failed. After some debugging I found that:
if (WLANInfo.getWLANState() == WLANInfo.WLAN_STATE_CONNECTED){...}
was returning false despite the fact that the WiFi network was setup correctly (I was able to use web browser to visit Google). We had to return the Torch but while debugging the app in the simulator I noticed that if WiFi was on but the Data Network was turned off then this call would work correctly. However I would then get an exception (java.io.ioexception: Radio is off) when executing this block:
httpConnector = (HttpConnection)Connector.open(URL);
httpConnector.setRequestMethod(HttpConnection.GET);
httpConnector.setRequestProperty("Content-Type", "text/plain; charset=UTF-8");
in = httpConnector.openInputStream();
I've seen a lot of issues related to the Torch's WiFi connectivity problems but I'm currently concerned that this behavior may also be affecting other models. Anyone seen anything like this or have an idea of how to fix it?
You could try:
if( RadioInfo.areWAFsSupported( RadioInfo.WAF_WLAN )
&&
( RadioInfo.getActiveWAFs() & RadioInfo.WAF_WLAN ) != 0
&&
CoverageInfo.isCoverageSufficient( 1 , RadioInfo.WAF_WLAN, false) )
{ ... }
It's been working so far, on Blackberry OS 6.0 (Torch 9800). Tested on device and sim.

Resources