Wifi Hight LAtency and Packet Loss - But No interference - wifi

I am using Wifi points (SSID name: "ve-E" and "ve-E1")(both 2.4Ghz)
After a through scan, in order to avoid Interference, I assigned a different channels to them (6, 7).
But occasionally I observe a high latency and packet loss.
Below Figures Shows the latency (varying between 4 to 5000 ms !!)
-- Can some one help me, any advanced techniques to find out where its going wrong , so that I can reduce the packet loss. --
At the same instance i checked if there was any interference, the scanner(below image) shows there is no interference.

Jut saw this article
"Bear in mind that tools like inSSIDer that plot which SSIDs cover which channels/frequencies at which power levels are designed to LOOK like RF spectrum analyzers, but they are not true RF spectrum analyzers. They take normal Wi-Fi network scan results lists and plot them. They are not capable of detecting or plotting non-Wi-Fi energy, so they'll never show the interference from microwave ovens, cordless phones, baby monitors, or any other non-Wi-Fi device that uses the 2.4GHz or 5GHz bands.
"
Ref:: https://superuser.com/questions/973209/wi-fi-interference-and-loss-of-ssids-in-scans.
Looks like my scanner might miss to scan the routers which have isolated its ssid , the nearby microwave, any hidden Bluetooth devices etc ...

Related

How can I estimate the power consumption of a BLE module?

I'm writing an iOS app for a device with a BLE module that advertises a few bytes of data on a consistent basis while it's connected. We are trying to estimate the power consumption of the BLE module so we can estimate the battery life for the device. I've scoured SO and Google looking for the appropriate way to estimate this, but I'm coming up empty. Is there a way to take the number of bytes that are being sent, multiplied by the frequency with which the data is sent and come up with a rough approximation of power consumption?
A typical BLE SoC (i.e. a all-in-one Application + Radio chip) typically consumes:
A few hundreds nA while in deep sleep,
2 to 10 µA while a RTC tracks time (needed between radio events while advertising or connected),
10 to 30 mA while CPU or Radio runs (computing data, TX, RX). RX and TX power consumption is roughly the same.
Life of a BLE peripheral basically consists of 3 main states:
Be idle (not advertising, not connected). Most people will tell your device is off. Unless it has a physical power switch, it still consumes a few hundred nanoamps though.
Advertise (before a connection takes place). Peripheral needs to be running approximatively 5 ms every 50 ms. This is the time when your device actually uses most power because advertising requires to send many packets, frequently. Average power consumption is in the 1-10 mA range.
Be connected. Here, consumption is application-dependant. If application is mostly idle, peripheral is required to wakeup periodically and must send a packet each time in order to keep the connection alive. Even if the peripheral has nothing useful to send, an empty packet is still sent. Side effect: that means low duty cycle applications basically transmit packets for free.
So to actually answer you question:
length of your payload is not a problem (as long as you keep your packets shorts): we're talking about transmitting during 1 µs more per bit, while the rest of the handling (waking up, receiving master packet, etc. kept us awake during at least 200 µs);
what you actually call "continuous" is the key point. Is it 5 Hz ? 200 Hz ? 3 kHz ?
Let's say we send data at a 5 Hz rate. Power estimate will be around 5 connection events every second, roughly 2 ms CPU + Radio per connection event, so 10 ms running every second. Average consumption: 200 µA (.01 * 20 mA + .99 * 5 µA)
This calculation does not take some parameters into account though:
You should add consumption from your sensors (Gyro/Accelerometers can eat a few mA),
You should consider on-board communication (i2c, SPI, etc),
If your design actually uses two chips (one for the application talking to a radio module), consumption will roughly double.

methods to avoid fluctuating signal strength in trilateration

I am trying to create a wifi trilateration project using 3 Raspberry pis.
I can capture packets from all 3 pi's to a web server but even if a mobile device stays in one spot, i will still get wild fluctuation in signal strength results.
I have been researching but can't really find a solution to getting a consistent signal strength.
I have seen the fingerprinting method but i would like to avoid it, since it requires a lot of setup and when moved has to be re calibrated.
If you could point me in the right direction, i would appreciate it.

Creating synchronized stereo videos using webcams

I am using OpenCV to capture video streams from two USB webcams (Microsoft LifeCam Studio) in Ubuntu 14.04. I am using very simple VideoCapture code (source here) and am trying to at least view two videos that are synchronized against each other.
I used Android stopwatch apps (UltraChron Stopwatch Lite and Stopwatch Timer) on my Samsung Galaxy S3 mini to realize that my viewed images are out of sync (show different time on stopwatch).
The frames are synced maybe in 50% of the time. The frame time differences I get are from 0 to about 300ms with an average about 120ms. It seems that the amount of timeout used has very little effect on sync (same for 1000ms or 2000ms). I tried to minimize the timeout (waitKey(1) for the OpenCV loop to work at all) and read every Xth iteration of the loop - this gave worse results that waitKey(1000). I run in FullHD but lowering resolution to 640x480 had no effect.
An ideal result would be a 100% synchronized stereo video stream that has X FPS. As I said I so far use OpenCV to view video still images, but I do not mind using anything else to get the desired result (can be on Windows too).
Thanks for help in advance!
EDIT: In my search for low-cost hardware I fount that it is probably possible to do some commodity hardware hacking (link here) and inject a single clock signal into multiple camera modules simultaneously to get the desired sync. The guy who did that seems to have developed his GENLOCKed camera board (called NerdCam1) and even a synced stereo camera board that he now sells for about €200.
However, I have almost zero ability of hardware hacking. Also I am not sure if such clock injection is possible for resolutions above NTSC/PAL standard (as it seems to be an "analog" solution?). Also, I would prefer a variable baseline option where both cameras would not be soldered on a single board.
It is not possible to stereo sync two common webcams because webcams lack external trigger feature that lets one precisely sync multiple cams using a common trigger signal. Such trigger may be done both in SW or HW but the latter will give better precision. Webcams only support "free-running" mode and let you stream whatever FPS they support but you can not influence when exactly the frame integration/exposure is done.
There are USB cameras with a dedicated external trigger feature (usually scientific cameras like Point Grey) - they are more expensive (starting at about $300/piece) than webcams but can be synced. If you really are on low budget you can try to hack the PS3 Eye camera to get the ext. trigger feature.

Capabilities of Samsung Galaxy S2 Wi-Fi network card

I am doing a research and trying to find the distance between 2 samsung S2 galaxy phones, using Wi-Fi, by measuring RTT.
For that, in order to get the highest accuracy, I need to access the network Phy, and see the exact time the packet has left one cellphone, and the exact time it arrives back, before it has been processed in the LAN card (again, I need a very high accuracy).
Is it possible? Did someone succeed in accessing the LAN card physical layer on samsung S2 galaxy?
BTW - my cell phones are "rooted".
Thanks in advance,
Tzach
It's not too feasible at SW level. There are several factors that severely limit your ability to perform such measurements:
Data is being propagated at speed of light, any timer that you might have access to, is no match for such resolution.
You need to do at least some processing to understand that the data has arrived; similarly, there is a delay from when you measure transmission start till the packet actually leaves to the air. These delays are much more significant than the actual time of travel in the air.
Also see a related question: WI-FI 802.11 speed depending on distance
Thanks a lot for your quick answer. I worked a lot on this ability at SW level, and as you have written - it is not feasible. My question is about testing this ability at HW level ("talk with the PY"). Do you know if it can be done with Samsung S2 galaxy?

Monitor packet losses using Wireshark

I'm using Wireshark to monitor network traffinc to test a new software installed on a router. The router itself lets other networks (4g, mobile devices through usb etc) connect to it and enhance the speed on that router.
What I'm trying to do is to disconnect the connected devices and discover if there are any packet losses while doing this. I know I can simply use a filter stating "tcp.analysis.lost_segment" to track down lost packets, but how can I eventually isolate the specific device that causes the packet loss? Or even know if the reason was because of a disconnected device when there is a loss?
Also, what is the most stable method to test this with? To download a big file? To stream a video? Etc etc
All input is greatly appreciated
You can't detect lost packets solely with Wireshark or any other packet capture*.
Wireshark basically "records" what was seen on the line.
If a packet is lost, then by definition you will not see it on the line.
The * means I lied. Sort of. You can't detect them as such, but you can extremely strongly indicate them by taking simultaneous captures at/near both devices in the data exchange... then compare the two captures.
COMPUTER1<-->CAPTURE-MACHINE<-->NETWORK<-->CAPTURE-MACHINE<-->COMPUTER2
If you see the data leaving COMPUTER1, but never see it in the capture at COMPUTER2, there's your loss. (You could then move the capture machines one device closer on the network until you find the exact box/line losing your packets... or just analyze the devices in the network for eg configs, errors, etc.)
Alternately if you know exactly when the packet was sent, you could not prove but INDICATE its absence with a capture covering a minute or two before and after the packet was sent which does NOT have that packet. Such an indicator may even stand on its own as sufficient to find the problem.

Resources