Wifi router (N 600) emitting both 2.4ghz and 5ghz.
- due to interference in 2.4 ghz bandwidth, if the packet loss arises.
- does it lead to instability of its own hardware which ultimately lead to degrade of the signal quality for the devices connected to 5gh antenna.
No. Issues in 2.4 GHz will not cause inferior service in 5GHz for a dual-band WiFi Router.
2.4 Ghz and 5Ghz can be considered as two independent pipes going to the Wifi-router. They are two independent radios and use different frequencies, so they wont interfear at all.
If you are experiencing a problem in 5 GHz, I suggest you to change the 5 GHz channel used. There are many smart-phone applicaitons that helps to scan for wifi signals.
Hope it helps
-- While my wifi router(disable with 5Ghz) was facing the problem with interference on 2.4Ghz band width, with huge packet loss, i was trying to change its channel, but could not log into !
--While trying to open its admin page it was not loading.
--I started to ping its ip from Wired connection , there i could realize its showing a huge packet loss.
-- Iam afraid if i enable the 5Ghz bandwidth , the issue due to 2.4Ghz packetloss will effect the devices connected to the same hardware via 5Ghz band width !!
Related
So i started making test streams on my youtube channel using Streamlabs OBS.
I turned on performance mode, looked at the stream but it was like 2 fps.
I looked at task manager and my network usage was NEVER going past 1%, and streamlabs's network usage rarely went past 0.1 mbps.
This happens with other things too, and i don't like it since it makes my internet so slow. Internet (if you're wondering): Verizon Fios 5Ghz connection.
The percentage simply shows the relation between the current network usage and the link speed of your network adapter. For example, if your link speed is 1 Gbps and you're transferring data at 10 Mbps, the network usage will be 1%.
When transferring data over the Internet, the percentage is generally not very useful, because the maximum speed is defined by your ISP and, in your case, is likely to be a lot lower than your adapter speed. Furthermore, your Internet speed can also be degraded by a poor Wi-Fi signal, by other users in the same network, etc.
What you should actually be looking at is the actual speed (in bit/s) at which you are sending data (look at the "Send" field in the performance tab of the task manager) and compare that to your Internet speed (which you can learn by doing a speed test).
I have been looking into LoRaWAN for a low cost waterproof asset tracker I am looking at building.
AFAIK, the primary benefits of LoraWAN over say LTE-M or cellular are: no connectivity costs and potentially lower power consumption.
What I'm wondering is: why can't we use our own cellphones as the "base station" that the IOT device talks with? We can do this with bluetooth and WiFi, why not cell? Is it the LTE protocol that prevents it? Physics? What am I missing?
There's quite a few architectural reasons why Peer-to-Peer LTE isn't feasible, but the largest is probably the fact that in LTE the uplink and downlink use different modulation techniques.
In the downlink (the connection from the Base stations (eNodeBs) to the User Equipment (our mobile phones)) Orthogonal Frequency Division Multiplex (OFDMA) is used, this means the phone listens out onto the RF interface for the OFDMA signal.
This works well, OFDMA is a great way of encoding the data onto the air interface, but it has a very high peak-to-average-power ratio, this means if the UEs used OFDMA in the Uplink (From the UE to the eNodeB) they'd have awful battery life.
Instead in the Uplink LTE uses Single Carrier Frequency Division Multiple Access (SC-FDMA), which is much more power efficient and allows you talk all day, so the eNodeBs listen on their RF interface for the SC-FDMA modulated traffic.
This means our UEs (Mobile phones) use one type of modulation to send and a different modulation scheme to receive, so they can't talk directly to one another as they can't send OFDMA modulated data, only receive & visa-versa.
Some more reading on OFDMA & SC-FDMA.
The LTE relay interface inducted as part of Release 10 allows the deployment of relay nodes (a kind of low cost eNB) that are fixed and that use in-band LTE to extend the coverage of standard eNodeBs by one hop, improve signal quality and to increase the network capacity. Relays can be placed such that it converts the long single hop into two shorter hops.
However the approach of using UE seems have many challenges as it can make UE to get bit loaded with more functional changes across layers(MAC, PHY, RRC, NAS) as it has to take additional functionalities from Relay nodes/eNB as well ranging from lower layer signalling, co-ordination, mobility to forwarding. Also, there might be additional power consumption and change in antenna to support the same which all will add to more cost of UE.
I would like to connect 2 usb webcams to a RaspberryPI and be able to get at least 1920 x 1080 frames at 10 fps using OpenCV. Has anyone done this and knows if this is possible? I am worried that the PI has only 1 usb bus?? (usb2) and might get a usb bandwidth problem.
Currently I am using an Odroid and it has a usb2 and usb3 bus so I can connect 1 camera to each without any problemo..
What i have found in the past with this is no matter what you select using OpenCV for bandwidth options the cameras try to take up as much bandwidth as they want.
This has led to multiple cameras on a single USB port being a no-no.
That being said, this will depend on your camera and is very likely worth testing. I regularly use HD-3000 Microsoft cameras and they do not like working on the same port, even on my beefy i7 laptop. This is because the limitation is in the USB Host Bandwidth and not processing power etc.
I have had a similar development process to you inthe past though, and selected an Odroid XU4 because it had the multiple USB hosts for the cameras. It also means you have a metric tonne more processing power available and more importantly can buy and use the on-board chip if you want to create a custom electronics design.
I'm working on an engineering project where I want a go-kart to maintain a direct connection with a base station. The base and go-kart can be separated by about a half mile (with lots of obstacles in between) which is too far for WiFi.
I'm thinking about using 3G/4G to directly connect the two. Does anyone have any resources or ideas that might help?
Or, alternatively, a better way to connect them? I'm just trying to send some sensor data (pretty low bandwidth) in real-time.
The biggest problem you face is radio spectrum that you are allowed to use. All 3G/4G spectrum is licensed to some firm and they get really unhappy (e.g. have you hunted down and fined) when you transmit in their space.
I did find DASH7 which
is an open source wireless sensor networking standard … which operates in the 433 MHz unlicensed ISM band. DASH7 provides multi-year battery life, range of up to 2 km, indoor location with 1 meter accuracy, low latency for connecting with moving things, a very small open source protocol stack …
with a parts cost around US$ 10. This sounds like it satisfies your requirements and keeps the local constabulary from bothering you.
You could maybe use SMS, between a modem on the kart and a mobile phone or modem at the base.
A mobile data connection like a telephone call isn't possible directly between the two; you have to make a data connection from the kart to a server in your operator's core network, identified by the APN. Then you can access IP addresses as for a normal internet connection - so the base computer would have to be a web server.
I have a project that uses Rocketport Infinity 16 ports to receive data from 6 different anemometers (wind speed measurement devices) (RS422, 50Hz, 38.4k baud, 47 bytes per record). When I use 32Hz and 9600 baud, everything is alright, however, when I change to 50Hz, some of the data isn't received. I tried to use USB instead of the Rocketport Infinity with no luck.
So, apart from the anemometer failing, I suspect the following explanations for the data loss:
For the Rocketport Infinity, I opened all 16 ports, but only connected 6 of them, I suspect the maximum data throughput is to high when I switch to 50Hz.
The IRQ switch speed is too high for the com port to operate properly.
Is there any other possible reason? Please correct me if I'm mistaken.
Development environment of Receiver : Delphi 6 in Windows XP Professional 32-bit version, with CPort 3.1
The IRQ rate isn't that high and modern machines should have no trouble keeping up with it. I suspect the real problem is your app not processing the received bytes fast enough. Especially when your code also updates a UI in the same thread that receives the data.
Hard to give specific troubleshooting hints because you neither specify a language nor an operating system. But be sure to get your error handling correct. Distinguish between a buffer overflow (app not reading fast enough) and a character buffer overrun (driver not reading fast enough). On Windows that's CE_RXOVER and CE_OVERRUN.
Are there constraints on a serial port with more than 6 devices
connected?
Yes, there are constrains. I assume that you have differential outputs and an I/O receiver with differential inputs. Please, see Balanced differential signals. It is possible that the maximum voltage ratings of the receiver circuits are exceeded.
Each port speed must match corresponding device speed. Please, see other criterias which must be matched.
The IRQ switch speed is too high for the com port to operate properly.
Why do you assume that it would be a problem with your IRQ switch speed? - I would say that you have only scarce IRQ resources.