Dronekit control over mission planner - image-processing

Hi I'm working on a project which uses a pixhawk 2.1 cube connected with a raspberry pi 4b using the MavLINK protocol and dronekit libraries. Is it possible for dronekit to pause mission planner on auto mode and take over control over the drone, and again switch to RTL mode.
I considered downloading the mission from mission planner and get waypoints and command the pixhawk from the raspberry pi. But it seems to be too heavy for the pi as it also has to do image processing which takes up a lot of processing power.

Related

Multiple usb cameras on Raspberry Pi using OpenCV

I would like to connect 2 usb webcams to a RaspberryPI and be able to get at least 1920 x 1080 frames at 10 fps using OpenCV. Has anyone done this and knows if this is possible? I am worried that the PI has only 1 usb bus?? (usb2) and might get a usb bandwidth problem.
Currently I am using an Odroid and it has a usb2 and usb3 bus so I can connect 1 camera to each without any problemo..
What i have found in the past with this is no matter what you select using OpenCV for bandwidth options the cameras try to take up as much bandwidth as they want.
This has led to multiple cameras on a single USB port being a no-no.
That being said, this will depend on your camera and is very likely worth testing. I regularly use HD-3000 Microsoft cameras and they do not like working on the same port, even on my beefy i7 laptop. This is because the limitation is in the USB Host Bandwidth and not processing power etc.
I have had a similar development process to you inthe past though, and selected an Odroid XU4 because it had the multiple USB hosts for the cameras. It also means you have a metric tonne more processing power available and more importantly can buy and use the on-board chip if you want to create a custom electronics design.

ALSA, TinyALSA support in Android Things for Raspberry Pi 0.5.1-devpreview

The 0.5.1-devpreview BSP for RPI3 comes with libtinyalsa.so, libalasautils.so but seemingly no adb shell commandline support for audio.
We are designing a custom audio board (with audio processor) for use with Android Things and Raspberry Pi and we would typically use ALSA utilities and custom kernel drivers for accessing this board under Raspian.
It is possible the default Android Things I2S peripheral drivers and Peripheral Manager support the stream interfaces we need (the same way the VoiceHat drivers were wrapped), but we have little to no information on the default drivers in the RPI3 BSP, and we don't have any information on how to override the default drivers in Android Things without a distro rebuild.
Seems silly to write a Native C++ low-level peripheral driver when so many audio processor companies already provide ALSA-ready ASoC drivers for device source tree use.
Best practices for writing your own audio driver for Android Things?
The VoiceHat driver is one example of how to do a userspace audio driver.
If you're using a custom audio board, you should be aware of the audio chip the board uses. Looking at that chip's datasheet, you should be able to use the same peripheral I/O (UART, GPIO, I2C, SPI) to configure the connection and read/write data over the I2S bus.
In the Google Assistant sample, the app registers the VoiceHat at the beginning of the activity and unregisters it at the end of the activity.

Send data from PC to Raspberry

I'm wondering if I can do something like this:
Do some image processing with opencv on my pc, do some math and send data to RaspberyPi to PID controller and then control servos, in real time.
UART wolud be the best connection?
In principle you can use any means to communicate from the PC to the Raspberry PI that are available (UART, ethernet, etc.).
However, you just have to be careful about any time requirements you have in the system you are controlling and check whether the communication rate is suitable.
For instance, you can use 9600 baud UART to temperature control, as the dynamics of such systems are usually slow. If your servos control an inverted pendulum, then the communication speed might make it impossible to control.

Laptop to desktop memory RAM adapter reliability

I have recently came across an adaptor that would allow me to use laptop memory on my desktop. See item below:
http://www.amazon.co.uk/Laptop-Desktop-Adapter-Connector-Converter/dp/B009N7XX4Q/ref=sr_1_1?ie=UTF8&qid=1382361582&sr=8-1&keywords=Laptop+to+desktop+memory
Both the desktop and the laptop use DDR3.
My question is, are this adapters reliable?
I have 8 GB available and I was wondering if they could be put to use in my gaming rig.
The desktop is an i7 machine generally used for gaming and some basic development.
The adapter should be reliable based on how it looks. There is not much to it only that it extends the "mini" RAM block to a bigger one. You can make the analog with A-B USB cables.
What you should also consider is if both RAM devices use the same frequency and possible heat issues as you will have to cool down the laptop memory more that if it was desktop size. This is because a lot of current goes trough smaller size compared to the desktop based RAM blocks. Then again you have the extension board to handle and disperse some of the heat so if you are not having some really extensive RAM operations you should be fine but you should check what is the working frequency on both of them. For example if the laptop one is faster than the maximum one your computer can support then you won't get that faster performance and the RAM block will work with the frequency of the system bus but if it is slower then the system bus will work on that frequency.
Use standard things on this module as reference to calculate the width. Measure it on image and scale to a reference item and check on your system. Use contacts or the lock in grooves to do the scaling since they are of standard dimensions on all modules. Or the module length...

how to connect nexys2 FPGA with camera? Driver issue

My project is to capture images and process them to move a wheelchair accordingly. I am using Nexys2 FPGA board for this purpose. Nexys2 has a usb port and the camera is also a usb camera. but i dont have the drivers in verilog which will make nexys2 and the camera communicate with each other. Please help me ill be very grateful.
Well, if you manage to write a driver for a USB camera in VErilog, you can sell that for a lot of money :)
Well, sarcasm aside, there is NO WAY you can access a USB camera in Verilog, unless you have a USB host implemented in your FPGA and have a CPU controlling it and have a SW driver for that camera.
There are alternatives to this, you can buy a camera which has an FPGA "friendly" connector like this one:
5 Mega Pixel Digital Camera Package
Which comes with the Verilog code that you can use in your project.
Sadly, the USB port on the Digilent Nexus 2 board does not have host capabilities, it can only act as a USB slave. The USB connection on the board is used for powering the board and configuring the FPGA and other onboard peripherals.
The newer Nexus 3 board has a second USB port however it has the same issue in that it can only act in slave mode. Also due to the configuration can only be used for mouse and keyboard input.

Resources