How to load two images onto a zynq zedboard - opencv

I'm trying to stitch two images on fpga using xilinx zedboard zynq7000. I couldn't find any material on how to dump two images onto the board and then get the output placing the images side to side. Any leads are greatly appreciated.

That board has arm processors, typically running linux. So at least you won't have any problems in getting images into the board, either with gigabit ethernet, or on sd card, or on memory stick in usb otg port. You don't really want to implement that channel yourself on fpga side, it would be just a waste of time.
To process images using fpga part (assuming that's the point of the task), you'll have your fpga-hw part connected to arm system via AXI interface, and memory-map it into linux application's address space.
You don't need entire images saved into fpga hw memory blocks, as probably pics will be too large to fit into available fpga resources, and because fpga can access linux side memory (big sdram) via fpga-to-sdram bridge.

Related

How to access Xilinx Axi DMA from Linux?

I'm a software developer but I'm a newbie to embedded software development.
I have a Zynq Ultrascale board that has an Axi DMA in its Hardware and I want to access this DMA from Linux.
I know I should use DMA-Engine to Access DMA in Linux and I found the following link which is the Xilinx DMA driver, but I can't add these files to my qt project without any errors and I received file(header file) not found errors.
drivers/dma/xilinx/xilinx_dma.c
I have a piece of scattered information about the DMA driver, Device tree, and DMA-Engine but I know nothing about how to utilize these to access hardware DMA.
I built a Petalinux project and add DMA-Engine and DMA Test client to its kernel.
I don't know adding DMAEngine to the Petalinux project is enough or I should have a driver as well.
I don't know adding hardware specification (by .xsa file and .bit file) to the Petalinux project is enough or I should add a device tree to my Linux for detecting DMA as well
I lookup a step-by-step tutorial on how to set up Linux and qt creator for accessing DMA,
or at least a clear roadmap to my target.
thank you in advance.
First of all, you are facing errors when adding xilinx_dma.c to the Qt project because this file is meant to be compiled as part of kernel or as a kernel module.
Adding DMA Engine to Petalinux is not enough to work with DMA from user space. DMA Engine only provides a standardized API to let different DMAs be integrated into kernel. You need to add a client driver as well. Xilinx, as far as I know, has provided a simple client driver called DMA Proxy Driver. It also includes some simple examples that show how you can access DMA from the user space. However, if your application needs high bandwidth, you probably need to consider other options.
There is also an open source client driver for Axi DMA which achieves higher bandwidths compared to Proxy DMA Driver. It's user space API also allows you to register a callback function to be called whenever a transaction is finished.
The third option is to implement the driver in the user space. This can be done by defining the DMA as a UIO device in the device tree and access its register map directly from the user space. In this case, you need to allocate some contiguous memory blocks in the kernel space to avoid complications with MMU, which cannot be dealt with from the user space.

Comparison between USB and Mini PCIe Interfaces

I'm deciding between the MiniPCIe and USB accelerators for a home Linux CCTV project. The host has both USB3 and a MiniPCIe socket. The host's physical environment will range from an ambient 20C up to a potential 35C (during the summer).
I'm struggling to determine the pros and cons for each. I have gotten this far, although many are guesses:
USB:
Supports Windows and MacOS as well as Linux
Appears to have greater mindshare/use/community support on the Internet
External so can be placed to optimise heat dissipation
Heatsink
Two manual performance modes, highest requires ambient temp of max 25C
Can use up to 4.5W (900mA # 5V)
Mini PCie:
Cheaper (25%)
Lower power consumption (1.4W for 416 fps)
Automatic thermal throttling via driver
Relies on host system for active cooling
Will maintain max operation at 85C
There's probably many I've missed. In particular I can't determine if there's any limitations on throughput/capacity using USB vs PCIe. If there is no difference, then I suspect the USB form factor is the better option, if only for the mindshare, although the power usage/heat generated may be a concern.
To whittle this down to an actual question: in what cases would the Mini PCIe interace be a preferred option to the USB one?
If you are looking for a plug&play solution, then I definitely suggest the USB Accelerator. Overall, as long as you have the system requirements then it'll always works (maybe with some modifications to the standard linux configs like adding your user to the plugdev group, ...). Then the software for the CCTV is all up to you :)
PCIes sometimes need extra works like adding extra kernel arguments and modules to keep the pcie modules happy. If you are looking to launch a huge product where volumes are expected, then it is worth investigating it since it's cheaper and more compact. However, the power usage is a must for consideration as the USB Accelerator could uses up to 900mA, so that could play a factor.
May I know what host are you trying to attach the accelerators to?

Multiple usb cameras on Raspberry Pi using OpenCV

I would like to connect 2 usb webcams to a RaspberryPI and be able to get at least 1920 x 1080 frames at 10 fps using OpenCV. Has anyone done this and knows if this is possible? I am worried that the PI has only 1 usb bus?? (usb2) and might get a usb bandwidth problem.
Currently I am using an Odroid and it has a usb2 and usb3 bus so I can connect 1 camera to each without any problemo..
What i have found in the past with this is no matter what you select using OpenCV for bandwidth options the cameras try to take up as much bandwidth as they want.
This has led to multiple cameras on a single USB port being a no-no.
That being said, this will depend on your camera and is very likely worth testing. I regularly use HD-3000 Microsoft cameras and they do not like working on the same port, even on my beefy i7 laptop. This is because the limitation is in the USB Host Bandwidth and not processing power etc.
I have had a similar development process to you inthe past though, and selected an Odroid XU4 because it had the multiple USB hosts for the cameras. It also means you have a metric tonne more processing power available and more importantly can buy and use the on-board chip if you want to create a custom electronics design.

Sending JPEG image into AXI4 stream and reading it back?

I'm doing an image processing project on Zedboard Zynq evaluation board, using the FPGA built on it. I have written the image processing block using HLS and created the IP with both input and output as AXI4 streams with width 8.
How do I read a JPEG image on my PC and send it as an AXI4 stream to this IP block, and output it back to show it on my PC screen ?
Are there any existing IPs which accomplish this ?
P.S. The FPGA board is connected to my PC via JTAG cable, in case it's relevant.
The exchange of image data between the programmable logic (PL) and the processing system (PS) of the Zynq, can be established using direct memory access (DMA)/video direct memory access(VDMA).
This functionally is provided by Xilinx as an IP core. This IP core implements the receiving and transmitting of image data on PL side as an AXI stream.
On PS side the DMA can be made accessible by using the linux UIO. For this purpose you have to modify the device tree node of the DMA IP core in the device tree of the ARM core. If this is done, the DMA is available under /dev/ in the linux system.
Now it can be mapped to the user space using mmap(). By configuring the DMA, a memory area in the RAM of the PS has to be assigned to it. This memory area is used to implement a so called stream buffer. The DMA core uses this stream buffer to read or write image data. At the same time a linux application can access this memory area. This allows exchange of data between PS and PL.
A detailed description of the individual registers and the configuration procedure can be found in Xilinx's AXI DMA/VDMA product guide.
As far as the image data is available in the user space, the Ethernet connection could be used to send the image to the host PC. The JTAG connection is not the proper way to exchange image data between a host PC and the Zed board.

Laptop to desktop memory RAM adapter reliability

I have recently came across an adaptor that would allow me to use laptop memory on my desktop. See item below:
http://www.amazon.co.uk/Laptop-Desktop-Adapter-Connector-Converter/dp/B009N7XX4Q/ref=sr_1_1?ie=UTF8&qid=1382361582&sr=8-1&keywords=Laptop+to+desktop+memory
Both the desktop and the laptop use DDR3.
My question is, are this adapters reliable?
I have 8 GB available and I was wondering if they could be put to use in my gaming rig.
The desktop is an i7 machine generally used for gaming and some basic development.
The adapter should be reliable based on how it looks. There is not much to it only that it extends the "mini" RAM block to a bigger one. You can make the analog with A-B USB cables.
What you should also consider is if both RAM devices use the same frequency and possible heat issues as you will have to cool down the laptop memory more that if it was desktop size. This is because a lot of current goes trough smaller size compared to the desktop based RAM blocks. Then again you have the extension board to handle and disperse some of the heat so if you are not having some really extensive RAM operations you should be fine but you should check what is the working frequency on both of them. For example if the laptop one is faster than the maximum one your computer can support then you won't get that faster performance and the RAM block will work with the frequency of the system bus but if it is slower then the system bus will work on that frequency.
Use standard things on this module as reference to calculate the width. Measure it on image and scale to a reference item and check on your system. Use contacts or the lock in grooves to do the scaling since they are of standard dimensions on all modules. Or the module length...

Resources