How to use the Beaglebone Black's PRUs? - beagleboneblack

I would like to use the PRUs of my Beaglebone Black. I 've been following several tuturials from the internet whithout any significant success. Most documents are outdated because the bone_capemgr is no longer supported (at least from what I understood). So how can I use the PRUs in combination with one of the Linux systems provided by beaglebone.org?

A programmable real-time unit (PRU) is a fast (200-MHz, 32-bit) processor with single-cycle I/O access to a number of pins and full access to the internal memory and peripherals of the BeagleBone (http://beagleboard.org/pru).
There are many tutorials online for PRU coding on newer beaglebone images.
I would suggest you to get familiar with the PRU Libraries and then try some simple and recent blink code or follow some example of remoteproc usage.

Related

Formatting an eMMC to SD format

I've been working with a Micron BGA eMMC chip and prototyping a communication scheme with the eMMC chip inside an adapter board that connects to the GPIO pins of a TI microcontroller.
I've essentially created a communication scheme written in C code to walk through the initial handshake and initialization steps to get the eMMC to a Data Write/Read stage where I can write some small amounts of bytes to a part of the sector memory and read back the pattern I've written.
My next task is to format the eMMC into a partition format such as a FAT32 format, which is common among SD cards.
Does anyone know of any useful software or methods I could use to achieve this?
Or I've seen it's possible to format the eMMC using a Linux setup as well, but have little experience when it comes to Linux.
Any insight from anyone with past experience on the topic would be greatly appreciated!
If your system runs Linux that is the best option.
It is easy to format an eMMC using linux.
# mkfs -t fat32 /dev/mmcblk1
Is your TI microcontroller running Linux? If not, it might be diffiult connecting your BGA eMMC chip to a Linux system to format it.
Your second best option is to use a library that already supports it. Maybe something like (http://elm-chan.org/fsw/ff/00index_e.html) or (https://github.com/ryansturmer/thinfat32). There are several options I have not used any of them. To use these layers, you have to fulfill the lower level api.

can i use an eye detecting opencv code on microcontroller?

i want to do a project which uses eye tracking, is it possible to port an open cv code on a microcontroller.
i am new to opencv as well as microcontroller so can any one tell me if it is possible to make a code which works like this vedio.
http://www.youtube.com/watch?feature=endscreen&v=eBtpKAja-m0&NR=1
Q: Can i use an eye detecting opencv code on microcontroller?
A: Yes, you can
Q: Is it possible to port an open cv code on a microcontroller
A: OpenCV is already in the Unix and Android platform. The easiest approach therefore will be to get hold of some embedded device with ARM. There are a lot of help available for the 'OpenCV-ARM' combination.
Beagleboard and RasberryPi are the cheapest embedded ARM devices available for less than $150. Sometimes they come preloaded with Unix boot system and opencv2.0. Thus it would be so easy to run the executable that you created in the computer system.
Be aware of the speed of the processor. If your algorithm is computationally intensive then you wont be quiet satisfied with the output being obtained in the low-end embedded devices.
If some ARM embedded Linux board can fit into your definition of microcontroller, then there is nothing to port.
http://www.google.com/search?q=opencv+arm

OpenCV with 2 cameras VC++

I am importing a source code for stereo visions. The next code of the author works. It takes two cameras sources. I have two different cameras currently and i receive images. Both works. It crashes at capture2. interesting part is that if i change the orders of the webcams(Unplugging them and invert the orders) the first camera it will be the second one. We it doesn't work? I tested also with Windows XP sp3 and Windows 7 X64. The same problem.
//---------Starting WebCam----------
capture1= cvCaptureFromCAM(1);
assert(capture1!=NULL); cvWaitKey(100);
capture2= cvCaptureFromCAM(2);
assert(capture2!=NULL);
Also If i use -1 for paramters the just give me the first one(all the time).
Or any method to capture two camers using function cvCaptureFrom
Firstly the cameras are generally numbered from 0 - is this just the problem?
Secondly, directshow and multiple USB webcams is notoriously bad in windows. Sometimes it will work with two identical camera, sometimes only if they are different.
You can also try a delay between initialising the cameras, sometimes one will lock the capture stream until it is sending data, preventing the other being detected.
Often the drivers assume they are the only camera and make incorrect calls to lock up the entire capture graph. This isn't helped by it being extremely complicated to write correct drivers+fdirectshow filters in Windows
some mother board can not work with some usb 2.0 cameras. one usb 2.0 camera take 40-60% of usb controller. solution is connect second usb 2.0 camera from pci2usb controller
Get 2 PS3 Eyes, around EUR 10 each, and the free codelaboratories.com SDK, this gets you support up to 2 cameras using C, C#, Java, and AS3 incl. examples etc. You also get FIXED frame rates up 75 fps # 640*480. Their free driver only version 5.1.1.0177 provides decent DirectShow component, but for a single camera only.
COmment for the rest: Multi-cam DirectShow drivers should be a default for any manufacturer, not providing this is a direct failure to implement THE VERY BASIC PORPUSE AND FEATURE OF USB as an interface. It is also VERY EASY to implement, compared to implementing the driver itself for a particular sensor / chipset.
Alternatives that are confirmed to work in identical pairs (via DirectShow):
Microsoft Lifecam HD Cinema (use general UVC driver if you can, less limited fps)
Logitech Webcam Pro 9000 (not to be confused with QuickCam Pro 9000, which DOES NOT work)
Creative VF0220
Creative VF0330
Canyon WCAMN-1N
If you're serious about your work, get a pair of machine vision cameras to get PERFORMANCE. Cheapest on the market, with german engineering quality, CCD, CMOS, mono, colour, GigE (ethernet), USB, FireWire, excellent range of dedicated drivers:
http://www.theimagingsource.com

What's the difference between AMD's APP SDK and (AMD) ATI's Stream Technology?

I'm working on a project that will use an AMD GPU for processing data. I noticed AMD has two different SDKs available on their website for using the GPU: ATI Stream Technology and
OpenCLâ„¢ and the AMD APP SDK. It looks like both support OpenCL but I haven't found anything on the site explicitly pointing out why one would use one over the other. What's the difference between these two?
The AMD APP SDK is here: http://developer.amd.com/sdks/AMDAPPSDK/Pages/default.aspx
The website should also answer your question about the difference between Stream and APP:
AMD Accelerated Parallel Processing (APP) SDK (formerly ATI Stream)
It used to be called AMD Stream SDK, they probably renamed it after adding support for non-Firestream hardware (namely OpenCL)
stream is the higher level amd-specific project (hardware and software) that includes opencl as the current software implementation. stream originally used the "brook" language, but switched to opencl in 2011. since then opencl became more popular (because it is a cross-platform standard that has been particularly well supported by apple) and these days amd doesn't seem to mention stream much. you can see this in a link like http://www.amd.com/us/products/technologies/stream-technology/opencl/pages/opencl.aspx where opencl is a "child" of stream (or the menu on the left of that page, where the higher level group is stream; other children are related to hardware).
in short, you want opencl. and despite the confusing mess that is amd's site, their opencl implementation is pretty solid.
hmmm. re-reading your question you seem to say there are two separate sdks. do you actually drill down to two different packages? my understanding is that opencl is the stream sdk. if you have found two different sdks (that are both current) can you link to them?

Can I use openCL in a application that I distribute to non developer machine?

I recently started to learn how to use openCL to speed up some part of my code. So far the speed gain is impressive. In one case the code ran up to 50X faster than on the CPU. However I wonder if can start using this code in a production environnement. The reason is that the first time that I tried to run the example code, nothing worked. I was able to make it run by downloading the driver on the Nvidia openCL SDK download page (I have a Geforce GTX260). It gave me a blue during installation but after that I was able to run the example program and create my own code.
Does the fact that it didn't work "out of the box" for me mean that the mainstream drivers does not yet support it, despite the fact that it is specifically written that it does on the driver download page? What about ATI support? Will everyone have to download the special driver that gave me a blue screen on install?
In short, is openCL ready for production code?
If someone can give me some details, I'd like to know. Does anyone has been able to run a simple program on a number of different device without installing anything SDK related?
You may find an accurate answer on the OpenCL forums on the Khronos Group message boards. The OpenCL work group hangs out there regularly.
Does anyone has been able to run a
simple program on a number of
different device without installing
anything SDK related?
Nop. For instance, on ATI's GPUs end-users need to install ATI Stream SDK in order to run OpenCL code (just having an up-to-date graphics driver is not sufficient).
You may want to consider trying DirectCompute (Microsoft's version of GPU programming) or doing your OpenCL work on a Snow Leopard Mac. Those are the two ways (that I know of) that you can deliver a GPU programming solution to another user without any driver or other installation hassle.

Resources