I have compiled OpenCV in 32-bit mode on MacOS with quicktime turned on and ffmpeg turned off.
My application is just a simple program to display the output of the webcam in a window. The application works perfectly with my built in iSight camera and the macam app works perfectly with my PS3 Eye but I can't seem to get them to work together.
The application works fine with the iSight
capture = cvCaptureFromCAM(0);
but when I try to use the PS3 Eye
capture = cvCaptureFromCAM(1);
every time I try to grab a frame off of the camera
frame = cvQueryFrame( capture );
I get an error message.
startNextBulkRead-ReadPipeAsync: Error: kIOUSBEndpointNotFound - Not found
I have been battling with trying to get OpenCV working with my PS3 Eye webcam for over a week but just can't seem to get it working. When I run the macam app, it takes captures from the camera perfectly so it just seems to be some sort of compatibility/configuration issue. Any help would be appreciated.
I'm facing the same problem (with Mac OS X 10.7.2 and the latest OpenCV). The behavior is totally erratic but the error message appears a lot more often than an image is successfully grabbed from the camera.
The situation for video capture on OS X is complicated, and we lack a decent cross-platform video capture library for real-time applications.
However, I advise you to go with OpenFrameworks. It's a collection of libraries used mainly for interactive art and prototyping. While providing a lot of tools that are not necessarily useful if you just want to do some computer vision, there's OpenCV and a decent video capture facility that, from my experiments, works correctly with the PS3 Eye cam (and with a pretty good framerate).
I had the same problem and I solved it. I did several things so I don't know which one is the correct one, but this is what I did: updating os x to the last version, in my case now it is 10.7.3. Installing the last Xcode (with the apple dev tools and so, qt is the important thing), the version of Xcode I am using is 4.3.2. And compiling opencv to 32bits.
Related
My SpriteKit playground book averaged 15 FPS on my MacBook Pro.
Do playgrounds run slower than iOS device simulations? If I run the same playground book on my iPad Pro, will the FPS limitation be similar? Will other apps opened on my computer limit the speed of playgrounds?
EDIT:
Moving code such as subclasses and extensions to auxiliary code in the "Sources" folder of the playground book allow the simulation to run quicker because the code only compiles once.
On the Mac, Xcode's "Playgrounds" are super useful for quick experiments but, due to their nature, terribly slow for "real" tasks.
If your code is more than a few pages long, and/or involves working with the UI, as you do with SpriteKit, the Playground may become really slow, sometimes even unresponsive.
"Playgrounds" are part of Xcode and run on top of the iOS simulator - that's how they display graphics and UI in the "Assitant Editor". The iOS simulator is not really known to be fast either.
On the other hand, "Swift Playgrounds" on iOS is a completely different app, even if it shares a lot with its Mac cousin.
Most importantly, it runs in iOS on the real device, with the real hardware processing, not emulation, which makes it ideal to use for SpriteKit, as Apple themselves often shows in their demos.
I would therefore say that your code should indeed run faster/better/properly on the iPad version.
Even if of course, I can't really know, since I don't know your code - you will probably be the one telling us later if using the iPad version made a difference.
I've hit a roadblock with using GPUImage. I'm trying to apply a filter (SepiaFilter or OpacityFilter) on a prerecorded video. What I'm expecting to see is the video played back with the filter applied to it. I followed the SimpleFileVideoFilter example for my code. What I ended up with is a video that is unplayable by Quicktime (m4v extension) and the live preview of the rendering all skewed. I thought it was my code at first so I ran the example app from the examples directory and lo and behold I got the same issue. Is the library broken? I just refreshed from master out of GitHub.
Thanks!
Here's a sample output of the video generated
http://youtu.be/SDb9GfVf9Lc
No matter what filter is applied the resultant video are all similar. (all skewed )
#Brad Larson (I hope you see this message), do you know what I can be doing wrong? I am using the latest XCode and source code of GPUImage. I also tried using the latest from CocoaPods as well. Both end up the same.
I assume you're trying to run this example via the Simulator. Movie playback in the Simulator has been broken for as long as I can remember. You need to run this on an actual device to get movie playback to work.
Unfortunately, one of the recent pull requests that I brought in appears to have introduced some crashing bugs even there, and I may need to revert those changes and figure out what went wrong. Even that's not an iOS version thing, it's a particular bug with a recent code addition. I haven't had the time to dig into it and fix it, though.
I downloaded this project to give it a try (http://www.hatzlaha.co.il/150842/Lucas-Kanade-Detection-for-the-iPhone). It also has a released version on AppStore.
When I downloaded the source and compiled, it gave compilation errors. I changed the compiler to LLVM GCC and it instantly compiled without any errors or warnings.
Further here is what I did:
Downloaded the app Lucas Kanade on personal iPhone. Runs as expected (ie. shows tracked points and video output).
Deployed the above compiled app on company iPad. Here is shows the tracked points (which means the application is getting the video frames) but not able to display the video output.
General app flow:
Grab a frame from the camera.
Process the frame (track points).
Output the frame, so the screen looks like its showing the camera feed, as expected.
Device details:
iPhone: OS 5.1.1 (9B206); iPhone 4;
iPad: OS 5.1 (9B176); iPad 2;
Questions
Is the video output problem occurring because of the different in OS versions or because of the GCC compiler being used?
If it is the compiler problem, would be be appropriate to put the compile errors here for resolution or do I need to start a separate Question?
I know this info might not be enough since there are a lot of unknowns, but trust me, getting into the details of the app would be a really exhaustive problem description. Let me know what more info is needed to guess the solution. I'll update it here.
Like you already said the info is not enough. I can just guess. The projects for tracking points uses generally their own driver for camera access. The reason, why it does not work with llvm compiler, might be that.
Just look in the project if you find any file specific to iPhone. They are mostly dat files. And maybe you find some preprocessor macros in the project like #ifdef target_iphone something like that. In that case you can contact the producer and request the driver for iPad.
I did not download and tried the project, but it is just a guess.
My Corona app correctly shows my PNG images when it's running in the Android simulator on PC. When I take the same code and compile on the Mac, my button images work, but other PNGs don't work. It's very simple code I'm working with at the moment, but before I post it, I wondered if there is something about PNGs on the Mac that I don't realize. A path is a path, and if it works on PC it should just work on Mac. Right?
A path is a path, and if it works on PC it should just work on Mac. Right?
Not necessarily. I don't know about Corona in particular, but in general, Windows is case-insensitive while linux & mac are case-sensitive. It's possible that an image called Foo.PNG and being refered to as 'foo.png' works on windows, but not on mac. It depends on how Corona is built.
So, mi advice is: check that you are using the same case in your code and in your filenames & extensions (I recommend all lowercase, but this is a personal preference)
If this doesn't solve your problem, it would be the time to show us some code.
It turns out that indexed PNGs aren't allowed in Corona, which is what I had. However, they worked on the PC / Android simulator. We brought the code over to the Mac and it didn't work, as it should not. So, we simply opened the images, saved as RGB PNG and voila, all the images worked.
I'm working on a BlackBerry app which will be used in cafes and restaurants, and one of the features is QR code scanning
Is there any way to make the camera autofocus the QR code?
Searching, I found FocusControl, which looks like the one I'm looking for. Unfortunately, it's only available since OS 5.0.
I wonder how to achieve the same thing on OS 4.5, 4.6, and 4.7.
Any suggestions?
You can't.
In OSs prior to 4.5, you can launch the native camera app and listen for recently created image files (FileSystemJournalListener). This way, if the BB has autofocus, the image has more definition.
However, there's no way to do this for every BB. Unless you apply some image algorithm of your own after taking the image.