Can not use cvsnakeimage in opencv 2.4 - opencv

I need to use the cvsnakeimage function for active contours but apparently it is not available in opencv2.4.
I have opencv2.4 installed on my PC. I have already linked my VS2010 project to opencv_legacy242d.lib but still cvSnakeImage is undefined.

Don't forget to:
#include <opencv2/legacy/legacy.hpp>
and make sure you are linking with either opencv_legacy242.lib or opencv_legacy242d.lib.
I don't have a Windows box at my disposal, but this worked on Mac OS X.

Related

Does OpenCV need to be compiled before using?

I'm trying to install OpenCV on Windows. Following are how I installed it:
Download OpenCV 2.4.2.exe from sourceforge.
unarchived it.
open Eclipse CDT.
Add C:/opencv/include/opencv to "Includes"
Add opencv_highgui, opencv_core, opencv_ml... to "Libraries"
Create a small project and compiled it.
The compiler complained about "opencv2/core/core_c.h:No such file or directory"...
I remember that when I install OpenCV on Ubuntu, I did compiled the project (it took quite a bit of time). Do I have to do the same thing on Windows? Or is any other thing causing this error?
Thanks.
You should add correct directories to includes.
Since you added C:/opencv/include/opencv, there is no way for compiler to find C:/opencv/include/opencv2/core/core.h.
I believe you should enter C:/opencv/include/ to includes directories as well.

Mono Android Monodroid Native Library DllNotFoundException

I am attempting to get a Native C/C++ library working with Monodroid in the emulator, using DllImport. I am developing mainly in Windows/Visual Studio 2010.
I have built a native C/C++ library (ZeroMQ) using the Android NDK tools, to both x86 and armeabi platforms, using a Ubuntu virtual machine. Is this correct - x86 for emulator and armeabi for the real device? (This is certainly the case on the iPhone/MonoTouch - though in that case statically linked libzmq.a file and DllImport __Internal)
I have added the x86 version of libzmq.so to my MonoDroid project under the directory structure lib\x86\libzmq.so
When I first attempted to build/deploy to the simulator, I got an error 'cannot determine abi type', so I've added x86 to the AndroidNativeLibrary Include="lib\x86\libzmq.so"> project item group. This then deployed.
I have a DllImport for the function to call [DllImport("libzmq"
I've tried libzmq.so, lib/x86/libzmq, lib/x86/libzmq.so etc, but then I call the DllImported method (running in the emulator), I always get a DllNotFoundException.
Can anyone give me some direction?
EDIT: After reading another support answer which states that the emulator uses armeabi .so libraries, I have removed the x86/libzmq and added my lib/armeabi/libzmq.so as an AndroidNativeLibrary. I also removed the project file Abi entry, and indeed the project built and deployed fine.
However, I still get a DllNotFoundException when I try to call a DllImported function. Any ideas?
Many thanks
I don't have any immediate ideas why it isn't working for you; [DllImport("zmq")] should work.
The SanityTests sample exercises the DllImport attribute.
The DllImport: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/Hello.cs#L240
The Android.mk to build libfoo.so: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/jni/Android.mk
Building libfoo.so by calling ndk-build: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/SanityTests.csproj#L82
Including libfoo.so into the .apk: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/SanityTests.csproj#L96
(This is a repeat of my reply to your email in the ZeroMQ mailing list).
This error could be caused by using an incorrect platform target in your Mono project. I'm not sure if Mono/MonoDevelop supports this, but you probably need to create an ARM platform target, as opposed to x86/x64/Any CPU.
You would experience the same errors if you tried to do P/Invoke interop between an x64-targeted assembly with an x86 native library or vice-versa.
If you're using the clrzmq bindings for bridging Mono and libzmq, you will need to create a new platform target for ARM and rebuild the bindings. The solution only defines x86 and x64 platform targets currently.
Ok, I finally got it working! For whatever reason, ZeroMQ doesn't build correctly using the current/latest NDK r7. The instructions at http://www.zeromq.org/build:android use NDK r6. I also downloaded an old version of the NDK r5b (it was the easiest old version to find a download for), and rebuilt ZeroMQ using it. Result, it now works on emulator and phone!

OpenCV 2.3 in Embarcadero C++ Builder

When compiling a OpenCV 2.3 project in Builder I get multiple errors starting with "_fm_atan2l is not a member of 'std'" and continuing with other math related errors in that form. I also get "Multiple declaration of '_Ctraits::_Isnan(double)' and other similar errors. This happens after I simply include the OpenCV header files and thus seems unrelated to anything I have done in the application itself.
The only file I have included so far is "cv.h" in OpenCV's include directory. Am I doing it wrong already or is there maybe something else I have to set up first?
You can download simple project combining 2.3.2 and c++ builder xe2 from my site:
http://www.compvision.ru/forum/index.php?showtopic=763
There are fixed headers for builder, and lib converter in archive.
There are also .lib files in archive, but it'll be better if you make them by yourself from original .lib files contained in your opencv distribution using LibConverter.exe utility.
And there is some strange thing: some dll files need to be renamed to something like .dl or .d. Compiled program will prompt you about it.
you can correct OCV atan2 issue with bcc32, including fastmath in std namespace (for more info see: https://forums.embarcadero.com/message.jspa?messageID=363384 [^]).... but more other issues are there after ...
Until now I'm unable to build OCV 2.3.1 with CBuilder XE2 :(

where can i find the byteswap macros i.e. Endian32_Swap() in IOS4.3?

I'm porting a project from mac os to iphone os. I can't seem to find Endian32_Swap() etc anywhere.
thanks,
oli
it doesn't have byteswap.h and Endian32_Swap() etc
i'm using
#include <libkern/OSByteOrder.h>
OSSwapInt32(x)
instead

iPhone and OpenCV

I know that OpenCV was ported to Mac OS X, however I did not find any info about a port to the iPhone.
I am not a Mac developer, so that I do not know whether a Mac OS X port is enough for the iPhone.
Does anyone know better than me?
OpenCV does indeed work on the iphone. Use the configure script here to compile the library: cross compiling for iphone
You just have to cross-compile just as you do your apps.
OpenCV now (since 2012) has an official port for the iPhone (iOS).
You can find all of OpenCV's releases here.
And find install instructions here:
Tutorials & introduction for the new version 3.x
The latest build script from Computer Vision Talks works great for Xcode 4 and iOS 4.3 . I have tried the script myself and it is just awesome!
Here is opencv2.0 on iPhone iphone opencv test
OpenCV is now available as a framework for iOS. Just drag and drop into your project. It supports video capture too. See the article and get the example project here: opencv framework for ios
For the sake of transparency, I wrote this article and it is hosted on my company's website.
I haven't tried using OpenCV specifically, but I do dev for the iPhone and can say that most libraries I've tried that work on OS X DO NOT work on the iPhone out of the box. Some of them just needed a little tweaking/configuration to be done and then it was fine on the iPhone, but the reality is that the phone is missing quite a few backend components that OS X supports. Most complex libraries (OpenCV sounds like one of them) aren't going to work without a major effort - particularly since OpenCV sounds like it depends on several other external libraries as well...so those would have to be ported too.
All you need is to generate XCode project for OpenCV project using cmake or cmake gui tool.
Remember to set option to generate XCode project instead of the default option to use CMakeFiles.
Open generated project, change the base SDK to iPhone SDK, and hit build!
Since OpenCV does not support iOS at now (but they has announced iPhone support in version 2.2), highgui library won't compile. So if you need camera access you have to write it yourself.
Anyway, other libraries should compile and work on the device. (Works for me).
iPhone do support OpenCV if you want to use it first go to the best OpenCV on iPhone Documentation on the web: Yoshimasa Niwa's
I used it and i already have an app on the AppStore that uses Face Detection and Image Processing: Flags&Faces if you have any doubts please contact me.
Note that OpenCV runs very fast on Intel chips but the iPhone is arm. Of course OpenCV is extremely useful but it won't be that fast. Also, there's no way to get a live video stream on the iPhone so all of the normal potential CV applications sort of lose their appeal, don't they?
You can also install OpenCV using a package manager like Cocoapods.
To quote the installation guide:
You want to add pod 'OpenCV', '~> 3.0' similar to the following to
your Podfile:
target 'MyApp' do
pod 'OpenCV', '~> 3.0'
end
Then run a pod install
inside your terminal, or from CocoaPods.app.
Here's modified script (based on the one from LambdaJive) that builds universal OpenCV framework for iPhone/iPhone Simulator - universal-i386arm opencv framework
The following post by Yoshimasa does indeed work with I OS and the IPhone 4 and is able to access both the front and back cameras.
The link to the project is using opencv on iphone en
and the sample code is at webgit and it is linked from this article. I really encourage to read the article before getting the source code.
A project utilizing opencv on the iPhone (Lucas-Kanade optical flow to be exact). Source code available and app is on the AppStore as well -

Resources