I am developing an image processing application in Centos with OpenCV using C/C++ coding. My intension is to have a single development platform for Linux and IOS (IPAD).
So if I start the development in a Linux environment with OpenCV installed ( in C/CPP ),Can I use the same code in IOS without going for Objective-C? I don't want to put dual effort for IOS and Linux, so how to achieve this?
It looks like it's possible. Compiling and running C/C++ on iOS is no problem, but you'll need some Objective-C for the UI. When you pay some attention to the layering/abstraction of your modules, you should be able to share most/all core code between the platforms.
See my detailed answer to this question:
iOS:Retrieve rectangle shaped image from the background image
Basically you can keep most of your CPP code portable between platforms if you keep your user interface code separate. On iOS all of the UI should be pure objective-C, while your openCV image processing can be pure C++ (which would be exactly the same on linux). On iOS you would make a thin ObjC++ wrapper class that mediates between Objective-C side and the C++ side. All it really does is translate image formats between them and send data in and out of C++ for processing.
I have a couple of simple examples on github you might want to take a look at: OpenCVSquares and OpenCVStitch. These are based on C++ samples distributed with openCV - you should compare the C++ in those projects with the original samples to see how much altering was required (hint: not much).
Related
I'm intermediary at Objective-C and I'm current trying to make an app that control an device using UPnP, the app is a control point and the device is a Binary Light Switch.
For UPnP I have to parse XML and I'm using UPnPx library but it seems a little old, because I don't stop getting those LLVM 5.0 error, and can't use Auto Layout. I've searched a lot for others libraries but the one that seems more easy is UPnPx. Could you guys help me with some example code for binary light switch or other libraries? Thx!
As far as I know, there are few well-known UPnP libraries can be easily integrated into your iOS project.
CyberLink4C
It is implemented by C, but with Objective-C wrapper.
Here are some examples https://github.com/cybergarage/CyberLink4C/tree/master/examples.
Platinum
It is implemented by C++, with Objective-C wrapper, too.
Beware, it is a dual license library, GPL or commercial, read http://www.plutinosoft.com/platinum/ for detail information.
UPnPx
It is implemented by Objective-C, but the performance of searching device is poor.
Is it possible to to port a game, originally built for iOS (iPad version only, but it's not important) to Windows in an easy manner?
I assume I have all the source code of the original game. I also have experience with Obj C as well as with C#.
If not, what would be the steps for this kind of sorcery? Where could I find appropriate tutorials or references? Or anything.
When I did it, it was from the ground up. I was able to use my objective c classes as guides, but I still had to write it line by line (nothing automatic).
There is, however, a good introduction to the differences from an iOS developer's perspective over on Jesse Liberty's site that I found helpful: http://jesseliberty.com/2010/08/23/i2w-an-iphone-developers-guide-to-creating-windows-phone-7-applications-tutorial/
I am new to both openCV and Android. I have to detect objects in my project. So, I have decided to use ASIFT for the same. However, the code they have given here is very lengthy. It contains lots of C file. It also doesn't have openCV support.
Some search on the SO itself suggested that it is easier to connect the ASIFT code to the openCV library, but I can't figure out how to do that. Can anyone help me by giving some link or by telling the steps that I should use to add ASIFT to my openCv library, which I can further utilize in making my Android application?
Also, I would like to know whether using Android NDK along with JNI to make calls to the C files or using Android SDK along with binary package for my android project(Object Detection) would be a suitable option for me?
Finally , I solved my problem by using the source code given at the website of ASIFT developers. I compacted all the source files together to make my own library using make. I then called the required function from the library using JNI.
It worked for me, but the execution is taking approximate 2 mins on an Android device. Anyone having some idea about ways to reduce the running time ?
They used very simple and slow brute force matching (just for proving of concept). You can use FLANN library and it will help a lot. http://docs.opencv.org/doc/tutorials/features2d/feature_flann_matcher/feature_flann_matcher.html
can you please tell me if there is a weka (machine learning algorithm) for iOS ?
and if yes then provide me with a download link to download it.
iOS Agreement says:
"3.3.2 — An Application may not itself install or launch other executable code by any means, including without limitation through the use of a plug-in architecture, calling other frameworks, other APIs or otherwise. No interpreted code may be downloaded or used in an Application except for code that is interpreted and run by Apple’s Documented APIs and built-in interpreter(s)."
So you cannot lanch a java interpreter to use WEKA libraries.
BUT... Google released a "Java to iOS Objective C translator" a few days ago. And WEKA is an "Open Source" project. So, maybe, you could try to download WEKA's (java) code and translate it from java to Objective-C in order to run WEKA's algorithms in iOS.
If you get it, please, let me know ;-)
Weka is written in Java. This means the likelihood of it being adapted to iOS is quite small.
Somebody correct me if im wrong.
There is no way (at least no supported way) to create a View/Windows based iPhone app using the Corona SDK?
I say this mainly because i see no way to work with IBOutlets (anything related to the interface builder). Which makes me believe Corona is not converting anything to Objective C, but rather converts the Lua script to C/C++.
Thanks!
The latest new feature in Corona (currently available only to subscribers) is Corona UI, which emulates most of the native UI components:
http://www.youtube.com/watch?v=9UHNSRilB-0
Note that I say "emulated." It's still not connecting to IBOutlets but it may accomplish what you want.
Corona UI supports most of the native UI components, you can also include widget_iOS for other native components of iOS otherwise you can create your own custom objects.
We can achieve it using Corona Enterprise edition. It has all the option to bridging between LUA and C/C++, lua and objective-C or LUA AND JAVA.
Corona Enterprise provides feature to work in Xcode- objective-C environment to execute same functionalities in corona apps.
http://docs.coronalabs.com/native/enterprise/index.html
Hope this helps for your question.