iPhone and OpenCV - ios

I know that OpenCV was ported to Mac OS X, however I did not find any info about a port to the iPhone.
I am not a Mac developer, so that I do not know whether a Mac OS X port is enough for the iPhone.
Does anyone know better than me?

OpenCV does indeed work on the iphone. Use the configure script here to compile the library: cross compiling for iphone
You just have to cross-compile just as you do your apps.

OpenCV now (since 2012) has an official port for the iPhone (iOS).
You can find all of OpenCV's releases here.
And find install instructions here:
Tutorials & introduction for the new version 3.x

The latest build script from Computer Vision Talks works great for Xcode 4 and iOS 4.3 . I have tried the script myself and it is just awesome!

Here is opencv2.0 on iPhone iphone opencv test

OpenCV is now available as a framework for iOS. Just drag and drop into your project. It supports video capture too. See the article and get the example project here: opencv framework for ios
For the sake of transparency, I wrote this article and it is hosted on my company's website.

I haven't tried using OpenCV specifically, but I do dev for the iPhone and can say that most libraries I've tried that work on OS X DO NOT work on the iPhone out of the box. Some of them just needed a little tweaking/configuration to be done and then it was fine on the iPhone, but the reality is that the phone is missing quite a few backend components that OS X supports. Most complex libraries (OpenCV sounds like one of them) aren't going to work without a major effort - particularly since OpenCV sounds like it depends on several other external libraries as well...so those would have to be ported too.

All you need is to generate XCode project for OpenCV project using cmake or cmake gui tool.
Remember to set option to generate XCode project instead of the default option to use CMakeFiles.
Open generated project, change the base SDK to iPhone SDK, and hit build!
Since OpenCV does not support iOS at now (but they has announced iPhone support in version 2.2), highgui library won't compile. So if you need camera access you have to write it yourself.
Anyway, other libraries should compile and work on the device. (Works for me).

iPhone do support OpenCV if you want to use it first go to the best OpenCV on iPhone Documentation on the web: Yoshimasa Niwa's
I used it and i already have an app on the AppStore that uses Face Detection and Image Processing: Flags&Faces if you have any doubts please contact me.

Note that OpenCV runs very fast on Intel chips but the iPhone is arm. Of course OpenCV is extremely useful but it won't be that fast. Also, there's no way to get a live video stream on the iPhone so all of the normal potential CV applications sort of lose their appeal, don't they?

You can also install OpenCV using a package manager like Cocoapods.
To quote the installation guide:
You want to add pod 'OpenCV', '~> 3.0' similar to the following to
your Podfile:
target 'MyApp' do
pod 'OpenCV', '~> 3.0'
end
Then run a pod install
inside your terminal, or from CocoaPods.app.

Here's modified script (based on the one from LambdaJive) that builds universal OpenCV framework for iPhone/iPhone Simulator - universal-i386arm opencv framework

The following post by Yoshimasa does indeed work with I OS and the IPhone 4 and is able to access both the front and back cameras.
The link to the project is using opencv on iphone en
and the sample code is at webgit and it is linked from this article. I really encourage to read the article before getting the source code.

A project utilizing opencv on the iPhone (Lucas-Kanade optical flow to be exact). Source code available and app is on the AppStore as well -

Related

How to compile FileMQ in iOS?

I want to implement FileMQ for file transfer from iOS to android in my iOS app.
I tried the steps given here but it causes errors at many steps. Also I need to know how should I use the downloaded library.
How should I compile FileMQ for iOS and use it?
Any information in this regard is appreciated!
I downloaded a C version of the library from the link mentioned in the question and compiled it on linux machine. I made some changes in the headers to make it iOS compatible as the headers were generated for linux. Now I am using the same copy in my iOS project.

Linking iOS project as a library

First of all this question is just theoretical, I know that is not something you should really do.
I was ordering with it's possible to link an iOS project to use it sources code as a library in a OS X project, since iOS project in debug mode produce code for simulator architecture which is the same used in OS X project. I Know that frameworks like UIKit will not be available, but with we link (again I am being theoretical) with an OSX version of the UIKit?

How to integrate a source compiled LLVM with Xcode?

As part of a research project at school, I'm exploring mobile specific energy optimizations and am building infrastructure to test these optimizations on a popular mobile platform. Given my background in LLVM, I have decided to setup the testing infrastructure around the iOS platform. I thought that since Xcode already uses LLVM under the hood, it should be easy to integrate a copy of LLVM compiled from source into the Xcode toolchain, but I haven't been able to find an option to accomplish it in Xcode yet. (I'm new to OSX and haven't worked with Xcode before)
Am I overlooking anything, or is such an integration not supported out of the box in Xcode?
It's for obfuscater-llvm, but it should work for a "normal" llvm:
https://github.com/obfuscator-llvm/obfuscator/wiki/Installation#integration-into-xcode

Can someone please explain the differences between Cocos2d-Swift, SpriteBuilder, Xcode and CocoaPods?

I'm completely confused and I don't know where to start asking questions. I tried googling, but the terminology is confusing and I'm not sure what either of these things do (except for Xcode). Can someone explain like I'm 5?
I'm on the cocos2d-swift website and after reading the getting started section it says "From this point onwards, using SpriteBuilder is optional.". I don't know what they mean by that.
How do each of these correlate with each other?
Also, how is an API Documentation Browser and Code Snippet Manager useful to an everyday iOS Developer?
cocos2d-swift is a framework that enables you to build things like sprite-based games quickly.
SpriteBuilder is a tool that helps you build your own multilayered sprites (images and animations grouped into a single package -- i.e. Mario, a Goomba, a Fireflower fireball, etc.).
Xcode is a developer environment in which you write your source code, compile, distribute, and test.
CocoaPods is a tool that fetches and manages framework/SDK dependancies.
You would use CocoaPods to fetch the cocos2d-swift framework so that you could build a sprite-based game in Xcode using sprites you generated in SpriteBuilder.
Not sure what Cocos2d is, but swift is the latest programming language by Apple for both OS X and iOS development.
http://en.wikipedia.org/wiki/Swift_(programming_language)
SpriteBuilder is a framework used to create games for iOS very quick. Think of it as a game engine.
http://www.spritebuilder.com/about
Xcode is the IDE (integrated development environment) that you use when writing native OS X and iOS applications. It's awesome!
CocoaPods is a way to load in third-party libraries and frameworks without having to manually install them on your own. It also makes it very easy to keep the frameworks up-to-date. Pods also allows your project to be more portable as it's much easier to install an application with multiple dependancies via Pods.
http://cocoapods.org
A documentation browser is good if you want to have access to documentation while offline. However, I almost always use Google to find what I'm looking for regardless of what technology I'm working on. Google is just the best way to search.
Finally, I'd start off with this book. I read the first edition years ago, and made things very easy for me to understand.
http://www.bignerdranch.com/we-write/ios-programming.html
Hope this helps!
Here are some basics:
XCode (A Program)- Most of your iOS development will happen here. Coding, creating the app etc.
Think of an SDK as a suite of commands or tools you can use-API's (API - Application programming interface)
Cocoas2d (An SDK) - Game engine. A software development kit for creating games. you would pull this library of code and tools into xcode to use it.
SpriteBuilder (An SDK) - Suite of tools for building games. Just like Cocoas, you would pull this into xCode to make use of it as you code.
CocoaPods - A tool for linking/loading SDK's into XCode and easily updating them.
Moral of the story: XCode is the software you will use for everything. Everything else are just additional libraries of code you can pull in.

Can Opencv developed in c/cpp - Run in IOS?

I am developing an image processing application in Centos with OpenCV using C/C++ coding. My intension is to have a single development platform for Linux and IOS (IPAD).
So if I start the development in a Linux environment with OpenCV installed ( in C/CPP ),Can I use the same code in IOS without going for Objective-C? I don't want to put dual effort for IOS and Linux, so how to achieve this?
It looks like it's possible. Compiling and running C/C++ on iOS is no problem, but you'll need some Objective-C for the UI. When you pay some attention to the layering/abstraction of your modules, you should be able to share most/all core code between the platforms.
See my detailed answer to this question:
iOS:Retrieve rectangle shaped image from the background image
Basically you can keep most of your CPP code portable between platforms if you keep your user interface code separate. On iOS all of the UI should be pure objective-C, while your openCV image processing can be pure C++ (which would be exactly the same on linux). On iOS you would make a thin ObjC++ wrapper class that mediates between Objective-C side and the C++ side. All it really does is translate image formats between them and send data in and out of C++ for processing.
I have a couple of simple examples on github you might want to take a look at: OpenCVSquares and OpenCVStitch. These are based on C++ samples distributed with openCV - you should compare the C++ in those projects with the original samples to see how much altering was required (hint: not much).

Resources