Can't add iOS OpenCV Framework in a QT Project - ios

I'm trying to bind the OpenCV Framework which is available for iOS to a Qt 5.2 iOS Project. So far without any useful result.
What works is to create the Qt Project without any OpenCV Framework bind, then generate a Xcode Project with quake (using command "qmake -spec macx-xcode qtopencv.pro") and then manually add the iOS OpenCV Framework to the Xcode Project. The resulting App then runs perfectly with OpenCV Support. But if I want to add a File to the Project or something effecting the Project structure then I need to use qmake and add the OpenCV Framework manually every time.
So I need a way to tell the QT Project to use the already existing OpenCV iOS Framework and using it automatically. After doing some research on qmake I found out that there is the possibility to add an Mac/iOS Framework.
Using following commands in the QT Pro file the Frameworkfiles seem to be added to the Project (The OpenCV Framework seems to be correctly added into the Project as it can be found in the Framework section in the qmake generated Xcode Project):
QMAKE_LFlags += -F"/Users/divdurch0/Desktop/qtopencv/OpenCV.framework"
LIBS += -framework "/Users/divdurch0/Desktop/qtopencv/OpenCV.framework"
But now the Project is not compiling saying "ld:framework not found -L/Users/divdurch0/Qt5.2.1/ios/plugins/platforms". The mentioned path is not an Framework it is a lib and is correctly found as Lib if I don't add above mentioned lines - so it shouldn't be the problem.
if I change the second line as described in several answers on the net to
LIBS += -framework OpenCV
it says Framework OpenCV not found and the Framework files aren't added to the qmake resulting Xcode Project.
I hope someone knows how to do this. There must be a way maybe I am using the wrong syntax. Or any other way to add the OpenCV Framework to a iOS QT Project will be helpful.
Thanks.

Answered at http://qt-project.org/forums/viewthread/41530/#172320 by SGaist :
LIBS += \
-F /Users/divdurch0/Desktop/qtopencv \
-framework OpenCV
And you should be good to go

Related

How do I compile a library in Xcode using a makefile?

I need to compile a library written in C in Xcode. For this I need to use a make file. How can I include a make file in my project?
Any link for writing a make file or sample of a make file for running on the iOS simulator would be helpful.
Also if I use cmake, then what commands do I use in the terminal to create a static Library for iOS simulator?
Thanks
First, your project generator is either Xcode or Make. You can't have a makefile in Xcode.
If you want to generate an iOS library from C/C++ sources, take a look at this google project. The project wiki explains how to use the iOS CMake toolchain. This will give you a .xcodeproj file. You can build and then link to that library from other iOS projects. Also there's this fork available on Github which you could take a look at.
If the target system is exclusively iOS, you could alternatively create a new iOS library project from scratch (no CMake) and throw in your sources manually.

Compile FFmpeg as iOS 8 Framework

I successfully compiled FFmpeg with iOS 8.2 SDK thanks to https://github.com/kewlbear/FFmpeg-iOS-build-script and last version of gas-preprocessor (https://github.com/libav/gas-preprocessor).
However, I would like to package FFmpeg libraries as a iOS 8 dynamic framework due to legal constraints. I found resources to create iOS 8 dynamic framework however I cannot find any solution for FFmpeg.
Can anyone help me to package these librairies ?
Thanks
David
As far as I know, FFmpeg-iOS repo in Github can build static libraries from FFmpeg source code. But I search throughout the network, no one show me how to compile with dynamic libraries.
But I wonder if we can create a new cocoa touch framework project, and drag all header files and libraries into this project, and do some header declaration into the base .h file, and drag the framework project into an existing iOS project as a sub project, add it as an embedded framework, and compile the whole project.
The reason why I use sub project, instead of giving out a final .framework file, is that static symbols can only be linked only if them are been using somewhere.
I will demonstrate this later. If anyone has better ideas, it will be grateful.
Edit:
After several days's researching, I found it is not easy to build dynamic framework easily, but I find a workaround to achieve the target:
Build a static libraries of FFmpeg
Create a new iOS dynamic framework project
Create a class that encapsulate the basic usage of the FFmpeg, such as encoding/decoding video
Copy static libraries into this dynamic framework project
Make sure your project build without error
Add this project as a subproject to your existing project
Add dependency in embedded binaries and Linked Frameworks and Libraries
Build and run main project
Open source this project as LGPL2.1+, the same as FFmpeg itself.
Through it is not perfect, but at least it works, and it complies with FFmpeg's LGPL license.

How can I make CorePlot work with swift?

I read the tutorial on how to use core plot (http://www.raywenderlich.com/13269/how-to-draw-graphs-with-core-plot-part-1)
The project I am working on is written in swift.
I did everything to import the CorePlot library.
To make sure I did it right, I created a new objective C file and called
#import "CorePlot-CocoaTouch.h"
it worked in the test objective-c file (built without errors)
I then put the import statement in the bridging header file, and got 27 errors ALL saying '... the current deployment target does not support automated __weak'
From this post (CorePlot in Swift can not call “numberForPlot”),
I know I can make core plot work in my swift project somehow.
Can anyone recommend something ?
Thank you for helping.
I followed these steps to import the library :
Static Library Install
You can also just copy the Core Plot library directly into your project in binary form.
Copy the CorePlotHeaders directory to your Xcode project
Copy the Core Plot library to your Xcode project.
Open your apps Target Build Settings, and for Other Linker Flags include this:
-ObjC
(-all_load used to be required as a linker flag, but this is no longer needed in Xcode 4.2)
Add the QuartzCore framework to the project.
Add the Accelerate framework to the project (release 2.0 and later).
Change your C/C++ Compiler in the project build settings to LLVM GCC 4.2 or LLVM 1.6.

Tesseract-OCR 3.02 with libc++

Xcode 4.6, iOS SDK 6.1, tesseract-ocr 3.02
Since the last OpenCV versions are built using libc++, and tesseract-ocr is built using libstdc++, they can't be used together in one xcode project.
So, I'm trying to build tesseract using libc++. Using the script here (updating the base sdk and deploy target to 6.1), tesseract is being built just fine, and works in my xcode project once the C++ standard library is set to the compiler default. Than, I tried altering the script to build it with libc++, according to the answer here. I changed CXX to point to clang++, and added -stdlib=libc++ to the CXXFLAGS.
The result is that the script succeeds, and the libraries are built, but when choosing libc++ as the C++ standard library in xcode, I'm getting a lot of linker errors and the project build fails. The new libraries still work when the standard library is set to the compiler default (just like when it was built regularly).
What am I missing?
Ok, so my problem was that after adding and removing references to libraries a few times in my project, I had quite a mess in my Library Search Paths. Plus, I didn't add the new "include" folder (created when building tesseract) to the User Header Search Paths.
So, just a quick recap, in order to build tesseract-ocr using libc++, so it can work along with newer OpenCV versions:
Download leptonica-1.69
Download tesseract 3.02
Arrange them in the folder structure explained in the original tutorial here
Download this script to the same folder.
Edit the script for your relevant IOS_BASE_SDK and IOS_DEPLOY_TGT.
Edit CXX to use clang++: CXX="/usr/bin/clang++"
Edit CXXFLAGS to use libc++ as the standard library: CXXFLAGS="$CFLAGS -stdlib=libc++"
Use the script and build tesseract and leptonica.
Add these libraries to your xcode project, change the "C++ Standard Library" setting to libc++.
Make sure your "Library Search Paths" setting is not pointing to any old tesseract libs.
Make sure your "User Header Search Paths" setting is pointing to the new "include" folder created when you built the new libs.
Now, when you try building your project, you'll have a few missing header files. Just copy them from the old "include" folder from tesseract and leptonica.
That's it. At this point, you'll have a project capable of using both new OpenCV versions AND tesseract 3.02 together. If it's a new project, don't forget to edit your prefix file accordingly to include OpenCV and Tesseract in case of __cplusplus, and rename any .m file using them to .mm
Big thanks to to this answer, that got me well on my way.
Tsseract-OCR-iOS has been updated to handle this issue (working in the same project as a libc++ compiled project e.g. OpenCV). Don't forget these steps when installing it:
If you are using iOS7 or greater, link libstdc++.6.0.9.dylib library (Your target => General => Linked Frameworks and Libraries => + => libstdc++.6.0.9)
Go to your project, click on the project and in the Build Settings tab add -lstdc++ to all the "Other Linker Flags" keys.
*Go to your project settings, and ensure that C++ Standard Library => Compiler Default. (thanks to https://github.com/trein)
Copy and import the tessdata folder from the Template Framework Project under the root of your project. It contains the "tessdata" files. You can add more tessdata files copyng them here.
*I had to set the C++ Standard Library to "libc++" in order for OpenCV to compile.

How do I add this C library to my iOS Xcode 4 project?

I'm trying to add the openjpeg library to my XCode 4 project so that I can compress images taken by the iPhone's camera to jpeg2000.
I built the static library (libopenjpeg.a) using Cmake on OS/X. (I'm guessing this may have been the first error, that it needs to be built by XCode so it's built for iPhone architecture and not OS X).
I have the library added in the Link Binary with Libraries of my target.
The project builds successfully but I can't seem to import any of the headers from the library into any of my Objective-C classes. I've tried manually adding the folder that contains the libopenjpeg header files to the User Header Search Path but that did not seem to do anything.
Any suggestions?
for the simplest solution
Import the head files to you project's source.
You can still build it on the command-line with CMake, you'd just have to modify the CMakeLists.txt file so the right flags are passed when compiling.
However as Gavin indicates, it may be simpler just to drag the header and source files from the library into your Xcode project, and forego the building of a static library.

Resources