I have to make an app in Objective-C (iOS) for handle a stream video from an ip camera.
After lots of research I have not idea where to start :(
RTSP protocol is difficult thus i looked for a library and i
found these
the site where i download it said that the library is for macOS.I'd like to know if is possible add .so library to my xcode/iOS project, if so, how do it?
Or have you other solutions for RTSP stream in iOS?
Sorry for my bad English.
Thanks in advice.
No this is not possible, the .so file is a binary and compiled for the x86 (x86_64) architecture.
iOS device run on an ARM architecture.
What i am trying to do:
Trying to develop an Enterprise level IOS application with FFMPEG for video Processing.
What i have done so far:
Created a Linux based sample program with FFMPEG and made it work. Learnt how to use FFMPEG. I have already found the build instructions to build the FFMPEG packages
for IOS.
What help i need:
Does apple allow to place the FFMPEG based application in IOS Application Store?
As there is no official support from ffmpeg community for IOS, how reliable the "FFMPEG-IOS" is, as i don't want to get into any problems in future especially when apple releases a new version of os or the problem of ffmpeg only with IOS?
I believe several users here have apps in the App Store that are compiled and linked with ffmpeg. I personally am going to submit my app within the next month. I anticipate that it will be accepted.
For iOS, you cannot dynamically link. You must statically link. Therefore, the ffmpeg libraries will be part of your app. It would be highly unlikely that a future iOS update would break the code. Your app is more likely to break for some other reason unrelated to ffmpeg, e.g. a UI change, that Apple makes.
The requirement for static linking means that you must understand the ffmpeg licensing situation carefully. I am not a lawyer, and this is not legal advice. You should consult a lawyer for real legal advice. Some people interpret the LGPL to mean that static linking is OK as long as you do not modify the ffmpeg source code and you distribute the ffmpeg source code (e.g., provide it for download on your server) as well as the static library (.a) files used in building your app. You must also credit the ffmpeg project for your use of their code. More information: http://ffmpeg.org/legal.html
Looking at the cross-platform nature of potrace http://potrace.sourceforge.net/, is it possible to compile this for iOS? If so, how?
Yes, it is possible to compile potrace for iOS.
As a matter of fact, most open source libraries using standard (GNU) configure tools that compile on MacOS X will easily compile on iOS, because they are likely free of platform-specific code (e.g. linuxisms) and standard configure tools allow cross-compilation.
You can compile these libraries in the shell by pointing configure to the proper toolchain and SDKs. Paths changed with Mavericks and Xcode 5, but fortunately automated scripts exist for more popular libraries such as expat.
The proposed solution is based on the x2on/expat-ios project on GitHub. Their script fixes expat's config.sub which doesn't know about arm64 target. Do does potrace 1.11's config.sub. However, a simpler approach consists in downloading a more recent version of config.sub. Unlike expat, potrace doesn't seem to need any patch to compile for iOS.
Full script build-potrace.sh is available here:
https://gist.github.com/pguyot/dce18af64a71b93c0204
Please note that potrace is licensed under the GPLv2. You might want to check Is it legal to publish iOS apps under the GNU GPLv3 open-source license? question.
I'm using iOS 6.1 and XCode 4.6
I have a problem, OpenCV needs to be compiled with libc++ (LLVM C++ 11), while Tesseract 3.02.03 needs to be compiled with "default compiler".
How can I overcome this problem. at this point I can compile and link only If I comment out OpenCV code or comment out OCR code. cannot make them both work together.
Any ideas??
I am by no means an expert with C++ but I had the same problem and by some trial and error and lots of internet searching I think I managed to solve it.
As I understand it, the problem is that opencv and tesseract are built with different standard libraries. The latest opencv is built with libc++ while tesseract is built with stdlibc++
The solution is to rebuild one of them so they both use the same standard library. I decided to recompile tesseract and followed the instructions found here which references a build script that is used to build the library.
I modified this script (again, by trial an error, not really sure this is the best way) to used the clang++ compiler (CXX="/usr/bin/clang++") and use libc++ (CXXFLAGS="$CFLAGS -stdlib=libc++") and it compiles (albiet with some warnings). You may also need to copy some headers as the script doesn't seem to copy them all.
Then just use this library instead of the downloaded one in your iOS project (remember to change back to libc++ in build settings) and everything will compile and link just fine.
So far it seems to work properly in runtime.
all!
I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case:(
Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me!
My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.
I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.
Many problem appeared when I began research of it, but later I understood how it should work.
Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.
My trouble now is in sending video FROM Camera.
Basic operations sequence, as I understood, should be the following:
Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)
I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)
I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)
If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.
Now, the most complicated part for me, is step 3.
I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).
FFMPEG has great documentation and is very powerfull library, and I think, there won't be a problem to use it. BUT the problem here is that I can not got it working in iOS project.
I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library:))
I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)
So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.
I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.
I check their architecture:
iMac-2:MediaLibiOS root# lipo -info libavformat.a
Architectures in the fat file: libavformat.a are: armv6 armv7 i386
And I found that it is appropriate for me!
When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7: av_register_all".
I thought, that maybe there are no symbols in lib, and tried to show them:
root# nm -arch armv6 libavformat.a | grep av_register_all
00000000 T _av_register_all
Now I am stuck here, I don't understand, why xCode can not see this symbols, and can not move forward.
Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.
I have iPhone 5.1. SDK and xCode 4.2.
I though to write an answer for this question, even it is old, because there is none accepted answer.
According to the given explanation in the question, I assume he was able to compile the source of FFmpeg accurately. If anyone afraid of doing this, there is already compiled version that you can download from here.
Anyway, I believe the error cause because of not giving Header Search Paths in your PROJECT's Build Settings.
What you can do is, add your Xcode project path as to the Header Search Paths key and set it recursive.
And I believe you have to linked all three libraries mentioned below not just the first two.
libz.dylib
libbz2.dylib
libiconv.dylib
To compile FFMpeg for iOS SDK 5.1,
With the AppStore version of XCode (Which changes the script paths), You can use this script.
It worked for me. Also be sure to add to your project "Link Binary with libraries" these:
libz.dylib
libbz2.dylib
I'm also trying to get the h.264 byte stream from the .mov file while it's being written,
so i'd be glad for any tips there on step (2), thanks :)