all!
I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case:(
Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me!
My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.
I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.
Many problem appeared when I began research of it, but later I understood how it should work.
Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.
My trouble now is in sending video FROM Camera.
Basic operations sequence, as I understood, should be the following:
Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)
I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)
I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)
If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.
Now, the most complicated part for me, is step 3.
I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).
FFMPEG has great documentation and is very powerfull library, and I think, there won't be a problem to use it. BUT the problem here is that I can not got it working in iOS project.
I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library:))
I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)
So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.
I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.
I check their architecture:
iMac-2:MediaLibiOS root# lipo -info libavformat.a
Architectures in the fat file: libavformat.a are: armv6 armv7 i386
And I found that it is appropriate for me!
When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7: av_register_all".
I thought, that maybe there are no symbols in lib, and tried to show them:
root# nm -arch armv6 libavformat.a | grep av_register_all
00000000 T _av_register_all
Now I am stuck here, I don't understand, why xCode can not see this symbols, and can not move forward.
Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.
I have iPhone 5.1. SDK and xCode 4.2.
I though to write an answer for this question, even it is old, because there is none accepted answer.
According to the given explanation in the question, I assume he was able to compile the source of FFmpeg accurately. If anyone afraid of doing this, there is already compiled version that you can download from here.
Anyway, I believe the error cause because of not giving Header Search Paths in your PROJECT's Build Settings.
What you can do is, add your Xcode project path as to the Header Search Paths key and set it recursive.
And I believe you have to linked all three libraries mentioned below not just the first two.
libz.dylib
libbz2.dylib
libiconv.dylib
To compile FFMpeg for iOS SDK 5.1,
With the AppStore version of XCode (Which changes the script paths), You can use this script.
It worked for me. Also be sure to add to your project "Link Binary with libraries" these:
libz.dylib
libbz2.dylib
I'm also trying to get the h.264 byte stream from the .mov file while it's being written,
so i'd be glad for any tips there on step (2), thanks :)
Related
Some iOS 9 devices in the wild seem to crash with the error message that I receive from the very basic crash reporting in Xcode only
dyld: malformed mach-o: load commands size (16464) > 16384
Unfortunately that's all the info I get. I can't even debug or reproduce locally.
Can anyone hint me into the right direction here?
It occurs after updating my Cocoapods, so I guess there's one of them (or their dependency) that misbehaves.
After some investigation of my mach-O binary, I saw that the sizeofcmds is really 16464.
If I understand correctly, there seems to be a load command size limit of 16384, can anyone confirm this?
Does that mean I should remove dylibs and everything should be fine?
At WWDC18 I went to an Apple engineer who is working on dyld. Here’s what he had to say:
The Dyld code is downloadable from https://opensource.apple.com (the one specific to us can be found inside macOS 10.12)
For iOS 9 the maximum size of load commands is indeed 16k aka 1 memory page (There’s no way around it! This is imposed by the OS itself. For customer service telling people to update to iOS 10 (all devices that run iOS 9 can except for iPhone 4S) would be viable.)
Since iOS 10 the maximum size of commands is 32k
Majority of the size of the load commands is determined by strings (paths) of the frameworks (use command otool -L to see them
Possible solutions:
Use less libraries (that was our goto solution thus far, but we will change to umbrella libraries (see below))
Shortening names (might screw up header lookup of cocoa pods, maybe use head maps to fix that inside the Xcode build process → maybe more (high-level) info in WWDC18 session “Behind the scenes of the Xcode Build Process”)
Try to build static archives for libraries (should not have dynamic resources otherwise make copy phases and figure out where resources are)
Build frameworks that re-export other frameworks (umbrella frameworks). Use -reexport-l as a linker flag (not done often) → gonna make some runtime overhead when starting the app, also uses a bit more memory (man ld → for info on re-exports)
The engineer recommended to file a bugreport via bugreport.apple.com, because in the future even hitting the 32k limit is possible.
I found a solution that will (at least temporarily) work for me - but I still encourage everyone to provide a real solution and more detailed insights.
Anyway:
Extract your binary
Xcode Archive
Export as IPA
Rename YourApp.ipa to YourApp.zip and extract
Navigate to the subfolder payload to find your YourApp.app
Right click & Show Package Contents of your YourApp.app file
Copy your binary YourApp (no file extension) to a different location
Investigate your mach-o binary
Run otool -f on your binary
Note the align for both architectures are listed which, for me, says 2^14 (16384). This seems to be the threshold for the size of load commands.
Run otool -l on your binary
You'll see that the different architectures and their load commands are listed - as well as their sizeofcmds (size of commands).
Now the funny thing:
For arm64, the sizeofcmds (16464) was larger than the align (16384), while it wasn't for armv7.
Now I haven't found enough documentation on this, but I assume that align symbolizes a threshold that should not be reached by the load command size. And also that it adjusts automatically (since we are definitely not having that many frameworks in our app, there have to be apps that have more).
So I guess the error came from this unlikely case, that the sizeofcmds was different in between the architectures AND that one of them was actually valid (so that the align was not automatically adjusted).
Please correct me if I'm wrong, I am just assuming here and I really really want to understand why this happens.
Solve the issue
Remove frameworks until you are under the sizeofcmds for both architectures.
I know this is not scalable, we were lucky (and stupid) that we still had one barely used framework in there that we could easily remove.
Fortunately this only seems to be an issue on iOS9 and will therefore loose relevance over the next months, nevertheless, we should find out if I'm right
Investigation ideas
My assumption that the align is automatically adjusted could be investigated by just putting in more and more frameworks to see if it actually does.
If so, adding frameworks would also solve the original issue - not nice, but at least slightly more scalable.
Sidenote
I don't feel like I shed enough light on the origins of this issue and I had a lot of assumptions.
While my solution works, I really hope you feel encouraged to investigate this as well and give a better answer.
So here's the problem:
The Mach-O header size is expected to be 16k (optimized for the platform's pagesize). In the reference by rachit it's basically the same thing but the limit is 32K. Both are correct in that this is hard limit of dyld, the loader.
The total size of load commands exceeds this max size. Removing frameworks and libraries works for you because that removes LC_LOAD_DYLIB commands (And, there is no reason why you'd need so many frameworks anyway). Instead of removing frameworks, build your app from the ground up starting with the core frameworks, and adding so long as you get linker errors.
btw, 'Align' has nothing to do with this - Alignment refers to the fat (universal) architecture slices, and doesn't have anything to do with the Mach-O.
I was able to resolve this for my team after reviewing the result of otool -l. Turns out we had the same directory included in our framework search paths 5x causing our dylibs to be added as rpaths 5x.
I have to make an app in Objective-C (iOS) for handle a stream video from an ip camera.
After lots of research I have not idea where to start :(
RTSP protocol is difficult thus i looked for a library and i
found these
the site where i download it said that the library is for macOS.I'd like to know if is possible add .so library to my xcode/iOS project, if so, how do it?
Or have you other solutions for RTSP stream in iOS?
Sorry for my bad English.
Thanks in advice.
No this is not possible, the .so file is a binary and compiled for the x86 (x86_64) architecture.
iOS device run on an ARM architecture.
I'm very interested in using GStreamer's iOS framework http://docs.gstreamer.com/display/GstSDK/Installing+for+iOS+development for video streaming, but when I add the framework to a blank project and add a few lines of code to take advantage of its powerful features, the final IPA is 27MB. This is just way to big to be adding to my project, what is the best way to go about stripping this down the the bare necessities as I'm sure I'm only using a small percent of the code that is included in the SDK.
Here's a pic showing the package contents of the IPA:
Thanks!
In the gst_ios_main.h you can disable all the plugins that you don't need (make sure to enable linker optimizations so that unused code is removed). If that's not enough, you can build your own stripped down version of the iOS binaries with http://cgit.freedesktop.org/gstreamer/cerbero/ (you need to remove things from the .package and .recipe files to only build what you need). Just disabling things from gst_ios_main.h should be enough in 99% of the cases though.
Note that by default you'll build applications for multiple architectures, as such the resulting application will be rather large. Depending on your use case you can drop some architectures.
On another note, gstreamer.com is providing an completely outdated version of GStreamer and is in no way related to the GStreamer project. The official website is http://gstreamer.freedesktop.org .
SDKs have their source code encapsulated away from you, the user. You get access only to header files. Thus you just can't extract some class from it because you don't have access to the implementation file.
Of course if this library is opensource you can attempt to isolate one class, but sometimes everything is so deeply connected, that it is close to impossible.
I am trying to use features of the FFMPEG library (such like libavcodec.a, libavformat.a, libavutil.a and libswresample.a) but I am confused on how to add the FFMPEG library to my project.
WHY FFMPEG library? => Because in my project I want to play Live URL Streaming, This URL is Window Media Audio file (.wma) since iOS has no direct support for '.wma' files, I need to convert this Live URL Streaming to an iOS device supported format. For doing this I am using RadioTunes SDK, All things are well and good except I don't know about the installation of FFMPEG library
There are many question related to mine but none can help me.
How to build and link FFMPEG to iOS?
ffmpeg use on iOS
I download FFMPEG library from here for 10.5.x
I don't totally understand what you mean by "how to add the FFMPEG library to my project" but I think you want to build the FFMPEG library and use it in your project?
If that's the case you have to do the following:
Download the FFMPEG sources and compile them for the architectures you want to support (armv6, armv7 arm64, ...).
Put the libraries for the different architectures together using lipo (which will get you the universal library)
Add the universal library to your project together with the needed FFMPEG header files.
After those three steps you can use the FFMPEG library in you code.
Unfortunately step 1 is kind of a pain. You will have to dig a bit for tutorials to get the FFMPEG sources to compile. There are several stack overflow questions that might help you:
Build FFmpeg with xCode 5
FFMPEG iOS 7 Library
Installing ffmpeg ios libraries armv7, armv7s, i386 and universal on Mac with 10.8
The last link seems to be the most promising one. Good luck!
I've to de-compile iOS .app file and then insert my code and then re-package back to ipa file.
Can you please suggest some pointers how to do it?
You simply can't do this.
Once the app is compiled down to machine code, the best you can get from reverse engineering it is just assembly and unless you are willing to write your fix in assembly I don't see how you are going to integrate your code.
Also the code signing will be corrupted by doing this as well.
Unless you have valid provisioning set up on your machine you can't repackage the app with the original code signing.
Try to get the source or similar source to write an app with the functionality you need.
Yes you can if the device is jailbroken and the signature check is removed.
Here there is one case:
Is it possible to edit and recompile an iOS Binary?
In the reply they suggest that the modifications have been taken directly on the binary (by replacing some functions with other of the same size). However, I believe that you can do more advanced things.
You could, for instance, include the assembly of the application into your own objective-c code using XCode:
Compiling Assembly (.S) file alongside iPhone project
Or directly compile the modified assembly into the binary (mach-o) and then repackage.
How to create an iPhone mach-o file?
Maybe GNU ARM or LLVM toolchain can assit you on doing this: Compile, Assemble and Disassemble Using the LLVM Tool Chain
There are just some approaches which I'm currently investigating on. It is not straightforward, so any other know-how on the topic will be very appreciated.