I'm using Qt5.5 for iOS development.
I'm wondering how to find and open a file in an iOS device to read and write using Qt5.5. As I know, there's no such file tree structure in iOS. When I download a picture, for example, I even do not know where it locates. But I can see it in apps.
Is there anyone can help? Thanks very much.
I am no expert with Qt, but I believe you need the QStandardPaths class.
iOS is no different to any other platform that stores files in certain pre-defined locations.
SWF files compiled for the Flash & Air (desktop) can be easily de-compiled with off the shelf software.
However, SWF files compiled for iOS are different. The header is FWS, yet the data in the file wont de-compile in any de-compiler.
I am guessing that something happens during compilation for iOS. Does anyone what this is?
Thanks
To the best of my knowledge, you won't be able to do this.
A compiled AIR for iOS file is cross-compiled to Objective-C. Apple does not allow for virtualized languages in the App Store (which is how AIR for Desktop and Android work), so this was Adobe's solution. They take your AS3/Flex and translate it to Objective-C at compile-time so that at run-time, you are actually running native code.
So you would have to decompile the Objective-C first. Then you would have to reverse-engineer the entire AIR compiler and write a way to re-translate the decompiled Objective-C back to AS3.
For the record, any questions about decompiling binaries is generally frowned upon on SO.
did you try this sothink flashdecompiler?
And there is a Mac version that can help you out in it. here
What i am trying to do:
Trying to develop an Enterprise level IOS application with FFMPEG for video Processing.
What i have done so far:
Created a Linux based sample program with FFMPEG and made it work. Learnt how to use FFMPEG. I have already found the build instructions to build the FFMPEG packages
for IOS.
What help i need:
Does apple allow to place the FFMPEG based application in IOS Application Store?
As there is no official support from ffmpeg community for IOS, how reliable the "FFMPEG-IOS" is, as i don't want to get into any problems in future especially when apple releases a new version of os or the problem of ffmpeg only with IOS?
I believe several users here have apps in the App Store that are compiled and linked with ffmpeg. I personally am going to submit my app within the next month. I anticipate that it will be accepted.
For iOS, you cannot dynamically link. You must statically link. Therefore, the ffmpeg libraries will be part of your app. It would be highly unlikely that a future iOS update would break the code. Your app is more likely to break for some other reason unrelated to ffmpeg, e.g. a UI change, that Apple makes.
The requirement for static linking means that you must understand the ffmpeg licensing situation carefully. I am not a lawyer, and this is not legal advice. You should consult a lawyer for real legal advice. Some people interpret the LGPL to mean that static linking is OK as long as you do not modify the ffmpeg source code and you distribute the ffmpeg source code (e.g., provide it for download on your server) as well as the static library (.a) files used in building your app. You must also credit the ffmpeg project for your use of their code. More information: http://ffmpeg.org/legal.html
I built a project for sending or receiving audio data and tended to use RTP protocol. So I'm trying to compile JRTPLIB which I have downloaded from the web,it is written by C++,but I have no idea.I am new for these.
Does anyone know how to compile JRTPLIB which is able to use on ios? I would be very grateful if given the detailed steps.
all!
I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case:(
Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me!
My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.
I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.
Many problem appeared when I began research of it, but later I understood how it should work.
Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.
My trouble now is in sending video FROM Camera.
Basic operations sequence, as I understood, should be the following:
Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)
I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)
I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)
If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.
Now, the most complicated part for me, is step 3.
I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).
FFMPEG has great documentation and is very powerfull library, and I think, there won't be a problem to use it. BUT the problem here is that I can not got it working in iOS project.
I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library:))
I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)
So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.
I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.
I check their architecture:
iMac-2:MediaLibiOS root# lipo -info libavformat.a
Architectures in the fat file: libavformat.a are: armv6 armv7 i386
And I found that it is appropriate for me!
When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7: av_register_all".
I thought, that maybe there are no symbols in lib, and tried to show them:
root# nm -arch armv6 libavformat.a | grep av_register_all
00000000 T _av_register_all
Now I am stuck here, I don't understand, why xCode can not see this symbols, and can not move forward.
Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.
I have iPhone 5.1. SDK and xCode 4.2.
I though to write an answer for this question, even it is old, because there is none accepted answer.
According to the given explanation in the question, I assume he was able to compile the source of FFmpeg accurately. If anyone afraid of doing this, there is already compiled version that you can download from here.
Anyway, I believe the error cause because of not giving Header Search Paths in your PROJECT's Build Settings.
What you can do is, add your Xcode project path as to the Header Search Paths key and set it recursive.
And I believe you have to linked all three libraries mentioned below not just the first two.
libz.dylib
libbz2.dylib
libiconv.dylib
To compile FFMpeg for iOS SDK 5.1,
With the AppStore version of XCode (Which changes the script paths), You can use this script.
It worked for me. Also be sure to add to your project "Link Binary with libraries" these:
libz.dylib
libbz2.dylib
I'm also trying to get the h.264 byte stream from the .mov file while it's being written,
so i'd be glad for any tips there on step (2), thanks :)