I have compiled QtAv and PortAudio src code, but I run QtAv's examples (such as vo-qt), It only has video, I check the src code, to here AVPlayer.cpp:
HAVE_PORTAUDIO
include "QtAV/AOPortAudio.h"
and here
/*
HAVE_PORTAUDIO
_audio = new QtAV::AOPortAudio();
*/
so I think PortAudio hasn't included , I don't know where defined the HAVE_PORTAUDIO and how I can include the portaudio?
thanks!
if your compiler can find portaudio's header and library, then it will be enabled when running qmake.
https://github.com/wang-bin/QtAV/wiki/Build-QtAV
Related
I have integrated FFmpeg Library in Xcode successfully. But I don’t know, how to use it, how to run FFmpeg commands?? If anyone has any example of how to use it. That'd be really appreciated.
I have integrated it by following this guide.
Any help will be highly appreciated. Thanks
Here is my resolution. I hope it can work for you.
Assuming you has integrated FFmpeg Library in Xcode successfully.
First, Create a header file named "avcodec_shim.h":
//avcodec_shim.h
#pragma once
#include "libavcodec/avcodec.h"
Second, Create a empty file with a fixed name "module.modulemap":
module CFFmpeg [system] {
module AVCodec {
header "avcodec_shim.h"
link "avcodec"
export *
}
module AVFormat {
header "avformat_shim.h"
link "avformat"
export *
}
.....
}
Third, Go to "TARGET"->"Build Setting", and set "Import Paths" as the ABSOLUTELY PATH OF FOLDER containing "module.modulemap" file. NOTE: not the file absolutely path.
Final, Import the module using "import CFFmpeg", and using FFmpeg APIs as usual.
I have added this (https://github.com/kewlbear/FFmpeg-iOS-build-script) version of ffmpeg to my project. I can't see the entry point to the library in the headers included.
How do I get access to the same text command based system that the stand alone application has, or an equivalent?
I would also be happy if someone could point me towards documentation that allows you to use FFmpeg without the command line interface.
This is what I am trying to execute (I have it working on windows and android using the CLI version of ffmpeg)
ffmpeg -framerate 30 -i snap%03d.jpg -itsoffset 00:00:03.23333 -itsoffset 00:00:05 -i soundEffect.WAV -c:v libx264 -vf fps=30 -pix_fmt yuv420p result.mp4
Actually you can build ffmpeg library including the ffmpeg binary's code (ffmpeg.c). Only thing to care about is to rename the function main(int argc, char **argv), for example, to ffmpeg_main(int argc, char **argv) - then you can call it with arguments just like you're executing ffmpeg binary. Note that argv[0] should contain program name, just "ffmpeg" should work.
The same approach was used in the library VideoKit for Android.
To do what you want, you have to use your compiled FFmpeg library in your code.
What you are looking for is exactly the code providing by FFmpeg documentation libavformat/output-example.c (that mean AVFormat and AVCodec FFmpeg's libraries in general).
Stackoverflow is not a "do it for me please" platform. So I prefer explaining here what you have to do, and I will try to be precise and to answer all your questions.
I assume that you already know how to link your compiled (static or shared) library to your Xcode project, this is not the topic here.
So, let's talk about this code. It creates a video (containing video stream and audio stream randomly generated) based on a duration. You want to create a video based on a picture list and sound file. Perfect, there are only three main modifications you have to do:
The end condition is not reaching a duration, but reaching the end of your file list (In code there is already a #define STREAM_NB_FRAMES you can use to iterate over all you frames).
Replace the dummy void fill_yuv_image by your own method that load and decode image buffer from file.
Replace the dummy void write_audio_frame by your own method that load and decode the audio buffer from your file.
(you can find "how to load audio file content" example on documentation starting at line 271, easily adaptable for video content regarding documentation)
In this code, comparing to your CLI, you can figure out that:
const char *filename; in the main should be you output file "result.mp4".
#define STREAM_FRAME_RATE 25 (replace it by 30).
For MP4 generation, video frames will be encoded in H.264 by default (in this code, the GOP is 12). So no need to precise libx264.
#define STREAM_PIX_FMT PIX_FMT_YUV420P represents your desired yuv420p decoding format.
Now, with these official examples and related documentation, you can achieve what you desire. Be careful that there is some differences between FFmpeg's version in these examples and current FFmpeg's version. For example:
st = av_new_stream(oc, 1); // line 60
Could be replaced by:
st = avformat_new_stream(oc, NULL);
st->id = 1;
Or:
if (avcodec_open(c, codec) < 0) { // line 97
Could be replaced by:
if (avcodec_open2(c, codec, NULL) < 0) {
Or again:
dump_format(oc, 0, filename, 1); // line 483
Could be replaced by:
av_dump_format(oc, 0, filename, 1);
Or CODEC_ID_NONE by AV_CODEC_ID_NONE... etc.
Ask your questions, but you got all the keys! :)
MobileFFMpeg is an easy to use pod for the purpose. Instructions on how to use MobileFFMpeg at: https://stackoverflow.com/a/59325680/1466453
MobileFFMpeg gives a very simple method for translating ffmpeg commands to your IOS objective-c program.
Virtually all ffmpeg commands and switches are supported. However you have to get the pod with appropriate license. e.g min-gpl will not give you features of libiconv. libiconv is convered in vidoe, gpl and full-gpl licenses.
Please highlight if you have specific issues regarding use of MobileFFMpeg
unfortunately I can't manage to make clang_complete work and I could need your help.
I've already compiled vim 7.4 with python support. Here is the output of vim --version | grep python:
+cryptv +linebreak +python/dyn +viminfo
-cscope +lispindent +python3/dyn +vreplace
I followed this guide: https://vtluug.org/wiki/Clang_Complete
Please note that I've started from a clean installation (i.e. no other plugins and no further entries in my .vimrc (except for those shown in the guide above)).
According to the tutorials I've seen so far everything should be working.
However, if I try to get code completion for the following example nothing happens. If I press <c-x><x-u> I receive the error "completefunc not set".
#include <string>
int main()
{
std::string s;
s.
}
Moreover, I've installed the newest version of clang from source and it in my $PATH.
Is there a way to verify that clang_complete is actually installed?
What might cause this problem?
Any help is much appreciated.
Add
filetype plugin indent on
to your vimrc, its missing from the vimrc snippet in the link. This tells vim to do filetype detection and fire autocommands related to those file types. Without it you won't run the following autocommands.
au FileType c,cpp,objc,objcpp call <SID>ClangCompleteInit()
au FileType c.*,cpp.*,objc.*,objcpp.* call <SID>ClangCompleteInit()
Which probably initalize ClangComplete.
I have created a Wireshark dissector in Lua for an application over TCP. I am attempting to use zlib compression and base64 decryption. How do I actually create or call an existing c library in Lua?
The documentation I have seen just says that you can get the libraries and use either the require() call or the luaopen_ call, but not how to actually make the program find and recognize the actual library. All of this is being done in Windows.
You can't load any existing C library, which was not created for Lua, with plain Lua. It's not trivial at least.
*.so/*.dll must follow some specific standard, which is bluntly mentioned in programming in Lua#26.2 and lua-users wiki, code sample. Also similar question answered here.
There are two ways You could solve Your problem:
Writing Your own Lua zlib library wrapper, following those standards.
Taking some already finished solution:
zlib#luapower
lua-zlib
ffi
Bigger list #lua-users wiki
The same applies to base64 encoding/decoding. Only difference, there are already plain-Lua libraries for that. Code samples and couple of links #lua-users wiki.
NOTE: Lua module package managers like LuaRocks or
LuaDist MIGHT save You plenty of time.
Also, simply loading a Lua module usually consists of one line:
local zlib = require("zlib")
The module would be searched in places defined in Your Lua interpreter's luaconf.h file.
For 5.1 it's:
#if defined(_WIN32)
/*
** In Windows, any exclamation mark ('!') in the path is replaced by the
** path of the directory of the executable file of the current process.
*/
#define LUA_LDIR "!\\lua\\"
#define LUA_CDIR "!\\"
#define LUA_PATH_DEFAULT \
".\\?.lua;" LUA_LDIR"?.lua;" LUA_LDIR"?\\init.lua;" \
LUA_CDIR"?.lua;" LUA_CDIR"?\\init.lua"
#define LUA_CPATH_DEFAULT \
".\\?.dll;" LUA_CDIR"?.dll;" LUA_CDIR"loadall.dll"
#else
How do I actually create or call an existing c library in Lua?
An arbitrary library, not written for use by Lua? You generally can't.
A Lua consumable "module" must be linked against the Lua API -- the same version as the host interpreter, such as Lua5.1.dll in the root of the Wireshark directory -- and expose a C-callable function matching the lua_CFunction signature. Lua can load the library and call that function, and it's up to that function to actually expose functionality to Lua using the Lua API.
Your zlib and/or base64 libraries know nothing about Lua. If you had a Lua interpreter with a built-in FFI, or you found a FFI Lua module you could load, you could probably get this to work, but it's really more trouble than it's worth. Writing a Lua module is actually super easy, and you can tailor the interface to be more idiomatic for Lua.
I don't have zlib or a base64 C library handy, so for example's sake lets say we wanted to let our Lua script use the MessageBox function from the user32.dll library in Windows.
#include <windows.h>
#include "lauxlib.h"
static int luaMessageBox (lua_State* L) {
const char* message = luaL_checkstring(L,1);
MessageBox(NULL, message, "", MB_OK);
return 0;
}
int __declspec(dllexport) __cdecl luaopen_messagebox (lua_State* L) {
lua_register(L, "msgbox", luaMessageBox);
return 0;
}
To build this, we need to link against user32.dll (contains MessageBox) and lua5.1.dll (contains the Lua API). You can get Lua5.1.lib from the Wireshark source. Here's using Microsoft's compiler to produce messagebox.dll:
cl /LD /Ilua-5.1.4/src messagebox.c user32.lib lua5.1.lib
Now your Lua scripts can write:
require "messagebox"
msgbox("Hello, World!")
Your only option is to use a library library like alien. See my answer Disabling Desktop Composition using Lua Scripting for other FFI libraries.
This is a beginner question, since I am new to iOS(I started it today), so please pardon my ignorance and lack of iOS knowledge.
After building and successfully using FFMpeg for Android I wanted to do the same for iOS.
So I built FFMpeg successfully for iOS by following this link, but after all that pain I am confused as how to use FFMpeg in iOS, I mean how can I pass command line arguments to libffmpeg.a file?
I am assuming that there must be a way to run the .a file as an executable and then pass command line arguments and hope for FFMpeg to do the magic, I did the same in Android and it worked beautifully.
I am also aware that I can use ffmpeg.c class and use its main method, but the question remains; how do I pass those command line arguments?
Is there something I am supposed to be aware of here, is the thing what I am doing now correct or am I falling short on my approach?
I wanted to mix two audio files, so the command for doing that would be ffmpeg -i firstSound.wav -i secondSound.wav -filter_complex amix=inputs=2:duration=longest finalOutput.wav, how do I do the same in iOS?
Can someone please shed some light on this?
You don't pass arguments to a .a file as it's a library file. It's something you build your application with, giving you access to the functions provided by the ffmpeg library. I'm not sure what the state of play with Android is but it's likely it's generating a command line executable instead.
Have a look at the ffmpeg documentation, there's probably a way to do what you want with the library, however building and running ffmpeg as a standalone, pass-in-arguments, binary is unlikely.
You can do it in your main.c, and of course you wouldn't hardcode args these are just for illustration
I assume your using ffmpeg for playback since your playing with iframeextractor, what actually is the goal of what your trying to do.
/* Called from the main */
int main(int argc, char **argv)
int flags, i;
/*
argv[1] = "-fs";
argv[2] = "-skipframe";
argv[3] = "30";
argv[4] = "-fast";
argv[5] = "-sync";
argv[6] = "video";
argv[7] = "-drp";
argv[8] = "-skipidct";
argv[9] = "10";
argv[10] = "-skiploop";
argv[11] = "50";
argv[12] = "-threads";
argv[13] = "5";
//argv[14] = "-an";
argv[15] = "http://172.16.1.33:63478/hulu-f4fa0821-767a-490a-8cb5-f03788760e31/1-hulu-f4fa0821-767a-490a-8cb5-f03788760e31.mpg";
argc += 14;
*/
/* register all codecs, demux and protocols */
avcodec_register_all();
avdevice_register_all();
av_register_all();
parse_options(argc, argv, options, opt_input_file);
. .. mo
}