How can I use Matroska container to save H264 video? I have been looking for examples but I have found none. Maybe I am searching with wrong parameters. Can anyone point me in right direction? I have looked at Matroska source but it seems overkill to study the whole source code to accomplish this. There should be a practical way to do it.
Working with DirectShow you could use the Matroska Muxer filter
or you could dump to an avi then use ffmpeg to copy the stream(s) to an mkv
ffmpeg -i <input_file> -vcodec copy -acodec copy output.mkv
Related
I was unable to read video frames in OpenCV (python and c++) after using ffmpeg.
Specifically, I had the following problems:
1) unable to open the video file using VideoCapture.
2) able to open the video file using VideoCapture, but reading zero frames and/or receiving frames of size 0x0 pixels.
I am pasting my solution below, hoping this will help others.
These problems were encountered on Mac OS X Sierra 10.12.5, ffmpeg 3.2.4, Python 2.7.13, g++ 4.2.1
1) OpenCV cannot read as many video formats as ffmpeg. Therefore, it is often possible to play videos in VLC, but not to open them in OpenCV's VideoCapture. Hence, conversion with a tool such as ffmpeg is often required (see next point). One format that worked for me is h.264.
2) OpenCV seemed to require YUV420, whereas ffmpeg used YUV444 by default. Therefore, the following command solved my problems:
ffmpeg -i input.avi -c:v libx264 -vf format=yuv420p output.mp4
I am trying to accomplish a task to live stream from iPhone camera. I have done some research and found that i can use .m3u8 files for streaming live video with should contain .ts(Mpeg-2) files .
Now the file which i have on my iPhone is .mp4 file and it does not work with .m3u8, so i figured out i will have to convert .mp4 to .ts for that , but i have not succeeded in doing so.
I found that it is possible to convert video ffmpeg lib as mentioned in this article here. I have successfully imported ffmpeg library but not able figure out how can i use it to convert a video as i am using this for first time.
One another thing apple documentation says
There are a number of hardware and software encoders that can create
MPEG-2 transport streams carrying MPEG-4 video and AAC audio in real
time.
What is being said here? is there any other way i can use .mp4 files for live streaming without converting them from iOS?
Let me know if i am not clear, i can provide more information .Any suggestion is appreciated. I would like to know am i on a right path here?
EDIT
I am adding more info to my question, so basically what i am asking is , we can convert .mp4 video to .ts using following command
ffmpeg -i file.mp4 -acodec libfaac -vcodec libx264 -an -map 0 -f segment -segment_time 10 -segment_list test.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream%05d.ts
How can i use ffmpeg library to do what this command does in iOS.
You can look at ffmpeg.c to see what the ffmpeg uses from the ffmpeg libraries. It's a pretty complicated file, however. You may want to rename main() and compile it to an object, then you could call the new ffmpeg_main() function from your code.
Apple device work on HLS only and you need to convert the media files from mp4 to ts along with the metadata file as m3u8. You can use the streamers like gstreamer etc to stream you mp4 to HLS format or the other way is to trans-code on fly which is very slow.
I am trying to use ffmpeg for windows to convert thousands of images and sounds to a single video file.
ffmpeg -i apples.jpg -i oranges.jpg -i orangessound.wav -i bananas.jpg -vcodec mpeg4 test.avi
My delphi program generates the command line as a string, and I use shellexecute to call it.
But is it true that a command line cannot be longer than 8191 characters? If so, can ffmpeg read the parameters from a file instead?
It is not possible in this case to rename the pictures with consecutive numbers
You might want to consider using the open source libavcodec library in your own code instead. It is the same library that ffmpeg uses internally (there is a separate library called libav, which is an off-shoot of the original ffmpeg libavcodec library). Then you can do whatever you want with the files. You can look at the ffmpeg source code to see how it interprets the parameters you want to use, then adapt that logic to your own code.
ffmpeg does not support taking the entire command from file. you need to modify the code for your custom use.
it can take encoding parameter using "-fpre" option.
So I want to make an app for the iPhone that will play live mms:// video streams.
I have look around, and everywhere says that I'll need FFmpeg in order to accomplish it. So I successfully compiled the FFmpeg libraries, but now
Do I have to convert the mms:// link to a .m3u8 link? Or I can just use apples AV Foundation Framework ?
Thanks!
You need libmms as well as the ffmpeg libs ,however as think the latest versions of ffmpeg has the code built in so you may not need libmms mms is just a streaming protocol so the actual format is likely some mpeg variant, mp4, h264.
Once you have that you extract the frames , and use ffmpeg avcodec_decode_video2 to decode to an avFrame. Just like any other video.
I need to extract only the i-frames from an MPEG encoded video file(it may be a .avi file or .mp4 file). Is it possible to atleast identify the i-frames.
It looks like this ffmpeg enhancement can accomplish this for you using the select filter to filter all I-Frames. You may need to build ffmpeg from source to get this capability I'm not sure. Here is an Ubuntu guide with the details, and here is where you can get the nightly Windows binary version.
EDIT: OpenCV relies on ffmpeg API's to capture video from a file. However, the functionality you want is a bit specific, so you are going to need to write your own custom filtering module with ffmpeg's libavfilter. Here is one of their how to guides on using libavfilter. In particular, as I mentioned above, you will probably want to use the select filter provided by libavfilter. This will filter all I-frames from the stream for you to process.
Hope that helps!