Live Stream using .m3u8 and .ts files with iPhone as server - ios
I am trying to accomplish a task to live stream from iPhone camera. I have done some research and found that i can use .m3u8 files for streaming live video with should contain .ts(Mpeg-2) files .
Now the file which i have on my iPhone is .mp4 file and it does not work with .m3u8, so i figured out i will have to convert .mp4 to .ts for that , but i have not succeeded in doing so.
I found that it is possible to convert video ffmpeg lib as mentioned in this article here. I have successfully imported ffmpeg library but not able figure out how can i use it to convert a video as i am using this for first time.
One another thing apple documentation says
There are a number of hardware and software encoders that can create
MPEG-2 transport streams carrying MPEG-4 video and AAC audio in real
time.
What is being said here? is there any other way i can use .mp4 files for live streaming without converting them from iOS?
Let me know if i am not clear, i can provide more information .Any suggestion is appreciated. I would like to know am i on a right path here?
EDIT
I am adding more info to my question, so basically what i am asking is , we can convert .mp4 video to .ts using following command
ffmpeg -i file.mp4 -acodec libfaac -vcodec libx264 -an -map 0 -f segment -segment_time 10 -segment_list test.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream%05d.ts
How can i use ffmpeg library to do what this command does in iOS.
You can look at ffmpeg.c to see what the ffmpeg uses from the ffmpeg libraries. It's a pretty complicated file, however. You may want to rename main() and compile it to an object, then you could call the new ffmpeg_main() function from your code.
Apple device work on HLS only and you need to convert the media files from mp4 to ts along with the metadata file as m3u8. You can use the streamers like gstreamer etc to stream you mp4 to HLS format or the other way is to trans-code on fly which is very slow.
Related
How to build ffmpeg to get the smallest static library? I only want convert 3gp to mp4
I need to convert 3gp files into MP4 format files in my iOS App. I use ffmpeg to convert. But the complete ffmpeg package increased my app by 6MB. How do I build ffmpeg to get the smallest static library? I only need this function: convert 3gp video into MP4 format.
How to play .mts file in iOS
Currently im working with a wifi-camera device, which is able to send only videos in .mts format. As i have investigated in google, it leads to conclusion that it is not possible to play the video in iPhone or may be using Objective C. Now my problem what is, There is still many paid applications that allows us to play .mts files PlayerXtreme is able to run files in almost any video format. It has currently the following formats covered: 3gp, asf, avi, divx, dv, dat, flv, gxf, m2p, m2ts, m2v, m4v, mkv, moov, mov, mp4, mpeg, mpeg1, mpeg2, mpeg4, mpg, mpv, mt2s, mts, mxf, ogm, ogv, ps, qt, rm, rmvb, ts, vob, webm, wm, wmv See the application: iTunes app Player i m curently trying is AVPlayer lib How can i start the coding to get this done in my app also? Currently I'm trying solve by adding some python scripts to my project as a build.. and calling the same for converting to mp4 format.. anyone worked on py-ObjC Together in XCode pls give some idea..
MTS is a container format. You need to know what the video and audio codecs are, and see whether VideoToolbox and AudioToolbox (the underlying frameworks of AVPlayer) support those codecs. Assuming H264 for video and AAC or AC3 for audio, libavcodec (part of ffmpeg) supports the MTS container. You can use that to demux the video and audio streams, and then use VTDecompressionSession and AudioQueueNewOutput (I think, I haven’t used that API in a while).
iOS, how to convert mp4 file into ts (transport stream) without ffmpeg?
as I described in title, I need to convert mp4 file to ts file, but without using ffmpeg library. Is it possible to do by native API? Where should I look to?
iOS Video Player FFmpeg
So I want to make an app for the iPhone that will play live mms:// video streams. I have look around, and everywhere says that I'll need FFmpeg in order to accomplish it. So I successfully compiled the FFmpeg libraries, but now Do I have to convert the mms:// link to a .m3u8 link? Or I can just use apples AV Foundation Framework ? Thanks!
You need libmms as well as the ffmpeg libs ,however as think the latest versions of ffmpeg has the code built in so you may not need libmms mms is just a streaming protocol so the actual format is likely some mpeg variant, mp4, h264. Once you have that you extract the frames , and use ffmpeg avcodec_decode_video2 to decode to an avFrame. Just like any other video.
Containers that can hold H264 encoded video
How can I use Matroska container to save H264 video? I have been looking for examples but I have found none. Maybe I am searching with wrong parameters. Can anyone point me in right direction? I have looked at Matroska source but it seems overkill to study the whole source code to accomplish this. There should be a practical way to do it.
Working with DirectShow you could use the Matroska Muxer filter or you could dump to an avi then use ffmpeg to copy the stream(s) to an mkv ffmpeg -i <input_file> -vcodec copy -acodec copy output.mkv