Best way to convert RTMP to MP4 - ios

I need to play an RTMP streaming in a website.
I want to run on iOS devices, but I know RTMP does not run on it.
So, my idea is to convert RTMP to some audio file (mp4, for example) and send it to the user and to play with HTML5. Do you have a better idea to do this work? I have PHP5 at server, but I have no idea about how to convert a continuous data (streaming) in a HTML5 compatible audio. For now, it's only an idea I have...
Thanks in advance.

There is currently no live format that works with html 5 . Mp4 can not be used as a live source as it is not a streaming format. The only built in support for Live audio/video on iOS is HLS.

Related

Re-stream a live video from iphone to a server

I am getting a video feed in my app from a drone. The drone's SDK is giving me the video as Data or NSData into my app. I want to stream the same or divert the same to a server (for example a Wowza server). These two things should process simultanously.
You can use ffmpeg library for restreaming it.
There is a sample ffmpeg swift or objective c
I've been trying to send the video to a streaming service like YouTube for a year. The video coming into the iPad from the controller (connected to a Phantom 3 Advanced) is in H.263 format. ffmpeg is best at bulk transcoding, not a stream environment. Tried https://github.com/LaiFengiOS/LFLiveKit but it has bugs. It's a wrapper on ffmpeg that knows how to do RTMP.
The DJI go app knows how to do this, I've asked in the forum for help or code sample but they won't help. Bottom line is I can find no way to stream the video from the drone to a streaming service like YouTube or Wowza. I wish Wowza could accept H.263 native, but they site list only H.264.
So I can't give you an answer, but I can give you what I've figured out over the last year.

how to encode and broadcast video for live event on youtube in programing

I want to encode and upload video to youtube corresponding with a live event but I cannot find how to do that on youtube api site same as the wirecast software. please help me
You'll have to use a flash media encoder and rtmp streamer. I used ffmpeg libraries for both and worked fine.

How to play a non-rtmp stream

I have an issue to stream a video conference for many mobile device(Andriod, iPhone etc) and I have troubles with streaming.
RTMP stream uses a flash and didn't work with this devices.
What stream can I use? and how can I convert rtmp to this supported stream?
For mobile devices you would need to convert your video to HTTP Live Streaming (HLS). If it's truly a live stream, you could use something Zencoder live streaming to convert it. If the RTMP stream is just reading from an MP4, you could use Handbrake or Zencoder to transmux it to HLS.

RTSP to HTTP Live Streaming (iOS)

I have RTSP feed coming from my IP Camera as my source input and need to publish it to HTTP Live Streaming (HLS) for playback on iOS devices. I have already tried RTSP-to-RTSP on iOS and it worked; this time I want to try RTSP-to-HLS. I already have Wowza and ffmpeg installed but I just don't know what commands I should run to produce HLS. I've googled for the right commands but I just couldn't find them. What should I run?
Safari supports HLS. Does it also mean that UIWebView supports HLS as well?

How to display RTSP from IP Camera/CCTV in iOS

There is obviously a way to do this because so many applications are already doing it - NetCamViewer and iCamviewer to name just one.
I have searched and searched, but I'm not finding anything of value that gives a hint as to how this is done. I'm reaching out hoping that someone will give me a clue.
I'm trying to connect to an video security camera (Y-CAM), which supports the RTSP protocol, and display the video from my iPhone/iPad application. The camera has an IP address and I can view the video from a web browser and from Quicktime running on my Mac. The problem is that RSTP is not supported on iOS so even trying to connect using Safari on an iPad doesn't work.
I've read that some are trying to use Live5555, but I haven't seen an article that describes if it has been done successfully and how.
An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.
I'd prefer to go directly to the camera like other applications I've seen do. the RTSP to Live is a fall back if I have to.
Any hints are greatly appreciated. Thanks!
This is wrong :) or not necessary (An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.)
You should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address
then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method
When you get frame, if it is audio then sent it to AudioUnit or AudioQueue,
if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS
You really need to search stack overflow before posting , this question has been asked many times. Yes live 555 sort of works and some of us have gotten it to work..
There are other players too, including ours http://www.streammore.tv/
You can find an open source FFMepg Decoder for iOS (and somes samples) on GitHub : https://github.com/mooncatventures-group
Sample use of this library : http://sol3.typepad.com/exotic_particles/
There are two general technology to display RTSP video on iOS Safari:
RTSP / HLS (H.264+AAC)
RTSP / Websocket (H.264+AAC ==> MPEG+G.711 or H.264+?)
For HLS you can consider Wowza server.
For Websocket playback in iOS Safari you can use WCS4 server.
Main idea for websocket playback is direct HTML5 rendering to HTML page Canvas element and audio context. In the case of MPEG playback video decoding will be done on iOS Safari side using plain JavaScript.
Another option - install a WebRTC plugin with getUserMedia support and play this stream via WebRTC. Anyway you will need a server side RTSP-WebRTC transcoder in such case.

Resources