Streaming live camera video from iOS (iPhone/iPad) to remote PC / server - ios

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.

I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.

Related

Stream video from camera to iPhone

I'm working on an app that connects to a security camera. The camera has its own SIP server (Asterisk).
I'm having a very hard time finding a reliable iOS library to connect to the camera.
Can anyone recommend a high-quality SIP library that will stream video? I've tried several so far and none of them are fit for the task (I don't want mention them by name).
Or is there another way to access the video (using webRTC or possibly AVFoundation via the Asterisk server)?
I do not have a lot of experience with hardware, so I'm a bit lost.
What are you looking for called MCU(media control unit). There are some free availible for vido, but all are early beta and very hard to setup.

Streaming live video from iOS using HTTP

I have read about HTTP Live Streaming from Apple. So far I understand that it was created for streaming video to iOS devices. But is it possible to use this approach to stream from iOS device while recording on camera? If so can you give me a clue or tell how to do it?
It is possible, yes. But I would not recommend it. HLS is a pull based protocol. Good for delivering to clients, but bad for ingesting. Would need to run a web server on the device, and package the media into a transport stream. Its a lot of effort that is easier to handle on the server.

Streaming HLS to iOS devices using Windows Azure Media Services

I have been doing some exploring of Azure Media Services, specifically with media converted to HLS. I walked through the process of creating HLS content using a process similar to the one outlined in this HLS Walkthrough
Now that I have my HLS content in Azure, I am hoping to stream it just as you would any m3u8 stream. I have tried the following:
WebView in iPad – works OK, it's jumping and not very smooth
Safari on OS X – does not work at all
VLC Player – does not work at all.
Granted this i not exhaustive nor thorough (yet) but before I continue I wanted to get feedback if anyone has any. I stumbled along WAMS Media Player for iOS regarding the Smooth Player for iOS. Is the expectation here that the Smooth Player developed for iOS is the best way to consume HLS generated media from WAMS?
As I understand it, Safari's support (or lack thereof) for HLS depends on QuickTime -- there are versions of QuickTime that do support HLS (QuickTime Pro), but by default, the support is not there.
I suggest you transcode to both Smooth Streaming and HLS. Serve Smooth Streaming via Flash or Silverlight to Windows/Mac clients and HLS via HTML5 to iOS clients.

How to display RTSP from IP Camera/CCTV in iOS

There is obviously a way to do this because so many applications are already doing it - NetCamViewer and iCamviewer to name just one.
I have searched and searched, but I'm not finding anything of value that gives a hint as to how this is done. I'm reaching out hoping that someone will give me a clue.
I'm trying to connect to an video security camera (Y-CAM), which supports the RTSP protocol, and display the video from my iPhone/iPad application. The camera has an IP address and I can view the video from a web browser and from Quicktime running on my Mac. The problem is that RSTP is not supported on iOS so even trying to connect using Safari on an iPad doesn't work.
I've read that some are trying to use Live5555, but I haven't seen an article that describes if it has been done successfully and how.
An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.
I'd prefer to go directly to the camera like other applications I've seen do. the RTSP to Live is a fall back if I have to.
Any hints are greatly appreciated. Thanks!
This is wrong :) or not necessary (An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.)
You should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address
then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method
When you get frame, if it is audio then sent it to AudioUnit or AudioQueue,
if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS
You really need to search stack overflow before posting , this question has been asked many times. Yes live 555 sort of works and some of us have gotten it to work..
There are other players too, including ours http://www.streammore.tv/
You can find an open source FFMepg Decoder for iOS (and somes samples) on GitHub : https://github.com/mooncatventures-group
Sample use of this library : http://sol3.typepad.com/exotic_particles/
There are two general technology to display RTSP video on iOS Safari:
RTSP / HLS (H.264+AAC)
RTSP / Websocket (H.264+AAC ==> MPEG+G.711 or H.264+?)
For HLS you can consider Wowza server.
For Websocket playback in iOS Safari you can use WCS4 server.
Main idea for websocket playback is direct HTML5 rendering to HTML page Canvas element and audio context. In the case of MPEG playback video decoding will be done on iOS Safari side using plain JavaScript.
Another option - install a WebRTC plugin with getUserMedia support and play this stream via WebRTC. Anyway you will need a server side RTSP-WebRTC transcoder in such case.

Capture video on iOS device and live stream it to a server (or another mobile)

I want to be able to record footage using my iOS device and stream it directly to a server.
There's quite a few articles on S.O. that talk about this, but I'm not sure any have answered the question very well.
Should I be using HTTP Live Streaming, or is this just for sending data to an iPhone?
Should I be using AVCaptureSession to grab the video (a segment at a time?), sending each segment to the server?
Should I be using AVCaptureVideoDataOutput and ffmpeg for streaming?
I'm a little lost with all this, so any sample code or docs or links would be really appreciated.
Thanks for your help guys.
Duncan
You have to choose a network protocol for that purpose and find an appropriate media server to receive and process the stream. If the RTMP format is ok for your project, check angl library which supports RTMP streaming from iOS. Currently it's compatible with iOS 6 and 7.

Resources