Streaming live video from iOS using HTTP - ios

I have read about HTTP Live Streaming from Apple. So far I understand that it was created for streaming video to iOS devices. But is it possible to use this approach to stream from iOS device while recording on camera? If so can you give me a clue or tell how to do it?

It is possible, yes. But I would not recommend it. HLS is a pull based protocol. Good for delivering to clients, but bad for ingesting. Would need to run a web server on the device, and package the media into a transport stream. Its a lot of effort that is easier to handle on the server.

Related

iOS requirement for live steaming

I'm working on the live streaming app like Periscope and doing research on requirement and restriction on iOS.
I found out that Apple only allows HLS (Http Live Streaming) for certain conditions. I found such conditions below from apple site.
If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming.(https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW5)
But I'm not sure that HLS should be used for both publishing and watching video or only for watching is acceptable? Because i thinking of using RTMP for publishing and HLS for watching.
I wrote an app similar to periscope that is on the app store now and it can use 2Mbps and connects via RTMP protocol to send the data. So my guess is they no longer enforce it. I also beleive that at the time that that was written cell service load was possibly to high and they were hoping that HLS would help with that would be my guess. Now with 4gLTE it can handle the loads a little better. Again that is just a guess. My app went up with no problem or mention of that and the review team was more than aware what the app did.

iOS RTP live audio receiving

I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere.
I have a Windows desktop app which captures audio from the selected audio interface and streams it as ยต-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In Android we have "android.net.rtp" package to manage this and transmit or receive data streams over the network.
Is there any kind of equivalent package for iOS to implement this? Could you give me any kind of reference / sample to do this, or just tell me where to start?
You can see this libraryHTTPLiveStreaming, But his protocol maybe is not standard one, You can check my fork aelam/HTTPLiveStreaming-1, I'm still working on it, it can be played by ffplay. You can try
check the file rtp.c in ffmpeg, I think it will help out

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.

Capture video on iOS device and live stream it to a server (or another mobile)

I want to be able to record footage using my iOS device and stream it directly to a server.
There's quite a few articles on S.O. that talk about this, but I'm not sure any have answered the question very well.
Should I be using HTTP Live Streaming, or is this just for sending data to an iPhone?
Should I be using AVCaptureSession to grab the video (a segment at a time?), sending each segment to the server?
Should I be using AVCaptureVideoDataOutput and ffmpeg for streaming?
I'm a little lost with all this, so any sample code or docs or links would be really appreciated.
Thanks for your help guys.
Duncan
You have to choose a network protocol for that purpose and find an appropriate media server to receive and process the stream. If the RTMP format is ok for your project, check angl library which supports RTMP streaming from iOS. Currently it's compatible with iOS 6 and 7.

sending video by hand-made protocol

I'm going to develop an application, whose main task would be send video frames captured from device camera to server. Server uses protocol over TCP. I heard that Apple restricts developers from using any video streaming protocols, except HTTP live streaming. Is this information correct? Will be there any problems while approving my app in appstore?
After some digging into the topic, found some info, here is another topic. And for video broadcasting from device we cannot use HLS.

Resources