How to stream live video from camera in iOS [closed] - ios

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
How can I stream a live video from a camera to an iOS application.
Let me know what are the possible way for stream video from the camera to cloud and view in iOS application.

It depends on what your camera supports. There are two ways to implement live streaming in an ios application.
HTTP Live Streaming :- HTTP Live Streaming (also known as HLS) is an
HTTP-based media streaming communications protocol that provides
mechanisms that are scalable and adaptable to different networks.
HLS works by breaking down a video file into a sequence of small
HTTP-based file downloads, with each download loading one short
chunk of a video file.
Reference Links : https://developer.apple.com/streaming/
https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
HTTP LIve Streaming
https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html
RTSP Streaming :- The Real Time Streaming Protocol (RTSP) is a
network control protocol designed for use in entertainment and
communications systems to control streaming media servers. The
protocol is used for establishing and controlling media sessions
between end points. There are third party library available which
can be used for RTSP streaming Like FFMPEG, VideoKit etc.
Reference Links :
https://www.ffmpeg.org
https://iosvideokit.com

There are several platforms out there where you can record and distribute your recording and share it with friends. Some of them are
UStream
Live Stream
Youtube
Youtube specifically has extensive documentation on what apps to download and how to setup your encoder and how to distribute your playback URL. The link to their documentation is here
https://support.google.com/youtube/answer/2853700
Facebook
This will be more specific to distribution only within the facebook network. So, if you want to publicly distribute your live stream, you should probably go with one of the others.
There would definitely be a lot more other platforms. But, these are the ones that come to mind.

Related

Is it possible to use the YouTube Live Stream API to broadcast through my phone camera?

I want to create a basic app that allow users to simply start to broadcast a video through their phone camera (front and back) just by pressing a button.
Does the YouTube live stream API allow me to handle the video streaming process?
If so, is YouTube Live Stream API totally free of charges and will never ask me to pay something if I reach a certain amount of usage?
Creating a Live Event and Live broadcast is language and hardware agnostic, just use YouTube's Live Streaming HTTP API. Read through the Core Concepts and Life of a Broadcast guides.
Your flow might look something like this:
Authenticate the user.
Set up and schedule your Live Broadcast object.
Start your video encoder and create a Live Stream Object.
Bind your Live Stream to your Live Broadcast.
Test to verify your video is going through.
Set your Live Broadcast to Live.
At the conclusion of your event, set your Live Broadcast to Ended.
Note that setting up your encoder is on you. Asking "How do I create an RTMP or DASH video encoder for [hardware or software]" is too broad of a question for Stack Overflow.
The YouTube API is free to use within a specific quota. If you hit that quota limit, there are ways to request additional quota from Google (potentially for a fee).
I answered a similar question about integrating with YouTube's Live Streaming API on iOS here: YouTube live on iOS?

What is the major role of Streaming Media Server?

I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.
I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.
Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..
(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?
(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?
(3) How will my server communicate with a streaming server like Wowza ?
(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.
It would be great if someone gives a graphical representation of the data flow.
Thanks a lot in Advance !!!
Let me quickly add my understanding to your questions:
1a. Why Media Server? ..
You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.
1b. What actually Media Server do to make its role necessary?
A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.
An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.
In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the
streaming server?
You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.
How will my server communicate with a streaming server like Wowza?
The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.
How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.
What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.
I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)

Streaming video/audio from iOS device

I have read several posts here about live streaming video/audio from iOS device while user is recording. Unfortunately it seems that there is not any "good" solution.
I understand that I must have access to files while I am recording and then send files to server from which other users can watch my stream live (with a small time lag).
Working with iOS is not problem for me, I am more struggling with part where data should be handled to server and the whole processing on server.
I have several questions:
Saying just server is very vague, what "kind of" server it should be?
I understand that I must use some protocol to send data TO server and then to get data FROM server so user can watch live video, what protocol should I use?
I feel very lost with whole server side processing, what should be done with files that were sent to server?
All this seems to be very nontrivial is there any third party solution? For example what technology apps like Periscope, Ustream or Meerkat use to provide live stream feature for their users?
I would also really appreciate if possible answers would more than one word long for each question.
Please find my answers to your questions:
There is a class of software called "media servers". E.g. Wowza, Red5, Nimble Streamer, nginx-rtmp-module and a few others.
Most common protocols for sending data TO media server are RTMP and RTSP. Watching the video is done via several ones like RTMP (requires Flash installed), HLS (native for iOS, supported by Android 4+, working on some web-players), DASH (supported by some players).
No files needed, media server can process incoming live stream and handle connections from viewers.
Basically they use combination of mentioned technologies plus their own "know-how".

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.

How to use HTTP Live Streaming protocol in iPhone SDK 3.0

I have developed on IPhone application and submitted to App store. But my application got rejected based on below criteria.
Thank you for submitting your yyyyyyyy
application. We have reviewed your
application and have determined that
it cannot be posted to the App Store
at this time because it is not using
the HTTP Live Streaming protocol to
broadcast streaming video. HTTP Live
Streaming is required when streaming
video feeds over the cellular network,
in order to have an optimal user
experience and utilize cellular best
practices. This protocol automatically
determines bandwidth available to
users and adjusts the bandwidth
appropriately, even as bandwidth
streams change. This allows you the
flexibility to have as many streams as
you like, as long as 64 kbps is set as
the baseline feed.
In my apps I have to stream prerecorded m4v and mp3 files from my server. I used MPMoviePlayerController to stream and play those videos / audio.
How to implement the HTTP Live Streaming Protocol in my apps? Also can I get some sample code?
Thanks in advance!
There are many documents about Apple's HTTP Live Streaming:
HTTP Live Streaming Overview
IETF HTTP Live Streaming Internet-Draft
There are many encoder devices claim to support this protocol e.g.,
Inlet's Spinnaker, acquired by Cisco and renamed to Cisco Media Processor Family.
For a software solution, please give a visit to Wowza
Please check the below notes specified in Apple documentation.
****Important: iPhone and iPad apps that send large amounts of audio or video data over cellular networks are required to use HTTP Live Streaming.****

Resources