I have read several posts here about live streaming video/audio from iOS device while user is recording. Unfortunately it seems that there is not any "good" solution.
I understand that I must have access to files while I am recording and then send files to server from which other users can watch my stream live (with a small time lag).
Working with iOS is not problem for me, I am more struggling with part where data should be handled to server and the whole processing on server.
I have several questions:
Saying just server is very vague, what "kind of" server it should be?
I understand that I must use some protocol to send data TO server and then to get data FROM server so user can watch live video, what protocol should I use?
I feel very lost with whole server side processing, what should be done with files that were sent to server?
All this seems to be very nontrivial is there any third party solution? For example what technology apps like Periscope, Ustream or Meerkat use to provide live stream feature for their users?
I would also really appreciate if possible answers would more than one word long for each question.
Please find my answers to your questions:
There is a class of software called "media servers". E.g. Wowza, Red5, Nimble Streamer, nginx-rtmp-module and a few others.
Most common protocols for sending data TO media server are RTMP and RTSP. Watching the video is done via several ones like RTMP (requires Flash installed), HLS (native for iOS, supported by Android 4+, working on some web-players), DASH (supported by some players).
No files needed, media server can process incoming live stream and handle connections from viewers.
Basically they use combination of mentioned technologies plus their own "know-how".
Related
I am trying to build a web app for users to easily add text (as open caption) and other assets in my app as overlays in real-time to their YouTube live stream video.
They will use their camera to record their video, and select from my app which text should be added to the video.
Then, the video will be sent to Youtube live through their API.
Here are my questions:
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
Second, should I encode the output being sent to Youtube? Can this be done from the browser too?
I'm only seeing a few node.js frameworks, and even they're not very mature (or is Webcodecs for this purpose?). Is a web app a poor choice for this task?
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)? Is my server most likely going to be the bottleneck given YouTube's infrastructure, since video files are huge and my server is limited?
I am new to video streaming, so please excuse my lack of understanding of the subject. Also, if there's any good resource for my problem, please share them with me.
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
You can do the video compositing and audio mixing and what not, but browsers don't support RTMP. To get the data to an RTMP server, you need to send it to a server where it is proxied off to the final URL.
They will use their camera to record their video, and select from my app which text should be added to the video.
Yeah, that's no problem at all. Draw everything to a canvas every frame.
Second, should I encode the output being sent to Youtube?
Yes, you must. Check out the Media Recorder API.
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)?
The video has to be encoded client-side to get to the server in the first place. The server can then hopefully just repackage with flv and send it along. If the browser doesn't support H.264 in its Media Recorder API, then you'll have an intermediary codec like VP8, and you'll have to transcode server-side.
A few years ago, I wrote a tutorial on how to do all of these steps here: https://github.com/fbsamples/Canvas-Streaming-Example Note that the tutorial is in the context of Facebook, but this should teach you the concepts.
I am very new to Real Time Protocols and I had some questions about how WebRTC works and how I can implement it. I am trying to create a one to many livestream like facebook or periscope, where one user broadcasts and other users join and stream the video. I am using Swift from my client end.
My questions are:
How do I broadcast a video using WebRTC
Is there an SDK for WebRTC in Swift/iOS
I know the questions are very vague but a guidance to the right direction would be great because I am not sure where to start
You will need to use backend servers for that.
If you plan on broadcasting to multiple users directly from your mobile app then stop...
You need to connect your mobile app to a backend media server which then can be used to broadcast the video to a larger audience.
There are several commercial and open source alternatives that enable you to do that. I'd check Red5Pro, Wowza, SwitchRTC, Jitsi, Janus and Kurento for this task.
For the client side, look at react-native-webrtc
You can find more tools for WebRTC developers here.
Regarding your question (2), there's also a SDK for iOS here and a neat get-started-page here (although 2.5ys old, but I haven't found anything better so far yet)
I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.
I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.
Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..
(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?
(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?
(3) How will my server communicate with a streaming server like Wowza ?
(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.
It would be great if someone gives a graphical representation of the data flow.
Thanks a lot in Advance !!!
Let me quickly add my understanding to your questions:
1a. Why Media Server? ..
You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.
1b. What actually Media Server do to make its role necessary?
A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.
An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.
In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the
streaming server?
You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.
How will my server communicate with a streaming server like Wowza?
The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.
How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.
What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.
I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)
I am trying to develop a live-streaming application like the meerkat app, where user A can broadcast a live stream while other users are able to watch it. I am having trouble understanding the architecture and mechanisms used to upload video to a server. Currently, I am using a dedicated server with FFMPEG installed on it. I also know FFServer can be used to perform RTSP communication, but I am still unclear how to do this. Can anyone guide me on this?
I would like to know how to upload videos to a server or whether there is another way to perform a live stream. Open source frameworks are welcome.
for Live streaming video/audio http://www.wowza.com/ give you the best functionality. you have to set up your server in WOWZA also you cant test in that.
for IOS you can broadcast and receive from the below demo you can download from here
i think it's helpful to you :)
Well i was in search of something open source which can be implemented without any additional cost. Luckily found Red5 Server (Open Source) https://github.com/Red5/red5-server
I had configured it on my dedicated server and running perfectly fine. Now as server side issue is solved. I need something thing to work on iOS side. For that also i found https://github.com/slavavdovichenko/MediaLibDemos3x
So with the combination of this two repos i was able to make an live streaming app like meerkut
Thanks
I am new to multimedia and iOS programming and I came across Weborb while Googling, which provides RTMP library for iOS. It doesn't clearly mention that if it can be used to stream live video through a media server like Red5.
If any one have used this, please let me know that whether it can be used to stream live video from iPhone to a media server and where does it fit in the whole setup.
Does it act like a server itself between a media server and the iPhone application or does it also have its own media server?
I also want some links for tutorials which can help me start the real coding pertaining to RTMP streaming to a media server?
Thanks.
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server.
As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.