iOS broadcasting live to Azure Media Services - ios

I am trying to make a Periscope-like app (not practically, but technical requirements are alike) where users can start streaming quickly from their iPhone to an unknown amount of users, both mobile. I am trying to use Azure Media Services for live video streaming, but even after reading pages of documentation I'm stuck.
I'm using VideoCore (https://github.com/jgh-/VideoCore) to publish from iOS device to the RTMP server. On local (using Wowza) I can just connect to the local server with my set username and password as shown:
vcSession = [[VCSimpleSession alloc] initWithVideoSize:CGSizeMake(1280, 720) frameRate:30 bitrate:1000000 useInterfaceOrientation:NO];
[self.view addSubview:vcSession.previewView];
vcSession.previewView.frame = self.view.bounds;
vcSession.delegate = self;
[vcSession startRtmpSessionWithURL:#"rtmp://172.20.10.2:1935/live?rtmpauth=test:test" andStreamKey:#"test"];
Where the rtmpauth parameter has the username:password format, which I've set both to test on my local server. It works. In Azure, I've created a channel named test, and I've got the following Ingest URL:
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string
In Wirecast, I'm able to stream to URL (though EXTREMELY slow and connection frequently lost, don't know why) by selecting Azure Media Services in Output Settings and typing that Ingest URL. In iOS, I have no idea how to connect to Azure Media Services.
In startRtmpSessionWithURL:andStreamKey: method, I've tried all the possible combinations of URL and a stream key, but no luck. I have no idea what my username/password is (nothing is given at the Azure side), what the stream key is (I've tried test, live, empty string) and what that long hexadecimal string is (some sources say that it's called a locator, though).
What is the correct format of RTMP URL and stream key when connecting to Azure Media Services for streaming?

I'll find someone to help you. I think you are just missing a stream name after the long hex string in the URL.
rtmp://test-myappname.channel.mediaservices.windows.net:1935/live/some-long-hexadecimal-string/[YOUR-CUSTOM-STREAM-NAME-Anything Really!]
Also, do you have any control over the encoding settings? Its possible that some encoding settings are not right. We have not tested with that VideoCore library, so it may also be that there is a slight variation in the RTMP protocol (since it is very poorly documented and there is a lot of missing information out there).
I'm curious why your Wirecast setup is having trouble as well. That doesn't sound good to start with. Network issue? Are you setting it to the proper Encoder preset with H.264 and NOT x264 set?
Review your settings in Wirecast against Cenk's blog post here: http://azure.microsoft.com/blog/2014/09/18/azure-media-services-rtmp-support-and-live-encoders/

Related

Architecture for a web app to add overlays to users' Youtube live stream video?

I am trying to build a web app for users to easily add text (as open caption) and other assets in my app as overlays in real-time to their YouTube live stream video.
They will use their camera to record their video, and select from my app which text should be added to the video.
Then, the video will be sent to Youtube live through their API.
Here are my questions:
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
Second, should I encode the output being sent to Youtube? Can this be done from the browser too?
I'm only seeing a few node.js frameworks, and even they're not very mature (or is Webcodecs for this purpose?). Is a web app a poor choice for this task?
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)? Is my server most likely going to be the bottleneck given YouTube's infrastructure, since video files are huge and my server is limited?
I am new to video streaming, so please excuse my lack of understanding of the subject. Also, if there's any good resource for my problem, please share them with me.
First of all, I was wondering if mixing video + subtitle and sending it to Youtube's rtmp url can be done from the client side, so it's simple and lightweight.
You can do the video compositing and audio mixing and what not, but browsers don't support RTMP. To get the data to an RTMP server, you need to send it to a server where it is proxied off to the final URL.
They will use their camera to record their video, and select from my app which text should be added to the video.
Yeah, that's no problem at all. Draw everything to a canvas every frame.
Second, should I encode the output being sent to Youtube?
Yes, you must. Check out the Media Recorder API.
Lastly, if I do need a server to process the video, where should the encoding happen (from the user's machine, or in the server, or both?)?
The video has to be encoded client-side to get to the server in the first place. The server can then hopefully just repackage with flv and send it along. If the browser doesn't support H.264 in its Media Recorder API, then you'll have an intermediary codec like VP8, and you'll have to transcode server-side.
A few years ago, I wrote a tutorial on how to do all of these steps here: https://github.com/fbsamples/Canvas-Streaming-Example Note that the tutorial is in the context of Facebook, but this should teach you the concepts.

What is the major role of Streaming Media Server?

I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.
I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.
Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..
(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?
(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?
(3) How will my server communicate with a streaming server like Wowza ?
(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.
It would be great if someone gives a graphical representation of the data flow.
Thanks a lot in Advance !!!
Let me quickly add my understanding to your questions:
1a. Why Media Server? ..
You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.
1b. What actually Media Server do to make its role necessary?
A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.
An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.
In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the
streaming server?
You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.
How will my server communicate with a streaming server like Wowza?
The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.
How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.
The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.
What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?
It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.
I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)

Streaming video/audio from iOS device

I have read several posts here about live streaming video/audio from iOS device while user is recording. Unfortunately it seems that there is not any "good" solution.
I understand that I must have access to files while I am recording and then send files to server from which other users can watch my stream live (with a small time lag).
Working with iOS is not problem for me, I am more struggling with part where data should be handled to server and the whole processing on server.
I have several questions:
Saying just server is very vague, what "kind of" server it should be?
I understand that I must use some protocol to send data TO server and then to get data FROM server so user can watch live video, what protocol should I use?
I feel very lost with whole server side processing, what should be done with files that were sent to server?
All this seems to be very nontrivial is there any third party solution? For example what technology apps like Periscope, Ustream or Meerkat use to provide live stream feature for their users?
I would also really appreciate if possible answers would more than one word long for each question.
Please find my answers to your questions:
There is a class of software called "media servers". E.g. Wowza, Red5, Nimble Streamer, nginx-rtmp-module and a few others.
Most common protocols for sending data TO media server are RTMP and RTSP. Watching the video is done via several ones like RTMP (requires Flash installed), HLS (native for iOS, supported by Android 4+, working on some web-players), DASH (supported by some players).
No files needed, media server can process incoming live stream and handle connections from viewers.
Basically they use combination of mentioned technologies plus their own "know-how".

How to stream video over secure connection on iOS

I can play a video from a local resource (on the device).
I can stream a video from the unprotected internet.
I can't stream from the company intranet (either from internal or externally)
Typical secure company network. Videos are stored in SharePoint 2007 lists (but I have url to the video file).
I've tried:
MPMoviePlayerController
MPMoviePlayerViewController
UIViewView (creating html on fly using the <video> tag and video url)
and I can't get anything to work. Heck, I can't even get it to work going directly to the link in Safari on the iPad. The only thing I haven't tried doing is downloading it as a file then playing locally. Due to a host of usability issue this would not be a preferred option.
There were 2 problems.
MPMoviePlayerController doesn't support all the challenge authentication which exists in NSURLConnection. The solution is to just do a "dummy" NSURLConnection somewhere inside your secure area, have it handle all the challenges and set it to store the information for the session. From here on out, MPMoviePlayerController or other connections which don't support the ins and outs of SSL requests will use the existing session.
A valid intermediate certificate had to be installed on the server. Something I never would have figured out myself. It exists as a small item in one of the guides under Secure connections.

Live Streaming over RTMP from Wowza to iOS

I'm trying to play a live stream that is being sent out by a Wowza server, we are using RTMP to handle the streams. We have an equivalent for that works on android and the way they do it is by connecting to the server via the url and 2 parameters to identify the actual stream to play and if you are allowed to see the live video. After the connection is attempted, the server does a call back sending an integer for a check to see if the user is logged in. Once the check has passed, the video is played.
I have no idea how to handle the call back or how to properly set up the connection so that it takes both parameters and the url.
One big issue is that the Wowza server was created by a third party that we are no longer in contact with, so i have no idea on how the actual server is set up.
Any suggestions would be greatly appreciated
I recommend a few steps to start with:
Determine how the Android app actually works. Is the the server you speak of to which it sends the two parameters the Wowza server? If so, it is probably a custom plugin.
Get access to the server, so that you can configure it for iOS streaming.
You'll also need to check out some documentation, http://www.wowza.com/forums/content.php?3-quick-start-guide, http://www.wowza.com/forums/content.php?217#cupertinostreaming
Once you have a better understanding of the problem, the Wowza folks are very helpful at http://www.wowza.com/forums.
Good luck!
there is no need to android apps etc. Wowza is supporting the output which can be played # ios, because of the ios cant play flash, then it cannot played rtmp or rtsp . However Ios can play the stream which is like http: // myWowzaServer/myApplication/myStream/playlist.m3u8 ...
For better config detail please visit the
http://emrekaratasoglu.com/php-freelance-watch-wowza-live-stream-apple-ios-mobile-phone/

Resources