live video streaming formats supported in iOS programming - ios

I am working live video streaming in iPad app. I have done it previously using MPMediaPlayer with HTTP servers.
From link here I understand that "iOS devices support HTTP progressive download for .mp4 files, the server could be simply Apache or Nginx. The user experience is quite similar to HTTP live streaming , RTSP is also possible. You can migrate live555 to iOS platform as the RTSP client, as use DarwinStreamingServer as the RTSP server.
But my client has provided video streaming URLs in UDP (eg. udp://225.X.X.X:XXXXX ) format and also suggested to refer links link1 and link2 as support to his statement that this will work in iOS for live video streaming.
But I am unable to relate provided links by him with requirement.
My doubts are -
what all formats does iOS supports for live video streaming ?
Does this udp link is of use for me in iOS for video streaming ?

On wifi, iOS supportS everything, because you have access to raw sockets, and the h.264 decode via VideoToolbox. So any protocol can be developed even If there is no out of box support. If you want the video to work over a cellular network, it MUST use HLS. no other options. (Unless you are Facebook, then apple will grant exceptions to this policy)

Related

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.

How to play a RTMP stream on IPhone at websites

I'm developing a website who needs an external RTMP stream.
I'm using jwplayer to run the stream using Flash (examples and information about here).
My problem is the stream do not works at iOS.
Somebody suggests a solution?
IOS does not support rtmp protocol for that you has to use http protocols, i.e. ,http live streaming

Can Weborb be used to do live video streaming from an iPhone through a media server?

I am new to multimedia and iOS programming and I came across Weborb while Googling, which provides RTMP library for iOS. It doesn't clearly mention that if it can be used to stream live video through a media server like Red5.
If any one have used this, please let me know that whether it can be used to stream live video from iPhone to a media server and where does it fit in the whole setup.
Does it act like a server itself between a media server and the iPhone application or does it also have its own media server?
I also want some links for tutorials which can help me start the real coding pertaining to RTMP streaming to a media server?
Thanks.
The short answer is yes, the RTMP library for iOS can be used with Red5, FMS, WebORB etc. The library is not the server itself, yet client. It establish the RTMP connection to the server and encodes stream before send it to the server.
As I remember the library distributive contains some example to demonstrate how streaming works. Unfortunately, the official site doesn't show any examples related to streaming, the available examples can be useful to start work with the library (http://www.themidnightcoders.com/products/weborb-for-mobile/ios-integration/rtmp-ios-examples-integration-between-java-net-and-ios.html). The documentation looks up to date - http://www.themidnightcoders.com/fileadmin/docs/ios/.

How to convert URL file 'Smooth streaming' to Apple HLs format?

In
iOS App
for Live streaming
I am getting the response from server
like this, (You can check)
http://4a75a0cce3694e29bc670b3d574fec92.cloudapp.net/push.isml/manifest
Which is Smooth Streaming file.
How to play this file in my ios app.
Is there any run time converter to convert this file to Apple HLS ?
Any player*(Smooth Streaming)* like
OSMF plugin for iOS
HTML5 player
Silverlight plugin for iOS
Actually :
Actually we have not tried much with Azure Framework. We setted up
the IIS server and got the live streaming.
Played stream by HTML5 video tag in web view.
By following this link
http://www.hanselman.com/blog/CommentView.aspx?guid=86968cd5-feeb-47f2-b02e-1eb4fa556379#commentstart
We can able to play live streaming in our iOS devices.
Still I am happy if I can configure the windows AZure framework to do the same.
You can use the Windows Azure Media Services for converting SmoothStreaming media content into Apple HLS content. Either you use the Management Portal to upload your media assets and then encode it with the preset "Playback on iOS devices and PC/MAC" or you use the REST Api for the Windows Azure Media Services at runtime.
With the REST API you can utilize the Windows Media Packager to encode Smooth Streaming content for HLS. Find a sample configuration for that task here.

How to use HTTP Live Streaming protocol in iPhone SDK 3.0

I have developed on IPhone application and submitted to App store. But my application got rejected based on below criteria.
Thank you for submitting your yyyyyyyy
application. We have reviewed your
application and have determined that
it cannot be posted to the App Store
at this time because it is not using
the HTTP Live Streaming protocol to
broadcast streaming video. HTTP Live
Streaming is required when streaming
video feeds over the cellular network,
in order to have an optimal user
experience and utilize cellular best
practices. This protocol automatically
determines bandwidth available to
users and adjusts the bandwidth
appropriately, even as bandwidth
streams change. This allows you the
flexibility to have as many streams as
you like, as long as 64 kbps is set as
the baseline feed.
In my apps I have to stream prerecorded m4v and mp3 files from my server. I used MPMoviePlayerController to stream and play those videos / audio.
How to implement the HTTP Live Streaming Protocol in my apps? Also can I get some sample code?
Thanks in advance!
There are many documents about Apple's HTTP Live Streaming:
HTTP Live Streaming Overview
IETF HTTP Live Streaming Internet-Draft
There are many encoder devices claim to support this protocol e.g.,
Inlet's Spinnaker, acquired by Cisco and renamed to Cisco Media Processor Family.
For a software solution, please give a visit to Wowza
Please check the below notes specified in Apple documentation.
****Important: iPhone and iPad apps that send large amounts of audio or video data over cellular networks are required to use HTTP Live Streaming.****

Resources