I'm playing videos using AVPlayer in my iOS application, and now want to add chrome cast support.
1- As per this link, we can view chrome-cast button when video is playing. Is it the same case with AVPlayer?
2- As per Apple's requirement, my videos are encoded and are in m3u8 format. Can we play that in chrome cast?
Well, you can try to check this Google Cast documentation, it includes API libraries and sample application code to help your applications go big. These APIs are documented in the API references, and the sample code is discussed in the Sender Applications and Receiver Applications overviews.
To answer the question if you can play the m3u8 format in the Chrome Cast, first, you can check this Supported Media for Google Cast to know all the supported media facilities and types in the Google Cast.
Note that Some of these require additional coding or the Media Player Library. See Receiver Applications for more information about developing your receiver application to support these media types.
For more information, check these SO questions:
ChromeCast doesnt play HLS in .m3u8 format
Streaming .m3u8 format using Chromecast
Related
The docs are a little hard to parse here. I was wondering if there was any way to
Stream YouTube live into an iOS app, without significant/any YouTube branding.
Stream from an iOS device as a broadcast stream for YouTube live.
My initial Googling turned up mixed responses. I was hoping to see an example of this if it's possible, or save myself some time if it's not.
Suppose I have a person on ATT next to a person on Verizon streaming content, and I want to make both appear as a single uninterrupted stream switching back and forth. Does YouTube or a library do to anything to facilitate this?
Streaming from an iOS device is no different than streaming from any other device. You would have to write an h264 encoder and RTMP packetizer, and send the video to your YouTube stream object's ingestionAddress. Outlining the details of the encoder beyond the above is too broad for Stack Overflow, but I highly recommend looking at the VideoCore iOS project.
As far as branding goes, the only way to play back YouTube content in an iOS app without breaking YouTube's terms of service is to play the video in a UIWebView or YouTube's iOS player helper library (which is just a web view with some playback interfaces).
There is no way to completely remove YouTube branding from the IFrame player. However, there are branding options you can toggle using the modestBranding flag on the player. See the IFrame docs here.
Despite reading the documentation it not not clear to me exactly what " Google Cast Media Player Library" is and whether it is the route I need to take for my Chromecast app.
What I am trying to achieve is to play media from my local IOS device on Chromecast. My main aim to to play users videos and photos and not necessarily DRM media.
Up till now I have been doing this by exporting the AVAsset and then passing file address this to a simple HTTP server. This seems horribly inefficient and I thought I could use AVAssetReader to pass a stream to Chromecast. During my research I came across terms
MPEG-DASH -
SmoothStreaming
HTTP Live Streaming (HLS)
But I do not understand whether I need such complex implementations
I find the name - Google Cast Media Player Library, to be very ambiguous and there is no concise explanation of what it is.
https://developers.google.com/cast/docs/player
This is a piece of the definition given there:
... It provides JavaScript support for parsing manifests and playing HTTP
Live Streaming (HLS), MPEG-DASH, and Smooth Streaming content. It also
provide support for HLS AES encryption, PlayReady DRM, and Widevine
DRM.
I hope this is not ambiguous; if your media has encryption and/or you are dealing with adaptive streams of the types specified (HLS, ..), then this library can help you. If you are playing a simple mp4 or showing images, you don't need to use this library.
There is plenty of posts in this forum on how to cast local media; it amounts to embedding a local tiny embedded web server in your sender app and then sending the url of the media (that is now exposed through your embedded web server via a URL) to chromecast and have your receiver show or play that media tiem (via the url that was exposed).
I've made an application to cast HLS streams on a Chromecast.
It works well with VOD streams (non live), but it's not with a LIVE stream.
So here is my question : Can Chromecast read LIVE streams ?
Yes, it can and many chromecast application already do. You may want to use our MPL library, or use your own player. You may need to write a custom receiver if the Styled/Default receiver is not doing what you need.
I'm looking to implement DRM in an iOS video player, but I'm not sure how to implement this. In order to implement video DRM (while still using Apple's hardware accelerated H264 decode), I need a way to feed the decrypted H264 stream into the standard iOS video playback APIs.
According to this question, it was not possible to implement 3rd party DRM in September 2010. There's a thread in the Apple Developer Forums that goes nowhere. However, as of today a number of 3rd party DRM libraries exist: Widevine, Irdeto (PDF), Marlin. They have clearly found some way to pass a custom stream to the media player in Apple approved apps.
I've found two leads. One is a suggestion to create a custom URL protocol, but people seem to have poor success using this with video. The other is to create a local HTTP server thread and provide the content by HTTP live streaming on 127.0.0.1 inside the iDevice. I'd like to be very sure that Apple will approve before going that route.
So - what Apple approved APIs do 3rd party DRM implementations use to get decrypted video data into the video player?
Edit: the latest BBC iPlayer for iOS allows programmes to be downloaded for later viewing. Either they store the content in the clear, or they have cracked this problem.
You can begin decrypting the file into another file and playback that file as you decrypt. You'll need to let it buffer a few seconds worth of video, but it will work.
Additionally you'll need to make sure that the moov atom is BEFORE the mdat atom in the file, otherwise it won't work. (AVFoundation, for example, creates MP4s where the moov atom comes after the mdat atom, and so they would need to be modified to work)
A working solution is local http server. But the patent application was submitted by Authentec.
http://www.google.com/patents/US20120284802
I am using a Phonegap plugin for playing a video in my iOS app. I'm able to play a video with the URL format like http://easyhtml5video.com/images/happyfit2.mp4.
How do I play Youtube videos using the phonegap-videoplayer-plugin?
YouTube Terms of Service: "You agree not to access Content
through any technology or means other than the video playback pages of
the Service itself, the Embeddable Player, or other explicitly
authorized means YouTube may designate."
There are some methods that will give you direct link to youtube videos. Use "gdata" option to find all possible video formats.
Then parse the result to get desired link. Hope this might be useful
But everywhere I found the code to be edited in .m and .h files. None have explained how to use it.
That's because you can only directly use MPMoviePlayerController from native apps (written in Objective-C). For any other technology you'll have to have an intermediary layer in between.
From googling, I found this plugin for PhoneGap that claims to integrate with MPMoviePlayerController. I've no idea if it's any good, but it might do as a starting point.
try to give the youtube url in this format.
http://www.youtube.com/embed/jxXukpxNSx4
You will get the desired result. But Autoplay is not enabled, as Apple stopped supporting autoplay to save user's bandwidth