Playing widevine drm dash url on ios swift - ios

I want to play widevine url on ios,is there any player in ios similar to android exoplayer that supports widevine DRM,right now in android i am able to play widevine mpd file
exception:
[ERROR:/Developer/playground/google3/third_party/video/widevine/cdm_release/core/src/license.cpp(452)] CdmLicense::HandleKeyResponse: unable to parse signed license response
[ERROR:/Developer/playground/google3/third_party/video/widevine/cdm_release/core/src/cdm_engine.cpp(312)] CdmEngine::AddKey: keys not added, result = 70
[ERROR:/Developer/playground/google3/third_party/video/widevine/cdm_release/cdm/src/cdm.cpp(497)] Unexpected error 70

Widevine do have an iOS SDK - you can see them mention it on their update page:
We had a busy year in 2015 launching new and updated products like the iOS SDK; Shaka Player with offline playback, captioning and cast support; and key security improvements for Chrome and Android. We have a ton of exciting things planned for 2016 and look forward to sharing more details in the coming months!
To get access you need to be part of the partner program, AFAIK - details on the partner program are available on their website:
https://www.widevine.com/cwip/index.html

Related

How to fill metadata info for tvOS info panel when using Airplay?

I'm barely new to iOS.
I'm able to reproduce streams(no local video) via AVPlayer using Airplay.
Also, MPNowPlayingInfo and RemoteCommandManager are supported, using external medatada, not included into the streams.
But, I would like to fill the info panel with title, artwork, etc. on AppleTv/tvOS.
The image is part of WWDC17 talk titled "Now Playing and Remote 
 Commands on tvOS".
My question is not about tvOS apps, which the referenced talk is about, but about a iOS app that plays video via Airplay.
My guess is that the played AVAsset needs to have medatada, which currently would be empty.
I've been checking AVMutableMetaDataItem, but still don't understand if that's what I would need to use, nor how to do it.
Does anyone has any hint?
The WWDC 2019 Talk :( https://developer.apple.com/videos/play/wwdc2019/503/ ) which is about Delivering Intuitive Media Playback with AVKit speaks about using the external metadata during Airplay and how they have provided the API for iOS now which is similar to what was present on tvOS. (Refer from duration of 7 minutes in the Video mentioned above where they explain the same.) Hope this helps:)

HLS/fMP4 (CMAF) redundant/fallback (Primary/Backup) workflow on Apple Devices is not working

I'm trying to publish my HLS/fMP4 (CMAF) stream to Akamai with Primary/Backup workflow.
When I tested with Shaka player, it works fine: Whenever publishing to Primary entry point stopped, player will properly switch to Backup stream and keep playing back.
However, somehow it does not work on safari on iOS11 nor macOS High Sierra.
I'm wondering if it is a limitation in Apple Devices, or there's compatibility issue in my master playlist.
Here's the sample master playlist file.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="default-p",NAME="audio-eng-p",LANGUAGE="eng",DEFAULT=YES,URI="https://foo.akamaized.net/cmaf/live/123456/FailOverTest/index_bitrate128K.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="default-b",NAME="audio-eng-b",LANGUAGE="eng",DEFAULT=NO,URI="https://foo.akamaized.net/cmaf/live/123456-b/FailOverTest/index_bitrate128K.m3u8"
#EXT-X-STREAM-INF:BANDWIDTH=928000,RESOLUTION=640x360,CODECS="avc1.4d401f,mp4a.40.2",AUDIO="default-p"
https://foo.akamaized.net/cmaf/live/123456/FailOverTest/index_bitrate800K.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=928000,RESOLUTION=640x360,CODECS="avc1.4d401f,mp4a.40.2",AUDIO="default-b"
https://foo.akamaized.net/cmaf/live/123456-b/FailOverTest/index_bitrate800K.m3u8
NOTE: for readability, I included only 1 video and audio pair.
Please let me know if you notice something.

Google Cast plugin for Unity can't stream Video clip

I'm trying to play a Video inside Unity, and stream it to Google Cast.
Google provides a plugin that enables the connection to a Cast device, and it works fine once a correct Cast App ID is given.
Recently Unity provided a component 'VideoPlayer' that enables video playback inside a mobile device. And I tried to use both of them to stream video content on the Cast device. But when I play the video, the app stops responding with a signal 'SIGABRT' at
reinterpret_cast<PInvokeFunc>(_native_GCKUnityRenderRemoteDisplay)();
I also tried to play the video using AVPro plugin but the same issue appeared.
The plugin works just fine without a video, and the last update of the plugin is Apr 2016 so I think the plugin has some issue with the Unity's latest VideoPlayer component.
Is there something I can do about it?
There are currently no plans to update the Google Cast Unity plugin.

Live stream implementation using RTSP protocol

I'm trying to gain access to a live stream through the RTSP protocol on iOS. I'm trying to run the example from this website: http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html and it's advertised that you can just take the url (rtsp://) and paste it into quicktime player, VLC or some other means, but whenever I try it fails. When I try in quicktime player it gives me this error: The document “Macintosh HD” could not be opened. The file may be damaged or may not be a movie file that is compatible with QuickTime Player.
What am I doing wrong? Is the example broken or do I need to update some specs in the code. I'm running iOS 9.3 and it's told to work > 7.0.
I was able to play this back on VLC when compiling and running on my iOS device. You need to ensure that you are on WiFi (vs LTE or 3G). I'm on iOS 9.2.1 and played back with VLC version 2.2.2.
You can then take it a step further as I was successful in ingesting it into Wowza via Stream file with the following configuration:
{
uri : "rtsp://[rtsp-address-as-published-on-the-app]",
streamTimeout:12000,
reconnectWaitTime:12000,
rtpTransportMode:"udp",
rtspValidationFrequency:15000,
rtspFilterUnknownTracks:true,
rtspStreamAudioTrack:false,
rtspStreamVideoTrack:true,
rtpDebugSession:true,
rtspSessionTimeout:12000,
rtspConnectionTimeout:12000
}
I would suggest reviewing what the console logs say in your iOS application (xcode) and then also take a look at your VLC error messages/logs as well to see what the exact issue is when you try to playback.

Upload file from iPhone media library, in 2012

I'm about to launch a service where one of the feature is to upload files with an 'upload' button on a website. Some years ago, I made some program for iPhone, and I remember that it was impossible to upload an MP3 from the library, because each app is in its sandbox, though I was able to upload MP3 placed in the sandbox itself.
There is an old post on SO about the impossibility to upload from the library to a website:
A html5 web app for mobile safari to upload images from the Photos.app?
Is possible as of may 2012 for an iPhone/iPad to be prompted into the music library when clicking on an html upload button?
I don't think things will evolve in your way on iPhone.
I assume your service will not be in native objective-c.
look at the features of phonegap to see what interactions are currently possible :
http://docs.phonegap.com/en/1.8.0/index.html
You can probably develop a dedicated app to extract the music file using the Media Player framework and send them to your service, but I barely doubt it can pass the apple verification team.
Apple will not allow you to do this. Although it may be possible using private APIs or perhaps the Media Player framework, it will not be accepted by Apple.

Resources