I'm looking to implement DRM in an iOS video player, but I'm not sure how to implement this. In order to implement video DRM (while still using Apple's hardware accelerated H264 decode), I need a way to feed the decrypted H264 stream into the standard iOS video playback APIs.
According to this question, it was not possible to implement 3rd party DRM in September 2010. There's a thread in the Apple Developer Forums that goes nowhere. However, as of today a number of 3rd party DRM libraries exist: Widevine, Irdeto (PDF), Marlin. They have clearly found some way to pass a custom stream to the media player in Apple approved apps.
I've found two leads. One is a suggestion to create a custom URL protocol, but people seem to have poor success using this with video. The other is to create a local HTTP server thread and provide the content by HTTP live streaming on 127.0.0.1 inside the iDevice. I'd like to be very sure that Apple will approve before going that route.
So - what Apple approved APIs do 3rd party DRM implementations use to get decrypted video data into the video player?
Edit: the latest BBC iPlayer for iOS allows programmes to be downloaded for later viewing. Either they store the content in the clear, or they have cracked this problem.
You can begin decrypting the file into another file and playback that file as you decrypt. You'll need to let it buffer a few seconds worth of video, but it will work.
Additionally you'll need to make sure that the moov atom is BEFORE the mdat atom in the file, otherwise it won't work. (AVFoundation, for example, creates MP4s where the moov atom comes after the mdat atom, and so they would need to be modified to work)
A working solution is local http server. But the patent application was submitted by Authentec.
http://www.google.com/patents/US20120284802
Related
I am need a solution for protect music file downloaded in an music app.
We have all rights for the audios, so, we need to garante only our app is able to play this audios.
This music app actually is only for streaming. The next update is for implement the functionality for download e play music offline.
I know Spotify , for example, use DRM protection, but it is a little bit controvercious for some people, and I think this is not what we need now.
During my researches, I dont find any concrete solution. So, my questions are which functionalities, libraries or resources can I use to protect the downloaded files.
Maybe I need to encrypt/decrypt the files? But, Swift have a native functionality for this, and have some documentation available?
So, what can I use to protect the audios with Swift, and keep playing the audio only in my own app?
This question gets asked almost daily and the answer is, and will always be, the same - if a user can play your audio on their device, then they can also extract and keep a copy of that audio - no amount of DRM, encryption or any other naive concept anyone dreams up can change this.
You can prevent "script kiddies" from just copying the files off their phone by embedding an encryption key in your app and streaming files through a stream cipher before playing them, but again, it's trivial to reverse engineer and get the key.
You can transcode your .mp3 files to HLS file which will generate one master playlist and several segment files and then you can apply ALS encryption on it using ffmpeg or Apple Media segmenter.
For More Info:
https://www.theoplayer.com/blog/content-protection-for-hls-with-aes-128-encryption
We are a couple of software developers and were planning on making some commercial extension or some website through which users, with slow internet connections or limited data could play almost any video through YouTube's API. Although, while going through the API docs, we came through the following section,
Your API Client will not, and You will not encourage or create functionality for Your users or other third parties to:
"separate, isolate, or modify the audio or video components of any YouTube audiovisual content made available through the YouTube API"
-Kuan Yong, YouTube API Team
Is there no such legal way through which we can disable just the video and stream audio, not even with the commercial API?
Hoping for a positive reply.
Playing just audio means isolating audio, which is a clear violation of the terms and conditions of the YouTube API. However, as far as I know, there is no such restriction when not using the YouTube API. For example, youtube-dl, which is a video downloader for YouTube and hundreds of other websites, has the functionality to download just audio.
In fact, in YouTube's format, audio and video are actually stored in separate files on the server. This means you can actually acquire the path to the audio file and stream it to you users. Check out the open-source youtube-dl, specifically the youtube module, for more details.
Edit
The reason I am merely linking to a library is that YouTube continually changes the way they format, store, and stream video. This means that retrieving the direct link to a video's audio component is a task subject to frequent change. If you can link against youtube-dl's python module, you can make use of their updates whenever such changes occur.
Disclaimer
I am not fully sure if this is all legal and whatnot. Hence, I take no responsibility for your decision on what to do, nor do I condone this kind of behavior.
The docs are a little hard to parse here. I was wondering if there was any way to
Stream YouTube live into an iOS app, without significant/any YouTube branding.
Stream from an iOS device as a broadcast stream for YouTube live.
My initial Googling turned up mixed responses. I was hoping to see an example of this if it's possible, or save myself some time if it's not.
Suppose I have a person on ATT next to a person on Verizon streaming content, and I want to make both appear as a single uninterrupted stream switching back and forth. Does YouTube or a library do to anything to facilitate this?
Streaming from an iOS device is no different than streaming from any other device. You would have to write an h264 encoder and RTMP packetizer, and send the video to your YouTube stream object's ingestionAddress. Outlining the details of the encoder beyond the above is too broad for Stack Overflow, but I highly recommend looking at the VideoCore iOS project.
As far as branding goes, the only way to play back YouTube content in an iOS app without breaking YouTube's terms of service is to play the video in a UIWebView or YouTube's iOS player helper library (which is just a web view with some playback interfaces).
There is no way to completely remove YouTube branding from the IFrame player. However, there are branding options you can toggle using the modestBranding flag on the player. See the IFrame docs here.
I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.
Apple Http Live Streaming Overview document clearly states that streaming videos exceeding 10 minutes cannot be downloaded progressively and MUST be transferred using Http Live Streaming(HLS). It also states that the latency of HLS is in the neighbourhood of 30 seconds.
In my case, I am building an app that needs to receives live steaming videos in almost real-time. So on one hand I must use HLS for real time streaming but it is not fast enough. On the other hand I can not use anything else because it seems anything other than HLS is not allowed. I know RTSP is possible on iOS but will it be approved by the App Store?
Cheers,
M.
Apple is not transparent. The only way to know if an app will be rejected or accepted is to submit it.