Does AVPlayer support closed captions delivered in a separate text file? - ios

My team produces iOS apps that play video using AVPlayer. We've recently been told that we MUST allow display of closed captions for all videos... but that the closed captions would not be coming as a track within the video files (we already support closed captions that come in this way). Instead, we'll be getting them as a separate text file. I've seen a couple references to including the caption text file in the SMIL, but I've found nothing about how to incorporate this text file into the playback experience. Does anyone have any personal experience with this, or know of any online documentation/tutorials that would help?

OK, we have a plan now, though it's a little complicated because DRM is involved. The simplified version is that we're going to do what's described in the What's New in HTTP Live Streaming talk from WWDC 2012 (https://developer.apple.com/videos/wwdc/2012/?id=512): namely, create a playlist that references our webvtt file(s), and then reference that playlist from the main m3u8. This will give us closed-captions-as-subtitles in iOS6.

Related

Play video (from streaming) in iPhone with Ionic 3/Cordova iOS app from Node Js, without so much time loading

First, sorry because this is not an special code issue (I can play videos in iPhone), so I'm not attaching any of my code, but asking for technical solution.
I'm developing a mobile applicaton (and also a webapp) that plays videos which come from a Nde js server. At first I noticed that in Safari, you can only play videos from streaming (which is also the best practice in the rest of the browsers), but it was very slow (so much time loading the video).
I came accross this piece of code, and the post of the author, it helped me to improve my server side streaming code:
https://github.com/davidgatti/How-to-Stream-Movies-using-NodeJS/blob/master/routes/video.js
I didn't need to change anything in the webapp, but now I can play videos much faster in Mac/Safari (in HTML5 I have simple tags).
But nothing changed in the Ionic app... And I don't know how to follow or where the problem could be (ionic/cordova or Node JS).
What can be the point I can be missing? Any link, known bug in ionic, or trick would help a lot.
UPDATE:
I'm trying with .mov and .mp4 video files. What's the ideal format (or compression) for iPhone?
UPDATE 2:
It's a good choice to handle videos with a cloud video solution like uStream, and embed it like an iframe (as provided solution from ustream)? Nothing more seem to work on improving time of load, managing videos on my own server and ionic client.
Thanks so much

Can iOS Kamcord SDK be used to record screens of apps other than games?

I am looking for a solution to record the screen of my app - especially an online video. Initially I used this link to implement this:
http://codethink.no-ip.org/wordpress/archives/673
This does record everything selected in the view however the youtube player shows up as a black screen. After research I came across the Kamcord library used by games.
Anyone know if this can be used to record non-game app views - especially online videos embedded in UIWebView?
Do you know why the video in a uiwebview is treated differently? Is it using open-gl?
I don't think this is the answer your hoping for. But this may be helpful to someone looking to do this in the future.
The current version of Kamcord 1.9 does not support any recording. Kamcord is moving into the live stream hosting business. Version 1.7 is still obtainable via github Official support is supposed to end August 24th 2015, but I haven't had a reply back, even for questions about the new version for weeks now, a week before official close.
The online implementation documentation was kept on their website and I when I asked for the 1.7 documents I was told they "did not have them" since no backup was kept. I suggested they look into source control systems. You can get some of the older iOS documents from the way back machine (please consider donating) but the Android documents are not to be found (nor did it do basic things like let you control video length on Android.)

Designing a library for Hardware-accelerated unsupported containers on iOS (and Airplay)

I'm trying to put together an open source library that allows iOS devices to play files with unsupported containers, as long as the track formats/codecs are supported. e.g.: a Matroska video (MKV) file with an H264 video track and an AAC audio track. I'm making an app that surely could use that functionality and I bet there are many more out there that would benefit from it. Any help you can give (by commenting here or—even better— collaborating with me) is much appreciated. This is where I'm at so far:
I did a bit of research trying to find out how players like AVPlayerHD or Infuse can play non-standard containers and still have hardware acceleration. It seems like they transcode small chunks of the whole video file and play those in sequence instead.
It's a good solution. But if you want to throw that video to an Apple TV, things don't work as planned since the video is actually a bunch of smaller chunks being played as a playlist. This site has way more info, but at its core streaming to Apple TV is essentially a progressive download of the MP4/MPV file being played.
I'm thinking a sort of streaming proxy is the way to go. For the playing side of things, I've been investigating AVSampleBufferDisplayLayer (more info here) as a way of playing the video track. I haven't gotten to audio yet. Things get interesting when you think about the AirPlay side of things: by having a "container proxy", we can make any file look like it has the right container without the file size implications of transcoding.
It seems like GStreamer might be a good starting point for the proxy. I need to read up on it; I've never used it before. Does this approach sound like a good one for a library that could be used for App Store apps?
Thanks!
Finally got some extra time to go over GStreamer. Especially this article about how it is already updated to use the hardware decoding provided by iOS 8. So no need to develop this; GStreamer seems to be the answer.
Thanks!
The 'chucked' solution is no longer necessary in iOS 8. You should simply set up a video decode session and pass in NALUs.
https://developer.apple.com/videos/wwdc/2014/#513

Progressive Video Download on iOS

I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to
So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS
Searching I have come across this question which implements the progressive download through hls but other than that, I can find no other way
However, I keep coming across search results that say how to configure web servers to leverage the iOS support for HTTP Progressive Downloading, but with no mention of how to do it from the iOS side
So, any one have any ideas and/or experience about this?
EDIT: I have also found there could be a way of doing it other way around (ie: streaming, then writing streamed data to disk) which was suggested by this question but still cannot get it to work as it seems it does not work with non-local assets!
From what you say, you might want to change approach and attempt to stream the file. Downloading and playing at the same time, I would say is the definition of Streaming. I hate when people post links to the Apple documentation but in this instance reading a tiny bit of this documentation will help you more than I ever can. It should all make sense if you are lready working with connections and video, you just need to change your approach.
The link: https://developer.apple.com/library/ios/documentation/networkinginternet/conceptual/streamingmediaguide/Introduction/Introduction.html

Multiple HTML5 media elements on one page in iOS (iPad)

My research has led me to learn that Apple's media element handler is a singleton, meaning I can't have a video playing while an audio is playing in the background. I'm tasked to build a slideshow presentation framework and the client wants a background audio track, timed audio voice-overs which match bullet points, and variable media which can either be an image or video - or a timed cycle of multiple media elements.
Of course, none of the media works on iOS. Each media element cancels out the previous.
My initial thought is to embed the voice-over audio into the video when there's a video present, but there's an existing Flash version of this setup which depends on existing assets so I pretty much have to use what's delivered.
Is there ANY work-around for this? I'm testing on iOS 4.3.5. The smartest devs in the world are on this site - we've got to be able to come up with something.
EDIT: Updated my iPad to iOS 5.0.1 and the issue remains.
How about do it with CSS to do the trick.
Maybe you know about a company called vdopia that distribute video ad on mobile.
http://mobile.vdopia.com/index.php?page=mobilewebsolutions
They claim to developed what so called vdo video format, that actually just to do a css sprite running on that :D
I mean you could have your "video" as a framed image, then attach html5 audio tag on there.
I would like to know your response
Are you working on a Web App or on a Native Application?
If you are working on a Web App you're in a world of hurt. This is because you simply do not have much control over things that Mobile Safari doesn't provide right away.
If this is the case I would come forth and be honest with the stakeholders.
If you are working on a Native Application you can resort to a mechanism that involves some back and forth communication between UIWebView and ObjC. It's actually doable.
The idea is the following:
Insert special <object> elements in your HTML5 documents, that you handcraft yourself according to your needs, taking special care to maintain the attr-* naming convention for non-standard attributes.
Here you could insert IDs, paths and other control variables in the multimedia artifacts that you want to play.
Then you could actually build some javascript (on top of jQuery,p.e.) that communicates with ObjC through the delegation mechanism on the UIWebView or through HTTP. I'll go over this choice down below.
Say that on $(document).ready() you go through all the objects that have a special class. A class that you carefully choose to identify all the special <object>.
You build a list of such objects and pass them on to the ObjC part of your application. You could easily serialize such list using JSON.
Then in ObjC you can do what you want with them. Play them through AVPlayer or some other framework whenever you want them played (again you would resort to a JS - ObjC bridge to actually signal the native part to play a particular element).
You can "communicate" with ObjC through the delegation pattern in UIWebView or through HTTP.
You would then have a JS - ObjC bridge in place.
The HTTP approach makes sense in some cases but it involves a lot of extra code and is resource hungry.
If you are building an ObjC application and want further details on how to actually build an ObjC - JS bridge that fits these needs get back to us :)
I'm halting this post as of now because it would be nice to know if it is in fact a Native App.
Cheers.
This is currently not possible. As you notice when a video plays it takes up the full screen with quicktime and moves the browser to the background. The only solution at this time is to merge the audio and video together into an mp4 format and play that single item.
If I understand you correctly, you are not able to merge the audio and video together because it relies on flash? Since iOS can't play flash you should merge the audio and video together and use flash as a backup. There are numerous html5 players which use javascript to try and play the html5 video first then fallback to flash for backup.
You mention there is an existing Flash setup of the video - is it a an swf file, could you import it into a video/audio editing software and add an audio track on top?
Something like this: http://www.youtube.com/watch?v=J2vvH7oi8m8&feature=youtube_gdata_player
Also, if it is a Flash file, will you be converting it to an avi or like for iOS? If you'd have to do it anyway, there is your chance for adding an audio track.
Could you use a webservice to merge the streams in real time with FFMpeg and then stream one output to quicktime?
To elaborate maybe a library like http://directshownet.sourceforge.net/about.html could also work. It looks like they have method
DESCombine – A class library that uses DirectShow Editing Services to combine video and audio files (or pieces of files) into a single output file. A help file (DESCombine.chm) is provided for using the class.
This could then be used to return the resulting data as the response to the call and loaded via the HTML5 player.

Resources