Is that any possibility to buffer Live Streaming?
I searched a lot but didn't get any official answer. Different people have different views on this.
Many people told that It is open from ios 10 , but did not get this.
Some have answered that use caching proxy, but I did not understand this.
Thanks for your valuable time
I'll share a good answer i found.
https://stackoverflow.com/a/36538295/2037169
To sum it up. this is wwdc 2016 on how to work with HLS from ios 10.
i did a poc today using this example it works great.
im currently looking into how to use HLS in tvOS.
the AVFoundation class they used in the example seem to apply to ios only
NS_CLASS_AVAILABLE_IOS(9_0) __TVOS_PROHIBITED
AVAssetDownloadTask : NSURLSessionTask
Related
First, sorry because this is not an special code issue (I can play videos in iPhone), so I'm not attaching any of my code, but asking for technical solution.
I'm developing a mobile applicaton (and also a webapp) that plays videos which come from a Nde js server. At first I noticed that in Safari, you can only play videos from streaming (which is also the best practice in the rest of the browsers), but it was very slow (so much time loading the video).
I came accross this piece of code, and the post of the author, it helped me to improve my server side streaming code:
https://github.com/davidgatti/How-to-Stream-Movies-using-NodeJS/blob/master/routes/video.js
I didn't need to change anything in the webapp, but now I can play videos much faster in Mac/Safari (in HTML5 I have simple tags).
But nothing changed in the Ionic app... And I don't know how to follow or where the problem could be (ionic/cordova or Node JS).
What can be the point I can be missing? Any link, known bug in ionic, or trick would help a lot.
UPDATE:
I'm trying with .mov and .mp4 video files. What's the ideal format (or compression) for iPhone?
UPDATE 2:
It's a good choice to handle videos with a cloud video solution like uStream, and embed it like an iframe (as provided solution from ustream)? Nothing more seem to work on improving time of load, managing videos on my own server and ionic client.
Thanks so much
I am currently in the research and prototyping stages of a project to develop a native iOS app (Swift 3) that includes a multi-channel audio player (multiple stereo MP3 files). I have found very limited information online, particularly written in Swift 3, so thought as I continue my research I would pose a question here.
Regarding frameworks it seems clear from what I've looked at so far that AVFoundation is going to do the job. It's not too low level and has a good set of functionality. It has support for playing multiple audio files with AVAudioPlayer. I am planning to start prototyping something with this soon.
But I am new to Swift and to iOS development with its huge number of libraries, so I'm wondering if I'm missing anything, if I'm on the right track here. Any answers with general information and thoughts on this will be up-voted. For an accepted answer some sample outline code using an appropriate framework, AVFoundation or a justified alternative.
If no answer is forthcoming I will post my own code when I get there.
Specifically I need from two to ten input channels, from MP3 files within the project resources, each with their own gain that can be individually adjusted, and then all of these mixed, maintaining their stereo channels, to a single output (the device) with a master gain. Some of the tracks need to loop, others not. The tracks need to be accurately synchronised. This is just for info and outline code would be fine covering the important points.
Research Notes and Resources
Apple: AVFoundation
A collection of resources relating to AVFoundation.
Apple: AVFoundation Programming Guide
This document seems encouraging at first, but actually only deals with video. It says:
There are two facets to the AVFoundation framework—APIs related to video and APIs related just to audio. The older audio-related classes provide easy ways to deal with audio. They are described in the Multimedia Programming Guide, not in this document.
The "Multimedia Programming Guide" which is also mentioned elsewhere at Apple in relation to this, is never linked and Google results point to not found pages on the Apple site. It seems to have disappeared.
Rudi Strahl: Mixing Multiple Audio Tracks with AVFoundation
Compares using AVComposition to using multiple AVPlayers. Example code is Objective-C. Not sure how the AVPlayers are mixed in the second solution. Perhaps with AVAudioMix. Currently looking at this. The article talks a little about it but doesn't deliver any specifics.
Audio Session Programming Guide
This document looks at AVAudioSession which provides supporting functionality:
AVAudioSession gives you control your app’s audio behavior. You can:
Select the appropriate input and output routes for your app
Determine how your app integrates audio from other apps
Handle interruptions from other apps
Automatically configure audio for the type of app your are creating
Techotopia: Playing Audio on iOS 10 using AVAudioPlayer
Some useful information on using AVAudioPlayer.
Stack Overflow: Playing a Sound with AVAudioPlayer
Basic Swift code for playing a sound. Some answers include a little extra functionality.
Hacking with Swift: How to Play Sounds Using AVAudioPlayer
Again, covers the basics.
Sweet Tutos: How To Play Sounds Files And Manage Duration Progress – AVAudioPlayer Tutorial
Updated to Swift 3. Some useful info.
Xamarin: Playing Sound with AVAudioPlayer
Written in Swift 2, I think.
Apple Video: WWDC 2013 Moving to AV Kit and AV Foundation
While not directly related, I found the first 30 minutes of this video introducing developers to AV Kit and AV Foundation in OS X 10 provides a useful overview of the technology.
I was working on the same problem, best what I could do it is, to transcode media content to be playing using avplayer, here is a draft, maybe it can help.
I'm trying to put together an open source library that allows iOS devices to play files with unsupported containers, as long as the track formats/codecs are supported. e.g.: a Matroska video (MKV) file with an H264 video track and an AAC audio track. I'm making an app that surely could use that functionality and I bet there are many more out there that would benefit from it. Any help you can give (by commenting here or—even better— collaborating with me) is much appreciated. This is where I'm at so far:
I did a bit of research trying to find out how players like AVPlayerHD or Infuse can play non-standard containers and still have hardware acceleration. It seems like they transcode small chunks of the whole video file and play those in sequence instead.
It's a good solution. But if you want to throw that video to an Apple TV, things don't work as planned since the video is actually a bunch of smaller chunks being played as a playlist. This site has way more info, but at its core streaming to Apple TV is essentially a progressive download of the MP4/MPV file being played.
I'm thinking a sort of streaming proxy is the way to go. For the playing side of things, I've been investigating AVSampleBufferDisplayLayer (more info here) as a way of playing the video track. I haven't gotten to audio yet. Things get interesting when you think about the AirPlay side of things: by having a "container proxy", we can make any file look like it has the right container without the file size implications of transcoding.
It seems like GStreamer might be a good starting point for the proxy. I need to read up on it; I've never used it before. Does this approach sound like a good one for a library that could be used for App Store apps?
Thanks!
Finally got some extra time to go over GStreamer. Especially this article about how it is already updated to use the hardware decoding provided by iOS 8. So no need to develop this; GStreamer seems to be the answer.
Thanks!
The 'chucked' solution is no longer necessary in iOS 8. You should simply set up a video decode session and pass in NALUs.
https://developer.apple.com/videos/wwdc/2014/#513
I am trying to implement Progressive Downloading of a video in my iOS application that can be played through AVPlayer. I have already implemented a downloader module that can download the files to the iPad. However, I have discovered I cannot play a file that is still being written to
So, as far as I can tell, my only solution would be through downloading a list of file 'chunks' and then keep playing through every file as they are ready (ie: downloaded), probably using HLS
Searching I have come across this question which implements the progressive download through hls but other than that, I can find no other way
However, I keep coming across search results that say how to configure web servers to leverage the iOS support for HTTP Progressive Downloading, but with no mention of how to do it from the iOS side
So, any one have any ideas and/or experience about this?
EDIT: I have also found there could be a way of doing it other way around (ie: streaming, then writing streamed data to disk) which was suggested by this question but still cannot get it to work as it seems it does not work with non-local assets!
From what you say, you might want to change approach and attempt to stream the file. Downloading and playing at the same time, I would say is the definition of Streaming. I hate when people post links to the Apple documentation but in this instance reading a tiny bit of this documentation will help you more than I ever can. It should all make sense if you are lready working with connections and video, you just need to change your approach.
The link: https://developer.apple.com/library/ios/documentation/networkinginternet/conceptual/streamingmediaguide/Introduction/Introduction.html
I'm working on improving the experience of a site by adding in iPad support. This includes support for videos. Our client is pushing towards a YouTube model for storing and serving videos -- great for us! I originally planned to implement the use of YouTube's new HTML5-supporting <iframe> snippets. This offloads the device detection to YouTube and makes embedding a video a cinch as we don't need to worry about compatibility. It turns out the the CMS we're using, Sitecore CMS, strips out <iframe>'s from our WYSIWYG editor. After a lot of research it looks like its a bit hard to not make this happen.
Fast forward to now... I tested out the old style <embed> code and discovered even though iOS doesn't support Flash, these embeds seem to work fine on iPad. Some Stack Overflow research led me to this post which suggests its because of the YouTube plugin /System/Library/Internet Plug-Ins/YouTubePlugIn.webplugin on iPads that allows for the playback.
My question is, is there any documentation that this is the exact reason? I'd like to go by this as why we can use the regular <embed> code but I need to back it up with proof via a document for iOS. Is this YT plug-in on every iPad by default, or do users need to manually install it? This seems like a great solution considering our unfortunate incompatibility with an <iframe> but I need to support the use of the <embed>'s with hard facts. Thanks in advance.
The answer you are looking for is to be found in Apple's URL Scheme Reference. Basically it's a mechanism that comes into play on the iDevices to detect and handle specifically, certain types of URLs - for instance Google Maps, iTunes and also YouTube.
Here's a few reference links.
https://developer.apple.com/library/archive/featuredarticles/iPhoneURLScheme_Reference/Introduction/Introduction.html#//apple_ref/doc/uid/TP40007899
https://developer.apple.com/library/archive/featuredarticles/iPhoneURLScheme_Reference/YouTubeLinks/YouTubeLinks.html#//apple_ref/doc/uid/TP40007895-SW1
And just for good measure, you might also want to take a look at the Safari Developer Library for the best practice recommendations on HTML5 Video and Audio embedding :-)