Set cache size for AVPlayer - ios

Is it possible to simply set the size of the cache for an AVPlayer? I've an app where the user will often go back and forth in the video and I'd like to make sure that there's enough caching happening. I've found that code that shows how to manually handle the network requests using an AVAssetResourceLoaderDelegate and that's pretty involved. Ideally I'm looking for something like
let player = AVPlayer(url: url)
player.urlSession = myConfiguredURLSession

Related

How to add trick play to custom iOS video player build on top of AVPlayer to play .m3u8 files?

Goal:
To add youtube like preview feature when user seeks manually using the player seek bar.
From what I understand so far is that I will have to add "I-Frame only playlist" to my stream to enable trick play but I am not able to figure out how I will be using this to show the preview view on the video player?
Other solutions I considered:
AVAssetImageGenerator: It does not work on streams. Explained here.
This says if my .m3u8 file contains "I-Frame only playlist", AVAssetImageGenerator will start returning the snapshot, but even if it does, generating thumbnails of a complete 1 hour video upfront is just not optimal.
AVPlayerItemVideoOutput This also seems like a very brut force way to approach the problem as I need thumbnails of almost complete video.
Current player implementation:
I have added AVPlayerLayer as a sublayer to my view controller's view and added custom controls on top of it.
I am thinking of using something like this https://github.com/pbs/iframe-playlist-generator to add the I-Frame playlist.
PS: I am new to this, so if I have made any wrong assumption, please let me know.
Also, any links or references to some reading material I can use to dive in deeper are appreciated. Thanks.

AVPlayer sometimes doesn't load a video

I have a one button and AVPlayer. After button click I record a video using DBAttachmentPickerController and I want to load it to the AVPlayer.
In function below I try to load it:
func refresh(attachmentInfo: AttachmentInfo) {
self.videoLayer.player = nil
if let url = attachmentInfo.url {
self.player = AVPlayer(url: url) // line A
self.videoLayer.player = self.player // line B
}
}
In 6/10 cases it works fine, but sometimes the video doesn't load to AVPlayer.
When I set breakpoints in line A and B it works always.
Any ideas?
I also hit this exact issue as yours. In my case the AVPlayer and AVPlayerLayer were inside tableview cells which made the problem worse. However, I was using an API which returned the actual video URL, and an Image URL which consisted of the video's first frame's image. In the tableview cell logic, I first attempt to load the video into AVPlayer and then check if the AVPlayer has any .video assets, if it didn't then I simply load the Image from the other URL into the UIView (using a UIImageView of course) with a "Play" button image in the center of the Image. When the user taps the play button the selector function should try to load the AVPlayer again with the video URL. If the video loads and AVPlayer has assets then play() else simply return (and possibly alert the user that the video can't be loaded).
If you do not have the image of the first frame of the video, you can simply use a placeholder image and a button.
The way to check if the player has a video or not-
if player.currentItem?.asset.tracks(withMediaType: .video).count == 0{
print("Oops! no video here")
//load an image, and add a button here
}else{
//do whatever needs to be done if video is available
}
Unfortunately, there isn't much else that can be done here unless you have some solid networking code where in you can download the video in a utility thread and store it in cache, but then again you have to be cognizant about the user's data usage, spawning too many download threads etc.
For image caching in the UITableView cell, I used SDWebImage library from Github so that I can be (somewhat) assured that my image (for the video) atleast would be cached in case of bad/slow network, or network interruptions.

How to cache AVPlayer video?

I have AVPlayers embedded into UITableViewCells and need a way to cache the video that's loaded so that when I refresh the table view, it doesn't have to load the videos again. The code I'm using at the moment to load videos is as follows:
let videoString = "http://www.someurl.com"
let videoURL = NSURL(string: videoString)
videoPlayer = AVPlayer(URL: videoURL!)
videoPlayer!.actionAtItemEnd = AVPlayerActionAtItemEnd.None
videoView.playerLayer.player = videoPlayer
I cache images by storing the returned image from a server in a dictionary with it's ID as the key. I'm looking for a similar way for videos, any help would be appreciated.
Ideally their is no in-built support to cache the vidoes,however depending upon the size of the video you may try to download it in background, once the video start playing.
However do note that this process will increase the cache size of the app, and may consume lot of space depending upon the size of the video, so you may cache few videos that the user has watched few minutes back, or you may restrict it to the 5-10 latest vidoes.

Looping AVPlayer seamlessly

There has been some discussion before about how to loop an AVPlayer's video item, but no 'solution' is seamless enough to provide lag-less looping of a video.
I am developing a tvOS app that has a high-quality 10 second clip of 'scenery' in the background of one of its views, and simply restarting its AVPlayer the 'standard' way (subscribing to NSNotification to catch it) is too jumpy not to notice and detract from user experience.
It seems as though the only way to achieve a truly seamless loop is to manually manage frames, at a lower-level (in OpenGL)...
Despite best efforts to read up on this, and as a novice in manipulating video pipelines, I have not come close enough to a comprehensible solution.
I am aware that external libraries exist to be able to perform this behaviour more easily; most notably GPUImage. However, the app I am developing is for tvOS and therefore has difficulty using quite a lot of the 3rd party iOS libraries in existence, GPUImage included. Another library I have come across is AVAnimator, which provides great functionality for light-weight animation videos, but not for dense, high-quality video clips of source footage encoded in .H264.
The closest I have come so far is Apple's own AVCustomEdit source code, however this primarily deals with static production of a 'transition' that, while seamless, is too complex for me to be able to discern how to make it perform simple looping functionality.
If anybody can chip in with experience of manipulating AVPlayer at a lower level, i.e. with image processing/buffers (or iOS development that doesn't rest on external libraries), I would be incredibly interested to know how I could make a start.
I had the same problem when streaming a video. After playing for the first time, there was a black screen when loading the video for second time. I got rid of the black screen by seeking video to 5ms ahead. It made nearly a seamless video loop. (Swift 2.1)
// Create player here..
let player = AVPlayer(URL: videoURL)
// Add notification block
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
player.seekToTime(t1)
player.play()
}
If the video is very short (a few seconds), you can probably extract each frame as CGImage and use CAKeyframeAnimation to animate it. I am using this technique to play GIF images on my app and the animation is very smooth.
You mention that you looked at AVAnimator, but did you see my blog post on this specific subject of seamless looping? I specifically built seamless looping logic in because it could not be done properly with AVPlayer and the H.264 hardware.
I use two AVPlayerItems with the same AVAsset in an AVQueuePlayer and switch the items:
weak var w = self
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: nil, queue: nil) { (notification) -> Void in
let queuePlayer = w!.playerController.player! as! AVQueuePlayer
if(queuePlayer.currentItem == playerItem1) {
queuePlayer.insertItem(playerItem2, afterItem: nil)
playerItem1.seekToTime(kCMTimeZero)
} else {
queuePlayer.insertItem(playerItem1, afterItem: nil)
playerItem2.seekToTime(kCMTimeZero)
}
}

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Resources