iOS 7 AVPlayer AVPlayerItem duration incorrect in iOS 7 - ios

I have the following code in my app:
NSURL *url = [NSURL fileURLWithPath: [self.DocDir stringByAppendingPathComponent: self.FileName] isDirectory: NO];
self.avPlayer = [AVPlayer playerWithURL: url];
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
This worked fine with iOS 6 but with iOS 7 for some reason it returns NaN. When inspecting self.avPlayer.currentItem.duration the CMTime object has 0's with a flag of 17.
Interestingly the player works fine, just the duration is wrong.
Has anyone else experienced the same issues? I am importing the following:
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVAsset.h>

After playing around with different ways of initializing the objects I arrived at a working solution:
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
Float64 duration = CMTimeGetSeconds(asset.duration);
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem: item];
It appears the duration value isn't always immediately available from an AVPlayerItem but it seems to work fine with an AVAsset immediately.

In iOS 7, for AVPlayerItem already created, you can also get duration from the underlaying asset:
CMTimeGetSeconds([[[[self player] currentItem] asset] duration]);
Instead of get it directly from AVPlayerItem, which gives you a NaN:
CMTimeGetSeconds([[[self player] currentItem] duration]);

The recommended way of doing this, as described in the manual is by observing the player item status:
[self.avPlayer.currentItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionInitial context:nil];
Then, inside observeValueForKeyPath:ofObject:change:context:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
// TODO: use either keyPath or context to differentiate between value changes
if (self.avPlayer.currentItem.status == AVPlayerStatusReadyToPlay) {
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
// ...
}
}
Also, make sure that you remove the observer when you change the player item:
if (self.avPlayer.currentItem) {
[self.avPlayer.currentItem removeObserver:self forKeyPath:#"status"];
}
Btw, you can also observe the duration property directly; however, it's been my personal experience that the results aren't as reliable as they should be ;-)

Swift version
You can get the duration using AVAsset which is AVPlayerItem property:
func getVideoDuration(from player: AVPlayer) -> Double? {
guard let duration = player.currentItem?.asset.duration else { return nil }
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}
or by creating AVAsset from the scratch:
func getVideoDuration(for videoUrl: URL) -> Double {
let asset = AVAsset(url: videoUrl)
let duration = asset.duration
let durationSeconds = CMTimeGetSeconds(duration)
return durationSeconds
}

Related

How to use AVPlayerLooper with AVPlayerItemVideoOutput?

AVPlayerLooper accepts a template AVPlayerItem and a AVQueuePlayer as setup parameters, then it internally manipulates items of the queue and player is constantly changing its currentItem.
This works perfect with AVPlayerLayer, which accepts this looped player as parameter and just renders it, but how can I use it with AVPlayerItemVideoOutput, which is being attached to the AVPlayerItem, which the player has multiple inside it? How do I reproduce the same thing AVPlayerLayer does internally?
AVPlayerLooper setup example from docs
NSString *videoFile = [[NSBundle mainBundle] pathForResource:#"example" ofType:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:videoFile];
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [AVQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_playerLayer];
[_player play];
This is how AVPlayerItemVideoOutput supposed to be used
[item addOutput:_videoOutput];
The only workround I came up with is to observe for changes of the currentItem and each time deattach the video output from old item and attach it to new one, like in example below, but this apparently neutralizes the gapless playback which I'm trying to achieve.
- (void)observeValueForKeyPath:(NSString*)path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
if (context == currentItemContext) {
AVPlayerItem* newItem = [change objectForKey:NSKeyValueChangeNewKey];
AVPlayerItem* oldItem = [change objectForKey:NSKeyValueChangeOldKey];
if(oldItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem removeOutput:_videoOutput];
}
if(newItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem addOutput:_videoOutput];
}
[self removeItemObservers:oldItem];
[self addItemObservers:newItem];
}
}
For more context, I'm trying to come up with a fix for flutter's video_player plugin https://github.com/flutter/flutter/issues/72878
Plugin's code can be found here https://github.com/flutter/plugins/blob/172338d02b177353bf517e5826cf6a25b5f0d887/packages/video_player/video_player/ios/Classes/FLTVideoPlayerPlugin.m
You can do this by subclassing AVQueuePlayer (yay OOP) and creating and adding AVPlayerItemVideoOutputs there, as needed. I've never seen multiple AVPlayerItemVideoOutputs before, but memory consumption seems reasonable and everything works.
#interface OutputtingQueuePlayer : AVQueuePlayer
#end
#implementation OutputtingQueuePlayer
- (void)insertItem:(AVPlayerItem *)item afterItem:(nullable AVPlayerItem *)afterItem;
{
if (item.outputs.count == 0) {
NSLog(#"Creating AVPlayerItemVideoOutput");
AVPlayerItemVideoOutput *videoOutput = [[AVPlayerItemVideoOutput alloc] initWithOutputSettings:nil]; // or whatever
[item addOutput:videoOutput];
}
[super insertItem:item afterItem:afterItem];
}
#end
The current output is accessed like so:
AVPlayerItemVideoOutput *videoOutput = _player.currentItem.outputs.firstObject;
CVPixelBufferRef pixelBuffer = [videoOutput copyPixelBufferForItemTime:_player.currentTime itemTimeForDisplay:nil];
// do something with pixelBuffer here
CVPixelBufferRelease(pixelBuffer);
and configuration becomes:
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [OutputtingQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
[self.view.layer addSublayer:_playerLayer];
[_player play];

iOS: AVPlayer - getting a snapshot of the current frame of a video

I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.
What have I tried:
AVAssetImageGenerator. It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos.
Taking snapshot of the player view. I tried first renderInContext: on AVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.
AVPlayerItemVideoOutput. I have added a video output for my AVPlayerItem, however whenever I call hasNewPixelBufferForItemTime: it returns NO. I guess the problem is again streaming video and I am not alone with this problem.
AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:#escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if let img = image {
handler(UIImage(cgImage: img))
}
}
}
(It's Swift 4.2)
AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.
This answer mostly cribbed from here
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#property (nonatomic) AVPlayer *player;
#property (nonatomic) AVPlayerItem *playerItem;
#property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
#end
#implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = #{ (id)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(#"The image: %#", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:#"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:#"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(#"%# Failed to load the tracks.", self);
}
}];
}
#end

Capture UIImage frame of AVPlayer streaming .m3u8 for given time?

So I have an AVPlayer that playing a live stream of an .m3u8 video and from what I've found searching it looks like you can't use AVAssetImageGenerator to make a screen shot but instead should use AVPlayerItemVideoOutput and
- (CVPixelBufferRef)copyPixelBufferForItemTime:(CMTime)itemTime itemTimeForDisplay:(CMTime *)outItemTimeForDisplay
but when I try and get the outputs from my AVPlayer.
NSArray *outputs = self.mainPlayer.currentItem.outputs;
I get an empty array.
The video plays just fine. Ultimatly what I want is a method like this.
-(UIImage *)frameFor:(CMTime)time;
At some point the CALayer on the view needs to be getting this image data so their has to be a way to grab that at some point. I tried just capturing the CALayer my AVPLayerLayer is attached to but I don't get anything more than the blank view color (bright pink just to make sure its return something). Their has to be some way of grabbing this data.
You're getting an empty array when you do
NSArray *outputs = self.mainPlayer.currentItem.outputs;
because the AVPlayerItemVideoOutput object needs to be added first
NSDictionary *settings = #{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
[playerItem addOutput:self.videoOutput];
and then when the AVPlayerItem status is AVPlayerStatusReadyToPlay you can capture the current frame using copyPixelBufferForItemTime:itemTimeForDisplay. For that, you need to create your AVPlayer object and add an observer to the AVPlayerItem status property
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[self.player.currentItem addObserver:self forKeyPath:#"status" options:0 context:NULL];
and then in your callback function
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (self.player.currentItem.status == AVPlayerStatusReadyToPlay)
{
CMTime currentTime = self.player.currentItem.currentTime;
CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
UIImage *image = [UIImage imageWithCIImage:ciImage];
// use `image`
}
}
If you only need to capture a thumbnail and you don't have to play the video first, you can use a simple utility that I've created that follows this logic.
(https://github.com/acotilla91/ACThumbnailGenerator)
How to use:
double bitRate = 1000000; // force video bit rate (can be use to cap video quality and improve performance). Pass 0 to use default bit rate.
self.thumbnailGenerator = [[ACThumbnailGenerator alloc] initWithPreferredBitRate:bitRate];
NSURL *videoURL = [NSURL URLWithString:#"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
int position = 10; // video position (in seconds) from where thumbnail should be extracted. Always pass 0 for live streams.
[self.thumbnailGenerator loadImageFrom:videoURL position:position withCompletionBlock:^(UIImage *image) {
// use `image`
}];
Hope it helps.
Swift version:
ACThumbnailGenerator-Swift
Usage:
var generator: ACThumbnailGenerator!
func captureSomeImage() {
let streamUrl = URL(string: "https://p-events-delivery.akamaized.net/18oijbasfvuhbfsdvoijhbsdfvljkb6/m3u8/hls_vod_mvp.m3u8")!
generator = ACThumbnailGenerator(streamUrl: streamUrl)
generator.delegate = self
generator.captureImage(at: 300)
}
func generator(_ generator: ACThumbnailGenerator, didCapture image: UIImage, at position: Double) {
// Use `image`
}

AVPlayer Item get a nan duration

I'm playing a file. mp3 from url by stream.
I'm using AVPlayer and when I am trying to get the total time to build a progress bar, I get whenever time is nan.
NSError *setCategoryError = nil;
if ([ [AVAudioSession sharedInstance] isOtherAudioPlaying]) { // mix sound effects with music already playing
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategorySoloAmbient error:&setCategoryError];
} else {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&setCategoryError];
}
if (setCategoryError) {
NSLog(#"Error setting category! %ld", (long)[setCategoryError code]);
}
NSURL *url = [NSURL URLWithString:#"http://..//46698"];
AVPlayer *player = [AVPlayer playerWithURL:url];
songPlayer=player;
[songPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == songPlayer && [keyPath isEqualToString:#"status"]) {
if (songPlayer.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (songPlayer.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[songPlayer play];
[songPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 1) queue:dispatch_get_main_queue() usingBlock:^(CMTime time){
CMTime aux = [songPlayer currentTime];
AVPlayerItem *item=[songPlayer currentItem];
CMTime dur=[item duration];
NSLog(#"%f/%f", CMTimeGetSeconds(aux), CMTimeGetSeconds(dur));
}];
} else if (songPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
I've tried everything.
[item duration]; /// Fail
[[item asset] duration]; /// Fail
and nothing work
Anyone know why?
The value of duration property will be reported as kCMTimeIndefinite until the duration of the underlying asset has been loaded. There are two ways to ensure that the value of duration is accessed only after it becomes available:
Wait until the status of the AVPlayerItem is AVPlayerItemStatusReadyToPlay.
Register for key-value observation of the duration property, requesting the initial value. If the initial value is reported as kCMTimeIndefinite, the AVPlayerItem will notify you of the availability of the item's duration via key-value observing as soon as its value becomes known.
For swift:
if player.currentItem.status == .readyToPlay {
print(currentItem.duration.seconds) // it't not nan
}
I have this problem on iOS 12 (for iOS 13 everything works as expected). Current item's duration is always indefinite. I solve it by using player.currentItem?.asset.duration. Something like this:
private var currentItemDuration: CMTime? {
if #available(iOS 13.0, *) {
return player?.currentItem?.duration
} else {
return player?.currentItem?.asset.duration
}
}
See this answer for macOS: https://stackoverflow.com/a/52668213/7132300 It looks like it's also valid for iOS 12.
#voromax is correct. I added the asset to the playerItem without getting the duration first and duration was nan:
let asset = AVAsset(url: videoUrl)
self.playerItem = AVPlayerItem(asset: asset)
When I loaded the asset.loadValuesAsynchronously first, no more nan and I got the correct duration:
let assetKeys = ["playable", "duration"]
let asset = AVAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
DispatchQueue.main.async { [weak self] in
self?.playerItem = AVPlayerItem(asset: asset, automaticallyLoadedAssetKeys: assetKeys)
}
})
You can use asset property. It will give you the duration.
self.player.currentItem?.asset.duration.seconds ?? 0
I had the same problem but I was able to get the duration with a different method. Please see my answer here: https://stackoverflow.com/a/38406295/3629481

Using the AVURLAsset to play a Remote Video File

I want to use the AVURLAsset to play a video file but on a server not local file. I have read that the AVURLAsset cant be used directly for remote files .
I read another link of stack Overflow
AVURLAsset cannot load with remote file
this link has some method to use AVURLAsset to play remote files but I am not able to understand it fully. My observer is not being called. Can someone please help me? Actually I don't want to use AVPlayer to play video for some reasons. I am grabbing frames from AVAsset and then rendering them as textures in OpenGL so I need to do this by AVURLAsset only.
Here is code to look at
-(void) startPlayer
{
NSURL *url=[NSURL fileURLWithPath:#"http://gamooz.com/wildlife.mp4"];
pItem = [AVPlayerItem playerItemWithURL:url];
player = [AVPlayer playerWithPlayerItem:pItem];
[player play];
pItem addObserver:self forKeyPath:#"status" options:0 context:nil];
}
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change: (NSDictionary *)change context:(void *)context
{
NSLog(#"heyy");
if ([keyPath isEqualToString:#"status"])
{
AVPlayerItem *pItemTemp = (AVPlayerItem *)object;
if (pItemTemp.status == AVPlayerItemStatusReadyToPlay)
{
///now i can use playerItem asset
asset = (AVURLAsset *)pItemTemp.asset;
}
}
}
but the observer is never getting called. Why is that?
Also I put the observer code in some other function and tried to check if the playerItem is ready to play
-(void) checkForPlayer
{
if (pItem.status == AVPlayerItemStatusReadyToPlay)
{
asset = (AVURLAsset *)pItem.asset;
}
}
it is never giving status equal to ready.

Resources