Edit 3
I have found the root cause. CADisplayLink has a strong reference of the target. So it makes Retain Cycles.
Edit 2
Now I think is the memory issue causing the crash.
What I am doing is capture the output of the player and draw it on the opengl layer.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
NSDictionary *pixBuffAttributes = #{(id)kCVPixelBufferPixelFormatTypeKey: #(kCVPixelFormatType_32BGRA)};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
[self.player.currentItem addOutput:self.videoOutput];
[self.player seekToTime:kCMTimeZero];
[self.player play];
In the callback of DisplayLink
CMTime itemTime = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
BOOL hasNewContent = [self.videoOutput hasNewPixelBufferForItemTime:itemTime];
if (hasNewContent) {
CVPixelBufferRef pixelBuffer = [self.videoOutput copyPixelBufferForItemTime:itemTime itemTimeForDisplay:NULL];
// creat texture with pixelBuffer
// display texture on opengl surface
if (pixelBuffer != NULL) {
CFRelease(pixelBuffer);
}
}
There is no memory leak by instruments, but memory is rising.
Edit 1:
I have found a workaround. the resolution of "video_1" and "video_3" is 3840 * 1920, and the resolution of "video_2" is 2160 * 1080.
When I use ffmpeg to change the all resolutions to 2160 * 1080, it's worked.
Origin
I want to play several videos in sequence and meet a very strange behavior.
AVPlayerItem *item = ...;
if (!self.player) {
self.player = [AVPlayer playerWithPlayerItem:item];
} else {
[self.player replaceCurrentItemWithPlayerItem:item];
}
[self.player seekToTime:kCMTimeZero];
[self.player play];
For examples, I have three video files, such as video_1, video_2 and video_3.
First, I set the playerItem with "video_1", then I replace with "video_2". That's ok.
But I replace with "video_3", the App has crashed. I can't find any device log on my iphone. Even more, when I was debugging and replacing with "video_3", it would disconnect the debug and no exception!
More information:
"video_2" can replace "video_1"
"video_1" can replace "video_2"
"video_3" can replace "video_2"
"video_3" can't replace "video_1"
"video_1" can't replace "video_3"
all videos can be played normal in alone.
Try below code
if ([playerItemVideoOutput hasNewPixelBufferForItemTime:currentTime]) {
__unsafe_unretained ViewController *weakSelf = self; //create weak reference of your viewcontroller
CVPixelBufferRef pixelBuffer = [playerItemVideoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
if(pixelBuffer) { //check if buffer exist
[weakSelf.metalView.realTimeRender setPixelBuffer:pixelBuffer]; //use weakSelf here
CFRelease(pixelBuffer);
}
}
Related
AVPlayerLooper accepts a template AVPlayerItem and a AVQueuePlayer as setup parameters, then it internally manipulates items of the queue and player is constantly changing its currentItem.
This works perfect with AVPlayerLayer, which accepts this looped player as parameter and just renders it, but how can I use it with AVPlayerItemVideoOutput, which is being attached to the AVPlayerItem, which the player has multiple inside it? How do I reproduce the same thing AVPlayerLayer does internally?
AVPlayerLooper setup example from docs
NSString *videoFile = [[NSBundle mainBundle] pathForResource:#"example" ofType:#"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:videoFile];
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [AVQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_playerLayer];
[_player play];
This is how AVPlayerItemVideoOutput supposed to be used
[item addOutput:_videoOutput];
The only workround I came up with is to observe for changes of the currentItem and each time deattach the video output from old item and attach it to new one, like in example below, but this apparently neutralizes the gapless playback which I'm trying to achieve.
- (void)observeValueForKeyPath:(NSString*)path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
if (context == currentItemContext) {
AVPlayerItem* newItem = [change objectForKey:NSKeyValueChangeNewKey];
AVPlayerItem* oldItem = [change objectForKey:NSKeyValueChangeOldKey];
if(oldItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem removeOutput:_videoOutput];
}
if(newItem.status == AVPlayerItemStatusReadyToPlay) {
[newItem addOutput:_videoOutput];
}
[self removeItemObservers:oldItem];
[self addItemObservers:newItem];
}
}
For more context, I'm trying to come up with a fix for flutter's video_player plugin https://github.com/flutter/flutter/issues/72878
Plugin's code can be found here https://github.com/flutter/plugins/blob/172338d02b177353bf517e5826cf6a25b5f0d887/packages/video_player/video_player/ios/Classes/FLTVideoPlayerPlugin.m
You can do this by subclassing AVQueuePlayer (yay OOP) and creating and adding AVPlayerItemVideoOutputs there, as needed. I've never seen multiple AVPlayerItemVideoOutputs before, but memory consumption seems reasonable and everything works.
#interface OutputtingQueuePlayer : AVQueuePlayer
#end
#implementation OutputtingQueuePlayer
- (void)insertItem:(AVPlayerItem *)item afterItem:(nullable AVPlayerItem *)afterItem;
{
if (item.outputs.count == 0) {
NSLog(#"Creating AVPlayerItemVideoOutput");
AVPlayerItemVideoOutput *videoOutput = [[AVPlayerItemVideoOutput alloc] initWithOutputSettings:nil]; // or whatever
[item addOutput:videoOutput];
}
[super insertItem:item afterItem:afterItem];
}
#end
The current output is accessed like so:
AVPlayerItemVideoOutput *videoOutput = _player.currentItem.outputs.firstObject;
CVPixelBufferRef pixelBuffer = [videoOutput copyPixelBufferForItemTime:_player.currentTime itemTimeForDisplay:nil];
// do something with pixelBuffer here
CVPixelBufferRelease(pixelBuffer);
and configuration becomes:
_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [OutputtingQueuePlayer queuePlayerWithItems:#[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
[self.view.layer addSublayer:_playerLayer];
[_player play];
I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.
What have I tried:
AVAssetImageGenerator. It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos.
Taking snapshot of the player view. I tried first renderInContext: on AVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.
AVPlayerItemVideoOutput. I have added a video output for my AVPlayerItem, however whenever I call hasNewPixelBufferForItemTime: it returns NO. I guess the problem is again streaming video and I am not alone with this problem.
AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :
import AVFoundation
// ...
var player:AVPlayer? = // ...
func screenshot(handler:#escaping ((UIImage)->Void)) {
guard let player = player ,
let asset = player.currentItem?.asset else {
return
}
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
let times = [NSValue(time:player.currentTime())]
imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
if let img = image {
handler(UIImage(cgImage: img))
}
}
}
(It's Swift 4.2)
AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.
This answer mostly cribbed from here
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
#interface ViewController ()
#property (nonatomic) AVPlayer *player;
#property (nonatomic) AVPlayerItem *playerItem;
#property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
#end
#implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = #{ (id)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(#"The image: %#", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:#"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:#"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(#"%# Failed to load the tracks.", self);
}
}];
}
#end
I am using AVPlayer to play online video in my project. The Video is playing well. Now I want to reduce /increase the fps of the video . Below is my code that I am using:
self.asset = [AVAsset assetWithURL:self.videoUrl];
// the video player
self.player = [AVPlayer playerWithURL:self.videoUrl];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.myPlayerView.frame.size.height);
[self.myPlayerView.layer addSublayer:self.playerLayer];
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:kCMTimeZero];
}
Now how should I reduce/increase the fps for the online video?
You can do something like,
-(float)getFrameRateFromAVPlayer
{
float fps=0.00;
if (self.queuePlayer.currentItem.asset) {
AVAssetTrack * videoATrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
if(videoATrack)
{
fps = videoATrack.nominalFrameRate;
}
}
return fps;
}
OR
AVPlayerItem *item = AVPlayer.currentItem; // Your current item
float fps = 0.00;
for (AVPlayerItemTrack *track in item.tracks) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeVideo]) {
fps = track.currentVideoFrameRate;
}
}
Hope this will help :)
AVPlayer allows you to set the current rate of the playback. Basically, it accepts a range of possibility values to control the current AVPlayerItem such as play slow forward, fast forward or reverse with negative rates. As saying in the document, you should check whether or not the current item can support those states of playing
Please try to check it out. The link for your reference https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instp/AVPlayer/rate
AVPlayerItem has this property forwardPlaybackEndTime
The value indicated the time at which playback should end when the
playback rate is positive (see AVPlayer’s rate property).
The default value is kCMTimeInvalid, which indicates that no end time
for forward playback is specified. In this case, the effective end
time for forward playback is the item’s duration.
But I don't know why it does not work. I tried to set it in AVPlayerItemStatusReadyToPlay, duration available callback, ... but it does not have any effect, it just plays to the end
I think that forwardPlaybackEndTime is used to restrict the playhead, right?
In my app, I want to play from the beginning to the half of the movie only
My code looks like this
- (void)playURL:(NSURL *)URL
{
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:URL];
if (self.avPlayer) {
if (self.avPlayer.currentItem && self.avPlayer.currentItem != playerItem) {
[self.avPlayer replaceCurrentItemWithPlayerItem:playerItem];
}
} else {
[self setupAVPlayerWithPlayerItem:playerItem];
}
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
// Play
[self.avPlayer play];
}
How to make forwardPlaybackEndTime work?
Try this
AVPlayerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
Here 5 is the time till the AVPlayerItem will play.
Set the following on your AVPlayer
AVPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
Then set your notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidPlayToEndTime:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
with the method obviously
- (void) playerItemDidPlayToEndTime:(NSNotification*)notification
{
// do something here
}
then set your forwardPlaybackEndTime
AVPlayer.currentItem.forwardPlaybackEndTime = CMTimeAdd(AVPlayer.currentItem.currentTime, CMTimeMake(5.0, 1));
and start your avplayer playing
AVPlayer.rate = 1.0;
the notification will be triggered and your track will continue playing. In your handler you can stop it and do a seekToTime or whatever.
Alternatively you can just set a boundary observer
NSArray *array = [NSArray arrayWithObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(5.0, 1)]];
__weak OTHD_AVPlayer* weakself = self;
self.observer_End = [self addBoundaryTimeObserverForTimes:array queue:NULL usingBlock:^{ if( weakself.rate >= 0.0 ) [weakself endBoundaryHit]; }];
I have checked below code its running and streaming stops at specified time:
- (void)playURL
{
NSURL *url = [NSURL URLWithString:#"http://clips.vorwaerts-gmbh.de/VfE_html5.mp4"];
self.playerItem = [AVPlayerItem playerItemWithURL:url];
self.playerItem.forwardPlaybackEndTime = CMTimeMake(10, 1);
self.avPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
[videoView setPlayer:self.avPlayer];
// Play
[self.avPlayer play];
}
I hope this will help you.
Also please check these tutorials: AVFoundation Framework
I think that you have to update avPlayer's current playerItem.
So,
playerItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
it should be:
self.avPlayer.currentItem.forwardPlaybackEndTime = CMTimeMake(5, 1);
I am implementing a n HTTP live streaming player with OSX using avplayer.
I am able to stream it properly seek and get duration timing etc.
Now i want to take screen shots and process the frames from it using OpenCV.
I went for using AVASSetImageGenerator. But there is no audio and video tracks with the AVAsset which is associated with player.currentItem.
The tracks are appearing in player.currentItem.tracks.
So i am not able to sue AVAssetGenerator. Can anybody help to find out a solution to extract screenshots and individual frames in such a scenario?
Please find the code below how i am initiating an HTTP live stream
Thanks in advance.
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];
[playeritem addObserver:self forKeyPath:#"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:#"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];
Following is how i am checking whether video track is present with the Asset
case AVPlayerItemStatusReadyToPlay:
[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(#"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];
AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
This is an older question but in case someone needs help for that i have an answer
AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
}
}
You can easily get the FPS of a Video by using this code:
float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}
Hope that helps someone who is asking how to get all frames from a video or just some specific (with CMTime for example) frames. Please bear in mind, that saving all frames to an array can impact the memory hardly!