I'm writing Plugin for AVPlayer in ios. I need to know when user click on Done button in AVPlayerViewController ( I want to know when user close video) and I don't access to AVPlayerViewController object. I checked event and found only rate property in AVPlayer set to 0 but in pause situation also the rate set to 0 . how I figure out these two situations ?
thanks all.
I met the problem when I was developing the player in windowed mode. It is currently impossible without tricks. Therefore, I used KVO to observe contentOverlayView which is actually a full-size view of AVPlayerViewController. Code is a bit complicated. In the example below, playerView property is a view from xib/storyboard on the view controller (see attached).
#import <AVKit/AVKit.h>
#import <AVFoundation/AVFoundation.h>
static NSString * const kBoundsProperty = #"bounds";
static void * kBoundsContext = &kBoundsContext;
#interface ViewController ()
#property (nonatomic, strong) AVPlayerViewController *playerViewController;
// View for windowed mode.
#property (weak, nonatomic) IBOutlet UIView *playerView;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.playerViewController = [[AVPlayerViewController alloc] init];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Because in -viewDidLoad frame is unknown.
[self loadPlayerView];
}
- (void)loadPlayerView {
NSURL *videoURL = [NSURL URLWithString:#"https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"];
AVPlayer *player = [[AVPlayer alloc] initWithURL:videoURL];
self.playerViewController.player = player;
[player play];
[self addChildViewController:self.playerViewController];
[self.playerView addSubview:self.playerViewController.view];
// MARK: I would recommend to use constraints instead of frame.
self.playerViewController.view.frame = self.playerView.bounds;
[self.playerViewController.contentOverlayView addObserver:self forKeyPath:kBoundsProperty options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld context:kBoundsContext];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSString *,id> *)change context:(void *)context {
if (context == kBoundsContext) {
CGRect oldBounds = [change[NSKeyValueChangeOldKey] CGRectValue];
CGRect newBounds = [change[NSKeyValueChangeNewKey] CGRectValue];
BOOL wasFullscreen = CGRectEqualToRect(oldBounds, [UIScreen mainScreen].bounds);
BOOL isFullscreen = CGRectEqualToRect(newBounds, [UIScreen mainScreen].bounds);
if (isFullscreen && !wasFullscreen) {
if (CGRectEqualToRect(oldBounds, CGRectMake(0.0, 0.0, newBounds.size.height, newBounds.size.width))) {
NSLog(#"Rotated fullscreen");
} else {
NSLog(#"Entered fullscreen");
dispatch_async(dispatch_get_main_queue(), ^{
[[NSNotificationCenter defaultCenter] postNotificationName:#"DidEnterInFullscreen" object:nil];
});
}
} else if (!isFullscreen && wasFullscreen) {
NSLog(#"Exited fullscreen");
dispatch_async(dispatch_get_main_queue(), ^{
[[NSNotificationCenter defaultCenter] postNotificationName:#"DidExitFromFullscreen" object:nil];
});
}
}
}
#end
I have subclassed AVPlayerViewController so that I can show it in landscape mode on iPhone by overriding supportedInterfaceOrientations. This works fine. But when I click on cloud at the bottom right to select Subtitle and CC option, it opens in Portrait mode. What could be the reason?
Is there any other way to display AVPlayerViewController in landscape mode without subclassing it?
Do you mean only creating a simple Class? Because is simple, just create a new Class and add a AVLayer instead of creating an AVPlayerViewController:
NSString *videoPath = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"mp4"];
NSURL *videoURL = [NSURL fileURLWithPath:videoPath];
AVPlayerItem * playerItem = [AVPlayerItem playerItemWithURL:videoURL];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(itemDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:playerLayer];
[player play];
Then override the right methods:
- (BOOL)shouldAutorotate {
return YES;
}
- (UIInterfaceOrientationMask)supportedInterfaceOrientations {
return UIInterfaceOrientationMaskLandscape;
}
- (UIInterfaceOrientation)preferredInterfaceOrientationForPresentation {
return UIInterfaceOrientationLandscapeLeft;
}
In my application, I'm trying to play video using URL from my server . I'm using the UITableView to display the video list and by tapping the Cell from the list, the video will play in sub view. Now I want to play the video in landscape mode.
This is the current video code.
_movieplayer = [[MPMoviePlayerController alloc]initWithContentURL: [NSURL URLWithString:[self urlencode:self.strPlayUrl]]];
[[_movieplayer view] setFrame: CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[self.view addSubview: [_movieplayer view]];
[_movieplayer setShouldAutoplay:YES];
[_movieplayer prepareToPlay];
[self.movieplayer play];
In the above code, how do I make it work so that it plays in landscape mode. Please guide me on this. Been stuck on this for a long time.
Add this to your viewcontroller with the movieplayer
#property (nonatomic, strong) MPMoviePlayerController* mpc;
- (void)setUpMPC
{
NSURL* m = [[NSBundle mainBundle] URLForResource:#"YourVideo" withExtension:#"mp4"];
MPMoviePlayerController* mp = [[MPMoviePlayerController alloc] initWithContentURL:m];
self.mpc = mp; // retain policy
self.mpc.shouldAutoplay = NO;
[self.mpc prepareToPlay];
self.mpc.view.frame = CGRectMake(50, 50, self.view.bounds.size.width, self.view.bounds.size.height);
}
-(NSUInteger)supportedInterfaceOrientations {
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPad) {
return UIInterfaceOrientationMaskAll;
}
return UIInterfaceOrientationMaskLandscape;
}
Probably you should implement the shouldAutorotateToInterfaceOrientation and supportedInterfaceOrientations methods in your viewController
#property (nonatomic, strong) MPMoviePlayerController* moviePlayerController;
#pragma mark # - Init -
- (void) createAndPlayMovieForURL: (NSURL*) movieURL
{
self.moviePlayerController = [[MPMoviePlayerController alloc] initWithContentURL: movieURL];
[self.moviePlayerController.view setFrame: self.view.bounds];
[self.view addSubview: self.moviePlayerController.view];
[self.view bringSubviewToFront: self.overlayView];
}
#pragma mark - Rotation -
- (BOOL) shouldAutorotateToInterfaceOrientation: (UIInterfaceOrientation) interfaceOrientation
{
return YES;
}
- (NSUInteger) supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAllButUpsideDown;
}
I have a 15 second animation in my app that is currently way too inefficient to be workable as it uses 1.5GB of RAM.
Am I using the animation properties of UIImageView incorrectly? Perhaps there is a different solution. All I want is for it to loop for a certain duration.
There will be improvements I can make to the 539 images to make them more efficient, but they still need to be retina size etc. Currently it is 30fps.
- (void)viewDidLoad {
[super viewDidLoad];
_animationFrames = [[NSMutableArray alloc] init];
for (long l = 0; l < 540; l++) {
NSString *frameNumber = [NSMutableString stringWithFormat:#"AnimationFrames-%04ld", l];
UIImage *frame = [UIImage imageNamed:frameNumber];
[_animationFrames addObject:frame];
}
UIImageView *animation = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
animation.animationImages = _tutorialFrames;
animation.animationDuration = 14;
[self.view addSubview: animation];
[animation startAnimating];
}
Thanks to Mirko Brunner's comments I discovered that I could use an embedded movie rather than play the animation through UIImageView, though I ended up going with AVPlayer rather than MPMoviePlayer.
#import "VideoViewController.h"
#import AVFoundation;
#interface VideoViewController ()
#property (weak, nonatomic) AVPlayer *avPlayer;
#property (weak, nonatomic) AVPlayerLayer *avPlayerLayer;
#end
#implementation VideoViewController
- (void)viewDidLoad {
[super viewDidLoad];
[self avPlayer];
[self.avPlayer play];
}
- (AVPlayer *)avPlayer {
if (!_avPlayer) {
NSURL *url = [[NSBundle mainBundle]
URLForResource: #"Video" withExtension:#"mp4"];
_avPlayer = [AVPlayer playerWithURL:url];
_avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:_avPlayer];
_avPlayerLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer: _avPlayerLayer];
_avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[_avPlayer currentItem]];
}
return _avPlayer;
}
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:kCMTimeZero];
}
I am using AVFoundation's AVPlayer to play 2 video clips made from 1 longer video (so the end of the first matches the beginning of the second)
When the first video ends and the user taps, I create a new AVPlayer and assign it to my PlayerView, and start playing the second clip.
This all works, however, there is a prominent screen "flicker".
My assumption is that this is caused by the player view removing the first clip and then showing the second clip.
What I need is for this flicker to no appear, so that going between the two clips is seamless.
Do anyone know if there is a way to stop this flickr, either via the AVPlayer* classes, or a way to "fake" it by doing something to make it so this isn't visible.
Thanks
Below is the code of my load and play method:
- (void)loadAssetFromFile
{
NSURL *fileURL = nil;
switch (playingClip)
{
case 1:
fileURL = [[NSBundle mainBundle] URLForResource:#"wh_3a" withExtension:#"mp4"];
break;
case 2:
fileURL = [[NSBundle mainBundle] URLForResource:#"wh_3b" withExtension:#"mp4"];
break;
case 3:
fileURL = [[NSBundle mainBundle] URLForResource:#"wh_3c" withExtension:#"mp4"];
break;
case 4:
fileURL = [[NSBundle mainBundle] URLForResource:#"wh_3d" withExtension:#"mp4"];
break;
default:
return;
break;
}
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// The completion block goes here.
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded)
{
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
[self.player seekToTime:kCMTimeZero];
[self play];
}
else {
// Deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
}];
}
You do not need to re-create AVPlayer for this task. You can just have multiple AVPlayerItems and then switch which one is current via [AVPlayer replaceCurrentItemWithPlayerItem:item].
Also, you can observe for when current item has changed with the code below.
static void* CurrentItemObservationContext = &CurrentItemObservationContext;
...
After creating a player, register the observer:
[player1 addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:CurrentItemObservationContext];
...
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
if (context == CurrentItemObservationContext) {
AVPlayerItem *item = [change objectForKey:NSKeyValueChangeNewKey];
if (item != (id)[NSNull null]) {
[player1 play];
}
}
}
There are two workaround that I found. To me both approaches worked and I prefer the second one.
First, as #Alex Kennberg mentioned, create two set of AVPlayerLayer and AVPlayer. and switching them when you switch between videos. Be sure to set background color to clear color.
Second, use UIImageView as the owner view of AVPlayerLayer. Create thumbnail image of a video and set it to the imageview before switching the video. Be sure to set the view mode correctly.
I ran into the same issue with the video "flashing" and solved it this way in Swift 5.
Set my player variable to this
var player = AVPlayer(playerItem: nil)
Then inside my playVideo function, I changed this
self.player.replaceCurrentItem(with: AVPlayerItem(url: fileURL))
to this
player = AVPlayer(url: fileURL)
"fileURL" is the path to video I want to play.
This removed the flash and played the next video seamlessly for me.
You can initialise the PlayerItem and seek to zero some time before you assign it to the player.
Then the flickering disappears
I tried this and it worked for me.
if (layerView1.playerLayer.superlayer) {
[layerView1.playerLayer removeFromSuperlayer];
}
But I am also allocating my own AVPlayerLayer instead of using IB to do it.
After too many tries without success, I finally found a solution, not the best one, but works
My entry code bellow, have a look at the loadVideo method
#import "ViewController.h"
#import <AVKit/AVKit.h>
#interface ViewController ()<UIGestureRecognizerDelegate>
#property (nonatomic, strong) NSArray *videos;
#property (nonatomic, assign) NSInteger videoIndex;
#property (nonatomic, strong) AVPlayer *player;
#property (nonatomic, strong) AVPlayerLayer *playerLayer;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.view.backgroundColor = [UIColor blackColor];
self.videos = #[#"1.mp4", #"2.mp4", #"3.mp4", #"4.mp4", #"5.mp4", #"6.mp4"];
self.videoIndex = 0;
[self loadVideo];
[self configureGestures:self.view]; //changes video on swipe
}
-(void)prevZoomPanel{
if(self.videoIndex <= 0){
NSLog(#"cant go prev");
return;
}
self.videoIndex -= 1;
[self loadVideo];
}
-(void)nextZoomPanel{
if(self.videoIndex >= self.videos.count - 1){
NSLog(#"cant go next");
return;
}
self.videoIndex += 1;
[self loadVideo];
}
#pragma mark - Load Video
-(void)loadVideo{
NSURL * bundle = [[NSBundle mainBundle] bundleURL];
NSURL * file = [NSURL URLWithString:self.videos[self.videoIndex] relativeToURL:bundle];
NSURL * absoluteFile = [file absoluteURL];
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:absoluteFile];
//*************
//DO NOT USE '[self.player replaceCurrentItemWithPlayerItem:item]', it flashes, instead, initialize the instace again.
//Why is replaceCurrentItemWithPlayerItem flashing but playerWithPlayerItem is NOT?
// if you want to see the diferente, uncomment the code above
self.player = [AVPlayer playerWithPlayerItem:item];
// if (self.player == nil) {
// self.player = [AVPlayer playerWithPlayerItem:item];
// }else{
// [self.player replaceCurrentItemWithPlayerItem:item];
// }
//*************
//create an instance of AVPlayerLayer and add it on self.view
//afraid of this
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
playerLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer:playerLayer];
//*************
//play the video before remove the old AVPlayerLayer instance, at this time will have 2 sublayers
[self.player play];
NSLog(#"sublayers before: %zd", self.view.layer.sublayers.count);
//*************
//remove all sublayers after 0.09s, to avoid the flash, 0.08 still flashing.
//TODO: tested on iPhone X, need to test on slower iPhones to check if the time is enough.
//Why do I need to wait to remove? Is that safe? What if I swipe a lot too fast, faster than 0.09s ?
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.09 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
NSArray* sublayers = [NSArray arrayWithArray:self.view.layer.sublayers];
NSInteger idx = 0;
for (CALayer *layer in sublayers) {
if (idx < self.view.layer.sublayers.count && self.view.layer.sublayers.count > 1) {
//to avoid memory crash, need to remove all sublayer but keep the top one.
[layer removeFromSuperlayer];
}
idx += 1;
}
NSLog(#"sublayers after: %zd", self.view.layer.sublayers.count);
});
//*************
//the code bellow is the same of the above, but with no delay
//uncomment the code bellow AND comment the code above to test
// NSArray* sublayers = [NSArray arrayWithArray:self.view.layer.sublayers];
// NSInteger idx = 0;
//
// for (CALayer *layer in sublayers) {
//
// if (idx < self.view.layer.sublayers.count && self.view.layer.sublayers.count > 1) {
//
// //to avoid memory crash, need to remove all sublayer but keep the top one.
// [layer removeFromSuperlayer];
// }
// idx += 1;
// }
//*************
//App's memory usage is about 14MB constantly, didn't increase on videos change.
//TODO: need to test with more than 100 heavy videos.
}
-(void)configureGestures:(UIView *)view{
UISwipeGestureRecognizer *right = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(userDidSwipeScreen:)];
right.direction = UISwipeGestureRecognizerDirectionRight;
right.delegate = self;
[view addGestureRecognizer:right];
UISwipeGestureRecognizer *left = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(userDidSwipeScreen:)];
left.direction = UISwipeGestureRecognizerDirectionLeft;
left.delegate = self;
[view addGestureRecognizer:left];
}
- (void)userDidSwipeScreen:(UISwipeGestureRecognizer *)swipeGestureRecognizer{
switch (swipeGestureRecognizer.direction) {
case UISwipeGestureRecognizerDirectionLeft: [self nextZoomPanel];break;
case UISwipeGestureRecognizerDirectionRight:[self prevZoomPanel];break;
default: break;
}
}
#end
I found a very simple solution (maybe too simple for some people, but for me it worked):
In Interface Builder I set the background color of my view (which gets the video layer attached to) to black. So it's just 'flashing' black now...
As what #blancos says in this answer
Firstly, AVPlayer doesn't show any white screen, its your background
which is white
He's 100% correct because when I set my background to white, the flash was white. But when I set the background to green, the flash was green. So to fix it, I set the background to black
view.backgroundColor = .black
When switching videos, I used player.replaceCurrentItem(...):
playerItem: AVPlayerItem?
func switchVideos(url: URL) {
playerItem = AVPlayerItem(url: url)
player.replaceCurrentItem(with: playerItem!)
// if necessary use the KVO to know when the video is ready to play
}