I have to play the same local video file with the AVPlayer at the same time, and I created 4 AVPlayer instance, and the AVPlayerLayers was added to tha same layer. But the question is that 4 player were not begin at the same time. How to make them begin at the same time? here is my code:
self.players = #[].mutableCopy;
CMAudioClockCreate(kCFAllocatorDefault, &_syncClock);
AVPlayerItem *item = [[AVPlayerItem alloc] initWithURL:self.url];
for (NSInteger i = 0; i<playerNum; i++) {
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:item.copy];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
//设置模式
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
playerLayer.contentsScale = [UIScreen mainScreen].scale;
CGPoint pos = [playerOriginArr[i] CGPointValue];
playerLayer.frame = CGRectMake(pos.x, pos.y, playerSize.width, playerSize.height);
[self.playBackBgView.layer addSublayer:playerLayer];
avPlayer.masterClock = _syncClock;
[avPlayer.currentItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
[avPlayer play];
[self.players addObject:avPlayer];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if ([keyPath isEqualToString:#"status"] && [object isKindOfClass:[AVPlayerItem class]]) {
AVPlayerItem *playerItem = (AVPlayerItem *)object;
if (playerItem.status == AVPlayerStatusReadyToPlay) {
for (AVPlayer *player in self.players) {
if (player.currentItem == playerItem) {
[player prerollAtRate:1.0 completionHandler:^(BOOL finished) {
if (finished) {
}
}];
player.automaticallyWaitsToMinimizeStalling = NO;//如果是
NSLog(#"setRate");
[player setRate:1.0 time:kCMTimeInvalid atHostTime:CMClockGetTime(_syncClock)];
}
}
}
}
}
Create just one AVPlayer and create multiple AVPlayerLayers from that.
The layers will be synchronized.
Related
Here is my code:
NSURL *fileURL = [NSURL URLWithString:#"http://myglams.com/UserUpload/vv/oSiFevyNHM.mp4"];
moviePlayerController = [[MPMoviePlayerController alloc] initWithContentURL:fileURL];
[moviePlayerController.view setFrame:CGRectMake(20, 100, 380, 150)];
[self.view addSubview:moviePlayerController.view];
moviePlayerController.fullscreen = YES;
moviePlayerController.allowsAirPlay = YES;
moviePlayerController.shouldAutoplay =YES;
moviePlayerController.controlStyle = MPMovieControlStyleEmbedded;
[moviePlayerController play];// cannot stream automatically its needs many play every time
The best option for play videos is AVPlayer and AVPlayerLayer using currentItem.loadedTimeRanges observer.
Example
void *kTimeRangesKVO1 = &kTimeRangesKVO1;
playerView = [[AVPlayer alloc] initWithURL:url];
layer = [AVPlayerLayer playerLayerWithPlayer:playerView];
layer.frame = CGRectMake(0, 0, videoContentPlayer.frame.size.width, videoContentPlayer.frame.size.height);
[layer setVideoGravity:AVLayerVideoGravityResizeAspect];
[videoContentPlayer.layer insertSublayer:layer below:labelTime.layer];
[playerView addObserver:self forKeyPath:#"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:kTimeRangesKVO1];
// the callback method
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (kTimeRangesKVO1 == context) {
NSArray *timeRanges = (NSArray*)[change objectForKey:NSKeyValueChangeNewKey];
if (timeRanges && [timeRanges count]) {
CMTimeRange timerange=[[timeRanges objectAtIndex:0]CMTimeRangeValue];
float currentBufferDuration = CMTimeGetSeconds(CMTimeAdd(timerange.start, timerange.duration));
CMTime duration = playerView.currentItem.asset.duration;
float seconds = CMTimeGetSeconds(duration);
//your code here with your rules.
}
}
}
Remember remove your observer with removeObserver something like [playerView removeObserver:self forKeyPath:#"currentItem.loadedTimeRanges"];
I am using AVPlayer for playing remote .mp4 files (in Objective C). The video loads from remote url, but after 1 sec of playing the player freeze. The video got stucked and not playing it.
Can any one please help me?
NSString *encodedString = [urlString stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL *videoUrl = [[NSURL alloc]initWithString:encodedString];
self.asset = [AVAsset assetWithURL:videoUrl];
self.player = [AVPlayer playerWithURL:videoUrl];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.myPlayerView.layer addSublayer:self.playerLayer];
self.videoPlaybackPosition = 0;
[self.player play];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == self.player && [keyPath isEqualToString:#"status"]) {
if (self.player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed --> %ld",(long)self.player.currentItem.status);
} else if (self.player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[self.player play];
} else if (self.player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
I want to play a remote .mp4 video in AvPlayer. Here is my code
NSString *encodedString = [urlString stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL *videoUrl = [[NSURL alloc]initWithString:encodedString];
self.asset = [AVAsset assetWithURL:videoUrl];
self.player = [AVPlayer playerWithURL:videoUrl];
self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.player currentItem]];
self.playerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.myPlayerView.layer addSublayer:self.playerLayer];
self.videoPlaybackPosition = 0;
[self.player play];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == self.player && [keyPath isEqualToString:#"status"]) {
if (self.player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed --> %ld",(long)self.player.currentItem.status);
} else if (self.player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayerStatusReadyToPlay");
[self.player play];
} else if (self.player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
AVPlayerStatusReadyToPlay is taking some time to buffering the video and it only working for one time.
But the same code I use to record a video and play it from temp file. In that case it works fine. After uploading the video to server, when I try to play it from server, the video plays for just 1-2 seconds. After that the video got stuck, and nothing happens. I can't understand where is my problem.
Can any one please help me?
My app currently plays 2 videos at the same time, functionally, but I all the SO answers i've seen utilizes lots of Key-Value code. Is it bad/wrong if I just do the bare minimum listed below?
viewdidload
AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL URLWithString:firstVideo.videoURL]];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
self.player1 = [AVPlayer playerWithPlayerItem:item];
[topPlayer setMovieToPlayer:self.player1];
AVURLAsset *asset2 = [AVURLAsset assetWithURL:[NSURL URLWithString:secondVideo.videoURL]];
AVPlayerItem *item2 = [AVPlayerItem playerItemWithAsset:asset2];
self.player2 = [AVPlayer playerWithPlayerItem:item2];
[bottomPlayer setMovieToPlayer:self.player2];
((AVPlayerLayer *)[self.topPlayer layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
((AVPlayerLayer *)[self.bottomPlayer layer]).videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.player1 play];
[self.player2 play];
The above code is all I use to play a video, and it works fine. Occasionally there is a ~1s delay, how can I wait until both the videos are ready to play, then play them both?
use KVO to observer the player's status, and play videos when both's status is ready.
[yourPlayerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:nil];
and in kvo:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if ([keyPath isEqualToString:#"status"]) {
AVPlayerStatus status = [change[NSKeyValueChangeNewKey] integerValue];
switch (status) {
case AVPlayerStatusUnknown:
//do something
break;
case AVPlayerStatusReadyToPlay:
{
//check which avplayer is ready, by check the parametric object isEqual:yourPlayerItem
//use a bool value to record the ready status. after two bools are YES, then play the video
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *playerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:playerItem.error];
}
break;
}
}
}
So I am building a custom video player using AVFoundation - AVPlayer and AVPlayerLayer.
Currently, all I want the player to do is play a video in the asset library with a hardcoded url to that video. I would like this to be contained in a SubClass of UIView so I can use it all around my app.
Here is my code so far:
CUPlayer.h:
#interface CUPlayer : UIView
{
AVPlayer *player;
AVPlayerLayer *playerLayer;
AVPlayerItem *item;
NSURL *url;
}
#property(nonatomic) UIViewAutoresizing autoresizingMask;
#end
CUPlayer.m:
#implementation CUPlayer
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
// Initialization code
self.backgroundColor = [UIColor redColor];
[self setupURL];
}
return self;
}
-(void)setupURL
{
NSLog(#"URL setting up");
NSString *string = #"assets-library://asset/asset.mov?id=0A937F0D-6265-452D-8800- 1A760E8E88B9&ext=mov";
url = [[NSURL alloc] initFileURLWithPath: string];
[self setupPlayerForURL];
}
-(void)setupPlayerForURL
{
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
//player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[self.layer addSublayer:playerLayer];
playerLayer.frame = CGRectMake(0, 0, 200, 200);
//[playerLayer setBackgroundColor:[UIColor greenColor].CGColor];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayer Ready to Play");
[player play];
} else if (player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
And then I am calling this from the View Controller:
CUPlayer *cuPlayer = [[CUPlayer alloc]initWithFrame:CGRectMake(0, 0, 250, 250)];
[self.view addSubview:cuPlayer];
This complies but just give me a red square without the video playing. The URL to the local file is definitely correct. I can get it working if I hold all the code in the view controller and run the play in -(void)viewDidLayoutSubviews.
Help would very much be appreciated, I have read all the documentation multiple times trying to work this thing out.
Tom