I have multiple imageview subviews getting stacked based on my incoming data. Basically all these subviews are either set to an image or a video layer based on my incoming data. The problem i have is playing videos. i can play the first video in the stack but every video after that is just the sound of the first video. How can i play each accordingly?
the views are navigated through with a tap event like snapchat. see below:
#interface SceneImageViewController ()
#property (strong, nonatomic) NSURL *videoUrl;
#property (strong, nonatomic) AVPlayer *avPlayer;
#property (strong, nonatomic) AVPlayerLayer *avPlayerLayer;
#end
#implementation SceneImageViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];
int c = (int)[self.scenes count];
c--;
NSLog(#"int c = %d", c);
self.myCounter = [NSNumber numberWithInt:c];
for (int i=0; i<=c; i++) {
//create imageView
UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i; // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(#"added image view %d", i);
//get scene object
PFObject *sceneObject = self.scenes[i];
//get the PFFile and filetype
PFFile *file = [sceneObject objectForKey:#"file"];
NSString *fileType = [sceneObject objectForKey:#"fileType"];
//check the filetype
if ([fileType isEqual: #"image"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
//get image
NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = [UIImage imageWithData:imageData];
});
});
}
//its a video
else
{
// the video player
NSURL *fileUrl = [NSURL URLWithString:file.url];
self.avPlayer = [AVPlayer playerWithURL:fileUrl];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
//self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
CGRect screenRect = [[UIScreen mainScreen] bounds];
self.avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
[imageView.layer addSublayer:self.avPlayerLayer];
NSNumber *tag = [NSNumber numberWithInt:i+1];
NSLog(#"tag = %#", tag);
[self.videoCounterTags addObject:tag];
//[self.avPlayer play];
}
}
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(viewTapped:)];
[self.view bringSubviewToFront:self.screen];
[self.screen addGestureRecognizer:tapGesture];
}
- (void)viewTapped:(UIGestureRecognizer *)gesture{
NSLog(#"touch!");
[self.avPlayer pause];
int i = [self.myCounter intValue];
NSLog(#"counter = %d", i);
for(UIImageView *subview in [self.view subviews]) {
if(subview.tag== i) {
[subview removeFromSuperview];
}
}
if ([self.videoCounterTags containsObject:self.myCounter]) {
NSLog(#"play video!!!");
[self.avPlayer play];
}
if (i == 0) {
[self.avPlayer pause];
[self.navigationController popViewControllerAnimated:NO];
}
i--;
self.myCounter = [NSNumber numberWithInt:i];
NSLog(#"counter after = %d", i);
}
What Brooks Hanes said is correct you keep overriding the avplayer.
This is what i suggest for you to do:
Add the tap gesture to the imageView instead of the screen (or for a cleaner approach use UIButton instead):
UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i; // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(#"added image view %d", i);
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:imageView action:#selector(viewTapped:)];
[imageView addGestureRecognizer:tapGesture];
This way in your viewTapped: method you could get the tag of the pressed image like so: gesture.view.tag instead of using the myCounter.
To get the video working you could create a new AVPlayer for each video but that might turn quite expensive memory wise. A better approach will be to use AVPlayerItem and switch the AVPlayer's AVPlayerItem when changing the video.
So in the for loop do something like this where self.videoFiles is a NSMutableDictionary property:
// the video player
NSNumber *tag = [NSNumber numberWithInt:i+1];
NSURL *fileUrl = [NSURL URLWithString:file.url];
//save your video file url paired with the ImageView it belongs to.
[self.videosFiles setObject:fileUrl forKey:tag];
// you only need to initialize the player once.
if(self.avPlayer == nil){
AVAsset *asset = [AVAsset assetWithURL:fileUrl];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
}
// you don't need to keep the layer as a property
// (unless you need it for some reason
AVPlayerLayer* avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
CGRect screenRect = [[UIScreen mainScreen] bounds];
avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
[imageView.layer addSublayer:avPlayerLayer];
NSLog(#"tag = %#", tag);
[self.videoCounterTags addObject:tag];
Now in your viewTapped:
if ([self.videoCounterTags containsObject:gesture.view.tag]) {
NSLog(#"play video!!!");
AVAsset *asset = [AVAsset assetWithURL:[self.videoFiles objectForKey:gesture.view.tag]];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer replaceCurrentItemWithPlayerItem: item];
[self.avLayer play];
}
Or use the self.videoFiles instead and then you don't need self.videoCounterTags at all:
NSURL* fileURL = [self.videoFiles objectForKey:gesture.view.tag];
if (fileURL!=nil) {
NSLog(#"play video!!!");
AVAsset *asset = [AVAsset assetWithURL:fileURL];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer replaceCurrentItemWithPlayerItem: item];
[self.avLayer play];
}
That's the gist of it.
Take a look at the way you're setting up the myCounter variable. It is set one time and never changes until a view is tapped, and then it is set to the count of scenes, -1.
In addition, try looking at the way you're setting to the _avPlayer pointer var. It's always being set, over and over, and it seems that in a for loop you'd want to be storing references instead of simply updating the same pointer to the value latest in collection of scenes.
Also, from Apple's documentation:
You can create arbitrary numbers of player layers with the same AVPlayer object. Only the most recently created player layer will actually display the video content on-screen.
So, it's possible that since you're using the same AVPlayer object to create all these AVPlayer layers, that you're never going to see any more than one actual video layer work.
Related
Right now I'm allocating and initializing three UIImageViews that take up the entire screen and are stacked in the viewDidLoad method. Its actually taking some time to do this. Is there a way to do this automatically so the view just has them before its even loaded? like an init method that would speed this up?
- (void)viewDidLoad {
[super viewDidLoad];
self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];
int c = (int)[self.scenes count];
c--;
NSLog(#"int c = %d", c);
self.myCounter = [NSNumber numberWithInt:c];
for (int i=0; i<=c; i++) {
//create imageView
UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i; // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(#"added image view %d", i);
//get scene object
PFObject *sceneObject = self.scenes[i];
//get the PFFile and filetype
PFFile *file = [sceneObject objectForKey:#"file"];
NSString *fileType = [sceneObject objectForKey:#"fileType"];
//check the filetype
if ([fileType isEqual: #"image"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
//get image
NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = [UIImage imageWithData:imageData];
});
});
}
//its a video
else
{
// the video player
NSURL *fileUrl = [NSURL URLWithString:file.url];
self.avPlayer = [AVPlayer playerWithURL:fileUrl];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
//self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
CGRect screenRect = [[UIScreen mainScreen] bounds];
self.avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
[imageView.layer addSublayer:self.avPlayerLayer];
NSNumber *tag = [NSNumber numberWithInt:i+1];
NSLog(#"tag = %#", tag);
[self.videoCounterTags addObject:tag];
}
}
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(viewTapped:)];
// dummyScreen is just a see through view that sits on top of my image stack and holds my tap gesture recognizer
[self.view bringSubviewToFront:self.dummyScreen];
[self.dummyScreen addGestureRecognizer:tapGesture];
}
The problem is this line:
dispatch_async(dispatch_get_global_queue...
That moves you onto a background thread, and thus there is no telling when the code will be executed. Hence the delay.
If these are local files (i.e., the URL is the file URL of an image file in your app bundle), there is no need for any dispatch_async in your code. Remove all of that and do everything on the main thread. That way, it will happen as fast as possible.
If these are remote files (i.e., you have to do networking to get hold of them), then there's probably nothing you can do to speed things up; networking takes time, and viewDidLoad is just about as early as you can possibly be notified that it's time to get hold of the images.
I am using MPMoviePlayer to display a video from an external URL onto my iPhone App, however when I run the App a black screen is all that shows.
Here is the URL I am using:
2015-04-27 00:11:29.655 Floadt[21069:2598414] https://scontent.cdninstagram.com/hphotos-xaf1/t50.2886-16/11179443_819874424728492_389701720_n.mp4
Here is my code to try to setup MPMoviePlayer:
if (entry[#"videos"] != nil) {
NSLog(#"There is a Video: %#", entry[#"videos"]);
NSString *urlString = entry[#"videos"][#"standard_resolution"][#"url"];
NSLog(urlString);
NSURL *url = [NSURL URLWithString:urlString];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL: url];
[player prepareToPlay];
[player.view setFrame: CGRectMake(10, 65, 299, 299)];
[cell.contentView addSubview: player.view];
player.shouldAutoplay = YES;
[player play];
}
You need to retain your instance to MPMoviePlayerController i.e. as a property or an instance variable. The reference to the movie player is lost if you do not retain.
When we try to load the video from URL initially it will display blank screen only. MPMoviePlayerController will take some time to load the video from url.So we can display first frame of the video till the video loads. For this need to import two frameworks.
1.AVFoundation
2.AssetsLibrary
Using these two we can display first frame of video into UIImageView as follows:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
url=[NSURL URLWithString:#"https://scontent.cdninstagram.com/hphotos-xaf1/t50.2886-16/11179443_819874424728492_389701720_n.mp4"];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
if ([[avAsset tracksWithMediaType:AVMediaTypeVideo] count] > 0)
{
AVAssetImageGenerator *imageGenerator =[AVAssetImageGenerator assetImageGeneratorWithAsset:avAsset];
Float64 durationSeconds = CMTimeGetSeconds([avAsset duration]);
CMTime midpoint = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
NSError *error;
CMTime actualTime;
CGImageRef halfWayImage = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:&actualTime error:&error];
if (halfWayImage != NULL)
{
NSString *actualTimeString = (NSString *)CFBridgingRelease(CMTimeCopyDescription(NULL, actualTime));
NSString *requestedTimeString = (NSString *)CFBridgingRelease(CMTimeCopyDescription(NULL, midpoint));
NSLog(#"Got halfWayImage: Asked for %#, got %#", requestedTimeString, actualTimeString);
UIImage *img=[UIImage imageWithCGImage:halfWayImage];
_imgVw.image=img;
}
}
UITapGestureRecognizer *tap=[[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapped)];
[_imgVw addGestureRecognizer:tap];
}
-(void)tapped
{
MPMoviePlayerController *movPlayer=[[MPMoviePlayerController alloc] init];
[movPlayer setContentURL:url];
[movPlayer setMovieSourceType:MPMovieSourceTypeFile];
[movPlayer.view setFrame:CGRectMake(0, 0, _imgVw.frame.size.width, 250)];
[movPlayer prepareToPlay];
movPlayer.controlStyle = MPMovieControlStyleNone;
movPlayer.fullscreen = NO;
movPlayer.shouldAutoplay=YES;
[movPlayer setScalingMode:MPMovieScalingModeAspectFill];
[_imgVw addSubview:movPlayer.view];
[movPlayer play];
}
Here i am taking UIImageView view for playing the video. In viewDidLoad i am loading the 1st frame and giving tap gesture to the UIImageView. When i tapped the ImageView then i am playing the video.
I try to embed different videos from youtube vimeo, dailymotion.
Sadly at the Moment nothing is shown except the backgroundcolor of my containerView:
UIView *containerView = [[UIView alloc] initWithFrame:CGRectMake(0.0f, 0, 320.0f, 200.0f)];
//item.url is my url which i get fro my webserver, it looks like http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:[NSURL fileURLWithPath:item.url]];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
NSLog(#"%#",playerItem);
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = self.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
if (avPlayer.status == AVPlayerStatusReadyToPlay) {
//[playingLbl setText:#"Playing Audio"];
NSLog(#"It works");
} else if (avPlayer.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
NSLog(#"Not works");
NSLog(#"%#",avPlayer.error);
}
else if (avPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
containerView.backgroundColor = [UIColor blueColor];
NSLog(#"error: %#", avPlayer.error);
NSLog(#"AVPlayer: %#", avPlayer);
AVPlayer Error is Null and the only Log i always get from the Status is: AVPlayerItemStatusUnknown. Any ideas?
EDIT 1:
Ich changed my Code to:
#implementation VideoView
BlockVideo *list;
- (id)initWithBlock:(GFBlock *)block {
self = [super initWithBlock:block];
if (self) {
if (block.values && block.values.count) {
list = (GFBlockVideo *) [block.values objectAtIndex:0];
for (int i=0; i<list.videos.count; ++i) {
GFBlockVideoItem *item = list.videos[i];
UIView *containerView = [[UIView alloc] initWithFrame:CGRectMake(0.0f, 0, 320.0f, 200.0f)];
//Like i said item.url = http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
//#property (nonatomic, strong) NSString* url;
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:item.url]];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = containerView.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
containerView.backgroundColor = [UIColor blueColor];
Sadly the only thing i can see is the blue containerView :/
I think the Problem is not the AVPlayer himself, but the frames and the layer maybe....
You'd have to do the following:
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:item.url]];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = self.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
Hope this helps.
Addendum:
It also depends on your URL; if as you said, you have one such as in this format:
http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
Then you should use this instead:
[NSURL urlWithString:item.url];
Given that item is your object and url is a property there of object type NSString.
AVPlayer Implementation which working for me:
MP4:
player = [AVPlayer playerWithURL:videoPathUrl];
AVcontroller = [[AVPlayerViewController alloc] init];
[AVcontroller.view setFrame:CGRectMake(0, 0,self.view.frame.size.width, self.view.frame.size.width)];
AVcontroller.player = player;
AVcontroller.showsPlaybackControls = FALSE;
[self addChildViewController:AVcontroller];
[self.view addSubview:AVcontroller.view];
[player play];
MP3 :
playerItem = [AVPlayerItem playerItemWithURL:url];
player = [AVPlayer playerWithPlayerItem:playerItem];
player = [AVPlayer playerWithURL:url];
[player play];
For getting thumbnail form video try this
AVAsset *asset = [AVAsset assetWithURL:videoPathUrl];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
float tempTime = CMTimeGetSeconds(player.currentItem.duration);
CMTime time = CMTimeMake(tempTime, 1); // (1,1)
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *aThumbnail = [UIImage imageWithCGImage:imageRef];
For Stop Video
[player pause];
player = nil;
**Setting Play rate/Speed for video **
[player play];
[player setRate:currentRate];
Play From start
[player seekToTime:kCMTimeZero];
[player play];
for checking video is playing or not
if ((player.rate != 0) && (player.error == nil)) {
// playing
}
For seeking video (for some duration ahead)
float tempSeekTime = CMTimeGetSeconds(player.currentItem.duration) + 10;
CMTime targetTime = CMTimeMakeWithSeconds(tempSeekTime, NSEC_PER_SEC);
[player seekToTime:targetTime];
Use the requestPlayerItemForVideo method of PHImageManager to acquire an AVPlayerItem; it is the simplest, sure-fire way to play an AVAsset, performing flawlessly and consistently.
I use it here:
https://youtu.be/7QlaO7WxjGg
I am trying to process frames of a video that is loaded from a file (not a camera stream).
To do that, I've loaded the video (currently a test mp4 that's in the bundle, although I plan to have this be an external url).
I've read through a lot of documentation and tutorials on AVFoundation, and I've found a number of different suggestions, but none of them have worked. What I was hoping was to find a delegate method or a notification when a frame changed, but I think these do not actually exist (possibly because they would be too slow to handle frame streams).
Here's what I have so far. I need the [self doImageProcess:] selector to run every time an image is rendered from a video stream. How can this be done?
(I will post fully working code once this is working properly)
#import <AVFoundation/AVFoundation.h>
#implementation VTViewController
#pragma mark - View lifecycle
- (void)viewDidLoad {
[super viewDidLoad];
// load mp4
NSURL *theURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"]];
AVAsset *avAsset = [AVAsset assetWithURL:theURL];
AVPlayerItem *avPlayerItem =[[AVPlayerItem alloc] initWithAsset:avAsset];
avPlayer = [[AVPlayer alloc] initWithPlayerItem:avPlayerItem];
CMTime interval = CMTimeMake(33, 1000); // 30fps
id playbackObserver = [avPlayer addPeriodicTimeObserverForInterval:interval queue:nil usingBlock: ^(CMTime time) {
// get image
[self doImageProcess:nil];
}];
img1 = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 480, 320)];
[self.view addSubview:img1];
avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerLayer setFrame:CGRectMake(0, 0, 480, 320)];//self.view.frame];
[self.view.layer addSublayer:avPlayerLayer];
[avPlayer seekToTime:kCMTimeZero];
[avPlayer play];
}
- (void)doImageProcess:(UIImage *)theImage {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, 1.0, 1.0);
[avPlayerLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[arrImages addObject:img];
[img1 removeFromSuperview];
img1 = nil;
img1 = [[UIImageView alloc] initWithImage:img];
[self.view addSubview:img1];
if ([arrImages count] > 4)
[arrImages removeObjectAtIndex:0];
}
The addPeriodicTimeObserverForInterval:queue:usingBlock: method of AVPlayer may help. It can execute a block of code at a particular time interval of playback.
My code is:
-(void)playMoviesForItems:(NSArray *)shopItems{
_moviePlayer = [self.storyboard instantiateViewControllerWithIdentifier:#"videoPlayerController"];
[_moviePlayer playMoviesForItems:shopItems];
[self presentViewController:_moviePlayer animated:NO completion:^{
[self.scrollView scrollRectToVisible:CGRectZero animated:YES];
}];
}
In the line that includes presentViewController, I am getting
EXC_BAD_ACCESS (code=1, address=.....)
I cannot find why I am getting this. Where am I wrong? How can I debug that?
UPDATE:
Movie Player is declared like:
#property(nonatomic,retain) VideoPlayerViewController* moviePlayer;
And the code in VideoPlayerController is:
-(VideoPlayerViewController*)playMoviesForItems:(NSArray*)items{
playerItems = [[NSMutableArray alloc] init];
for (ShopItem* item in items) {
NSURL *url = [NSURL fileURLWithPath:item.localUrl];
AVPlayerItem *videoItem = [AVPlayerItem playerItemWithURL:url];
[playerItems addObject:videoItem];
}
self.mPlayer = [[AVQueuePlayer alloc] init];
self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:mPlayer];
self.playerLayer.bounds = self.view.bounds;
self.view.backgroundColor = [UIColor blackColor];
[mPlaybackView setPlayer:self.mPlayer];
currentIndex = 0;
[self initScrubberTimer];
[self playAtIndex:currentIndex];
mScrubber.maximumValue = CMTimeGetSeconds(self.mPlayer.currentItem.duration);
mScrubber.value = 0.0;
[self syncScrubber];
[self showOverlays];
return self;
}
Sometimes it works clearly. But sometimes it breaks.