Black Screen when attempting to use MPMoviePlayer - ios

I am using MPMoviePlayer to display a video from an external URL onto my iPhone App, however when I run the App a black screen is all that shows.
Here is the URL I am using:
2015-04-27 00:11:29.655 Floadt[21069:2598414] https://scontent.cdninstagram.com/hphotos-xaf1/t50.2886-16/11179443_819874424728492_389701720_n.mp4
Here is my code to try to setup MPMoviePlayer:
if (entry[#"videos"] != nil) {
NSLog(#"There is a Video: %#", entry[#"videos"]);
NSString *urlString = entry[#"videos"][#"standard_resolution"][#"url"];
NSLog(urlString);
NSURL *url = [NSURL URLWithString:urlString];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL: url];
[player prepareToPlay];
[player.view setFrame: CGRectMake(10, 65, 299, 299)];
[cell.contentView addSubview: player.view];
player.shouldAutoplay = YES;
[player play];
}

You need to retain your instance to MPMoviePlayerController i.e. as a property or an instance variable. The reference to the movie player is lost if you do not retain.

When we try to load the video from URL initially it will display blank screen only. MPMoviePlayerController will take some time to load the video from url.So we can display first frame of the video till the video loads. For this need to import two frameworks.
1.AVFoundation
2.AssetsLibrary
Using these two we can display first frame of video into UIImageView as follows:
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
url=[NSURL URLWithString:#"https://scontent.cdninstagram.com/hphotos-xaf1/t50.2886-16/11179443_819874424728492_389701720_n.mp4"];
AVAsset *avAsset = [AVURLAsset URLAssetWithURL:url options:nil];
if ([[avAsset tracksWithMediaType:AVMediaTypeVideo] count] > 0)
{
AVAssetImageGenerator *imageGenerator =[AVAssetImageGenerator assetImageGeneratorWithAsset:avAsset];
Float64 durationSeconds = CMTimeGetSeconds([avAsset duration]);
CMTime midpoint = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
NSError *error;
CMTime actualTime;
CGImageRef halfWayImage = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:&actualTime error:&error];
if (halfWayImage != NULL)
{
NSString *actualTimeString = (NSString *)CFBridgingRelease(CMTimeCopyDescription(NULL, actualTime));
NSString *requestedTimeString = (NSString *)CFBridgingRelease(CMTimeCopyDescription(NULL, midpoint));
NSLog(#"Got halfWayImage: Asked for %#, got %#", requestedTimeString, actualTimeString);
UIImage *img=[UIImage imageWithCGImage:halfWayImage];
_imgVw.image=img;
}
}
UITapGestureRecognizer *tap=[[UITapGestureRecognizer alloc]initWithTarget:self action:#selector(tapped)];
[_imgVw addGestureRecognizer:tap];
}
-(void)tapped
{
MPMoviePlayerController *movPlayer=[[MPMoviePlayerController alloc] init];
[movPlayer setContentURL:url];
[movPlayer setMovieSourceType:MPMovieSourceTypeFile];
[movPlayer.view setFrame:CGRectMake(0, 0, _imgVw.frame.size.width, 250)];
[movPlayer prepareToPlay];
movPlayer.controlStyle = MPMovieControlStyleNone;
movPlayer.fullscreen = NO;
movPlayer.shouldAutoplay=YES;
[movPlayer setScalingMode:MPMovieScalingModeAspectFill];
[_imgVw addSubview:movPlayer.view];
[movPlayer play];
}
Here i am taking UIImageView view for playing the video. In viewDidLoad i am loading the 1st frame and giving tap gesture to the UIImageView. When i tapped the ImageView then i am playing the video.

Related

How do I place a label over an AVPlayer playing a video?

How do I place a label over or "on top of" an AVPlayer playing a video? I have tried adjusting Z-position as suggested in another SO post but it is not working, the video player seems to be in the front of all the labels.
//this is my label
self.commentLabel.layer.masksToBounds = YES;
self.commentLabel.layer.cornerRadius = 15.0;
[self.commentLabel sizeToFit];
self.commentLabel.center = CGPointMake(self.view.frame.size.width/2,self.view.frame.size.height/1.4);
self.commentLabel.layer.zPosition=100;
//this is my video player
self.playerViewController = [[AVPlayerViewController alloc] init];
NSString*str3=[self.FRIENDDATA stringByReplacingOccurrencesOfString:#" " withString:#"_"];
NSURL *url = [NSURL URLWithString:[NSString stringWithFormat:#"http://my.website.com/%#/%#.mp4", str3,self.RANDOMDATA]];
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset: asset];
AVPlayer * player = [[AVPlayer alloc] initWithPlayerItem: item];
self.playerViewController.player = player;
[self.playerViewController.view setFrame:self.referenceImageView.frame];
self.playerViewController.view.layer.zPosition=0;
self.playerViewController.showsPlaybackControls = YES;
[self.view addSubview:self.playerViewController.view];
Have you tried adding the label subview onto the playerViewController ?
[self.playerViewController.view addSubview:self.commentLabel];
From the code I am unsure if commentLabel is not yet initialized or if it initialized somewhere else in the code or from a storyboard. It would be great to initialize the commentLabel on the playerViewController
Like this:
UILabel *commentLabel = [[UILabel alloc]initWithFrame:CGRectMake(self.view.frame.size.width/2, self.view.frame.size.height/1.4, 100, 100)];
[self.playerViewController.view addSubview:commentLabel];
That should resolve your issue

Playing stacked videos

I have multiple imageview subviews getting stacked based on my incoming data. Basically all these subviews are either set to an image or a video layer based on my incoming data. The problem i have is playing videos. i can play the first video in the stack but every video after that is just the sound of the first video. How can i play each accordingly?
the views are navigated through with a tap event like snapchat. see below:
#interface SceneImageViewController ()
#property (strong, nonatomic) NSURL *videoUrl;
#property (strong, nonatomic) AVPlayer *avPlayer;
#property (strong, nonatomic) AVPlayerLayer *avPlayerLayer;
#end
#implementation SceneImageViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];
int c = (int)[self.scenes count];
c--;
NSLog(#"int c = %d", c);
self.myCounter = [NSNumber numberWithInt:c];
for (int i=0; i<=c; i++) {
//create imageView
UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i; // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(#"added image view %d", i);
//get scene object
PFObject *sceneObject = self.scenes[i];
//get the PFFile and filetype
PFFile *file = [sceneObject objectForKey:#"file"];
NSString *fileType = [sceneObject objectForKey:#"fileType"];
//check the filetype
if ([fileType isEqual: #"image"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
//get image
NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = [UIImage imageWithData:imageData];
});
});
}
//its a video
else
{
// the video player
NSURL *fileUrl = [NSURL URLWithString:file.url];
self.avPlayer = [AVPlayer playerWithURL:fileUrl];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
//self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
CGRect screenRect = [[UIScreen mainScreen] bounds];
self.avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
[imageView.layer addSublayer:self.avPlayerLayer];
NSNumber *tag = [NSNumber numberWithInt:i+1];
NSLog(#"tag = %#", tag);
[self.videoCounterTags addObject:tag];
//[self.avPlayer play];
}
}
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(viewTapped:)];
[self.view bringSubviewToFront:self.screen];
[self.screen addGestureRecognizer:tapGesture];
}
- (void)viewTapped:(UIGestureRecognizer *)gesture{
NSLog(#"touch!");
[self.avPlayer pause];
int i = [self.myCounter intValue];
NSLog(#"counter = %d", i);
for(UIImageView *subview in [self.view subviews]) {
if(subview.tag== i) {
[subview removeFromSuperview];
}
}
if ([self.videoCounterTags containsObject:self.myCounter]) {
NSLog(#"play video!!!");
[self.avPlayer play];
}
if (i == 0) {
[self.avPlayer pause];
[self.navigationController popViewControllerAnimated:NO];
}
i--;
self.myCounter = [NSNumber numberWithInt:i];
NSLog(#"counter after = %d", i);
}
What Brooks Hanes said is correct you keep overriding the avplayer.
This is what i suggest for you to do:
Add the tap gesture to the imageView instead of the screen (or for a cleaner approach use UIButton instead):
UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i; // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(#"added image view %d", i);
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:imageView action:#selector(viewTapped:)];
[imageView addGestureRecognizer:tapGesture];
This way in your viewTapped: method you could get the tag of the pressed image like so: gesture.view.tag instead of using the myCounter.
To get the video working you could create a new AVPlayer for each video but that might turn quite expensive memory wise. A better approach will be to use AVPlayerItem and switch the AVPlayer's AVPlayerItem when changing the video.
So in the for loop do something like this where self.videoFiles is a NSMutableDictionary property:
// the video player
NSNumber *tag = [NSNumber numberWithInt:i+1];
NSURL *fileUrl = [NSURL URLWithString:file.url];
//save your video file url paired with the ImageView it belongs to.
[self.videosFiles setObject:fileUrl forKey:tag];
// you only need to initialize the player once.
if(self.avPlayer == nil){
AVAsset *asset = [AVAsset assetWithURL:fileUrl];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[self.avPlayer currentItem]];
}
// you don't need to keep the layer as a property
// (unless you need it for some reason
AVPlayerLayer* avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
CGRect screenRect = [[UIScreen mainScreen] bounds];
avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
[imageView.layer addSublayer:avPlayerLayer];
NSLog(#"tag = %#", tag);
[self.videoCounterTags addObject:tag];
Now in your viewTapped:
if ([self.videoCounterTags containsObject:gesture.view.tag]) {
NSLog(#"play video!!!");
AVAsset *asset = [AVAsset assetWithURL:[self.videoFiles objectForKey:gesture.view.tag]];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer replaceCurrentItemWithPlayerItem: item];
[self.avLayer play];
}
Or use the self.videoFiles instead and then you don't need self.videoCounterTags at all:
NSURL* fileURL = [self.videoFiles objectForKey:gesture.view.tag];
if (fileURL!=nil) {
NSLog(#"play video!!!");
AVAsset *asset = [AVAsset assetWithURL:fileURL];
AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
self.avPlayer replaceCurrentItemWithPlayerItem: item];
[self.avLayer play];
}
That's the gist of it.
Take a look at the way you're setting up the myCounter variable. It is set one time and never changes until a view is tapped, and then it is set to the count of scenes, -1.
In addition, try looking at the way you're setting to the _avPlayer pointer var. It's always being set, over and over, and it seems that in a for loop you'd want to be storing references instead of simply updating the same pointer to the value latest in collection of scenes.
Also, from Apple's documentation:
You can create arbitrary numbers of player layers with the same AVPlayer object. Only the most recently created player layer will actually display the video content on-screen.
So, it's possible that since you're using the same AVPlayer object to create all these AVPlayer layers, that you're never going to see any more than one actual video layer work.

Run multiple video file in objective c

There are two video file.When I want to see these two video
1) Only the last video file can play,see in fullscreen mode and also minimise when done button clicked..but first video file can't .
2) After a few time the first video file also see in black screen
- (void)viewDidLoad
{
[super viewDidLoad];
MPMoviePlayerController *moviePlayer;
NSArray *filename=#[#"nissan1",#"nissan5"];
//n![enter image description here][1]issan1,nissan5 are mp4 file
NSURL *fileURL1 = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:[filename objectAtIndex:0] ofType:#"mp4"]];
moviePlayer= [[MPMoviePlayerController alloc] initWithContentURL:fileURL1];
[moviePlayer.view setFrame:CGRectMake(5, 50, 100,100)];
moviePlayer.shouldAutoplay = NO;
moviePlayer.repeatMode = MPMovieRepeatModeOne;
moviePlayer.initialPlaybackTime = -1.0;
moviePlayer.movieSourceType = MPMovieSourceTypeFile;
[moviePlayer prepareToPlay];
[self.view addSubview:moviePlayer.view];
/* second video file*/
NSURL *fileURL2 = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:[filename objectAtIndex:1] ofType:#"mp4"]];
moviePlayer= [[MPMoviePlayerController alloc] initWithContentURL:fileURL2];
[moviePlayer.view setFrame:CGRectMake(200, 50, 100,100)];
moviePlayer.shouldAutoplay = NO;
moviePlayer.repeatMode = MPMovieRepeatModeNone;
moviePlayer.initialPlaybackTime = -1.0;
moviePlayer.movieSourceType = MPMovieSourceTypeFile;
[moviePlayer prepareToPlay];
[self.view addSubview:moviePlayer.view];
}
You should try with two instances. As Cristik pointed out, ARC targets it for deallocation once the method ends and you are using same variable.
Since you asked for sample code, try this:
- (void)viewDidLoad
{
[super viewDidLoad];
MPMoviePlayerController *moviePlayer1;
MPMoviePlayerController *moviePlayer2;
//filename is a name array
NSURL *fileURL1 = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:[filename objectAtIndex:0] ofType:#"mp4"]];
moviePlayer1= [[MPMoviePlayerController alloc] initWithContentURL:fileURL1];
[moviePlayer1.view setFrame:CGRectMake(5, 50, 100,100)];
moviePlayer1.shouldAutoplay = NO;
moviePlayer1.repeatMode = MPMovieRepeatModeOne;
moviePlayer1.initialPlaybackTime = -1.0;
moviePlayer1.movieSourceType = MPMovieSourceTypeFile;
[moviePlayer1 prepareToPlay];
[self.view addSubview:moviePlayer1.view];
/* second video file*/
NSURL *fileURL2 = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:[filename objectAtIndex:1] ofType:#"mp4"]];
moviePlayer2= [[MPMoviePlayerController alloc] initWithContentURL:fileURL2];
[moviePlayer2.view setFrame:CGRectMake(200, 50, 100,100)];
moviePlayer2.shouldAutoplay = NO;
moviePlayer2.repeatMode = MPMovieRepeatModeNone;
moviePlayer2.initialPlaybackTime = -1.0;
moviePlayer2.movieSourceType = MPMovieSourceTypeFile;
[moviePlayer2 prepareToPlay];
[self.view addSubview:moviePlayer2.view];
}

AVPlayer doesn't show anything

I try to embed different videos from youtube vimeo, dailymotion.
Sadly at the Moment nothing is shown except the backgroundcolor of my containerView:
UIView *containerView = [[UIView alloc] initWithFrame:CGRectMake(0.0f, 0, 320.0f, 200.0f)];
//item.url is my url which i get fro my webserver, it looks like http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:[NSURL fileURLWithPath:item.url]];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
NSLog(#"%#",playerItem);
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = self.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
if (avPlayer.status == AVPlayerStatusReadyToPlay) {
//[playingLbl setText:#"Playing Audio"];
NSLog(#"It works");
} else if (avPlayer.status == AVPlayerStatusFailed) {
// something went wrong. player.error should contain some information
NSLog(#"Not works");
NSLog(#"%#",avPlayer.error);
}
else if (avPlayer.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
containerView.backgroundColor = [UIColor blueColor];
NSLog(#"error: %#", avPlayer.error);
NSLog(#"AVPlayer: %#", avPlayer);
AVPlayer Error is Null and the only Log i always get from the Status is: AVPlayerItemStatusUnknown. Any ideas?
EDIT 1:
Ich changed my Code to:
#implementation VideoView
BlockVideo *list;
- (id)initWithBlock:(GFBlock *)block {
self = [super initWithBlock:block];
if (self) {
if (block.values && block.values.count) {
list = (GFBlockVideo *) [block.values objectAtIndex:0];
for (int i=0; i<list.videos.count; ++i) {
GFBlockVideoItem *item = list.videos[i];
UIView *containerView = [[UIView alloc] initWithFrame:CGRectMake(0.0f, 0, 320.0f, 200.0f)];
//Like i said item.url = http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
//#property (nonatomic, strong) NSString* url;
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:item.url]];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = containerView.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
containerView.backgroundColor = [UIColor blueColor];
Sadly the only thing i can see is the blue containerView :/
I think the Problem is not the AVPlayer himself, but the frames and the layer maybe....
You'd have to do the following:
AVAsset *asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:item.url]];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
avPlayerLayer.frame = self.frame;
[containerView.layer addSublayer:avPlayerLayer];
[self addSubview:containerView];
[avPlayer play];
Hope this helps.
Addendum:
It also depends on your URL; if as you said, you have one such as in this format:
http://www.youtube.com/watch?v=zPP6lXaL7KA&feature=youtube_gdata_player
Then you should use this instead:
[NSURL urlWithString:item.url];
Given that item is your object and url is a property there of object type NSString.
AVPlayer Implementation which working for me:
MP4:
player = [AVPlayer playerWithURL:videoPathUrl];
AVcontroller = [[AVPlayerViewController alloc] init];
[AVcontroller.view setFrame:CGRectMake(0, 0,self.view.frame.size.width, self.view.frame.size.width)];
AVcontroller.player = player;
AVcontroller.showsPlaybackControls = FALSE;
[self addChildViewController:AVcontroller];
[self.view addSubview:AVcontroller.view];
[player play];
MP3 :
playerItem = [AVPlayerItem playerItemWithURL:url];
player = [AVPlayer playerWithPlayerItem:playerItem];
player = [AVPlayer playerWithURL:url];
[player play];
For getting thumbnail form video try this
AVAsset *asset = [AVAsset assetWithURL:videoPathUrl];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
float tempTime = CMTimeGetSeconds(player.currentItem.duration);
CMTime time = CMTimeMake(tempTime, 1); // (1,1)
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *aThumbnail = [UIImage imageWithCGImage:imageRef];
For Stop Video
[player pause];
player = nil;
**Setting Play rate/Speed for video **
[player play];
[player setRate:currentRate];
Play From start
[player seekToTime:kCMTimeZero];
[player play];
for checking video is playing or not
if ((player.rate != 0) && (player.error == nil)) {
// playing
}
For seeking video (for some duration ahead)
float tempSeekTime = CMTimeGetSeconds(player.currentItem.duration) + 10;
CMTime targetTime = CMTimeMakeWithSeconds(tempSeekTime, NSEC_PER_SEC);
[player seekToTime:targetTime];
Use the requestPlayerItemForVideo method of PHImageManager to acquire an AVPlayerItem; it is the simplest, sure-fire way to play an AVAsset, performing flawlessly and consistently.
I use it here:
https://youtu.be/7QlaO7WxjGg

How to get a notification when frames are being played of a video file, using iOS AVFoundation?

I am trying to process frames of a video that is loaded from a file (not a camera stream).
To do that, I've loaded the video (currently a test mp4 that's in the bundle, although I plan to have this be an external url).
I've read through a lot of documentation and tutorials on AVFoundation, and I've found a number of different suggestions, but none of them have worked. What I was hoping was to find a delegate method or a notification when a frame changed, but I think these do not actually exist (possibly because they would be too slow to handle frame streams).
Here's what I have so far. I need the [self doImageProcess:] selector to run every time an image is rendered from a video stream. How can this be done?
(I will post fully working code once this is working properly)
#import <AVFoundation/AVFoundation.h>
#implementation VTViewController
#pragma mark - View lifecycle
- (void)viewDidLoad {
[super viewDidLoad];
// load mp4
NSURL *theURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"video" ofType:#"mp4"]];
AVAsset *avAsset = [AVAsset assetWithURL:theURL];
AVPlayerItem *avPlayerItem =[[AVPlayerItem alloc] initWithAsset:avAsset];
avPlayer = [[AVPlayer alloc] initWithPlayerItem:avPlayerItem];
CMTime interval = CMTimeMake(33, 1000); // 30fps
id playbackObserver = [avPlayer addPeriodicTimeObserverForInterval:interval queue:nil usingBlock: ^(CMTime time) {
// get image
[self doImageProcess:nil];
}];
img1 = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 480, 320)];
[self.view addSubview:img1];
avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerLayer setFrame:CGRectMake(0, 0, 480, 320)];//self.view.frame];
[self.view.layer addSublayer:avPlayerLayer];
[avPlayer seekToTime:kCMTimeZero];
[avPlayer play];
}
- (void)doImageProcess:(UIImage *)theImage {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, 1.0, 1.0);
[avPlayerLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[arrImages addObject:img];
[img1 removeFromSuperview];
img1 = nil;
img1 = [[UIImageView alloc] initWithImage:img];
[self.view addSubview:img1];
if ([arrImages count] > 4)
[arrImages removeObjectAtIndex:0];
}
The addPeriodicTimeObserverForInterval:queue:usingBlock: method of AVPlayer may help. It can execute a block of code at a particular time interval of playback.

Resources