AVPlayer does not show video - ios

I am trying to play a movie at the beginning of my game. I am using AVPlayer to do this. My problem is, when I register a KVO to check the status of my AVPlayer, my game proceeds as usual without waiting for the video to load and finish. As a result, I can only hear the audio from my .mov file and can't see any video (since my game has already started).
I would like the video to load and finish before proceeding with the game.
Here's the code:
#interface RMVideoView : NSView
{
NSURL* _videoURL;
AVPlayer* _player;
AVPlayerLayer* _playerLayer;
}
#property (nonatomic, readonly, strong) AVPlayer* player;
#property (nonatomic, readonly, strong) AVPlayerLayer* playerLayer;
#property (nonatomic, retain) NSURL* videoURL;
- (void) play;
#end
static void *RMVideoViewPlayerLayerReadyForDisplay = &RMVideoViewPlayerLayerReadyForDisplay;
static void *RMVideoViewPlayerItemStatusContext = &RMVideoViewPlayerItemStatusContext;
#interface RMVideoView()
- (void)onError:(NSError*)error;
- (void)onReadyToPlay;
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys;
#end
#implementation RMVideoView
#synthesize player = _player;
#synthesize playerLayer = _playerLayer;
#synthesize videoURL = _videoURL;
- (id)initWithFrame:(NSRect)frame {
self = [super initWithFrame:frame];
if (self) {
self.wantsLayer = YES;
_player = [[AVPlayer alloc] init];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:RMVideoViewPlayerItemStatusContext];
}
return self;
}
- (void) setVideoURL:(NSURL *)videoURL {
_videoURL = videoURL;
[self.player pause];
[self.playerLayer removeFromSuperlayer];
AVURLAsset *asset = [AVAsset assetWithURL:self.videoURL];
[asset retain];
NSArray *assetKeysToLoadAndTest = [NSArray arrayWithObjects:#"playable", #"hasProtectedContent", #"tracks", #"duration", nil];
[asset loadValuesAsynchronouslyForKeys:assetKeysToLoadAndTest completionHandler:^{
dispatch_async(dispatch_get_main_queue(),^{
[self setUpPlaybackOfAsset:asset withKeys:assetKeysToLoadAndTest];
});
}];
}
#pragma mark - KVO
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (context == RMVideoViewPlayerItemStatusContext)
{
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
case AVPlayerItemStatusUnknown:
break;
case AVPlayerItemStatusReadyToPlay:
[self onReadyToPlay];
break;
case AVPlayerItemStatusFailed:
[self onError:nil];
break;
}
}
else if (context == RMVideoViewPlayerLayerReadyForDisplay)
{
if ([[change objectForKey:NSKeyValueChangeNewKey] boolValue])
{
self.playerLayer.hidden = NO;
}
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#pragma mark - Private
- (void)onError:(NSError*)error {
// Notify delegate
}
- (void)onReadyToPlay {
// Notify delegate
[self.player play];
}
- (void)setUpPlaybackOfAsset:(AVAsset *)asset withKeys:(NSArray *)keys {
for (NSString *key in keys) {
NSError *error = nil;
if ([asset statusOfValueForKey:key error:&error] == AVKeyValueStatusFailed) {
[self onError:error];
return;
}
}
if (!asset.isPlayable || asset.hasProtectedContent) {
[self onError:nil];
return;
}
if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0)
{ // Asset has video tracks
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
self.playerLayer.frame = self.layer.bounds;
self.playerLayer.autoresizingMask = kCALayerWidthSizable | kCALayerHeightSizable;
self.playerLayer.hidden = NO;
[self.layer addSublayer:self.playerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:RMVideoViewPlayerLayerReadyForDisplay];
}
// Create a new AVPlayerItem and make it our player's current item.
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.player replaceCurrentItemWithPlayerItem:playerItem];
}
#pragma mark - Public
- (void) play {
[self.player play];
}
#end
I am calling the above code from my entry function's -(void)drawView method this way:
-(void)drawView
{
if(playVideo)
{
RMVideoView *rmVid = [[RMVideoView alloc]init];
NSURL* MovieURL;
NSBundle *bundle = [NSBundle mainBundle];
if(bundle != nil)
{
NSString *moviePath = [bundle pathForResource:#"MyVideoResource" ofType:#"mov"];
if (moviePath)
{
MovieURL = [NSURL fileURLWithPath:moviePath];
[MovieURL retain];
[rmVid setVideoURL:MovieURL];
}
}
playVideo = kFalse;
}
}
The call made to [rmVid setVideoURL:MovieURL] returns when KVO is setup and the game runs forward.
Please help!

You can listen for a notification, when the video playback reaches the end like this:
AVPlayerItem *avPlayerItem =[[AVPlayerItem alloc]initWithAsset:avAsset];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachedEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:avPlayerItem];

Related

After AVPlayer's first failure, I can never receive KVO callback any more

Just as what I explain above, If I start the app and first turn off the wifi and then try to play a music with AVPlayer, I will receive a AVPlayerItemStatusFailed, but after that, I can no longer receive any KVO callback even if I turn on the wifi again.
On the other situation, if I run the app and first to turn on wifi and play, I can receive AVPlayerItemStatusReadyToPlay and after that, I can continue to receive KVO callback even if I turn wifi off...
So how to fix the first situation??
#interface ViewController ()
#property (strong, nonatomic) AVPlayer *player;
#property (assign, nonatomic) NSInteger index;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
}
- (IBAction)buttonPressed:(UIButton *)sender {
[self.player.currentItem removeObserver:self forKeyPath:#"status"];
NSLog(#"removed %#", self.player.currentItem);
AVPlayerItem *item = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:self.urls[self.index]]];
[item addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
NSLog(#"added %#", item);
[self.player replaceCurrentItemWithPlayerItem:item];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context {
if ([keyPath isEqualToString:#"status"]) {
if ([object isMemberOfClass:[AVPlayerItem class]]) {
AVPlayerItem *playerItem = (AVPlayerItem *)object;
switch (playerItem.status) {
case AVPlayerItemStatusFailed:
NSLog(#"failed");
break;
case AVPlayerItemStatusReadyToPlay:
NSLog(#"ready");
break;
case AVPlayerItemStatusUnknown:
NSLog(#"unknown");
break;
}
}
}
}
- (AVPlayer *)player {
if (!_player) {
_player = [[AVPlayer alloc] init];
}
return _player;
}
- (NSArray *)urls {
return #[#"https://of92d29bn.qnssl.com/record1.m4a", #"https://of92d29bn.qnssl.com/record2.m4a"];
}
- (NSInteger)index {
if (_index == 0) {
_index = 1;
} else {
_index = 0;
}
return _index;
}
This is all my code.
I had the same issue and found a solution:
Reset current item to nil before using the new one:
[self.player replaceCurrentItemWithPlayerItem:nil];
[self.player replaceCurrentItemWithPlayerItem:item];
This doesn't make much sense, and there's probably a bug in AVPlayer's implementation somewhere. But this worked for me.

AVPlayer starts to reproduce a video after a long delay

Delay is about 10 seconds.
Snippet of my code. It's *.m file of HomeController:
#interface HomeController ()
#property(nonatomic, strong) AVPlayerViewController *playerViewController;
#implementation HomeController
- (void)viewDidLoad
{
[super viewDidLoad];
...
self.playerViewController = [[AVPlayerViewController alloc] init];
}
- (IBAction)watchDemoToggle:(id)sender {
NSURL *url = [NSURL URLWithString:#"http://blahblahblah.com/demo.mp4"];
AVURLAsset *asset = [AVURLAsset assetWithURL: url];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
if(playerItem)
{
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
self.playerViewController.player = player;
[self.playerViewController.player addObserver:self forKeyPath:#"status" options:0 context:nil];
[self.playerViewController setShowsPlaybackControls:NO];
[self presentViewController:self.playerViewController animated:YES completion:^{
[self.playerViewController.player play];
self.playerViewController.showsPlaybackControls = YES;
}];
}
}
#pragma mark - KVO
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context {
if ([keyPath isEqualToString:#"status"]) {
AVPlayerStatus status = [change[NSKeyValueChangeNewKey] integerValue];
NSLog(#"status = %ld", (long)status);
}
}
I added KVO and got result that after click play player status is AVPlayerStatusUnknown. After that it is not changing despite the fact that the video is reproducing.
Does anyone have any idea?
set automaticallyWaitsToMinimizeStalling property of AVPlayer to false in order to start playback immediately.
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
player.automaticallyWaitsToMinimizeStalling = false;
But if sufficient content is not available for playing then player might stall.
for more details please refer this Apple documentation.
avplayer

Playing continuous audio in Iphone

I have this piece of code for playing audio, but once it is finished, I want to play the same audio again and again, I think I should use numberofloops=-1, but where I need to use this directly. Please help me.
#import "JetNapMusicPlayer.h"
#import <AVFoundation/AVFoundation.h>
#interface JetNapMusicPlayer()
#property(nonatomic,strong) AVQueuePlayer *avQueuePlayer;
#end
static JetNapMusicPlayer *sharedManager = nil;
#implementation JetNapMusicPlaye
#pragma mark Singleton Methods
+ (id)sharedManager {
#synchronized(self) {
if(sharedManager == nil)
sharedManager = [[super alloc] init];
}
return sharedManager;
}
- (id)init {
if (self = [super init]) {
// [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
MPRemoteCommandCenter *rcc = [MPRemoteCommandCenter sharedCommandCenter];
MPRemoteCommand *playCommand = rcc.playCommand;
[playCommand setEnabled:YES];
[playCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent *event) {
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] play];
return MPRemoteCommandHandlerStatusSuccess;
}];
MPRemoteCommand *pauseCommand = rcc.pauseCommand;
[pauseCommand setEnabled:YES];
[pauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent *event) {
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] pause];
return MPRemoteCommandHandlerStatusSuccess;
}];
}
return self;
}
- (void)dealloc {
[super dealloc];
}
-(AVPlayer *)avQueuePlayer
{
if (!_avQueuePlayer) {
[self initSession];
_avQueuePlayer = [[AVQueuePlayer alloc] init];
}
return _avQueuePlayer;
}
-(void)initSession
{
[[NSNotificationCenter defaultCenter] addObserver: self
selector: #selector(audioSessionInterrupted:)
name: AVAudioSessionInterruptionNotification
object: [AVAudioSession sharedInstance]];
//set audio category with options - for this demo we'll do playback only
NSError *categoryError = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error:&categoryError];
if (categoryError) {
NSLog(#"Error setting category! %#", [categoryError description]);
}
//activation of audio session
NSError *activationError = nil;
BOOL success = [[AVAudioSession sharedInstance] setActive: YES error: &activationError];
if (!success) {
if (activationError) {
NSLog(#"Could not activate audio session. %#", [activationError localizedDescription]);
} else {
NSLog(#"audio session could not be activated!");
}
}
}
#pragma mark - notifications
-(void)audioSessionInterrupted:(NSNotification*)interruptionNotification
{
NSLog(#"interruption received: %#", interruptionNotification);
}
#pragma mark - player actions
-(void) pause
{
[[self avQueuePlayer] pause];
}
-(void) play
{
[[self avQueuePlayer] play];
}
-(void) clear
{
[[self avQueuePlayer] removeAllItems];
}
#pragma mark - remote control events
#pragma mark - Kony FFI
+ (BOOL)playMusic:(NSString *)filename artistname:(NSString *)artistname songname:(NSString *)songname {
NSString *name = [filename stringByDeletingPathExtension];
NSString *ext = [filename pathExtension];
AVPlayerItem *avSongItem = [[AVPlayerItem alloc] initWithURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:[[NSString alloc] initWithFormat:name] ofType:ext]]];
if (avSongItem) {
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] clear];
[[[JetNapMusicPlayer sharedManager] avQueuePlayer] insertItem:avSongItem afterItem:nil];
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] play];
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = #{MPMediaItemPropertyTitle: songname, MPMediaItemPropertyArtist:artistname};
}
return YES;
}
+ (BOOL)stopMusic {
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] pause];
[(JetNapMusicPlayer *)[JetNapMusicPlayer sharedManager] clear];
return YES;
}
#end
To loop a song use below code after alloc init of avSongItem.
avSongItem.actionAtItemEnd = AVPlayerActionAtItemEndNone;
More info : Looping a video with AVFoundation AVPlayer?
Also as mentioned in the link use notification.
avSongItem.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[avPlayer currentItem]];
this will prevent the player to pause at the end.
in the notification:
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:kCMTimeZero];
}

AVPlayer stalling on large video files using resource loader delegate

I am using this approach to save the buffer data of the AVPlayer for video files. Found as the answer in this question Saving buffer data of AVPlayer.
iPhone and iPad - iOS 8.1.3
I made the necessary changes to play video and it is working very nicely except when I try to play a very long video (11-12 minutes long and about 85mb in size) the video will stall roughly 4 minutes after the connection finishes loading. I get an event for playbackBufferEmpty and a player item stalled notification.
This is the gist of the code
viewController.m
#property (nonatomic, strong) NSMutableData *videoData;
#property (nonatomic, strong) NSURLConnection *connection;
#property (nonatomic, strong) AVURLAsset *vidAsset;
#property (nonatomic, strong) AVPlayerItem *playerItem;
#property (nonatomic, strong) AVPlayerLayer *avlayer;
#property (nonatomic, strong) NSHTTPURLResponse *response;
#property (nonatomic, strong) NSMutableArray *pendingRequests;
/**
Startup a Video
*/
- (void)startVideo
{
self.vidAsset = [AVURLAsset URLAssetWithURL:[self videoURLWithCustomScheme:#"streaming"] options:nil];
[self.vidAsset.resourceLoader setDelegate:self queue:dispatch_get_main_queue()];
self.pendingRequests = [NSMutableArray array];
// Init Player Item
self.playerItem = [AVPlayerItem playerItemWithAsset:self.vidAsset];
[self.playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:NULL];
self.player = [[AVPlayer alloc] initWithPlayerItem:self.playerItem];
// Init a video Layer
self.avlayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
[self.avlayer setFrame:self.view.frame];
[self.view.layer addSublayer:self.avlayer];
}
- (NSURL *)getRemoteVideoURL
{
NSString *urlString = [#"http://path/to/your/long.mp4"];
return [NSURL URLWithString:urlString];
}
- (NSURL *)videoURLWithCustomScheme:(NSString *)scheme
{
NSURLComponents *components = [[NSURLComponents alloc] initWithURL:[self getRemoteVideoURL] resolvingAgainstBaseURL:NO];
components.scheme = scheme;
return [components URL];
}
/**
NSURLConnection Delegate Methods
*/
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response
{
NSLog(#"didReceiveResponse");
self.videoData = [NSMutableData data];
self.response = (NSHTTPURLResponse *)response;
[self processPendingRequests];
}
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
NSLog(#"Received Data - appending to video & processing request");
[self.videoData appendData:data];
[self processPendingRequests];
}
- (void)connectionDidFinishLoading:(NSURLConnection *)connection
{
NSLog(#"connectionDidFinishLoading::WriteToFile");
[self processPendingRequests];
[self.videoData writeToFile:[self getVideoCachePath:self.vidSelected] atomically:YES];
}
/**
AVURLAsset resource loader methods
*/
- (void)processPendingRequests
{
NSMutableArray *requestsCompleted = [NSMutableArray array];
for (AVAssetResourceLoadingRequest *loadingRequest in self.pendingRequests)
{
[self fillInContentInformation:loadingRequest.contentInformationRequest];
BOOL didRespondCompletely = [self respondWithDataForRequest:loadingRequest.dataRequest];
if (didRespondCompletely)
{
[requestsCompleted addObject:loadingRequest];
[loadingRequest finishLoading];
}
}
[self.pendingRequests removeObjectsInArray:requestsCompleted];
}
- (void)fillInContentInformation:(AVAssetResourceLoadingContentInformationRequest *)contentInformationRequest
{
if (contentInformationRequest == nil || self.response == nil)
{
return;
}
NSString *mimeType = [self.response MIMEType];
CFStringRef contentType = UTTypeCreatePreferredIdentifierForTag(kUTTagClassMIMEType, (__bridge CFStringRef)(mimeType), NULL);
contentInformationRequest.byteRangeAccessSupported = YES;
contentInformationRequest.contentType = CFBridgingRelease(contentType);
contentInformationRequest.contentLength = [self.response expectedContentLength];
}
- (BOOL)respondWithDataForRequest:(AVAssetResourceLoadingDataRequest *)dataRequest
{
long long startOffset = dataRequest.requestedOffset;
if (dataRequest.currentOffset != 0)
{
startOffset = dataRequest.currentOffset;
}
// Don't have any data at all for this request
if (self.videoData.length < startOffset)
{
NSLog(#"NO DATA FOR REQUEST");
return NO;
}
// This is the total data we have from startOffset to whatever has been downloaded so far
NSUInteger unreadBytes = self.videoData.length - (NSUInteger)startOffset;
// Respond with whatever is available if we can't satisfy the request fully yet
NSUInteger numberOfBytesToRespondWith = MIN((NSUInteger)dataRequest.requestedLength, unreadBytes);
[dataRequest respondWithData:[self.videoData subdataWithRange:NSMakeRange((NSUInteger)startOffset, numberOfBytesToRespondWith)]];
long long endOffset = startOffset + dataRequest.requestedLength;
BOOL didRespondFully = self.videoData.length >= endOffset;
return didRespondFully;
}
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
if (self.connection == nil)
{
NSURL *interceptedURL = [loadingRequest.request URL];
NSURLComponents *actualURLComponents = [[NSURLComponents alloc] initWithURL:interceptedURL resolvingAgainstBaseURL:NO];
actualURLComponents.scheme = #"http";
NSURLRequest *request = [NSURLRequest requestWithURL:[actualURLComponents URL]];
self.connection = [[NSURLConnection alloc] initWithRequest:request delegate:self startImmediately:NO];
[self.connection setDelegateQueue:[NSOperationQueue mainQueue]];
[self.connection start];
}
[self.pendingRequests addObject:loadingRequest];
return YES;
}
- (void)resourceLoader:(AVAssetResourceLoader *)resourceLoader didCancelLoadingRequest:(AVAssetResourceLoadingRequest *)loadingRequest
{
NSLog(#"didCancelLoadingRequest");
[self.pendingRequests removeObject:loadingRequest];
}
/**
KVO
*/
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == StatusObservationContext)
{
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
if (status == AVPlayerStatusReadyToPlay) {
[self initHud];
[self play:NO];
} else if (status == AVPlayerStatusFailed)
{
NSLog(#"ERROR::AVPlayerStatusFailed");
} else if (status == AVPlayerItemStatusUnknown)
{
NSLog(#"ERROR::AVPlayerItemStatusUnknown");
}
} else if (context == CurrentItemObservationContext) {
} else if (context == RateObservationContext) {
} else if (context == BufferObservationContext){
} else if (context == playbackLikelyToKeepUp) {
if (self.player.currentItem.playbackLikelyToKeepUp)
}
} else if (context == playbackBufferEmpty) {
if (self.player.currentItem.playbackBufferEmpty)
{
NSLog(#"Video Asset is playable: %d", self.videoAsset.isPlayable);
NSLog(#"Player Item Status: %ld", self.player.currentItem.status);
NSLog(#"Connection Request: %#", self.connection.currentRequest);
NSLog(#"Video Data: %lu", (unsigned long)self.videoData.length);
}
} else if(context == playbackBufferFull) {
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
The problem seems to be that some time after the connection finishes loading, the player item buffer goes empty. My thought at the moment is that something is being deallocated when the connection finishes loading and messing up the playerItem buffer.
However at the time the buffer goes empty the playerItem status is good, the video asset is playable, the video data is good
If I throttle the wifi through charles and slow down the connection, the video will play as long as the connection does not finish loading within a few minutes of the end of the video.
If I set the connection nil on the finished loading event, the resource loader will fire up a new connection when shouldWaitForLoadingOfRequestedResource fires again. In this case the loading starts all over again and the video will continue playing.
I should mention that this long video plays fine if I play it as a normal http url asset, and also plays fine after being saved to the device and loaded from there.
when the resource loader delegate fires up the NSURLConnection, the connection takes over saving the NSData to the pending requests and processing them. when the connection finished loading, the resource loader regains responsibility for handling the loading requests. the code was adding the loading request to the pending requests array but the issue was that they were not being processed. changed the method to the following and it works.
//AVAssetResourceLoader
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
if(isLoadingComplete == YES)
{
//NSLog(#"LOADING WAS COMPLETE");
[self.pendingRequests addObject:loadingRequest];
[self processPendingRequests];
return YES;
}
if (self.connection == nil)
{
NSURL *interceptedURL = [loadingRequest.request URL];
NSURLComponents *actualURLComponents = [[NSURLComponents alloc] initWithURL:interceptedURL resolvingAgainstBaseURL:NO];
actualURLComponents.scheme = #"http";
self.request = [NSURLRequest requestWithURL:[actualURLComponents URL]];
self.connection = [[NSURLConnection alloc] initWithRequest:self.request delegate:self startImmediately:NO];
[self.connection setDelegateQueue:[NSOperationQueue mainQueue]];
isLoadingComplete = NO;
[self.connection start];
}
[self.pendingRequests addObject:loadingRequest];
return YES;
}

Can't Play AVPLayer Local Video from

So I am building a custom video player using AVFoundation - AVPlayer and AVPlayerLayer.
Currently, all I want the player to do is play a video in the asset library with a hardcoded url to that video. I would like this to be contained in a SubClass of UIView so I can use it all around my app.
Here is my code so far:
CUPlayer.h:
#interface CUPlayer : UIView
{
AVPlayer *player;
AVPlayerLayer *playerLayer;
AVPlayerItem *item;
NSURL *url;
}
#property(nonatomic) UIViewAutoresizing autoresizingMask;
#end
CUPlayer.m:
#implementation CUPlayer
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
// Initialization code
self.backgroundColor = [UIColor redColor];
[self setupURL];
}
return self;
}
-(void)setupURL
{
NSLog(#"URL setting up");
NSString *string = #"assets-library://asset/asset.mov?id=0A937F0D-6265-452D-8800- 1A760E8E88B9&ext=mov";
url = [[NSURL alloc] initFileURLWithPath: string];
[self setupPlayerForURL];
}
-(void)setupPlayerForURL
{
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
item = [AVPlayerItem playerItemWithAsset:asset];
player = [AVPlayer playerWithPlayerItem:item];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
//player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[self.layer addSublayer:playerLayer];
playerLayer.frame = CGRectMake(0, 0, 200, 200);
//[playerLayer setBackgroundColor:[UIColor greenColor].CGColor];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if (object == player && [keyPath isEqualToString:#"status"]) {
if (player.status == AVPlayerStatusFailed) {
NSLog(#"AVPlayer Failed");
} else if (player.status == AVPlayerStatusReadyToPlay) {
NSLog(#"AVPlayer Ready to Play");
[player play];
} else if (player.status == AVPlayerItemStatusUnknown) {
NSLog(#"AVPlayer Unknown");
}
}
}
And then I am calling this from the View Controller:
CUPlayer *cuPlayer = [[CUPlayer alloc]initWithFrame:CGRectMake(0, 0, 250, 250)];
[self.view addSubview:cuPlayer];
This complies but just give me a red square without the video playing. The URL to the local file is definitely correct. I can get it working if I hold all the code in the view controller and run the play in -(void)viewDidLayoutSubviews.
Help would very much be appreciated, I have read all the documentation multiple times trying to work this thing out.
Tom

Resources