Using AVFoundation to play video, video plays but no sound - ios

I am trying to build an app that will change the composition of a video using the AVFoundation class but will like to just be able to play the video first.
I have write some code to do that but when i play the video the sound does not work.
I tried to use the Apple sample code to do this.
Below is my code:
-(void)loadAssets
{
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
/* Tells the asset to load the values of any of the specified keys that are not already loaded. */
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
/* IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem. */
[self prepareToPlayAsset:asset withKeys:requestedKeys];
});
}];
}
- (void)prepareToPlayAsset:(AVURLAsset *)assetURL withKeys:(NSArray *)requestedKeys
{
/* Make sure that the value of each key has loaded successfully. */
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [assetURL statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
[self assetFailedToPrepareForPlayback:error];
return;
}
}
/* Use the AVAsset playable property to detect whether the asset can be played. */
if (!assetURL.playable)
{
/* Generate an error describing the failure. */
NSString *localizedDescription = NSLocalizedString(#"Item cannot be played", #"Item cannot be played description");
NSString *localizedFailureReason = NSLocalizedString(#"The assets tracks were loaded, but could not be made playable.", #"Item cannot be played failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *assetCannotBePlayedError = [NSError errorWithDomain:#"StitchedStreamPlayer" code:0 userInfo:errorDict];
/* Display the error to the user. */
[self assetFailedToPrepareForPlayback:assetCannotBePlayedError];
return;
}
/* At this point we're ready to set up for playback of the asset. */
/* Stop observing our prior AVPlayerItem, if we have one. */
if (self.playerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.playerItem removeObserver:self forKeyPath:kStatusKey];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
}
/* Create a new instance of AVPlayerItem from the now successfully loaded AVAsset. */
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
/* Observe the player item "status" key to determine when it is ready to play. */
[self.playerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:VideoPlaybackViewControllerStatusObservationContext];
/* When the player item has played to its end time we'll toggle
the movie controller Pause button to be the Play button */
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
seekToZeroBeforePlay = NO;
/* Create new player, if we don't already have one. */
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
self.player=[AVPlayer playerWithPlayerItem:self.playerItem];
[self.playerView setPlayer:self.player];
// Observe the AVPlayer "currentItem" property to find out when any
//AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
//occur.
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:VideoPlaybackViewControllerCurrentItemObservationContext];
}
// Make our new AVPlayerItem the AVPlayer's current item.
if (self.player.currentItem != self.playerItem)
{
// Replace the player item with a new player item. The item replacement occurs
//asynchronously; observe the currentItem property to find out when the
//replacement will/did occur
[[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
[self syncUI];
}
}
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context
{
/* AVPlayerItem "status" property value observer. */
if (context == VideoPlaybackViewControllerStatusObservationContext)
{
[self syncUI];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
[self disablePlayerButtons];
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
//[self initScrubberTimer];
//[self enableScrubber];
[self enablePlayerButtons];
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *playerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:playerItem.error];
}
break;
}
}
/* AVPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == VideoPlaybackViewControllerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* Is the new player item null? */
if (newPlayerItem == (id)[NSNull null])
{
[self disablePlayerButtons];
//[self disableScrubber];
}
else /* Replacement of player currentItem has occurred */
{
/* Set the AVPlayer for which the player layer displays visual output. */
[self.playerView setPlayer:self.player];
//[self setViewDisplayName];
/* Specifies that the player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
//[mPlaybackView setVideoFillMode:AVLayerVideoGravityResizeAspect];
[self syncUI];
}
}
else
{
[super observeValueForKeyPath:path ofObject:object change:change context:context];
}
}
Any help?

Related

How to force AVPlayer to fail when AVAssetResourceLoadingRequest's finishLoadingWithError(err) is called

I want to force AVPlayer to throw the player error, either through the playerFailedToReachEnd notification or observe player.status via KVO, when during the process of loading resource request via AVAssetResourceLoader that the request is finished loading with error.
It should not do the manual playback stop on AVPlayer to avoid dealing with the race condition between the manual stop and the KVO/notifications
Manual playback stop on AVPlayer when error occurred is refrained to avoid race condition.
Tried the part to return the callback 'resourceLoader:shouldWaitForLoadingOfRequestedResource:' to return NO it doesn't make AVPlayer change state to Failure nor does it send notification about player failure
#implementation AssetLoader <AVAssetResourceLoaderDelegate>
- (AVPlayerItem *)setupLoader {
NSURL *playbackUrl = [NSURL URLWithString:#"example-url"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:playbackUrl options:nil];
[asset.resourceLoader setDelegate:self queue:_sample_queue];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
return playerItem;
}
- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest {
if ([url.scheme isEqual:#"skd"] == NO) {
QPLogError(#"Unexpected url scheme: %#", url.absoluteString);
return NO;
}
LicenseAction *action = [[LicenseAction alloc] initWithLoadingRequest:loadingRequest];
[action execute:^(NSData *ckcData, NSError *error) {
if (error) {
[loadingRequest finishLoadingWithError:error]; //This should prompt AVPlayer to fail
} else {
[loadingRequest.dataRequest respondWithData:ckcData];
[loadingRequest finishLoading];
}
}];
return YES;
}
...
#end
#implementation Player {
AVPlayer *_player;
}
- (void)prepare {
[_player replaceCurrentItemWithPlayerItem:playerItem];
NSKeyValueObservingOptions options = (NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld | NSKeyValueObservingOptionInitial);
[_player addObserver:self forKeyPath:#"status" options:options context:&QPClearPlayerAVPlayerKVOContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(playerItemFailedToEnd:) name:AVPlayerItemFailedToPlayToEndTimeNotification object:_player.currentItem];
}
- (void)stopWithError {
...
[self reportPlayerError];
}
...
- (void)playerItemFailedToEnd:(NSNotification *)notification {
...
[self reportPlayerError];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if(object == _player && keyPath == #"status") {
...
if (_player.status == AVPlayerStatusFailed) {
[self reportPlayerError];
}
}
}
...
#end
Expected
upon invoking AVAssetResourceLoadingRequest.finishLoadingWithError() that the AVPlayer would send a failed notification or KVO status changes
Actual
AVPlayer doesn't have status change nor failed notification

Sometimes AVPlayer stalling and seekToTime: not respondin

I'm playing youtube videos using AVPlayer as follows,
- (void)startYoutubeVideoAtUrl:(NSURL *)videoUrl
{
NSLog(#"start player at url : %#", videoUrl);
[HCYoutubeParser h264videosWithYoutubeURL:videoUrl completeBlock:^(NSDictionary *videoDictionary, NSError *error) {
if (videoDictionary && videoDictionary.count > 0) {
NSString *URLString = [self chooseYoutubeUrlFromUrlList:videoDictionary];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[self videoURLWithCustomScheme:#"streaming" uRLString:URLString] options:nil];
[asset.resourceLoader setDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
pendingRequests = [NSMutableArray array];
avPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[self startVideoPlayBack];
}
else {
[_delegate failedStartPalyInlineVideo];
}
}];
}
-(void)startVideoPlayBack
{
startTime = CFAbsoluteTimeGetCurrent();
avPlayer = [[AVQueuePlayer alloc] initWithPlayerItem:avPlayerItem];
avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerItem addObserver:self forKeyPath:#"status" options:0 context:nil];
[avPlayerLayer addObserver:self forKeyPath:#"readyForDisplay" options:NSKeyValueObservingOptionNew context:nil];
avPlayerLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
[self.view.layer addSublayer:avPlayerLayer];
[self watchApiCall];
avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[avPlayer currentItem]];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemPlaybackStalled:)
name:AVPlayerItemPlaybackStalledNotification
object:[avPlayer currentItem]];
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
#try {
if (!avPlayer) {
return;
}
if (avPlayerItem.status == AVPlayerStatusReadyToPlay) {
}
else if (avPlayerItem.status == AVPlayerStatusFailed) {
NSLog(#"----- AVPlayerStatusFailed ----");
[self playbackVideo];
}
if (object == avPlayerLayer && [keyPath isEqualToString:#"readyForDisplay"]) {
[self resetNetworkSpeedUsingLoadingTime];
if (avPlayerLayer.readyForDisplay) {
id<LoopingVideoDelegate> strongDelegate = self.delegate;
if([strongDelegate readyForVideoDisplay]) {
[avPlayer play];
}
else {
[self removePlayer];
}
}
}
}
#catch(NSException *ex) {
NSLog(#"EXCEPTION : %#", ex);
}
}
My issue is some times the video getting stalled and fires AVPlayerItemPlaybackStalledNotification. Also it's not responding [avPlayerItem seekToTime:kCMTimeZero]; sometimes after called AVPlayerItemDidPlayToEndTimeNotification selector. I wasn't able to find a solution for this. I checked with,
[HCYoutubeParser thumbnailForYoutubeURL:videoUrl thumbnailSize:YouTubeThumbnailDefaultHighQuality completeBlock:^(UIImage *image, NSError *error) {
if (!error) {
[HCYoutubeParser h264videosWithYoutubeURL:videoUrl completeBlock:^(NSDictionary *videoDictionary, NSError *error) {
NSString *URLString = [self chooseYoutubeUrlFromUrlList:videoDictionary];
NSURL *urlToLoad = [NSURL URLWithString:URLString];
avPlayerItem = [[AVPlayerItem alloc] initWithURL:urlToLoad];
[self startVideoPlayBack];
}];
}
else {
NSLog(#"error in youtube parser");
}
}];
and there is no any player stall issue or seekToTime: not responding issue with that. Please help.
Observe your AVPlayerItem's loadedTimeRanges and seekableTimeRanges property to make make sure that the AVPlayer has loaded playable data yet. When steaming, the AVPlayer often pause falling short of playable(data that can be played by AVPlayer) data. This should help investigate further into the issue. You can also try to start playback when you get avPlayerItem.status == AVPlayerStatusReadyToPlay by calling play on your AVPlayer
From AV Foundation Programming Guide
Monitoring Playback
You can monitor a number of aspects of both the presentation state of a player and the player item being played. This is particularly useful for state changes that are not under your direct control. For example:
If the user uses multitasking to switch to a different application, a player’s rate property will drop to 0.0.
If you are playing remote media, a player item’s loadedTimeRanges and seekableTimeRanges properties will change as more data becomes available.
These properties tell you what portions of the player item’s timeline are available.
A player’s currentItem property changes as a player item is created for an HTTP live stream.
A player item’s tracks property may change while playing an HTTP live stream.
This may happen if the stream offers different encodings for the content; the tracks change if the player switches to a different encoding.
A player or player item’s status property may change if playback fails for some reason.
You can use key-value observing to monitor changes to values of these properties.

ios Exception [NSException raise:format:]

I have this exception, i know it's related to the AVPlayer but i don't exactly understand to what. (something with the observation, but what???)
#0 0x36342c64 in objc_exception_throw ()
#1 0x29282d74 in +[NSException raise:format:] ()
#2 0x29f94334 in NSKVODeallocate ()
#3 0x36358d5e in objc_object::sidetable_release(bool) ()
#4 0x2918a25c in CFRelease ()
#5 0x2919eafc in -[__NSSetM dealloc] ()
#6 0x36358d5e in objc_object::sidetable_release(bool) ()
#7 0x27d19d76 in -[AVPlayer dealloc] ()
#8 0x29f94254 in NSKVODeallocate ()
#9 0x36358d5e in objc_object::sidetable_release(bool) ()
#10 0x27d00a3a in -[AVPlayerLayer dealloc] ()
#11 0x29f94254 in NSKVODeallocate ()
#12 0x2c2a73ac in CA::Layer::free_transaction(CA::Transaction*) ()
#13 0x2c2a4052 in CA::Transaction::commit() ()
#14 0x2c29decc in CA::Transaction::observer_callback(__CFRunLoopObserver*, unsigned long, void*) ()
#15 0x292495bc in __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ ()
#16 0x29246c7a in __CFRunLoopDoObservers ()
#17 0x29247082 in __CFRunLoopRun ()
#18 0x29195610 in CFRunLoopRunSpecific ()
#19 0x29195422 in CFRunLoopRunInMode ()
#20 0x306af0a8 in GSEventRunModal ()
#21 0x2c8df484 in UIApplicationMain ()
#22 0x000f8408 in main at main.m:17
thank
* UPDATE *
this is the code using AVPlayer and KVO (it's not all of the code, only relevant parts to AVPlayer and KVO. too much code to put all in here....). i have 2 players i'm using. thanks.
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context {
/* AVPlayerItem "status" property value observer. */
if (context == MainPlayerStatusObservationContext)
{
[self syncPlayPauseButtons];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
[self removePlayerTimeObserver];
[self syncScrubber];
[self disableScrubber];
[self disablePlayerButtons];
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
SingrDLog(#"item is ready!");
sMainDuration = CMTimeGetSeconds(self.mPlayer.currentItem.duration);
// NSLog(#"SDURATION IS: %f", sDuration);
[self initScrubberTimer];
[self enableScrubber];
[self enablePlayerButtons];
if (_isMainPlayerPlaying) {
// make sure the video continues streaming when app goes to backgroud
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
[self.mPlayer play];
self.isBuffering = NO;
[self updatePlayingInfoCenter:self.title :_video.artist_name :#"personal album"];
}
}
break;
case AVPlayerStatusFailed:
{
NSLog(#"name: %#", [mURL lastPathComponent]);
AVPlayerItem *playerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:playerItem.error];
}
break;
}
}
/* AVPlayer "rate" property value observer. */
else if (context == MainPlayerRateObservationContext)
{
[self syncPlayPauseButtons];
}
/* AVPlayer buffer size observer - update if main player is playing */
else if (context == MainPlayerBufferObservationContext)
{
NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey];
if (self.isMainPlayerPlaying) {
[self updateBufferProgressBar:timeRanges];
}
}
/* AVPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == MainPlayerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* Is the new player item null? */
if (newPlayerItem == (id)[NSNull null])
{
[self disablePlayerButtons];
[self disableScrubber];
}
else /* Replacement of player currentItem has occurred */
{
/* Set the AVPlayer for which the player layer displays visual output.
(set it once)
*/
if (_mPlayerPlaybackViewSet == NO) {
_mPlayerPlaybackViewSet = YES;
[self.mPlaybackView setPlayer:mPlayer];
}
[self setViewDisplayName];
/* Specifies that the player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
[self.mPlaybackView setVideoFillMode:AVLayerVideoGravityResizeAspect];
[self syncPlayPauseButtons];
}
}
////////////////////////////////////////////////////
// Helper Player KVO
////////////////////////////////////////////////////
/* AVHelperPlayerItem "status" property value observer. */
else if (context == HelperPlayerStatusObservationContext)
{
[self syncHelperPlayPauseButtons];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
[self removeHelperPlayerTimeObserver];
[self syncHelperScrubber];
[self disableHelperScrubber];
[self disableHelperPlayerButtons];
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVHelperPlayerItem becomes ready to play, i.e.
[playerItem status] == AVHelperPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
SingrDLog(#"helper item is ready!");
sHelperDuration = CMTimeGetSeconds(self.mHelperPlayer.currentItem.duration);
[self initHelperScrubberTimer];
[self enableHelperScrubber];
[self enableHelperPlayerButtons];
if (!_isMainPlayerPlaying) {
// make sure the video continues streaming when app goes to backgroud
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
[self.mHelperPlayer play];
self.isBuffering = NO;
[self updatePlayingInfoCenter:self.title :_video.artist_name :#"personal album"];
}
}
break;
case AVPlayerStatusFailed:
{
NSLog(#"name: %#", [mHelperURL lastPathComponent]);
AVPlayerItem *playerItem = (AVPlayerItem *)object;
[self helperAssetFailedToPrepareForPlayback:playerItem.error];
}
break;
}
}
/* AVHelperPlayer "rate" property value observer. */
else if (context == HelperPlayerRateObservationContext)
{
[self syncHelperPlayPauseButtons];
}
/* AVHelperPlayer buffer size observer - update if helper player is playing */
else if (context == HelperPlayerBufferObservationContext)
{
NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey];
if (!self.isMainPlayerPlaying) {
[self updateBufferProgressBar:timeRanges];
}
}
/* AVHelperPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == HelperPlayerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* Is the new player item null? */
if (newPlayerItem == (id)[NSNull null])
{
[self disableHelperPlayerButtons];
[self disableHelperScrubber];
}
else /* Replacement of Helper player currentItem has occurred */
{
/* Set the AVPlayer for which the Helper player layer displays visual output. */
if (_mHelperPlayerPlaybackViewSet == NO) {
_mHelperPlayerPlaybackViewSet = YES;
[self.mHelperPlaybackView setHelperPlayer:self.mHelperPlayer];
}
[self setViewDisplayName];
/* Specifies that the Helper player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
[self.mHelperPlaybackView setHelperVideoFillMode:AVLayerVideoGravityResizeAspect];
[self syncHelperPlayPauseButtons];
}
}
else
{
[super observeValueForKeyPath:path ofObject:object change:change context:context];
} }
(void)prepareToPlayAsset:(AVURLAsset *)asset withKeys:(NSArray *)requestedKeys
{
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
NSLog(#"name: %#", [mURL lastPathComponent]);
[self assetFailedToPrepareForPlayback:error];
return;
}
/* If you are also implementing -[AVAsset cancelLoading], add your code here to bail out properly in the case of cancellation. */
}
/* Use the AVAsset playable property to detect whether the asset can be played. */
if (!asset.playable)
{
/* Generate an error describing the failure. */
NSString *localizedDescription = NSLocalizedString(#"Item cannot be played", #"Item cannot be played description");
NSString *localizedFailureReason = NSLocalizedString(#"The assets tracks were loaded, but could not be made playable.", #"Item cannot be played failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *assetCannotBePlayedError = [NSError errorWithDomain:#"StitchedStreamPlayer" code:0 userInfo:errorDict];
/* Display the error to the user. */
[self assetFailedToPrepareForPlayback:assetCannotBePlayedError];
return;
}
/* At this point we're ready to set up for playback of the asset. */
/* Stop observing our prior AVPlayerItem, if we have one. */
if (self.mPlayerItem)
{
/* Remove existing player item key value observers and notifications. */
//[self.mPlayerItem removeObserver:self forKeyPath:kStatusKey];
NSLog(#"calling NSNotificationCenter remove");
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.mPlayerItem];
}
/* Create a new instance of AVPlayerItem from the now successfully loaded AVAsset. */
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
/* Observe the player item "status" key to determine when it is ready to play. */
NSLog(#"call mPlayerItem addObserver");
[self.mPlayerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MainPlayerStatusObservationContext];
/* When the player item has played to its end time we'll toggle
the movie controller Pause button to be the Play button */
NSLog(#"calling NSNotificationCenter add");
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.mPlayerItem];
seekToZeroBeforePlay = NO;
/* Create new player, if we don't already have one. */
if (!self.mPlayer)
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MainPlayerCurrentItemObservationContext];
/* Observe the AVPlayer "rate" property to update the scrubber control. */
[self.player addObserver:self
forKeyPath:kRateKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MainPlayerRateObservationContext];
/* Observe the asset buffer loaded data. this indicates the buffering size */
[self.player addObserver:self
forKeyPath:kBufferLoadedKey
options:NSKeyValueObservingOptionNew
context:MainPlayerBufferObservationContext];
}
/* Make our new AVPlayerItem the AVPlayer's current item. */
if (self.player.currentItem != self.mPlayerItem)
{
/* Replace the player item with a new player item. The item replacement occurs
asynchronously; observe the currentItem property to find out when the
replacement will/did occur*/
[self.mPlayer replaceCurrentItemWithPlayerItem:self.mPlayerItem];
[self syncPlayPauseButtons];
}
[self.mScrubber setValue:0.0];
SingrDLog(#"set scrub to 0"); }
on viewWillDisappear i call this method:
#pragma mark - Observers -
- (void) removeObservers {
if (self.mPlayerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.mPlayerItem removeObserver:self forKeyPath:kStatusKey];
NSLog(#"1 GETTING OUT: calling helper NSNotificationCenter remove");
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.mPlayerItem];
self.mPlayerItem = nil;
}
if (self.mHelperPlayerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.mHelperPlayerItem removeObserver:self forKeyPath:kStatusKey];
NSLog(#"2 GETTING OUT: calling helper NSNotificationCenter remove");
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.mHelperPlayerItem];
self.mHelperPlayerItem = nil;
}
// methods check wheter observer exists
[self removePlayerTimeObserver];
[self removeHelperPlayerTimeObserver];
}
-(void)removePlayerTimeObserver
{
if (mTimeObserver) {
NSLog(#"calling mTimeObserver remove");
[self.mPlayer removeTimeObserver:mTimeObserver];
mTimeObserver = nil;
}
}
-(void)removeHelperPlayerTimeObserver
if (mHelperTimeObserver) {
NSLog(#"calling mHelperTimeObserver remove");
self.mHelperPlayer removeTimeObserver:mHelperTimeObserver];
mHelperTimeObserver = nil;
}
}

iOS AVPlayer never gets ready

I am using this code to start playing local video chunks referenced to form a play list.
The very same code works on one project, but not on another.
On the project I am working on right now, I can see how the first chunk gets loaded, and the first frame also show up. But the AVPlayer never start playing because it never gets the AVPlayerStatusReadyToPlay notification:
- (void)loadAssetAsync
{
NSLog(#"loadAssetAsync for URL: %#", videoURL);
/**
* Create an asset for inspection of a resource referenced by a given URL.
* Load the values for the asset keys "tracks", "playable".
*/
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
// Tells the asset to load the values of any of the specified keys that are not already loaded.
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
// IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem.
[self prepareToPlayAsset:asset withKeys:requestedKeys];
});
}];
}
/**
* Invoked at the completion of the loading of the values for all keys on the asset that required.
*/
- (void)prepareToPlayAsset:(AVURLAsset *)asset withKeys:(NSArray *)requestedKeys
{
//assert([NSThread isMainThread]);
// Make sure that the value of each key has loaded successfully.
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
BVLogWarn(#"%#: %#", THIS_FILE, error.localizedDescription);
[self handleErrorForProxy:error];
[self assetFailedToPrepareForPlayback];
return;
}
}
if (!asset.playable)
{
BVLogWarn(#"%#: Item cannot be played", THIS_FILE);
[self handleErrorForProxy:nil];
[self assetFailedToPrepareForPlayback];
return;
}
// Create a new instance of AVPlayerItem from the now successfully loaded AVAsset.
playerItem = [[AVPlayerItem alloc] initWithAsset:asset];
// Observe the player item "status" key to determine when it is ready to play.
[playerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:BVPlayerItemStatusObserverContext];
[playerItem addObserver:self
forKeyPath:kBufferEmpty
options:NSKeyValueObservingOptionNew
context:BVPLayerBufferEmptyObserverContext];
[playerItem addObserver:self
forKeyPath:kLikelyToKeepUp
options:NSKeyValueObservingOptionNew
context:BVPlayerLikelyToKeepUpObserverContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
// Get a new AVPlayer initialized to play the specified player item.
player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
// Do nothing if the item has finished playing
[player setActionAtItemEnd:AVPlayerActionAtItemEndNone];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:BVCurrentItemObserverContext];
// Observe the AVPlayer "rate" property to update the scrubber control.
[player addObserver:self
forKeyPath:kRateKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:BVRateObserverContext];
[player replaceCurrentItemWithPlayerItem:playerItem];
}
- (void)observeValueForKeyPath:(NSString*) keyPath
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context
{
// AVPlayerItem "status" property value observer.
if (context == BVPlayerItemStatusObserverContext)
{
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
case AVPlayerStatusUnknown:
{
[self removeTimeObserver];
[self syncTimeScrubber];
[timeControl setEnabled:NO];
[playButton setEnabled:NO];
[fullscreenButton setEnabled:NO];
[loadingIndicator startAnimating];
}
break;
case AVPlayerStatusReadyToPlay:
{
if (firstPlayback|becomeActive)
{
[timeControl setEnabled:YES];
[playButton setEnabled:YES];
[fullscreenButton setEnabled:YES];
[upperControls setHidden:NO];
[lowerControls setHidden:NO];
[loadingIndicator stopAnimating];
if (firstPlayback) {
[playbackView setNeedsDisplay];
}
if (self.shouldAutoplay)
[player play];
if (firstPlayback) {
timeRemaining.text = [NSString stringWithFormat:#"-%#", timeStringForSeconds(CMTimeGetSeconds(playerItem.duration) )];
}
firstPlayback = NO;
controlsHidden = NO;
if (!isSeeking)
[self startHideControlsTimer];
}
if (becomeActive) {
dispatch_async(dispatch_get_main_queue(), ^{
[player seekToTime:CMTimeMakeWithSeconds(lastTimeStop, NSEC_PER_SEC)
toleranceBefore:kCMTimeZero
toleranceAfter:kCMTimeZero
completionHandler:^(BOOL finished) {
if (finished && rateToRestoreAfterScrubbing)
{
[player setRate:rateToRestoreAfterScrubbing];
rateToRestoreAfterScrubbing = 0.f;
}
[self addTimeObserver];
[playbackView setPlayer:player];
becomeActive = NO;
}];
});
}else{
[self addTimeObserver];
}
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *thePlayerItem = (AVPlayerItem *)object;
BVLogWarn(#"%#: %#", THIS_FILE, thePlayerItem.error.localizedDescription);
[self handleErrorForProxy:thePlayerItem.error];
[self assetFailedToPrepareForPlayback];
}
break;
}
}
// AVPlayer "rate" property value observer.
else if (context == BVRateObserverContext)
{
[self updatePlayPauseButton];
}
// AVPlayer "currentItem" buffer is empty observer
else if (context == BVPLayerBufferEmptyObserverContext)
{
[loadingIndicator startAnimating];
}
// AVPlayer "currentItem" is likely to keep up observer
else if (context == BVPlayerLikelyToKeepUpObserverContext)
{
[loadingIndicator stopAnimating];
}
// AVPlayer "currentItem" property observer.
else if (context == BVCurrentItemObserverContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
// New player item null?
if (newPlayerItem == (id)[NSNull null])
{
[playButton setEnabled:NO];
[timeControl setEnabled:NO];
} else // Replacement of player currentItem has occurred
{
if (!becomeActive) {
[playbackView setPlayer:player];
}else{
}
[playbackView setVideoFillMode:[self scalingMode]];
[self updatePlayPauseButton];
}
}
else
{
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
return;
}
Did you try to put some log before your if ? Maybe the notification is working but it's stucked because your if?
case AVPlayerStatusReadyToPlay:
{
NSLog(#"NOTIFICATION TEST PASSED");
if (firstPlayback|becomeActive) {}
}

AVPlayerItem initial timedMetadata not being observed (KVO)

I have a class that is handling an AVPlayer (and AVPlayerItem) that reports back state, time, and timedMetadata to a delegate.
Works well except that about 70-80% of the time, the initial timedMetadata is not "key value observed". However after the first instance of timedMetadata being missed, all other timedMetadata seems to be observed without issue.
As a temporary fix, I've started to embed dummy timedMetadata tags in the beginning of videos that do nothing but "kick the tires" so to speak and everything works fine after that. Yet this seems pretty kludgy. I suspect that either I'm setting up the AVPlayerItem and KVO in a sub-optimal manner OR there's just a bug here.
Any ideas on why this might be happening are greatly appreciated! Code below....
// CL: Define constants for the key-value observation contexts.
static const NSString *ItemStatusContext;
static const NSString *ItemMetadataContext;
static const NSString *ItemPlaybackForcastContext;
- (id)initWithURL:(NSURL *)url
{
if (self = [super init]) {
__weak TFPAVController *_self = self;
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
[item addObserver:_self forKeyPath:#"status" options:0 context:&ItemStatusContext];
[item addObserver:_self forKeyPath:#"timedMetadata" options:0 context:&ItemMetadataContext];
[item addObserver:_self forKeyPath:#"playbackLikelyToKeepUp" options:0 context:&ItemPlaybackForcastContext];
[[NSNotificationCenter defaultCenter] addObserver:_self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:item];
AVPlayer *player = [AVPlayer playerWithPlayerItem:item];
_self.totalRunTime = CMTimeGetSeconds(item.duration);
[_self.delegate avPlayerNeedsView:player];
_self.playerItem = item;
_self.player = player;
}
else {
NSLog(#"The asset's tracks were not loaded: %# // [%# %#]",
error.localizedDescription,
NSStringFromClass([self class]),
NSStringFromSelector(_cmd));
}
_self.playerObserver = [_self.player addPeriodicTimeObserverForInterval:CMTimeMake(1, _FrameRate_)
queue:NULL
usingBlock: ^(CMTime time) {
_self.currentVideoTime = CMTimeGetSeconds([_self.playerItem currentTime]);
}];
});
}];
}
return self;
}
#pragma mark - KVO Response Methods
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
__weak TFPAVController *_self = self;
if (context == &ItemStatusContext) {
dispatch_async(dispatch_get_main_queue(),
^{
if (((AVPlayerItem *)object).status == AVPlayerItemStatusReadyToPlay) {
[_self.delegate videoIsLoadedInPlayer:_self];
}
});
return;
}
else if (context == &ItemMetadataContext) {
dispatch_async(dispatch_get_main_queue(),
^{
[_self checkMetaDataForPlayerItem: (AVPlayerItem *)object];
});
return;
}
else if (context == &ItemPlaybackForcastContext) {
dispatch_async(dispatch_get_main_queue(),
^{
AVPlayerItem *playerItem = object;
if (CMTimeGetSeconds([playerItem currentTime]) <= 0) return;
NSDictionary *notificationDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:playerItem.playbackLikelyToKeepUp]
forKey:kAVPlayerStateKey];
[[NSNotificationCenter defaultCenter] postNotificationName:kAVPlayerNotification
object:self
userInfo:notificationDictionary];
});
return;
}
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
- (void)checkMetaDataForPlayerItem:(AVPlayerItem *)item
{
NSMutableDictionary *metaDict = [NSMutableDictionary dictionary];
// CL: make sure there's stuff there
if (item.timedMetadata != nil && [item.timedMetadata count] > 0) {
// CL: if there is, cycle through the items and create a Dictionary
for (AVMetadataItem *metadata in item.timedMetadata) {
[metaDict setObject:[metadata valueForKey:#"value"] forKey:[metadata valueForKey:#"key"]];
}
// CL: pass it to the delegate
[self.delegate parseNewMetaData:[NSDictionary dictionaryWithDictionary:metaDict]];
}
}
Ahhh, KVO. Probably one of Apple's all-time worst design decisions.
I guess it's no longer relevant, but at a guess the problem you're having is that sometimes the value you're trying to observe has already been assigned to the key when you get around to adding yourself as an observer, so your observer selector isn't called.
To avoid this you can add NSKeyValueObservingOptionInitial to the options when calling addObserver:forKeyPath:options:context:, and your observer method will be invoked immediately with the current value.

Resources