I am working on a custom video player using AVPlayer. I load videos stored on the local file system in the Cache folder (NSCache). I initialize the player like this:
self.playerItem = [[AVPlayerItem alloc] initWithURL:self.localURL];
[self.playerItem addObserver:self forKeyPath:NSStringFromSelector(#selector(status)) options:NSKeyValueObservingOptionInitial context:nil];
self.avPlayer = [[AVPlayer alloc]initWithPlayerItem:self.playerItem];
[self.avPlayer addObserver:self forKeyPath:NSStringFromSelector(#selector(status)) options:NSKeyValueObservingOptionInitial context:nil];
This generally works fine. However I have frequent fails on the status of the AVPlayerItem with this error:
NSLocalizedDescription = "The operation could not be completed";
NSLocalizedFailureReason = "An unknown error occurred (-12983)";
NSUnderlyingError = "Error Domain=NSOSStatusErrorDomain Code=-12983
The strange thing is that the same URLs which fail sometimes work just shortly after and before such fails. I would say every 10th load attempt fails. I can't figure out what causes this or where to look for answers. A search for the error code turned up empty for me. Any help or pointers are highly appreciated.
After a lengthy hunt, I was able to track down the source of the issue. The problem was an undocumented limit on the number of AVPlayer items which can exist concurrently. If there are too many instances videos can no longer be loaded failing with the AVPlayerItem error -12983.
Other people seem to have run into the same issue as well: AVPlayerItemStatusFailed error when too many AVPlayer instances created. I solved the issue by reusing player instances and making sure that there are not too many active ones at the same time.
Related
I was trying to play a haptic "AHAP" pattern from a file with the following code:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
The haptics does play successfully, but I have an issue that whenever was pattern is played, the first key press on keyboard make very loud flick sound (the default iOS flick sound) and subsequently back to normal on second press. I thought it was because the engine is still active, so I called
[engine stopWithCompletionHandler:nil];
but then the haptic doesn't play anymore (however, flick sound is normal for first key press). playPatternFromURL:error: is supposed to play synchronously, which means it'll finish playing before executing stopWithCompletionHandler: (from Apple Doc). I honestly has no idea why and how this happens. CoreHaptics rarely can be seen implemented in the wild and github except the official Apple Doc, so I have no useful references (maybe except this in github).
Any idea on this particular issue? Thanks in advance.
EDIT:
For future reader, I managed to mitigate this issue by playing it in another thread:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
});
Perhaps this is due to the Frameworks being a beta software as of writing.
EDIT 2:
Above mitigation however doesn't solve it if you have CHHapticEventTypeAudioCustom
I managed to solve it using these codes below:
__strong static CHHapticEngine *engine;
engine = [[CHHapticEngine alloc] initAndReturnError:nil];
[engine startAndReturnError:nil];
[engine playPatternFromURL:[NSURL fileURLWithPath:#"/path/to/pattern.ahap"] error:nil];
[engine notifyWhenPlayersFinished:^CHHapticEngineFinishedAction(NSError * _Nullable error) {
[engine stopWithCompletionHandler:nil];
return CHHapticEngineFinishedActionStopEngine;
}];
It seems like I needs to observe whenever the pattern stopped playing and stop the engine (not due to the framework being a beta software, my bad). However, for the method playPatternFromURL:error:, quoting from Apple Doc:
This method blocks processing on the current thread until the pattern
has finished playing.
doesn't seems to means what it means, at least to my understanding. That's why calling stopWithCompletionHandler: right after playPatternFromURL:error: failed to trigger any haptics.
Solution:
engine.playsHapticsOnly = YES;
I am using AVPlayer to stream audio. Before beginning to play from a URL, I call
[self.currentlyLoadingAsset loadValuesAsynchronouslyForKeys:#[#"playable", #"duration"] completionHandler:^{
[self evaluateAssetStatusForEpisode:episode asset:self.currentlyLoadingAsset];
}];
Only when that completionHandler executes will I reevaluate the item to see if it's ready to play.
But for one user, the completionHandler suddenly is not firing. He cannot stream audio from the web (but can still play downloaded tracks). Logging
AVKeyValueStatus status = [asset statusOfValueForKey:#"playable" error:&error];
, I find that the status is continually AVKeyValueStatusLoading. It never updates, but AVKeyValueStatusLoading never returns an error.
I have tried putting loadValuesAsynchronouslyForKeys on the main thread, removing the duration key, and making currentlyLoadingAsset a property to ensure it's not getting released.
I'm using AVQueuePlayer to do playback of a list of videos in our app. I'm trying to cache the videos that are played by AVQueuePlayer so that the video doesn't have to be downloaded each time.
So, after the video finishes playing, I attempt to save the AVPlayerItem into disk so next time it's initialized with a local URL.
I've tried to achieve this through two approaches:
Use AVAssetExportSession
Use AVAssetReader and AVAssetWriter.
1) AVAssetExportSession approach
This approach works like this:
Observe when an AVPlayerItem finishes playing using AVPlayerItemDidPlayToEndTimeNotification.
When the above notification is received (a video finished playing, so it's fully downloaded), use AVAssetExportSession to export that video into a file in disk.
The code:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
then
- (void)videoDidFinishPlaying:(NSNotification*)note
{
AVPlayerItem *itemToSave = [note object];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:itemToSave.asset presetName:AVAssetExportPresetHighestQuality];
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [NSURL fileURLWithPath:#"/path/to/Documents/video.mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch(exportSession.status){
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting...");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export completed, wohooo!!");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Waiting...");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed with error: %#", exportSession.error);
break;
}
}
The result in console of that code is:
Failed with error: Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x98916a0 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x99ddd10 "The operation couldn’t be completed. (OSStatus error -12109.)", NSLocalizedFailureReason=An unknown error occurred (-12109)}
2) AVAssetReader, AVAssetWriter approach
The code:
- (void)savePlayerItem:(AVPlayerItem*)item
{
NSError *assetReaderError = nil;
AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:assetToCache error:&assetReaderError];
//(algorithm continues)
}
That code throws an exception when trying to alloc/init the AVAssetReader with the following information:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReader initWithAsset:error:] Cannot initialize an instance of AVAssetReader with an asset at non-local URL 'https://someserver.com/video1.mp4''
***
Any help is much appreciated.
How I ended up manage to do this is to create the usually array of AVPlayer > AVPlayerItem > AVURLAsset. But also at the same time create an AVExportSession right away before AVPlayer have completely loaded the video from the remote URL.
The AVExportSession actually will export slowly as the AVPlayer tries to pre-buffer the URL. This is irregardless of whether the user starts playing the AVPlayer or not. However the output file will not actually complete until the entire URL is pre-buffered. At that point AVExportSession will callback with the completion. This is also completely independent of whether the user is playing, or have completed playing the video or not.
As such by creating the AVPlayer/AVExportSession combo early, it became my pre-buffer mechanism
-- Update 2018-01-10 --
Want to add that after having deployed the above mechanism. We have encountered 1 big caveat that I think deserves some mention.
You cannot have too many video pipelines in memory (AVPlayer with AVPlayerItem/AVAsset hooked up). AVFoundation will refusing to play. So do use the above mechanism to pre-buffer. But once the video is downloaded to a file, if the user is not viewing the video, dealloc the AVPlayer/AVAsset. Reallocate an AVPlayer when the user decides to play the video again, this time with the AVURLAsset pointed to your local buffered copy of the video.
I'm working on a plugin for Apache Cordova that will allow audio streaming from a remote URL through the Media API.
The issue that I'm experiencing is that I get an EXC_BAD_ACCESS signal whenever I try to access certain properties on the AVPlayer instance. currentTime and isPlaying are the worst offenders. The player will be playing sound through the speakers, but as soon as my code reaches a player.currentTime or [player currentTime] it throws the bad access signal.
[player play];
double position = round([player duration] * 1000) / 1000;
[player currentTime]; //This will cause the signal
I am using ARC, so I'm not releasing anything that shouldn't be.
Edit:
Everything that I've done has been hacking around on the Cordova 3 CDVSound class as a proof of concept for actual streaming on iOS.
The original code can be found here: https://github.com/apache/cordova-plugin-media/tree/master/src/ios
My code can be found here:
CDVSound.h
CDVSound.m
The method startPlayingAudio will trip an exc_bad_access at line 346. Removing line 346 will cause audio to play, but it will trip a bad access later down the road when getCurrentPositionAudio and line 532 is called.
Edit / Solution
So it turns out that the best way to handle this is to use a AVPlayerItem and then access it with player.currentItem.currentTime. The real question then becomes, why isn't this behavior documented with AVPlayer and why does it behave like this?
I am developing an online radio app for iOS6 devices. Ive looked for various wrappers to achieve this task. AVPlayer, MPMoviePlayerController etc.
I tried using AVPlayer as it sounds more correct to use it for my purpose as it is audio only application. But soon I came across this problem : Here
Therefore I switched to MPMoviePlayerController and this is what Im trying to do :
pPlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://launch.fusionradio.fm:8004"]];
pPlayer.movieSourceType = MPMovieSourceTypeStreaming;
pPlayer.view.hidden = YES;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
[pPlayer prepareToPlay];
[pPlayer play];
pPlayer.shouldAutoplay = YES;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(StreamStateChanged) name:MPMoviePlayerLoadStateDidChangeNotification object:pPlayer];
In my StreamStateChanged method Im doing :
NSLog(#"Trying to replay");
[pPlayer pause];
[pPlayer play];
pPlayer is MPMoviePlayer. Everything is fine except when there is an interrupt Console spits out the following :
Took background task assertion (1) for playback stall.
Ending background task assertion (1) for playback stall.
The number after assertion keeps increasing. and then it recovers from it once the internet connection is stable.
My question is : Is this approach correct? Am I doing something wrong along the way? And Is it ok to ignore that assert message?.
P.S : Please suggest if there is a better approach for developing radio streaming app using different API as opposed to MPMoviePlayerController
Thank you :)
You are entirely correct in ignoring those internal assert messages. There is nothing you can do about them.