I'm using AVQueuePlayer to do playback of a list of videos in our app. I'm trying to cache the videos that are played by AVQueuePlayer so that the video doesn't have to be downloaded each time.
So, after the video finishes playing, I attempt to save the AVPlayerItem into disk so next time it's initialized with a local URL.
I've tried to achieve this through two approaches:
Use AVAssetExportSession
Use AVAssetReader and AVAssetWriter.
1) AVAssetExportSession approach
This approach works like this:
Observe when an AVPlayerItem finishes playing using AVPlayerItemDidPlayToEndTimeNotification.
When the above notification is received (a video finished playing, so it's fully downloaded), use AVAssetExportSession to export that video into a file in disk.
The code:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
then
- (void)videoDidFinishPlaying:(NSNotification*)note
{
AVPlayerItem *itemToSave = [note object];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:itemToSave.asset presetName:AVAssetExportPresetHighestQuality];
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [NSURL fileURLWithPath:#"/path/to/Documents/video.mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch(exportSession.status){
case AVAssetExportSessionStatusExporting:
NSLog(#"Exporting...");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export completed, wohooo!!");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(#"Waiting...");
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed with error: %#", exportSession.error);
break;
}
}
The result in console of that code is:
Failed with error: Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x98916a0 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x99ddd10 "The operation couldn’t be completed. (OSStatus error -12109.)", NSLocalizedFailureReason=An unknown error occurred (-12109)}
2) AVAssetReader, AVAssetWriter approach
The code:
- (void)savePlayerItem:(AVPlayerItem*)item
{
NSError *assetReaderError = nil;
AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:assetToCache error:&assetReaderError];
//(algorithm continues)
}
That code throws an exception when trying to alloc/init the AVAssetReader with the following information:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetReader initWithAsset:error:] Cannot initialize an instance of AVAssetReader with an asset at non-local URL 'https://someserver.com/video1.mp4''
***
Any help is much appreciated.
How I ended up manage to do this is to create the usually array of AVPlayer > AVPlayerItem > AVURLAsset. But also at the same time create an AVExportSession right away before AVPlayer have completely loaded the video from the remote URL.
The AVExportSession actually will export slowly as the AVPlayer tries to pre-buffer the URL. This is irregardless of whether the user starts playing the AVPlayer or not. However the output file will not actually complete until the entire URL is pre-buffered. At that point AVExportSession will callback with the completion. This is also completely independent of whether the user is playing, or have completed playing the video or not.
As such by creating the AVPlayer/AVExportSession combo early, it became my pre-buffer mechanism
-- Update 2018-01-10 --
Want to add that after having deployed the above mechanism. We have encountered 1 big caveat that I think deserves some mention.
You cannot have too many video pipelines in memory (AVPlayer with AVPlayerItem/AVAsset hooked up). AVFoundation will refusing to play. So do use the above mechanism to pre-buffer. But once the video is downloaded to a file, if the user is not viewing the video, dealloc the AVPlayer/AVAsset. Reallocate an AVPlayer when the user decides to play the video again, this time with the AVURLAsset pointed to your local buffered copy of the video.
Related
I am working on a custom video player using AVPlayer. I load videos stored on the local file system in the Cache folder (NSCache). I initialize the player like this:
self.playerItem = [[AVPlayerItem alloc] initWithURL:self.localURL];
[self.playerItem addObserver:self forKeyPath:NSStringFromSelector(#selector(status)) options:NSKeyValueObservingOptionInitial context:nil];
self.avPlayer = [[AVPlayer alloc]initWithPlayerItem:self.playerItem];
[self.avPlayer addObserver:self forKeyPath:NSStringFromSelector(#selector(status)) options:NSKeyValueObservingOptionInitial context:nil];
This generally works fine. However I have frequent fails on the status of the AVPlayerItem with this error:
NSLocalizedDescription = "The operation could not be completed";
NSLocalizedFailureReason = "An unknown error occurred (-12983)";
NSUnderlyingError = "Error Domain=NSOSStatusErrorDomain Code=-12983
The strange thing is that the same URLs which fail sometimes work just shortly after and before such fails. I would say every 10th load attempt fails. I can't figure out what causes this or where to look for answers. A search for the error code turned up empty for me. Any help or pointers are highly appreciated.
After a lengthy hunt, I was able to track down the source of the issue. The problem was an undocumented limit on the number of AVPlayer items which can exist concurrently. If there are too many instances videos can no longer be loaded failing with the AVPlayerItem error -12983.
Other people seem to have run into the same issue as well: AVPlayerItemStatusFailed error when too many AVPlayer instances created. I solved the issue by reusing player instances and making sure that there are not too many active ones at the same time.
I am using AVPlayer to stream audio. Before beginning to play from a URL, I call
[self.currentlyLoadingAsset loadValuesAsynchronouslyForKeys:#[#"playable", #"duration"] completionHandler:^{
[self evaluateAssetStatusForEpisode:episode asset:self.currentlyLoadingAsset];
}];
Only when that completionHandler executes will I reevaluate the item to see if it's ready to play.
But for one user, the completionHandler suddenly is not firing. He cannot stream audio from the web (but can still play downloaded tracks). Logging
AVKeyValueStatus status = [asset statusOfValueForKey:#"playable" error:&error];
, I find that the status is continually AVKeyValueStatusLoading. It never updates, but AVKeyValueStatusLoading never returns an error.
I have tried putting loadValuesAsynchronouslyForKeys on the main thread, removing the duration key, and making currentlyLoadingAsset a property to ensure it's not getting released.
While recording a video using AVFoundation's - (void)startRecordingToOutputFileURL:(NSURL*)outputFileURL recordingDelegate:(id<AVCaptureFileOutputRecordingDelegate>)delegate; method, if the video duration is more than 12 seconds, there is no audio track in the output file. Everything works fine, if the video duration is less than 12 seconds...
Delegate in which the output file URL is received is:
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"AUDIO %#", [[AVAsset assetWithURL:outputFileURL] tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]); //App crashes here...
NSLog(#"VIDEO %#", [[AVAsset assetWithURL:outputFileURL] tracksWithMediaType:AVMediaTypeVideo]);
}
My app crashes for a video that is longer than 12 seconds with this error: *** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array'
My guess is that AVCaptureMovieFileOutput has better support for QuickTime containers (.qt, .mov) than for mp4 although it is the industry standard. For instance when writing a movie file in fragments to an .mp4, something probably happens to the fragment table (sample table).
So you could either change the file format to .mov or turn of writing the file in fragments. See this question:
ios-8-ipad-avcapturemoviefileoutput-drops-loses-never-gets-audio-track-after
Spent almost 1 day to solve this & This is the perfect solution for this...
After a lot got help from iOS 8 iPad AVCaptureMovieFileOutput drops / loses / never gets audio track after 13 - 14 seconds of recording ...
Just add this line
avCaptureMovieFileOutput.movieFragmentInterval = kCMTimeInvalid
Just changed the extension of the path to which the video is being recorded to mov from mp4 and it worked...
I am using the MPMoviePlayerController to play an audio stream. To verify that there isn't a problem with playback, I set a movie playback error timer and I implement moviePreloadDidFinish. When moviePreloadDidFinish is called, I check the loadState for MPMovieLoadStatePlaythroughOK. If it is not called and my timer expires, I assume the download has failed.
- (void) moviePreloadDidFinish:(NSNotification*)notification
{
if (self.moviePlayer.loadState & MPMovieLoadStatePlaythroughOK) {
NSLog(#"The movie or mp3 finished loading and will now start playing");
// cancel movie playback error timer.
}
}
Occasionally, I do not receive this notification, yet audio keeps playing until my movie playback error timer expires (30 seconds). Does the absence of this moviePreloadDidFinish imply that the download of the audio stream is going to fail soon? If not, is there a better way to programmatically determine that there is a playback problem?
I'm working on video editing application for iphone/ipod touch. My app simply asks user to choose one of already existing videos in the camera roll directory, then frame by frame changes pixel values and saves again under different name in the camera roll directory. Because video processing might be quite long I really need to implement some kind of functionality to resume previously started session(ie. when video processing reaches 30% of total video length and user presses down the home button(or there is a phone call) when my application is brought back to the foreground video processing should start from 30%, not from beginning).
Most important parts of my code(simplified a bit to be more readable):
AVAssetReader* assetReader = [[AVAssetReader alloc] initWithAsset:sourceAsset error:&error];
NSArray* videoTracks = [sourceAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack* videoTrack = [videoTracks objectAtIndex:0];
AVAssetReaderTrackOutput* assetReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack
[assetReader addOutput:assetReaderOutput]);
[assetReader addOutput:assetReaderOutput];
[mVideoWriter startWriting]
[mAssetReader startReading]
[mVideoWriter startSessionAtSourceTime: kCMTimeZero];
mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
while (mWriterInput.readyForMoreMediaData) {
CMSampleBufferRef nextBuffer = [mAssetReaderOutput copyNextSampleBuffer];
if (nextBuffer) {
// frame processing here
} // if
} // while
}]; // block
// when done saving my video to camera roll directory
I'm listening to my app's delegate callback methods like applicationWillResignActive, applicationWillEnterForeground but I'm not sure how to handle them in proper way. What I've tried already:
1) [AVAssetWriter start/endSessionAtSourceTime], unfortunately stopping and starting in the last frame presentation time did not work for me
2) saving "by hand" every part of movie that was processed and when processing reaches 100% merging all of them using AVMutableComposition, this solution however sometimes crashes in dispatch queue
Would be really great if someone could give me some hints how it should be done correctly...
I'm pretty sure AVAssetWriter can't append, so something along the lines of 2), saving the pieces then merging them, is probably the best solution if you must make your export restartable.
But first you have to resolve that crash.
Also, before you start creating hundreds of movie pieces however, you should have a look AVAssetWriter.movieFragmentInterval as with careful management of presentation time stamps/durations you may be able to use it minimise the number of pieces you have to merge.
Have you tried -[UIApplication beginBackgroundTaskWithExpirationHandler:]? This seems like a good place to request extra operating time in the background to finish the heavy lifting.