iOS - How can i get the playable duration of AVPlayer - ios

The MPMoviePlayerController has a property called playableDuration.
playableDuration The amount of currently playable content (read-only).
#property (nonatomic, readonly) NSTimeInterval playableDuration
For progressively downloaded network content, this property reflects
the amount of content that can be played now.
Is there something similar for AVPlayer?
I can't find anything in the Apple Docs or Google (not even here at Stackoverflow.com)
Thanks in advance.

playableDuration can be roughly implemented by following procedure:
- (NSTimeInterval) playableDuration
{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = _moviePlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
double startTime = CMTimeGetSeconds(aTimeRange.start);
double loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
// FIXME: shoule we sum up all sections to have a total playable duration,
// or we just use first section as whole?
NSLog(#"get time range, its start is %f seconds, its duration is %f seconds.", startTime, loadedDuration);
return (NSTimeInterval)(startTime + loadedDuration);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}
_moviePlayer is your AVPlayer instance, by checking AVPlayerItem's loadedTimeRanges, you can compute a estimated playableDuration.
For videos that has only 1 secion, you can use this procedure; but for multi-section video, you may want to check all time ranges in array of loadedTimeRagnes to get correct answer.

all you need is
self.player.currentItem.asset.duration
simply best

Building on John's Answer…
This is the apparent default behavior of Apple players: "Show the Max Time of the playable range that encloses the current time"
- (NSTimeInterval)currentItemPlayableDuration{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = self.audioPlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTime currentTime = self.audioPlayer.currentTime;
__block CMTimeRange aTimeRange;
[timeRangeArray enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
if(CMTimeRangeContainsTime(aTimeRange, currentTime))
*stop = YES;
}];
CMTime maxTime = CMTimeRangeGetEnd(aTimeRange);
return CMTimeGetSeconds(maxTime);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}

You will have to detect when the AVPlayer is ready to play your media file.
Let me know, if you don't know how to do this.
However, once the media file is loaded, you can use this method:
#import <AVFoundation/AVFoundation.h>
/**
* Get the duration for the currently set AVPlayer's item.
*/
- (CMTime)playerItemDuration {
AVPlayerItem *playerItem = [mPlayer currentItem];
if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
return [[playerItem asset] duration];
}
return(kCMTimeInvalid);
}
When you use this method its important to understand (because you're streaming content) that the length value may be invalid or something. So you must check this before using it for processing.
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration)) {
return;
}
double duration = CMTimeGetSeconds(playerDuration);

Swift version of closes playable duration:
var playableDuration: TimeInterval? {
guard let currentItem = currentItem else { return nil }
guard currentItem.status == .readyToPlay else { return nil }
let timeRangeArray = currentItem.loadedTimeRanges
let currentTime = self.currentTime()
for value in timeRangeArray {
let timeRange = value.timeRangeValue
if CMTimeRangeContainsTime(timeRange, currentTime) {
return CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange))
}
}
guard let timeRange = timeRangeArray.first?.timeRangeValue else { return 0}
let startTime = CMTimeGetSeconds(timeRange.start)
let loadedDuration = CMTimeGetSeconds(timeRange.duration)
return startTime + loadedDuration
}

Related

Play HLS video clips from URL in AVplayer when multiple startTime's and EndTime's are received in JSON for multiple clips for the same video

I calculated the duration to stop the clip at the received end time. Start time works fine but it never stops at end time .
- (CMTime)playerItemDuration
{
AVPlayerItem *playerItem = self.currentVideoVC.player.currentItem;
if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
CGFloat duration = [self.currentVideoVC.currentPlay.endTime floatValue] - [self.currentVideoVC.currentPlay.startTime floatValue];
playerItem = nil;
return CMTimeMakeWithSeconds(duration, 1);
}
return(kCMTimeInvalid);
}
- (void)seekToTimeValue:(CGFloat)timeInSeconds
{
[self.currentVideoVC.player seekToTime:CMTimeMakeWithSeconds(timeInSeconds, NSEC_PER_SEC) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
[self.scrubberView updateCurrentTime:timeInSeconds];
}

How to save the position of the audio file? objective -C

How can I save time when audio was stopped in session and continue playback from the stop point in next session?
My code:
- (void)initPlayer:(NSString*) audioFile fileExtension:(NSString*)fileExtension
{
NSURL *audioFileLocationURL = [[NSBundle mainBundle] URLForResource:audioFile withExtension:fileExtension];
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileLocationURL error:&error];
if ([audioFile isEqualToString:#"2"]) {
_index = 1;
}
else if ([audioFile isEqualToString:#"3"]) {
_index = 2;
}
[self song];
}
- (void)playAudio {
[self.audioPlayer play];
}
- (void)pauseAudio {
[self.audioPlayer pause];
}
- (BOOL)isPlaying {
return [self.audioPlayer isPlaying];
}
-(NSString*)timeFormat:(float)value{
float minutes = floor(lroundf(value)/60);
float seconds = lroundf(value) - (minutes * 60);
int roundedSeconds = lroundf(seconds);
int roundedMinutes = lroundf(minutes);
NSString *time = [[NSString alloc]
initWithFormat:#"%d:%02d",
roundedMinutes, roundedSeconds];
return time;
}
- (void)setCurrentAudioTime:(float)value {
[self.audioPlayer setCurrentTime:value];
}
- (NSTimeInterval)getCurrentAudioTime {
return [self.audioPlayer currentTime];
}
- (float)getAudioDuration {
return [self.audioPlayer duration];
}
You can use AVPlayer's currentTime property. It returns the playback time of the current AVPlayerItem.
To restore the playback time in the next session, you can pass the stored time to AVPlayer's seekToTime:
[self.player seekToTime:storedPlaybackTime];
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/doc/uid/TP40009530-CH1-SW2
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVPlayer_Class/index.html#//apple_ref/occ/instm/AVPlayer/seekToTime%3a
To persist the CMTime returned by currentTime, you can use the AVFoundation convenience methods provided by NSValue.
To wrap CMTime in an NSValue, use valueWithCMTime:
[NSValue valueWithCMTime:player.currentTime];
To get an CMTime struct from the persisted value, use:
CMTime persistedTime = [storeValue CMTimeValue];
After you wrapped the CMTime struct in a NSValue instance, you can use keyed archiver & NSData to write the time to disk.
NSHipster has a good article about that topic:http://nshipster.com/nscoding/
The most easy way will be to keep a local db with the song name and when it is stopped add that data to the db. Then when the playback resumes later check the local db first if it has any entries in the past. If not continue from starting.
Also make sure that there is no entry made if the song finishes.
Hope this idea helps you...

Slow buffering when streaming multiple remote videos with AVPlayer and AVMutableComposition

[Edit: I was able to figure out a workaround for this, see below.]
I'm trying to stream multiple remote MP4 clips from S3, playing them in a sequence as one continuous video (to enable scrubbing within and between clips) with no stuttering, without explicitly downloading them to the device first. However, I find that the clips buffer very slowly (even on a fast network connection) and have been unable to find an adequate way to address that.
I've been trying to use AVPlayer for this, since AVPlayer with AVMutableComposition plays the supplied video tracks as one continuous track (unlike AVQueuePlayer, which I gather plays each video separately and thus doesn't support continuous scrubbing between the clips).
When I stick one of the assets directly into an AVPlayerItem and play that (with no AVMutableComposition), it buffers fast. But using AVMutableComposition, the video starts stuttering very badly on the second clip (my test case has 6 clips, each around 6 seconds), while the audio keeps going. After it plays through once, it plays perfectly smoothly if I rewind to the beginning, so I assume the problem lies in the buffering.
My current attempt to fix this problem feels convoluted, given that this seems like a rather basic use-case for AVPlayer - I do hope there's a simpler solution to all this that works properly. Somehow I doubt that the buffering player I use below is really necessary, but I'm running out of ideas.
Here's the main code that sets up the AVMutableComposition:
// Build an AVAsset for each of the source URIs
- (void)prepareAssetsForSources:(NSArray *)sources
{
NSMutableArray *assets = [[NSMutableArray alloc] init]; // the assets to be used in the AVMutableComposition
NSMutableArray *offsets = [[NSMutableArray alloc] init]; // for tracking buffering progress
CMTime currentOffset = kCMTimeZero;
for (NSDictionary* source in sources) {
bool isNetwork = [RCTConvert BOOL:[source objectForKey:#"isNetwork"]];
bool isAsset = [RCTConvert BOOL:[source objectForKey:#"isAsset"]];
NSString *uri = [source objectForKey:#"uri"];
NSString *type = [source objectForKey:#"type"];
NSURL *url = isNetwork ?
[NSURL URLWithString:uri] :
[[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:uri ofType:type]];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
currentOffset = CMTimeAdd(currentOffset, asset.duration);
[assets addObject:asset];
[offsets addObject:[NSNumber numberWithFloat:CMTimeGetSeconds(currentOffset)]];
}
_clipAssets = assets;
_clipEndOffsets = offsets;
}
// Called with _clipAssets
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
[composition insertTimeRange:editRange
ofAsset:asset
atTime:composition.duration
error:&editError];
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem; // this is used to initialize the main player
}
My initial thought was: Since it buffers fast with a vanilla AVPlayerItem, why not maintain a separate buffering player that's loaded with each asset in turn (with no AVMutableComposition) to buffer the assets for the main player?
- (void)startBufferingClips
{
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
_currentlyBufferingIndex = 0;
}
// called every 250 msecs via an addPeriodicTimeObserverForInterval on the main player
- (void)updateBufferingProgress
{
// If the playable (loaded) range is within 100 milliseconds of the clip
// currently being buffered, load the next clip into the buffering player.
float playableDuration = [[self calculateBufferedDuration] floatValue];
CMTime totalDurationTime = [self playerItemDuration :_bufferingPlayer];
Float64 totalDurationSeconds = CMTimeGetSeconds(totalDurationTime);
bool bufferingComplete = totalDurationSeconds - playableDuration < 0.1;
float bufferedSeconds = [self bufferedSeconds :playableDuration];
float playerTimeSeconds = CMTimeGetSeconds([_player currentTime]);
__block NSUInteger playingClipIndex = 0;
// find the index of _player's currently playing clip
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (playerTimeSeconds < [offset floatValue]) {
playingClipIndex = idx;
*stop = YES;
}
}];
// TODO: if bufferedSeconds - playerTimeSeconds <= 0, pause the main player
if (bufferingComplete && _currentlyBufferingIndex < [_clipAssets count] - 1) {
// We're done buffering this clip, load the buffering player with the next asset
_currentlyBufferingIndex += 1;
_bufferingPlayerItem = [AVPlayerItem playerItemWithAsset:_clipAssets[_currentlyBufferingIndex]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayer = [AVPlayer playerWithPlayerItem:_bufferingPlayerItem];
}
}
- (float)bufferedSeconds:(float)playableDuration {
__block float seconds = 0.0; // total duration of clips already buffered
if (_currentlyBufferingIndex > 0) {
[_clipEndOffsets enumerateObjectsUsingBlock:^(id offset, NSUInteger idx, BOOL *stop) {
if (idx + 1 >= _currentlyBufferingIndex) {
seconds = [offset floatValue];
*stop = YES;
}
}];
}
return seconds + playableDuration;
}
- (NSNumber *)calculateBufferedDuration {
AVPlayerItem *video = _bufferingPlayer.currentItem;
if (video.status == AVPlayerItemStatusReadyToPlay) {
__block float longestPlayableRangeSeconds;
[video.loadedTimeRanges enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
CMTimeRange timeRange = [obj CMTimeRangeValue];
float seconds = CMTimeGetSeconds(CMTimeRangeGetEnd(timeRange));
if (seconds > 0.1) {
if (!longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
} else if (seconds > longestPlayableRangeSeconds) {
longestPlayableRangeSeconds = seconds;
}
}
}];
Float64 playableDuration = longestPlayableRangeSeconds;
if (playableDuration && playableDuration > 0) {
return [NSNumber numberWithFloat:longestPlayableRangeSeconds];
}
}
return [NSNumber numberWithInteger:0];
}
It initially seemed that this worked like a charm, but then I switched to another set of test clips and then the buffering was very slow again (the buffering player helped, but not enough). It seems like the loadedTimeRanges for the assets as loaded into the buffering player didn't match the loadedTimeRanges for the same assets inside the AVMutableComposition: Even after the loadedTimeRanges for each item loaded into the buffering player indicated that the whole asset had been buffered, the main player's video continued stuttering (while the audio played seamlessly to the end). Again, the playback was seamless after rewinding once the main player had run through all the clips once.
I hope the answer to this, whatever it is, will prove useful as a starting point for other iOS developers trying to implement this basic use-case. Thanks!
Edit: Since I posted this question, I made the following workaround for this. Hopefully this will save whoever runs into this some headache.
What I ended up doing was maintaining two buffering players (both AVPlayers) that started buffering the first two clips, moving on to the lowest-indexed unbuffered clip after their loadedTimeRanges indicated that buffering for their current clip was done. I made the logic pause/unpause playback based on the clips currently buffered, and the loadedTimeRanges of the buffering players, plus a small margin. This needed a few bookkeeping variables, but wasn't too complicated.
This is how the buffering players were initialized (I'm omitting the bookkeeping logic here):
- (void)startBufferingClips
{
_bufferingPlayerItemA = [AVPlayerItem playerItemWithAsset:_clipAssets[0]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerA = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemA];
_currentlyBufferingIndexA = [NSNumber numberWithInt:0];
if ([_clipAssets count] > 1) {
_bufferingPlayerItemB = [AVPlayerItem playerItemWithAsset:_clipAssets[1]
automaticallyLoadedAssetKeys:#[#"tracks"]];
_bufferingPlayerB = [AVPlayer playerWithPlayerItem:_bufferingPlayerItemB];
_currentlyBufferingIndexB = [NSNumber numberWithInt:1];
_nextIndexToBuffer = [NSNumber numberWithInt:2];
} else {
_nextIndexToBuffer = [NSNumber numberWithInt:1];
}
}
In addition, I needed to make sure that the video and audio tracks weren't being merged as they were added to AVMutableComposition, as this apparently interfered with the buffering (perhaps they didn't register as the same video/audio tracks as those the buffering players were loading, and thus didn't receive the new data). Here's the code where the AVMutableComposition is built from an array of NSAssets:
- (AVPlayerItem*)playerItemForAssets:(NSMutableArray *)assets
{
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime timeOffset = kCMTimeZero;
for (AVAsset* asset in assets) {
CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), asset.duration);
NSError *editError;
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
if ([videoTracks count] > 0) {
AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];
[compVideoTrack insertTimeRange:editRange
ofTrack:videoTrack
atTime:timeOffset
error:&editError];
}
if ([audioTracks count] > 0) {
AVAssetTrack *audioTrack = [audioTracks objectAtIndex:0];
[compAudioTrack insertTimeRange:editRange
ofTrack:audioTrack
atTime:timeOffset
error:&editError];
}
if ([videoTracks count] > 0 || [audioTracks count] > 0) {
timeOffset = CMTimeAdd(timeOffset, asset.duration);
}
}
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:composition];
return playerItem;
}
With this approach, buffering while using AVMutableComposition for the main player works nice and fast, at least in my setup.

How to get video full duration and current playing time?

I need to create a custom video plugin using swift. But I don't know how to get video full duration and current playing time. In my console just appeared this output, C.CMTime. I'm not sure what wrong with my code.
My code
let url = NSBundle.mainBundle().URLForResource("Video", withExtension:"mp4")
let asset = AVURLAsset(URL:url, options:nil)
let duration: CMTime = asset.duration
println(duration)
You can use CMTimeGetSeconds to converts a CMTime to seconds.
let durationTime = CMTimeGetSeconds(duration)
Use ios Objective c concept
- (NSTimeInterval) playableDuration
{
// use loadedTimeRanges to compute playableDuration.
AVPlayerItem * item = _moviePlayer.currentItem;
if (item.status == AVPlayerItemStatusReadyToPlay) {
NSArray * timeRangeArray = item.loadedTimeRanges;
CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
double startTime = CMTimeGetSeconds(aTimeRange.start);
double loadedDuration = CMTimeGetSeconds(aTimeRange.duration);
// FIXME: shoule we sum up all sections to have a total playable duration,
// or we just use first section as whole?
NSLog(#"get time range, its start is %f seconds, its duration is %f seconds.", startTime, loadedDuration);
return (NSTimeInterval)(startTime + loadedDuration);
}
else
{
return(CMTimeGetSeconds(kCMTimeInvalid));
}
}

how to get the remaining time of the song which is played in AVAudioPlayer

I am new to iphone. I am working on audio player. I have to show the current time and remaining time of the song in audioplayer. In video player it will gets as default but in audioplayer it is not getting so that i write a logic for getting current time of the song. The code below is for that
int minutes = (int)audioPlayer.currentTime / 60;
int seconds = (int)audioPlayer.currentTime % 60;
startDurationLabel.text = [NSString stringWithFormat:#"%d:%02d",minutes,seconds];
here audioPlayer is instance of AVAudioPlayer and startDurationLabel is the label for display the current time of the song.
But I am struggling to get this logic to work and show the remaining time of the song
If any body know this please help me...
Try This
CGFloat remainingTime = audioPlayer.duration - audioPlayer.currentTime
NSTimeInterval remaining = audioPlayer.duration - audioPlayer.currentTime;
Try this -
NSString *strTimeLeft = [self getTimeFromTimeInterval:CMTimeGetSeconds(_player.currentItem.duration) - CMTimeGetSeconds(_player.currentTime)];
Add this method in your class
- (NSString*)getTimeFromTimeInterval:(NSTimeInterval)timeInterval
{
NSInteger interval = (NSInteger)timeInterval;
NSInteger seconds = interval%60;
NSInteger minutes = (interval/ 60)%60;
//NSInteger hr = (interval/3600)%60;
NSString *strTime = [NSString stringWithFormat:#"%02d:%02d",minutes,seconds];
return strTime;
}

Resources