AVPlayer seekToTime: backward not working - ios

I have the following code:
AVPlayerItem *currentItem = [AVPlayerItem playerItemWithURL:soundURL];
[self.audioPlayer replaceCurrentItemWithPlayerItem:currentItem];
[self.audioPlayer play];
where soundURL is a remoteURL. It works fine. The AVPlayer plays the music perfectly. I have a progress bar and i am updating it based on the current time of the player.
Everything works fine. My issue is when i drag the progress bar forward the audioplayer starts from the new location but if i drag the progressbar it doesn't start from the new location in fact it resumes from the previous location. Here is my progress bar drag start and stop code:
- (IBAction)progressBarDraggingStart:(id)sender
{
if (self.audioPlayer.rate != 0.0)
{
[self.audioPlayer pause];
}
}
- (IBAction)progressBarDraggindStop:(id)sender
{
CMTime newTime = CMTimeMakeWithSeconds(self.progressBar.value, 1);
[self.audioPlayer seekToTime:newTime];
[self.audioPlayer play];
}
Can anyone help me fix this issue?

I suggest doing a couple of things. First, get the timescale value and pass it to the CMTime struct. Second, use the seekToTime:toleranceBefore:toleranceAfter:completionHandler: method for more accurate seeking. For example, your code would look like:
- (IBAction)progressBarDraggindStop:(id)sender {
int32_t timeScale = self.audioPlayer.currentItem.asset.duration.timescale;
[self.audioPlayer seekToTime: CMTimeMakeWithSeconds(self.progressBar.value, timeScale)
toleranceBefore: kCMTimeZero
toleranceAfter: kCMTimeZero
completionHandler: ^(BOOL finished) {
[self.audioPlayer play];
}];
}

I am using below code for dragging- Added completionHandler after #Corey's answer and it works great without any web-service dependency:
- (void) sliderValueChanged:(id)sender {
if ([sender isKindOfClass:[UISlider class]]) {
UISlider *slider = sender;
CMTime playerDuration = self.avPlayer.currentItem.duration;
if (CMTIME_IS_INVALID(playerDuration)) {
return;
}
double duration = CMTimeGetSeconds(playerDuration);
if (isfinite(duration)) {
float minValue = [slider minimumValue];
float maxValue = [slider maximumValue];
float value = [slider value];
double time = duration * (value - minValue) / (maxValue - minValue);
[self.avPlayer seekToTime:CMTimeMakeWithSeconds(time, NSEC_PER_SEC) toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero completionHandler:^(BOOL finished) {
[self.avPlayer play];
}];
}
}
}

Related

Avplayer video playing Method

I am working on App in which I want to display current playing time and total time of video. I got the total time. Now for showing current playing time Which method will get called. Can anyone help? I have used avplayer. This is the code:
- (void)viewWillDisappear:(BOOL)animated
{
[super viewWillDisappear:animated];
[self.avplayer pause];
self.avplayer = [AVQueuePlayer playerWithURL:[NSURL URLWithString:#""]];
self.avplayer = nil;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:NO];
AVPlayerItem *currentItem = self.avplayer.currentItem;
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
NSUInteger durationSeconds = (long)CMTimeGetSeconds(duration);
NSUInteger minutes = floor(durationSeconds % 3600 / 60);
NSUInteger seconds = floor(durationSeconds % 3600 % 60);
NSString *time = [NSString stringWithFormat:#"%02ld:%02ld", (unsigned long)minutes, (unsigned long)seconds];
NSLog(#"Time|%#", time);
lblTotaltime.text = time;
NSUInteger durationSeconds1 = (long)CMTimeGetSeconds(currentTime);
NSUInteger minutes1 = floor(durationSeconds1 % 3600 / 60);
NSUInteger seconds1 = floor(durationSeconds1 % 3600 % 60);
NSString *time1 = [NSString stringWithFormat:#"%02ld:%02ld", (unsigned long)minutes1, (unsigned long)seconds1];
NSLog(#"Time|%#", time1);
lblRemaningTime.text = time1;
}
#pragma mark PlayerMethods
- (void)itemDidFinishPlaying:(NSNotification *)notification {
AVPlayerItem *player = [notification object];
[player seekToTime:kCMTimeZero];
}
- (void)playerItemDidReachEnd:(NSNotification *)notification {
AVPlayerItem *p = [notification object];
[p seekToTime:CMTimeMake(0, 3)];
}
- (void)playerStartPlaying
{
[self.avplayer play];
}
You can get the current played time by using currentItem property using AVPlayerItem
AVPlayerItem *getcurrentItem = yourAVPlayerName.currentItem;
for get total Duration
CMTime fullDuration = getcurrentItem.duration;
for get current Time
CMTime playercurrentTime = getcurrentItem.currentTime;
alternate
NSTimeInterval playercurrentTime = CMTimeGetSeconds(getcurrentItem.currentTime);
NSLog(#" get current Time of video :%f ",playercurrentTime);

iOS get audio duration from avplayer

I'm new to ios, i have an app which contain online audio player. i need to get total duration for the audio. i have tried lot but all codes returns NaN or 0 duration. What is the best way to get total duration for the audio..?
MY CODE
NSString *songUrl = #"http://9xmusiq.com/songs2/tamil/Kaatru%20Veliyidai/Azhagiye%20%5bStarmusiq.cc%5d.mp3"
AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL URLWithString:songUrl]];
AVPlayerItem *playerItem1 = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player1 = [AVPlayer playerWithPlayerItem:playerItem1];
AVPlayerLayer *playerLayer1 = [AVPlayerLayer playerLayerWithPlayer:player1];
playerLayer1.videoGravity = AVLayerVideoGravityResizeAspectFill;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
playerLayer1.frame = self.view.frame;
});
[self.view.layer insertSublayer:playerLayer1 atIndex:1];
[player1 play];
[playerItem1 addObserver:self forKeyPath:#"status" options:0 context:nil];
[playerItem1 addObserver:self forKeyPath:#"playbackBufferEmpty" options:0 context:nil];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
if ([object isKindOfClass:[AVPlayerItem class]]){
AVPlayerItem *item = (AVPlayerItem *)object;
if ([keyPath isEqualToString:#"status"]){
switch(item.status){
case AVPlayerItemStatusFailed:
NSLog(#"player item status failed");
break;
case AVPlayerItemStatusReadyToPlay:
NSLog(#"player item status is ready to play");
Float64 duration = CMTimeGetSeconds(self.avPlayer.currentItem.duration);
NSLog(#"Duration--> %f",duration) // NaN returns
break;
case AVPlayerItemStatusUnknown:
NSLog(#"player item status is unknown");
break;
}
}else if ([keyPath isEqualToString:#"playbackBufferEmpty"]){
if (item.playbackBufferEmpty){
NSLog(#"player item playback buffer is empty");
}
}
}
}
Thanks for your support friends, Finally i find the solution for my problem instead of using AVPlayer i used AVAudioPlayer and i fixed the problem and i got the audio duration.
NSString* resourcePath = #"http://9xmusiq.com/songs2/tamil/Kaatru%20Veliyidai/Azhagiye%20%5bStarmusiq.cc%5d.mp3"; //your url
NSData *_objectData = [NSData dataWithContentsOfURL:[NSURL URLWithString:resourcePath]];
NSError *error;
AVAudioPlayer *player1 = [[AVAudioPlayer alloc] initWithData:_objectData error:&error];
player1.numberOfLoops = 0;
player1.volume = 1.0f;
[player1 prepareToPlay];
NSLog(#"Total Duration : %f",player1.duration);
if (player1 == nil){
NSLog(#"%#", [error description]);
}else{
[player1 play];
}
I'm not that familiar with AVPlayer, but in digging around in the docs it looks like the AVAsset (or in your case AVURLAsset) is the object that holds a duration.
Try querying the asset:
AVURLAsset *asset = [AVURLAsset assetWithURL:[NSURL
URLWithString:songUrl]];
CMTime durationCMTime = asset.duration;
Float64 duration =
CMTimeGetSeconds(durationCMTime);
NSLog(#"Duration of asset is %f", duration);
When your AVPlayer ready to play (under the case of AVPlayerItemStatusReadyToPlay), you can use
CMTime duration = self.player.currentItem.asset.duration;
float seconds = CMTimeGetSeconds(duration);
You can access the duration of an AVPlayerItem's asset using duration property. If you need precise seconds with decimals, use Float64 to receive time from CMTimeGetSeconds. For regular use cases, I guess int would be sufficient.
CMTime duration = playerItem1.asset.duration;
int durationTotalSeconds = CMTimeGetSeconds(duration);
int durationHours = floor(durationTotalSeconds / 3600);
int durationMinutes = floor(durationTotalSeconds % 3600 / 60);
int durationSeconds = floor(durationTotalSeconds % 3600 % 60);
NSString *audioDurationString = [NSString stringWithFormat:#"%d:%d:%d",durationHours, durationMinutes, durationSeconds];
you can get duration - (id)addPeriodicTimeObserverForInterval:(CMTime)interval queue:(nullable dispatch_queue_t)queue usingBlock:(void (^)(CMTime time))block;
#property(nonatomic,strong) AVPlayer *player;
#property(nonatomic,strong) id obsever;
self.player = [[AVPlayer alloc]initWithURL:URL];
[self.player play];
self.obsever = [self.player addPeriodicTimeObserverForInterval:interval queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {//you can get duration in block
CMTimeGetSeconds(theItem.currentTime);
CMTimeGetSeconds(theItem.duration)
}];

How to make delays between AVPlayerItems in AVQueuePlayer in background task?

I trying to make delays between AVPlayerItems, but I don't understand how to make this. I need to play sounds in background (when iOS app in background) with delays between these sounds. But AVQueuePlayer doesn't support this feature. When i trying to pause the sound after it finished, the next sound don't want to play. However, in foreground that algorithm works fine.
This is code, which i running in background task:
AVQueuePlayer *player = [[AVQueuePlayer alloc] initWithItems: items];
[player play];
NSUInteger currentIndex = 0;
AVPlayerItem *lastItem = nil;
while(currentIndex < items.count && self.enabled) {
AVPlayerItem *currentItem = [player currentItem];
while(currentItem == lastItem || currentItem == nil) {
[NSThread sleepForTimeInterval:0.2f];
currentItem = [player currentItem];
}
while(currentItem.status != AVPlayerItemStatusReadyToPlay) {
[NSThread sleepForTimeInterval:0.2f];
NSLog(#"Loading...");
}
lastItem = currentItem;
CGFloat time = 0.0f;
CGFloat duration = 0.0f;
while(duration < 0.5f) {
[NSThread sleepForTimeInterval: 0.5f];
if(currentItem.duration.timescale > 0)
duration = (double) currentItem.duration.value / (double) currentItem.duration.timescale;
}
while(time < duration - 0.1f) {
[NSThread sleepForTimeInterval: 0.08f];
time = (double) currentItem.currentTime.value / (double) currentItem.currentTime.timescale;
}
[player pause];
CGFloat delay = someDelayBetweenSounds;
[NSThread sleepForTimeInterval: delay];
[player play];
currentIndex++;
}
My English is very bad, therefore I apologize
When an item finishes, you can pause the player and start it up again after a given delay. It's much better to do it this way than blocking an entire thread.
- (void) AVPlayerItemDidPlayToEndTimeNotification: (NSNotification *)notification {
self.avQueuePlayer.rate = 0;
CGFloat delay = 500;
dispatch_time_t delay = dispatch_time(DISPATCH_TIME_NOW, (delay) * NSEC_PER_MSEC);
dispatch_after(delay, dispatch_get_main_queue() , ^{
self.avQueuePlayer.rate = 1;
});

why am i getting progress slider exception error

I am making an application which will play audio.I have used a slider to show the time elapsed and time remaining graphically in the slider.But i am not getting how to get the value programitcically.I have used the following code but its throwing an exception.
-(void) updateMyProgress
{
float progress = [avPlayer currentTime]/[avPlayer duration];
self.myProgressView.progress = progress;
}
The code im my viewcontoller.m file is
#import "ViewController.h"
#interface ViewController ()
{
AVAudioPlayer *avPlayer;
AVPlayer *avp;
}
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
NSString *stringPath = [[NSBundle mainBundle]pathForResource:#"naat" ofType:#"mp3"];
NSURL *url = [NSURL fileURLWithPath:stringPath];
NSError *error;
avPlayer = [[AVAudioPlayer alloc]initWithContentsOfURL:url error:&error];
[avPlayer setNumberOfLoops:2];
[avPlayer setVolume:self.sliderVolumeOutlet.value];
[NSTimer scheduledTimerWithTimeInterval:1 target:self selector:#selector(updateMyProgress) userInfo:nil repeats:YES];
}
/*-(void) updateMyProgress
{
CGFloat progress = [avPlayer currentTime]/[avPlayer duration];
self.myProgressView.progress = progress;
}
*/
-(void) updateMyProgress
{
AVPlayerItem *currentItem = avp.currentItem ;
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
CGFloat progress = currentTime.value/duration.value;
self.myProgressView.progress = progress;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (IBAction)sliderVolumeAction:(id)sender
{
UISlider *myslider= sender;
[avPlayer setVolume:myslider.value];
}
- (IBAction)stopButton:(id)sender
{
[avPlayer stop];
[avPlayer setCurrentTime:0];
}
- (IBAction)pauseButton:(id)sender
{
[avPlayer pause];
}
- (IBAction)playButton:(id)sender
{
[avPlayer play];
}
#end![enter image description here][1]
The attached image shows the screenshot of the exception
! [1]: http://i.stack.imgur.com/WsxtI.png
AVPlayer has no property called "duration".
"duration" is the property of AVPlayerItem.
And both duration and currentTime are of "CMTime" dataType
We have to get the value of CMTime then update the progress.
-(void) updateMyProgress
{
AVPlayerItem *currentItem = avPlayer.currentItem;
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
CGFloat progress = currentTime.value/duration.value;
self.myProgressView.progress = progress;
}

Scrubber (UISlider) in AVPlayer?

When you play remote video via AVPlayer and start rewinding, the scrubber is buggy.
I'm doing a player based on this Apple example.
How to implement it smoothly?
Code from my project follows - https://github.com/nullproduction/Player
- (void)initScrubberTimer
{
double interval = .1f;
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration))
{
return;
}
double duration = CMTimeGetSeconds(playerDuration);
if (isfinite(duration))
{
CGFloat width = CGRectGetWidth([scrubberSlider bounds]);
interval = 0.5f * duration / width;
}
__weak id weakSelf = self;
CMTime intervalSeconds = CMTimeMakeWithSeconds(interval, NSEC_PER_SEC);
mTimeObserver = [self.player addPeriodicTimeObserverForInterval:intervalSeconds
queue:dispatch_get_main_queue()
usingBlock:^(CMTime time) {
[weakSelf syncScrubber];
}];
}
- (void)syncScrubber
{
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration))
{
scrubberSlider.minimumValue = 0.0;
return;
}
double duration = CMTimeGetSeconds(playerDuration);
if (isfinite(duration))
{
float minValue = [scrubberSlider minimumValue];
float maxValue = [scrubberSlider maximumValue];
double time = CMTimeGetSeconds([self.player currentTime]);
[scrubberSlider setValue:(maxValue - minValue) * time / duration + minValue];
}
}
- (IBAction)beginScrubbing:(id)sender
{
mRestoreAfterScrubbingRate = [self.player rate];
[self.player setRate:0.f];
[self removePlayerTimeObserver];
}
- (IBAction)scrub:(id)sender
{
if ([sender isKindOfClass:[UISlider class]])
{
UISlider* slider = sender;
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration))
{
return;
}
double duration = CMTimeGetSeconds(playerDuration);
if (isfinite(duration))
{
float minValue = [slider minimumValue];
float maxValue = [slider maximumValue];
float value = [slider value];
double time = duration * (value - minValue) / (maxValue - minValue);
[self.player seekToTime:CMTimeMakeWithSeconds(time, NSEC_PER_SEC)];
}
}
}
- (IBAction)endScrubbing:(id)sender
{
if (!mTimeObserver)
{
CMTime playerDuration = [self playerItemDuration];
if (CMTIME_IS_INVALID(playerDuration))
{
return;
}
double duration = CMTimeGetSeconds(playerDuration);
if (isfinite(duration))
{
CGFloat width = CGRectGetWidth([scrubberSlider bounds]);
double tolerance = 0.5f * duration / width;
__weak id weakSelf = self;
CMTime intervalSeconds = CMTimeMakeWithSeconds(tolerance, NSEC_PER_SEC);
mTimeObserver = [self.player addPeriodicTimeObserverForInterval:intervalSeconds
queue:dispatch_get_main_queue()
usingBlock: ^(CMTime time) {
[weakSelf syncScrubber];
}];
}
}
if (mRestoreAfterScrubbingRate)
{
[self.player setRate:mRestoreAfterScrubbingRate];
mRestoreAfterScrubbingRate = 0.f;
}
}
I guess the problem is, that your scrubber is still updating from the video, while you are using the seekbar. Implement it in a way, that you pause the player during the scrubbing, and you won't have this bug anymore. Checkout my solution:
The function to update your player:
- (IBAction)seekbarAction:(UISlider *)sender {
CMTime videoLength = playerItem1.duration; //gets the video duration
float videoLengthInSeconds = videoLength.value/videoLength.timescale; //transfers the CMTime duration into seconds
[player1 seekToTime:CMTimeMakeWithSeconds(videoLengthInSeconds*sender.value, 1)];
}
And another seekbar action with "Touch down" in order to pause the video:
- (IBAction)pauseSeek:(id)sender {
[player1 pause];
}
And another seekbar action with "Touch up" in order to resume the video when you release the scrubber. Hope this helps.

Resources