Audio player progress iOS - ios

I am playing an audio clip using AvAudipPlayer, I want to know if there is a way to get progress on how much time is left or played ?
I want to animate the slider control while the audio is being played in a view is this possible ?

You should read AVAudioPlayer's documentation. You'll find that the player has two properties that pertain to this, currentTime and duration
audioPlayer.currentTime
audioPlayer.duration
Then, assuming that you're leaving your sliders min and max values at their default settings (0 - 1), you can determine progress by simply dividing currentTime by duration.

Related

UIProgressView progess completion at end of audio

I have a progress view that represents the playing/progress of a audio track as it is played. 1.0 represents 100% completion of that progress view of course. What I need to accomplish is the progress view's progress gradually increasing throughout the duration of the audio track - need to reach 100% progress (progress=1.0) at the same time the audio track is finished. I do have the duration value of the audio track as well. What I'm confused about is the math or how to get gradually increase progress of progress view based on the duration of the audio track.
Note: I will have an array of audio tracks that have different durations. Also, I am using AVPlayer to play these audio tracks.
Any help would be greatly appreciated. Thanks in advance.
After doing some more digging, I believe approaching it this way will work:
AVPlayerItem *currentItem = yourAVPlayer.currentItem;
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
And then in the update method...
progressView.progress = (CGFloat)currentTime / (CGFloat)duration;
Can someone please confirm if this will work as I am not at my workstation at the moment.
A simpler approach is to allow the maximum and minimum values of the progress view to be set. For example, in my app, I use a UISlider to show progress. When the user selects a track to play, I set the minimum and maximum values of the slider as follows:
_slider.minimumValue = 0;
_slider.maximumValue = itemDuration;
The progress value is then just the current time of the player.
[_slider.value setValue:currentPlayerTime animated:YES];
You do not say if your progress view uses a UISlider, but even if it does not, you can probably use this approach.

How do I get current playing time and total play time in AVQueuePlayer?

Is it possible get playing time and total play time in AVQueuePlayer? If yes, how can I do this?
What i have tried so far :
CMTime currentTime = [self.myAVQueuePlayerObject currentTime];
But it is returning the currently played duration of the currentItem located in the Queue.
What i want to achieve ?
I want to get the currentTime Played with respect to all the items present in the AVQueue Player , how can i do that ?
Any Help is Highly Appreciated.
Since AVQueuePlayer is inherited by AVPlayer...
Playing Time:
float currentTime = queuePlayer.currentTime;
Total Play Time:
Use KVO to observe the events of AVQueuePlayer (similar to this), and calculate the total play time of your queue yourself.
For total time of current playing song, you can use:
float duration = queuePlayer.duration;

Synchronize multiple AVPlayers

I am trying to find a solution for a problem I got. I have 5 UIViews, which are all at the same position. Each UIView holds an AVPlayer with different videos. To be more precise they are all the same video, but encoded with different playback speed.
Video 1x speed
Video 4x speed
Video 8x speed
Video 16x speed
Video 24x speed
By default the video 1 is visible and playing, but I should be able to switch between the videos, but the switching shouldn't be visible for the user, therefore, I need to keep them synchronized. So If I am watching video 1 and switch to video 2, then video 2 should play exactly at the position, where video 1 stopped.
The sense is that it should look like, that the video is speeding up after an action, eg. a flick gesture.
I hope I described my issue good enough and I am very thankful for any suggestion.
I am using in the end an observer, which takes snapshots of the currentTime each 5 seconds, and calls all the other AVPLayer with the seekToTime method. This works fine to keep them synchronized. I just needed to adapt the CMTime for each player with different speed. As an example I got here 4x video:
videoPosition = player1.currentTime; //gets the video duration
videoPositionInSeconds = (Float64) videoPosition.value/videoPosition.timescale; //transfers the CMTime duration into seconds
[player2 seekToTime: CMTimeMakeWithSeconds(videoPositionInSeconds/4.0, player1.currentItem.asset.duration.timescale) toleranceBefore: kCMTimeZero toleranceAfter: kCMTimeZero];
Hope this helps.

AV Foundation reporting player item duration incorrectly

I am trying to play a video on iOS using AVPlayer that is encoded with zencoder. The problem I am seeing is that the duration that the player item reports is rounded / imprecise. For example, the video duration might be 173.134 and the player item will report it as a flat 174.0. This causes problems with detecting loaded percentage and other related things. If I try to use the video without encoding everything is reported correctly and precisely.
Has anyone else ever seen this or have a solution?
I had the same problem. I just compare difference between current position and item duration and 1 second:
- (void)playing:(CMTime)time
{
CMTime itemDuration = _player.currentItem.asset.duration;
NSTimeInterval currentTime = CMTimeGetSeconds(time);
NSTimeInterval duration = CMTimeGetSeconds(itemDuration);
if (fabs(currentTime - duration) < 1)
// This is the end.
}
The problem turned out to be a problem with the source video / zencoder. The audio track was a slightly different length than the video which caused problems with the encoding. Cutting off the last second of the video so that the track durations would match up fixed the problem.

Reconciling CACurrentMediaTime() and deviceCurrentTime

I am trying to synchronize several CABasicAnimations with AVAudioPlayer. The issue I have is that CABasicAnimation uses CACurrentMediaTime() as a reference point when scheduling animations while AVAudioPlayer uses deviceCurrentTime. Also for the animations, CFTimeInterval is used, while for sound it's NSTimeInterval (not sure if they're "toll free bridged" like other CF and NS types). I'm finding that the reference points are different as well.
Is there a way to ensure that the sounds and animations use the same reference point?
I don't know the "official" answer, but they are both double precision floating point numbers that measure a number of seconds from some reference time.
From the docs, it sounds like deviceCurrentTime is linked to the current audio session:
The time value, in seconds, of the audio output device. (read-only)
#property(readonly) NSTimeInterval deviceCurrentTime Discussion The
value of this property increases monotonically while an audio player
is playing or paused.
If more than one audio player is connected to the audio output device,
device time continues incrementing as long as at least one of the
players is playing or paused.
If the audio output device has no connected audio players that are
either playing or paused, device time reverts to 0.
You should be able to start an audio output session, call CACurrentMediaTime() then get the deviceCurrentTime of your audio session in 2 sequential statements, then calculate an offset constant to convert between them. That offset would be accurate within a few nanoseconds.
The offset would only be valid while the audio output session is active. You'd have to recalculate it each time you remove all audio players from the audio session.
I think the official answer just changed, though currently under NDA.
See "What's New in Camera Capture", in particular the last few slides about the CMSync* functions.
https://developer.apple.com/videos/wwdc/2012/?id=520

Resources