How to show AVplayer current play duration in UISlider - ios

I am using a custom UISlider to implement the scrubbing while playing a video in AVPlayer. Trying to figure out the best way to show the current play time and the remaining duration on the respective ends of the UISlider as it usually is shown in MPMoviePlayerController.
Any help is appreciated.

Swift 4.x
let player = AVPlayer()
player.addPeriodicTimeObserver(forInterval: CMTime.init(value: 1, timescale: 1), queue: .main, using: { time in
if let duration = player.currentItem?.duration {
let duration = CMTimeGetSeconds(duration), time = CMTimeGetSeconds(time)
let progress = (time/duration)
if progress > targetProgress {
print(progress)
//Update slider value
}
}
})
or
extension AVPlayer {
func addProgressObserver(action:#escaping ((Double) -> Void)) -> Any {
return self.addPeriodicTimeObserver(forInterval: CMTime.init(value: 1, timescale: 1), queue: .main, using: { time in
if let duration = self.currentItem?.duration {
let duration = CMTimeGetSeconds(duration), time = CMTimeGetSeconds(time)
let progress = (time/duration)
action(progress)
}
})
}
}
Use
let player = AVPlayer()
player.addProgressObserver { progress in
//Update slider value
}

Put a UILabel at each end. Update them using -[AVPlayer addPeriodicTimeObserverForInterval:queue:usingBlock:]. Compute the time remaining using -[AVPlayer currentTime] and -[AVPlayerItem duration].

[slider setMaximumValue:[AVPlayerItem duration]];
//use NSTimer, run repeat function
-(void)updateValueSlide{
[slider setValue:[AVPlayerItem duration]];
}

Related

UISlider as audio seekbar jumping to maximum value when changed

I am using a UISlider as a seek bar for audio and it works great for adjusting to change position in the track if it is not animated. If it's animated it works great and tracks along the bar in time with the track perfectly but if you try and adjust it while the animation is active, it jumps to the maximum value of the bar. I assume there is a conflict between the two processes but I'm struggling to work out a fix.
func changeProgressBar() {
let trackLength = Float(AudioService.shared.playerItem?.duration.seconds ?? 0)
Timer.scheduledTimer(withTimeInterval: 0.5, repeats: true){_ in
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let sliderPosition = currentTime / (trackLength / 100)
self.progressBar.setValue(sliderPosition, animated: true)
print("the current time is", currentTime)
print("the slider position is", sliderPosition)
}
}
#IBAction func progressBarValueChanged(_ sender: UISlider) {
let trackLength = AudioService.shared.playerItem?.duration.seconds ?? 0
let sliderValueFloat = progressBar.value * 100.00
let sliderValueDouble = Double(sliderValueFloat)
let targetTime = (trackLength / 100 * sliderValueDouble)
let targetTimeActual = CMTimeMake(value: Int64(targetTime), timescale: 1)
AudioService.shared.player!.seek(to: targetTimeActual)
}
I have buttons that add or subtract 30 seconds to skip forward or back in the track and they work fine even when the animation is active.
#IBAction func plus30Secs(_ sender: UIButton) {
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let seekTime = currentTime + 30
let seekTimeActual = CMTimeMake(value: Int64(seekTime), timescale: 1)
AudioService.shared.player!.seek(to: seekTimeActual)
}
#IBAction func minus30Secs(_ sender: UIButton) {
let currentTime = Float(AudioService.shared.player?.currentTime().seconds ?? 0)
let seekTime = currentTime - 30
let seekTimeActual = CMTimeMake(value: Int64(seekTime), timescale: 1)
AudioService.shared.player!.seek(to: seekTimeActual)
}
ok, i have fixed it.
The issue was i had progressBar.maximumValue = 100 meaning that my progressBar.value * 100.00 was giving a value 100 times what it should have been and as such a value beyond the end of the track. I removed the * 100.00 and now it works great.

Swift: Trying to control time in AVAudioPlayerNode using UISlider

I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound.
to get the current time of the player I'm doing this:
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
}
I have a slider that indicates the current time of the audio. When the user changes the slider value, on .ended event I have to change the current time of the player to that indicated in the slider.
To do so:
extension AVAudioPlayerNode {
func seekTo(value: Float, audioFile: AVAudioFile, duration: Float) {
if let nodetime = self.lastRenderTime{
let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodetime)!
let sampleRate = self.outputFormat(forBus: 0).sampleRate
let newsampletime = AVAudioFramePosition(Int(sampleRate * Double(value)))
let length = duration - value
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
self.stop()
if framestoplay > 1000 {
self.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, at: nil,completionHandler: nil)
}
}
self.play()
}
However, my function seekTo is not working correctly(I'm printing currentTime before and after the function and it shows always a negative value ~= -0.02). What is the wrong thing I'm doing and can I find a simpler way to change the currentTime of the player?
I ran into same issue. Apparently the framestoplay was always 0, which happened because of sampleRate. The value for playerTime.sampleRate was always 0 in my case.
So,
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
must be replaced with
let framestoplay = AVAudioFrameCount(Float(sampleRate) * length)

How to loop AVPlayer from 4 second to 8 second in swift 3?

I have an AVPlayer in swift 3 that plays video - the problem is that I want to use loop from A to B seconds (for example from 4 to 8 second)here is my codes for loop but didn't work
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: self.Player.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.Player.seek(to: kCMTimeZero)
self.Player.play()
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 4.0) {
// check if player is still playing
if self.Player.rate != 0 {
print("OK")
print("Player reached 4.0 seconds")
let timeScale = self.Player.currentItem?.asset.duration.timescale;
// let seconds = kCMTimeZero
let time = CMTimeMakeWithSeconds( 8.0 , timeScale!)
self.Player.seek(to: time, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
self.Player.play()
}
}
}
})
the problem is that this loop doesn't work and because of AVPlayerItemDidPlayToEndTime the print("OK") won't work until the player has finished the movie
There are a few options:
If you want gapless playback, you can start off by using:
Pre iOS 10: https://developer.apple.com/library/content/samplecode/avloopplayer/Introduction/Intro.html
iOS 10+:
https://developer.apple.com/documentation/avfoundation/avplayerlooper
The pre-ios10 "solution" from apple does work, and is the only way I have gotten gapless looping since I target ios9.
If you are using that solution, you also need to either feed it an avplayeritem the right length or add to the solution to cut it up as you send it to the player.
For that, you can do something like how I changed apples code (sorry if its a bit sparse - just trying to show the main changes) - Basically adding in sending the track and the chunk of time to use, then make that an AVMutableCompositionTrack (I got rid of all the stuff for video - you will want to keep that in) :
class myClass: someClass {
var loopPlayer:QueuePlayerLooper!
var avAssetLength:Int64!
var avAssetTimescale:CMTimeScale!
var avAssetTimeRange:CMTimeRange!
let composition = AVMutableComposition()
var playerItem:AVPlayerItem!
var avAssetrack:AVAssetTrack!
var compAudioTrack:AVMutableCompositionTrack!
var uurl:URL!
var avAsset:AVURLAsset!
func createCMTimeRange(start:TimeInterval, end:TimeInterval) -> CMTimeRange {
avAssetTimescale = avAssetTrack.naturalTimeScale
let a:CMTime = CMTime(seconds: start, preferredTimescale: avAssetTimescale)
let b:CMTime = CMTime(seconds: end, preferredTimescale: avAssetTimescale)
return CMTimeRange(start: a, end: b)
}
func startLoopingSection() {
loopPlayer = QueuePlayerLooper(audioURL: uurl, loopCount: -1, timeRange: createCMTimeRange(start: a_playbackPosition, end: b_playbackPosition))
loopPlayer.start()
}
}
//--==--==--==--==--==--==--==--==--
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this sample’s licensing information
Abstract:
An object that uses AVQueuePlayer to loop a video.
*/
// Marked changed code with ++
class QueuePlayerLooper : NSObject, Looper {
// MARK: Types
private struct ObserverContexts {
static var playerStatus = 0
static var playerStatusKey = "status"
static var currentItem = 0
static var currentItemKey = "currentItem"
static var currentItemStatus = 0
static var currentItemStatusKey = "currentItem.status"
static var urlAssetDurationKey = "duration"
static var urlAssetPlayableKey = "playable"
}
// MARK: Properties
private var player: AVQueuePlayer?
private var playerLayer: AVPlayerLayer?
private var isObserving = false
private var numberOfTimesPlayed = 0
private let numberOfTimesToPlay: Int
private let videoURL: URL
++var assetTimeRange:CMTimeRange!
++let composition = AVMutableComposition()
++var currentTrack:AVAssetTrack!
++var assetTimeRange:CMTimeRange!
// MARK: Looper
required init(videoURL: URL, loopCount: Int, ++timeRange:CMTimeRange) {
self.videoURL = videoURL
self.numberOfTimesToPlay = loopCount
++self.assetTimeRange = timeRange
super.init()
super.init()
}
func start(in parentLayer: CALayer) {
stop()
player = AVQueuePlayer()
playerLayer = AVPlayerLayer(player: player)
guard let playerLayer = playerLayer else { fatalError("Error creating player layer") }
playerLayer.frame = parentLayer.bounds
parentLayer.addSublayer(playerLayer)
let videoAsset = AVURLAsset(url: videoURL)
++currentTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
++currentTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)
++try! compositionTrack.insertTimeRange(assetTimeRange, of: currentTrack, at: CMTimeMake(0, 1))
videoAsset.loadValuesAsynchronously(forKeys: [ObserverContexts.urlAssetDurationKey, ObserverContexts.urlAssetPlayableKey]) {
/*
The asset invokes its completion handler on an arbitrary queue
when loading is complete. Because we want to access our AVQueuePlayer
in our ensuing set-up, we must dispatch our handler to the main
queue.
*/
DispatchQueue.main.async(execute: {
var durationError: NSError?
let durationStatus = videoAsset.statusOfValue(forKey: ObserverContexts.urlAssetDurationKey, error: &durationError)
guard durationStatus == .loaded else { fatalError("Failed to load duration property with error: \(durationError)") }
var playableError: NSError?
let playableStatus = videoAsset.statusOfValue(forKey: ObserverContexts.urlAssetPlayableKey, error: &playableError)
guard playableStatus == .loaded else { fatalError("Failed to read playable duration property with error: \(playableError)") }
guard videoAsset.isPlayable else {
print("Can't loop since asset is not playable")
return
}
guard CMTimeCompare(videoAsset.duration, CMTime(value:1, timescale:100)) >= 0 else {
print("Can't loop since asset duration too short. Duration is(\(CMTimeGetSeconds(videoAsset.duration)) seconds")
return
}
/*
Based on the duration of the asset, we decide the number of player
items to add to demonstrate gapless playback of the same asset.
*/
let numberOfPlayerItems = (Int)(1.0 / CMTimeGetSeconds(videoAsset.duration)) + 2
for _ in 1...numberOfPlayerItems {
let loopItem = AVPlayerItem(asset: ++self.composition)
self.player?.insert(loopItem, after: nil)
}
self.startObserving()
self.numberOfTimesPlayed = 0
self.player?.play()
})
}
}
}}
You can add periodic time observer to monitor current time
let timeObserverToken = player.addPeriodicTimeObserver(forInterval: someInterval, queue: DispatchQueue.main) { [unowned self] time in
let seconds = CMTimeGetSeconds(cmTime)
if seconds >= 8.0 {
// jump back to 4 seconds
// do stuff
}
}

Play/Pause and Elapsed Time not updating in iOS command center properly

I have a video player that can play from the iOS command center and lock screen. When I toggle a play/pause button in my app, it should update the play/pause button in the command center (MPRemoteCommandCenter) by updating the nowPlayingInfo (MPNowPlayingInfoCenter). I'm not sure why it's not updating.
For example, if I pause the video with a custom button in my app, the command center still shows the pause button (meaning the video is still playing which is wrong.)
This is how I update the nowPlayingInfo:
func updateMPNowPlayingInforCenterMetadata() {
guard video != nil else {
nowPlayingInfoCenter.nowPlayingInfo = nil
return
}
var nowPlayingInfo = nowPlayingInfoCenter.nowPlayingInfo ?? [String: Any]()
let image: UIImage
if let placeholderLocalURL = video.placeholderLocalURL, let placeholderImage = UIImage(contentsOfFile: placeholderLocalURL.path) {
image = placeholderImage
} else {
image = UIImage()
}
let artwork = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ -> UIImage in
return image
})
nowPlayingInfo[MPMediaItemPropertyTitle] = video.title
nowPlayingInfo[MPMediaItemPropertyAlbumTitle] = video.creator?.name ?? " "
nowPlayingInfo[MPMediaItemPropertyArtwork] = artwork
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = Float(video.duration)
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Float(currentTime) // CMTimeGetSeconds(player.currentItem!.currentTime())
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = player.rate
nowPlayingInfo[MPNowPlayingInfoPropertyDefaultPlaybackRate] = player.rate
nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
if player.rate == 0.0 {
state = .paused
} else {
state = .playing
}
}
With KVO, when the player's rate changes, I call this function:
// MARK: - Key-Value Observing Method
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard context == &assetPlaybackManagerKVOContext else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
return
}
} else if keyPath == #keyPath(AVPlayer.rate) {
updateMPNowPlayingInforCenterMetadata()
}
}
Any thoughts?
UPDATE
I found a solution though but not perfect in my case. So in my app I have 2 view controller's. Let's call them FeedVC and PlayerVC. So FeedVC has AVPlayer's that are always playing but are muted. If you click on one of them, then the PlayerVC is created and plays the full video. If I pause the AVPlayer's in FeedVC before going to PlayerVC then the "play/pause" button in the NowPlayingInfoCenter works perfectly!
Is there a way to make this work without having to pause the videos in the FeedVC?
Another issue is that the elapsed time keeps counting if I don't pause the players in the FeedVC. It seems that if multiple players are playing, the play/pause button and elapsed time are incorrect.
When you setting the dictionary for nowPlayingInfo you will want to set the MPNowPlayingInfoPropertyPlaybackRate value appropriately. The MPNowPlayingInfoCenter is expecting either a 1.0 (playing) or 0.0 (not playing) value as a Double wrapped in a NSNumber object. Below you will find the code for how I'm setting the nowPlayingInfo in my project.
func setNowPlayingMediaWith(song: SUSong, currentTime: Double?) {
var playingInfo:[String: Any] = [:]
if let title = song.title {
playingInfo[MPMediaItemPropertyTitle] = title
}
if let songID = song.id {
playingInfo[MPMediaItemPropertyPersistentID] = songID
}
if let artist = song.artist, let artistName = artist.name {
playingInfo[MPMediaItemPropertyArtist] = artistName
}
if let album = song.album, let albumTitle = album.title {
var artwork:MPMediaItemArtwork? = nil
if let album = song.album, let artworkData = album.artwork {
artwork = MPMediaItemArtwork(boundsSize: Constants.Library.Albums.thumbSize, requestHandler: { (size) -> UIImage in
return UIImage(data: artworkData as Data)!
})
}
if artwork != nil {
playingInfo[MPMediaItemPropertyArtwork] = artwork!
}
playingInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
var rate:Double = 0.0
if let time = currentTime {
playingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = time
playingInfo[MPMediaItemPropertyPlaybackDuration] = song.duration
rate = 1.0
}
playingInfo[MPNowPlayingInfoPropertyPlaybackRate] = NSNumber(value: rate)
playingInfo[MPNowPlayingInfoPropertyMediaType] = NSNumber(value: MPNowPlayingInfoMediaType.audio.rawValue)
MPNowPlayingInfoCenter.default().nowPlayingInfo = playingInfo
}
In this method I am passing the song that my player is has currently loaded. Whenever the user chooses to play or pause I call setNowPlayingMediaWith(song:currentTime:) to update the device's control console.
I keep track of the currentTime (Double) as a property of my player. If there is a currentTime passed in that means we are meant to be playing, so set the MPNowPlayingInfoPropertyPlaybackRate to 1.0 and set the MPNowPlayingInfoPropertyElapsedPlaybackTime to currentTime. This will set the current time to start automatically playing in the device's control console.
Likewise, if there is no currentTime passed then we have stopped or paused. In this case we set the MPNowPlayingInfoPropertyPlaybackRate to 0.0 and we do not include the MPNowPlayingInfoPropertyElapsedPlaybackTime.
Download my app to see this in action.
EDIT (answer to comments)
"Is there a way to make this work without having to pause the videos in the FeedVC"
Without seeing your code it is difficult to give you a definite answer. It would make sense though to pause any ongoing media before starting your PlayerVC's media.
"the elapsed time keeps counting"
The elapsed time will countdown the elapsed time based on an NSTimer property on NowPlayingInfoCenter. This timer will stop only when the timers value has reached the value you set in MPMediaItemPropertyPlaybackDuration, or when you update the MPNowPlayingInfoPropertyElapsedPlaybackTime.
My suggestion is to write a method that "clears out" the NowPlayingInfoCenter. This method should set the will set all of key values to either 0.0 or nil respectively. Call this "clear out" method each time before you play your media in PlayerVC. Then once you play from PlayerVC, set the NowPlayingInfoCenter key values like you would in the method I pasted in my answer to set the new values for the new playing media.

How do I get current playing time and total play time in AVPlayer?

Is it possible get playing time and total play time in AVPlayer? If yes, how can I do this?
You can access currently played item by using currentItem property:
AVPlayerItem *currentItem = yourAVPlayer.currentItem;
Then you can easily get the requested time values
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
Swift 5:
if let currentItem = player.currentItem {
let duration = CMTimeGetSeconds(currentItem.duration)
let currentTime = CMTimeGetSeconds(currentItem.currentTime())
print("Duration: \(duration) s")
print("Current time: \(currentTime) s")
}
_audioPlayer = [self playerWithAudio:_audio];
_observer =
[_audioPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 2)
queue:dispatch_get_main_queue()
usingBlock:^(CMTime time)
{
_progress = CMTimeGetSeconds(time);
}];
Swift 3
let currentTime:Double = player.currentItem.currentTime().seconds
You can get the seconds of your current time by accessing the seconds property of the currentTime(). This will return a Double that represents the seconds in time. Then you can use this value to construct a readable time to present to your user.
First, include a method to return the time variables for H:mm:ss that you will display to the user:
func getHoursMinutesSecondsFrom(seconds: Double) -> (hours: Int, minutes: Int, seconds: Int) {
let secs = Int(seconds)
let hours = secs / 3600
let minutes = (secs % 3600) / 60
let seconds = (secs % 3600) % 60
return (hours, minutes, seconds)
}
Next, a method that will convert the values you retrieved above into a readable string:
func formatTimeFor(seconds: Double) -> String {
let result = getHoursMinutesSecondsFrom(seconds: seconds)
let hoursString = "\(result.hours)"
var minutesString = "\(result.minutes)"
if minutesString.characters.count == 1 {
minutesString = "0\(result.minutes)"
}
var secondsString = "\(result.seconds)"
if secondsString.characters.count == 1 {
secondsString = "0\(result.seconds)"
}
var time = "\(hoursString):"
if result.hours >= 1 {
time.append("\(minutesString):\(secondsString)")
}
else {
time = "\(minutesString):\(secondsString)"
}
return time
}
Now, update the UI with the previous calculations:
func updateTime() {
// Access current item
if let currentItem = player.currentItem {
// Get the current time in seconds
let playhead = currentItem.currentTime().seconds
let duration = currentItem.duration.seconds
// Format seconds for human readable string
playheadLabel.text = formatTimeFor(seconds: playhead)
durationLabel.text = formatTimeFor(seconds: duration)
}
}
With Swift 4.2, use this;
let currentPlayer = AVPlayer()
if let currentItem = currentPlayer.currentItem {
let duration = currentItem.asset.duration
}
let currentTime = currentPlayer.currentTime()
Swift 4
self.playerItem = AVPlayerItem(url: videoUrl!)
self.player = AVPlayer(playerItem: self.playerItem)
self.player?.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main, using: { (time) in
if self.player!.currentItem?.status == .readyToPlay {
let currentTime = CMTimeGetSeconds(self.player!.currentTime())
let secs = Int(currentTime)
self.timeLabel.text = NSString(format: "%02d:%02d", secs/60, secs%60) as String//"\(secs/60):\(secs%60)"
})
}
AVPlayerItem *currentItem = player.currentItem;
NSTimeInterval currentTime = CMTimeGetSeconds(currentItem.currentTime);
NSLog(#" Capturing Time :%f ",currentTime);
Swift:
let currentItem = yourAVPlayer.currentItem
let duration = currentItem.asset.duration
var currentTime = currentItem.asset.currentTime
Swift 5:
Timer.scheduledTimer seems better than addPeriodicTimeObserver if you want to have a smooth progress bar
static public var currenTime = 0.0
static public var currenTimeString = "00:00"
Timer.scheduledTimer(withTimeInterval: 1/60, repeats: true) { timer in
if self.player!.currentItem?.status == .readyToPlay {
let timeElapsed = CMTimeGetSeconds(self.player!.currentTime())
let secs = Int(timeElapsed)
self.currenTime = timeElapsed
self.currenTimeString = NSString(format: "%02d:%02d", secs/60, secs%60) as String
print("AudioPlayer TIME UPDATE: \(self.currenTime) \(self.currenTimeString)")
}
}
Swift 4.2:
let currentItem = yourAVPlayer.currentItem
let duration = currentItem.asset.duration
let currentTime = currentItem.currentTime()
in swift 5+
You can query the player directly to find the current time of the actively playing AVPlayerItem.
The time is stored in a CMTime Struct for ease of conversion to various scales such as 10th of sec, 100th of a sec etc
In most cases we need to represent times in seconds so the following will show you what you want
let currentTimeInSecs = CMTimeGetSeconds(player.currentTime())

Resources