Display Timecode for video track using Swift AVPlayer - ios

I am working on a little app to navigate and play videos which includes framerate and initial timecode information (eg: "TimeCode" : "09:25:15:08"). I am using the AVKit Player View Controller to display my video, and I would like to add a UILabel displaying the current timecode.
I'm all set about the way to add up custom UI elements, but I'm lost about how to calculate the timecode and make it update itself every frame the video plays.
I have read about the AVFoundation - Timecode Support with AVAssetWriter and AVAssetReader, but I'm not sure if I have understood it.
Any explanations, guidance, or content to look at would be really appreciated.
UPDATE:
After thinking for a while, I though that I could use the frame count to build up my own timecode references.
note item is the AVPlayer
using var totalTime = CMTimeGetSeconds(item.currentItem.asset.duration)I can get the total length in seconds of the video track, and currentTime = CMTimeGetSeconds(item.currentItem.currentTime())to get its current time position.
Then I can do var fps = item.currentItem.asset.tracks[0].nominalFrameRateto get the framerate and use this variable to divide totalTime and currentTimeto get the total frame count as well as the current frame.
With this im considering the idea of building up a pre normalized array of time in second for each frame from the total frame count. this way I could know what exact frame is related to a time stamp.
I never had to work with timecode or dates so if someone has an idea about the way to do this I would appreciate the help.
basically, a timecode looks like this: HH:MM:SS:FF (FF being the current frame).
If the fps = 24, then every 24 frames a second is added to the SS and so on.
TEST CASE:
import UIKit
import AVKit
import AVFoundation
class ViewController: UIViewController {
var player = AVPlayer()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
var url=NSURL(string: "http://km.support.apple.com/library/APPLE/APPLECARE_ALLGEOS/HT1211/sample_iTunes.mov")
player = AVPlayer(URL: url)
let playerController = AVPlayerViewController()
playerController.player = player
self.addChildViewController(playerController)
self.view.addSubview(playerController.view)
playerController.view.frame = self.view.frame
//Debug btn
var btn = UIButton()
btn.frame = CGRect(x:10, y:50, width:100, height:30)
btn.setTitle("FPS", forState: .Normal)
btn.addTarget(self, action: "buttonTapAction:", forControlEvents: UIControlEvents.TouchUpInside)
playerController.view.addSubview(btn)
player.play()
// getFps(player)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func getFps(item:AVPlayer) {
var fps = item.currentItem.asset.tracks[0].nominalFrameRate
println("FPS: \(fps)")
var timeRange_src = CMTimeGetSeconds(item.currentItem.asset.duration)
var timeRange = Float(timeRange_src)
println("time Range: \(timeRange)")
var frameCount = timeRange * fps
println("total frames: \(frameCount)")
var timeIs = Float(CMTimeGetSeconds(item.currentItem.currentTime()))
var frameIs = timeIs * fps
println("current time: \(timeIs)")
println("current frame: \(frameIs)")
}
func buttonTapAction(sender:UIButton!)
{
getFps(player)
}
}

Here's how I recently did it with Swift 3:
func formatTimecode(frame: Int, fps: Double) -> String {
let FF = Int(Double(frame).truncatingRemainder(dividingBy: fps))
let seconds = Int(Double(frame - FF) / fps)
let SS = seconds % 60
let MM = (seconds % 3600) / 60
let HH = seconds / 3600
let timecode = [String(format: "%02d", HH), String(format: "%02d", MM), String(format: "%02d", SS), String(format: "%02d", FF)];
return timecode.joined(separator: ":")
}

ok so here is how I solved my problem and built the timecode based on the variables I had:
func pad(fps:Float, currentFrame:Float) -> (String){
var fps = fps
var frame = currentFrame + f
var ff = frame % fps
var seconds = s + ((currentFrame) / fps)
var ss = seconds % 60
var minutes = m + ((seconds - ss) / 60)
var mm = minutes % 60
var hh = h + ((minutes - mm) / 60)
return "\(showTwoDigits(hh)):\(showTwoDigits(mm)):\(showTwoDigits(ss)):\(showTwoDigits(ff))"
}
func showTwoDigits(number:Float) -> (String){
var string = ("00" + String(format:"%.f", number))
var range = Range(start: (advance(string.endIndex, -2)), end: string.endIndex)
var cutStr = string.substringWithRange(range)
return cutStr
}

Related

Swift - How to get the current position of AVAudioPlayerNode while it's looping?

I have an AVAudioPlayerNode looping a segment of a song:
audioPlayer.scheduleBuffer(segment, at: nil, options:.loops)
I want to get current position of the song while it's playing. Usually, this is done by calculating = currentFrame / audioSampleRate
where
var currentFrame: AVAudioFramePosition {
guard let lastRenderTime = audioPlayer.lastRenderTime,
let playerTime = audioPlayer.playerTime(forNodeTime: lastRenderTime) else {
return 0
}
return playerTime.sampleTime
}
However, when the loop ends and restarts, the currentFrame does not restart. But it still increases which makes currentFrame / audioSampleRate incorrect as the current position.
So what is the correct way to calculate the current position?
Good old modulo will do the job:
public var currentTime: TimeInterval {
guard let nodeTime = player.lastRenderTime,
let playerTime = player.playerTime(forNodeTime: nodeTime) else {
return 0
}
let time = (Double(playerTime.sampleTime) / playerTime.sampleRate)
.truncatingRemainder(dividingBy: Double(file.length) / Double(playerTime.sampleRate))
return time
}

Equivalent of getBufferedPosition() in exoplayer for iOS AVPlayer

I am currently developing an avplayer app which calculates HLS streaming metrics. I wanted to get buffer level for the current item.
private var availableDuration: Double {
guard let timeRange = player.currentItem?.loadedTimeRanges.first?.timeRangeValue else {
return 0.0
}
let startSeconds = timeRange.start.seconds
let durationSeconds = timeRange.duration.seconds
return startSeconds + durationSeconds
}
I am a little confused in the terminology used in apple documentations.
Here i am getting availableDuration of the current item but i am not sure if this represents the buffer level of the current item.
Your code seems ok. I used same
var bufferInSeconds: Double {
guard let range = self.loadedTimeRanges.first?.timeRangeValue else {
return 0.0
}
let sec = range.start.seconds + range.duration.seconds
return sec >= 0 ? sec : 0
}

mach_wait_until() Strange Behavior on iPad

I created a simple project to test out the functionality of mach_wait_until(). This code gives me an accurate printout of how precise the 1 second delay is. The console printout is virtually identical and extremely precise on both the iOS Simulator and on my iPad Air 2. However, on my iPad there is a HUGE delay, where the same 1 second delay takes about 100 seconds! And to add to the weirdness of it, the printout in the console says it only takes 1 second (with extremely low jitter and/or lag).
How can this be? Is there some timing conversion that I need to do for a physical iOS device when using mach_wait_until()?
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
playNoteTest()
}
var start = mach_absolute_time()
var end = mach_absolute_time()
func playNoteTest() {
let when = mach_absolute_time() + 1000000000
self.start = mach_absolute_time()
mach_wait_until(when)
self.end = mach_absolute_time()
let timeDelta = (self.end - self.start)
let newTimeDelta = Double(timeDelta) / 1000000000.0
print("Delta Time = \(newTimeDelta)")
playNoteTest()
}
}
mach_absolute_time units are CPU dependent. You need to multiply by a device-specific constant in order to get real-world units. It is discussed in this Tech Q&A from Apple.
Here is some playground code that demonstrates the idea:
import PlaygroundSupport
import Foundation
PlaygroundPage.current.needsIndefiniteExecution = true
class TimeBase {
static let NANOS_PER_USEC: UInt64 = 1000
static let NANOS_PER_MILLISEC: UInt64 = 1000 * NANOS_PER_USEC
static let NANOS_PER_SEC: UInt64 = 1000 * NANOS_PER_MILLISEC
static var timebaseInfo: mach_timebase_info! = {
var tb = mach_timebase_info(numer: 0, denom: 0)
let status = mach_timebase_info(&tb)
if status == KERN_SUCCESS {
return tb
} else {
return nil
}
}()
static func toNanos(abs:UInt64) -> UInt64 {
return (abs * UInt64(timebaseInfo.numer)) / UInt64(timebaseInfo.denom)
}
static func toAbs(nanos:UInt64) -> UInt64 {
return (nanos * UInt64(timebaseInfo.denom)) / UInt64(timebaseInfo.numer)
}
}
let duration = TimeBase.toAbs(nanos: 10 * TimeBase.NANOS_PER_SEC)
DispatchQueue.global(qos: .userInitiated).async {
print("Start")
let start = mach_absolute_time()
mach_wait_until(start+duration)
let stop = mach_absolute_time()
let elapsed = stop-start
let elapsedNanos = TimeBase.toNanos(abs: elapsed)
let elapsedSecs = elapsedNanos/TimeBase.NANOS_PER_SEC
print("Elapsed nanoseconds = \(elapsedNanos)")
print("Elapsed seconds = \(elapsedSecs)")
}

AVPlayer seektotime with Pangesturerecognizer

I'm trying to use seektotime with Pangesture recognizer.But its not seeking as expected.
let totalTime = self.avPlayer.currentItem!.duration
print("time: \(CMTimeGetSeconds(totalTime))")
self.avPlayer.pause()
let touchDelta = swipeGesture.translationInView(self.view).x / CGFloat(CMTimeGetSeconds(totalTime))
let currentTime = CMTimeGetSeconds((avPlayer.currentItem?.currentTime())!) + Float64(touchDelta)
print(currentTime)
if currentTime >= 0 && currentTime <= CMTimeGetSeconds(totalTime) {
let newTime = CMTimeMakeWithSeconds(currentTime, Int32(NSEC_PER_SEC))
print(newTime)
self.avPlayer.seekToTime(newTime)
}
What I'm doing wrong in here ?
Think about what's happening in this line here:
let touchDelta = swipeGesture.translationInView(self.view).x / CGFloat(CMTimeGetSeconds(totalTime))
You're dividing pixels (the translation in just the x-axis) by time. This really isn't a "delta" or absolute difference. It's a ratio of sorts. But it's not a ratio that has any meaning. Then you're getting your new currentTime by just added this ratio to the previous currentTime, so you're adding pixels per seconds to pixels, which doesn't give a logical or useful number.
What we need to do is take the x-axis translation from the gesture and apply a scale (which is a ratio) to it in order to get a useful number of seconds to advance/rewind the AVPlayer. The x-axis translation is in pixels so we'll need a scale that describes seconds per pixels and multiple the two in order to get our number of seconds. The proper scale is the ratio between the total number of seconds in the video and the total number of pixels that the user can move through in the gesture. Multiplying pixels times (seconds divided by pixels) gives us a number in seconds. In pseudocode:
scale = totalSeconds / totalPixels
timeDelta = translation * scale
currentTime = oldTime + timeDelta
So I would rewrite your code like this:
let totalTime = self.avPlayer.currentItem!.duration
print("time: \(CMTimeGetSeconds(totalTime))")
self.avPlayer.pause()
// BEGIN NEW CODE
let touchDelta = swipeGesture.translationInView(self.view).x
let scale = CGFloat(CMTimeGetSeconds(totalTime)) / self.view.bounds.width
let timeDelta = touchDelta * scale
let currentTime = CMTimeGetSeconds((avPlayer.currentItem?.currentTime())!) + Float64(timeDelta)
// END NEW CODE
print(currentTime)
if currentTime >= 0 && currentTime <= CMTimeGetSeconds(totalTime) {
let newTime = CMTimeMakeWithSeconds(currentTime, Int32(NSEC_PER_SEC))
print(newTime)
self.avPlayer.seekToTime(newTime)
}
I have same issue, then i create the UISlider and set the action method is given below,
declare AVPlayer is var playerVal = AVPlayer()
#IBAction func sliderAction(sender: UISlider) {
playerVal.pause()
displayLink.invalidate()
let newTime:CMTime = CMTimeMakeWithSeconds(Double(self.getAudioDuration() as! NSNumber) * Double(sender.value), playerVal.currentTime().timescale)
playerVal.seekToTime(newTime)
updateTime()
playerVal.play()
deepLink()
}
And another method is,
func updateTime() {
let currentTime = Float(CMTimeGetSeconds(playerItem1.currentTime()))
let minutes = currentTime/60
let seconds = currentTime - minutes * 60
let maxTime = Float(self.getAudioDuration() as! NSNumber)
let maxminutes = maxTime / 60
let maxseconds = maxTime - maxminutes * 60
startValue.text = NSString(format: "%.2f:%.2f", minutes,seconds) as String
stopValue.text = NSString(format: "%.2f:%.2f", maxminutes,maxseconds) as String
}
I have used CADisplayLink and declare var displayLink = CADisplayLink(), its used continue(automatically) playing audios. code is
func deepLink() {
displayLink = CADisplayLink(target: self, selector: ("updateSliderProgress"))
displayLink.addToRunLoop(NSRunLoop.currentRunLoop(), forMode: NSDefaultRunLoopMode)
}
func updateSliderProgress(){
let progress = Float(CMTimeGetSeconds(playerVal.currentTime())) / Float(self.getAudioDuration() as! NSNumber)
sliderView.setValue(Float(progress), animated: false)
}
if you see this above answer, you have get idea, hope its helpful

How do I get current playing time and total play time in AVPlayer?

Is it possible get playing time and total play time in AVPlayer? If yes, how can I do this?
You can access currently played item by using currentItem property:
AVPlayerItem *currentItem = yourAVPlayer.currentItem;
Then you can easily get the requested time values
CMTime duration = currentItem.duration; //total time
CMTime currentTime = currentItem.currentTime; //playing time
Swift 5:
if let currentItem = player.currentItem {
let duration = CMTimeGetSeconds(currentItem.duration)
let currentTime = CMTimeGetSeconds(currentItem.currentTime())
print("Duration: \(duration) s")
print("Current time: \(currentTime) s")
}
_audioPlayer = [self playerWithAudio:_audio];
_observer =
[_audioPlayer addPeriodicTimeObserverForInterval:CMTimeMake(1, 2)
queue:dispatch_get_main_queue()
usingBlock:^(CMTime time)
{
_progress = CMTimeGetSeconds(time);
}];
Swift 3
let currentTime:Double = player.currentItem.currentTime().seconds
You can get the seconds of your current time by accessing the seconds property of the currentTime(). This will return a Double that represents the seconds in time. Then you can use this value to construct a readable time to present to your user.
First, include a method to return the time variables for H:mm:ss that you will display to the user:
func getHoursMinutesSecondsFrom(seconds: Double) -> (hours: Int, minutes: Int, seconds: Int) {
let secs = Int(seconds)
let hours = secs / 3600
let minutes = (secs % 3600) / 60
let seconds = (secs % 3600) % 60
return (hours, minutes, seconds)
}
Next, a method that will convert the values you retrieved above into a readable string:
func formatTimeFor(seconds: Double) -> String {
let result = getHoursMinutesSecondsFrom(seconds: seconds)
let hoursString = "\(result.hours)"
var minutesString = "\(result.minutes)"
if minutesString.characters.count == 1 {
minutesString = "0\(result.minutes)"
}
var secondsString = "\(result.seconds)"
if secondsString.characters.count == 1 {
secondsString = "0\(result.seconds)"
}
var time = "\(hoursString):"
if result.hours >= 1 {
time.append("\(minutesString):\(secondsString)")
}
else {
time = "\(minutesString):\(secondsString)"
}
return time
}
Now, update the UI with the previous calculations:
func updateTime() {
// Access current item
if let currentItem = player.currentItem {
// Get the current time in seconds
let playhead = currentItem.currentTime().seconds
let duration = currentItem.duration.seconds
// Format seconds for human readable string
playheadLabel.text = formatTimeFor(seconds: playhead)
durationLabel.text = formatTimeFor(seconds: duration)
}
}
With Swift 4.2, use this;
let currentPlayer = AVPlayer()
if let currentItem = currentPlayer.currentItem {
let duration = currentItem.asset.duration
}
let currentTime = currentPlayer.currentTime()
Swift 4
self.playerItem = AVPlayerItem(url: videoUrl!)
self.player = AVPlayer(playerItem: self.playerItem)
self.player?.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main, using: { (time) in
if self.player!.currentItem?.status == .readyToPlay {
let currentTime = CMTimeGetSeconds(self.player!.currentTime())
let secs = Int(currentTime)
self.timeLabel.text = NSString(format: "%02d:%02d", secs/60, secs%60) as String//"\(secs/60):\(secs%60)"
})
}
AVPlayerItem *currentItem = player.currentItem;
NSTimeInterval currentTime = CMTimeGetSeconds(currentItem.currentTime);
NSLog(#" Capturing Time :%f ",currentTime);
Swift:
let currentItem = yourAVPlayer.currentItem
let duration = currentItem.asset.duration
var currentTime = currentItem.asset.currentTime
Swift 5:
Timer.scheduledTimer seems better than addPeriodicTimeObserver if you want to have a smooth progress bar
static public var currenTime = 0.0
static public var currenTimeString = "00:00"
Timer.scheduledTimer(withTimeInterval: 1/60, repeats: true) { timer in
if self.player!.currentItem?.status == .readyToPlay {
let timeElapsed = CMTimeGetSeconds(self.player!.currentTime())
let secs = Int(timeElapsed)
self.currenTime = timeElapsed
self.currenTimeString = NSString(format: "%02d:%02d", secs/60, secs%60) as String
print("AudioPlayer TIME UPDATE: \(self.currenTime) \(self.currenTimeString)")
}
}
Swift 4.2:
let currentItem = yourAVPlayer.currentItem
let duration = currentItem.asset.duration
let currentTime = currentItem.currentTime()
in swift 5+
You can query the player directly to find the current time of the actively playing AVPlayerItem.
The time is stored in a CMTime Struct for ease of conversion to various scales such as 10th of sec, 100th of a sec etc
In most cases we need to represent times in seconds so the following will show you what you want
let currentTimeInSecs = CMTimeGetSeconds(player.currentTime())

Resources