I am trying to get the duration of an AVPlayer asset in Hours, Minutes, Seconds. I am able to get the time but it seems to be in seconds and milliseconds.
This is how I get the time:
let duration : CMTime = (player.currentItem!.asset.duration)!
let seconds : Float64 = CMTimeGetSeconds(duration)
I am then applying that to a slider using
slider.maximumValue = Float(seconds)
The outcome of this obviously gives me the duration in seconds however I want to be able to use the duration to set the maximumValue of my slider for video clips which may be under a minute.
For Example: My code above returns 30.865 for a 30 second clip. I need it to return 0.30
This ended up working for me:
let duration : CMTime = (player.currentItem!.asset.duration)!
let timeInMinutes = Float(duration.value)
Related
I'm trying to append CVPixelBuffers to AVAssetWriterInputPixelBufferAdaptor at the intended framerate, but it seems to be too fast, and my math is off. This isn't capturing from the camera, but capturing changing images. The actual video is much to fast than the elapsed time it was captured.
I have a function that appends the CVPixelBuffer every 1/24 of a second. So I'm trying to add an offset of 1/24 of a second to the last time.
I've tried:
let sampleTimeOffset = CMTimeMake(value: 100, timescale: 2400)
and:
let sampleTimeOffset = CMTimeMake(value: 24, timescale: 600)
and:
let sampleTimeOffset = CMTimeMakeWithSeconds(0.0416666666, preferredTimescale: 1000000000)
I'm adding onto the currentSampleTime and appending like so:
self.currentSampleTime = CMTimeAdd(currentSampleTime, sampleTimeOffset)
let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)
One other solution I thought of is get the difference between the last time and the current time, and add that onto the currentSampleTime for accuracy, but unsure how to do it.
I found a way to accurately capture the time delay by comparing the last time in milliseconds compared to the current time in milliseconds.
First, I have a general current milliseconds time function:
func currentTimeInMilliSeconds()-> Int
{
let currentDate = Date()
let since1970 = currentDate.timeIntervalSince1970
return Int(since1970 * 1000)
}
When I create a writer, (when I start recording video) I set a variable in my class to the current time in milliseconds:
currentCaptureMillisecondsTime = currentTimeInMilliSeconds()
Then in my function that's supposed to be called 1/24 of a second is not always accurate, so I need to get the difference in milliseconds between when I started writing, or my last function call.
Do a conversion of milliseconds to seconds, and set that to CMTimeMakeWithSeconds.
let lastTimeMilliseconds = self.currentCaptureMillisecondsTime
let nowTimeMilliseconds = currentTimeInMilliSeconds()
let millisecondsDifference = nowTimeMilliseconds - lastTimeMilliseconds
// set new current time
self.currentCaptureMillisecondsTime = nowTimeMilliseconds
let millisecondsToSeconds:Float64 = Double(millisecondsDifference) * 0.001
let sampleTimeOffset = CMTimeMakeWithSeconds(millisecondsToSeconds, preferredTimescale: 1000000000)
I can now append my frame with the accurate delay that actually occurred.
self.currentSampleTime = CMTimeAdd(currentSampleTime, sampleTimeOffset)
let success = self.assetWriterPixelBufferInput?.append(cv, withPresentationTime: currentSampleTime)
When I finish writing the video and I save it to my camera roll, it is the exact duration from when I was recording.
I am implementing custom camera and for that I want to set exposure duration.
My code for setting slider's properties is-
slider.maximumValue = Float(CMTimeGetSeconds(camera.activeFormat.maxExposureDuration))
slider.minimumValue = Float(CMTimeGetSeconds(camera.activeFormat.minExposureDuration))
Now the problem comes while setting the exposure time whenever the slider is changed.
My code for that looks like this -
change(duration: CMTimeMakeWithSeconds(Double(slider.value), 600), iso: AVCaptureISOCurrent)
But in
func CMTimeMakeWithSeconds(_ seconds: Float64, _ preferredTimescale: Int32) -> CMTime
I am confused with preferredTimescale, and what should be its value, it's working fine with 600, but what is ideal value.
You should view the discussion here
Preferred time scale will add a denominator to your seconds, meaning if you set 5 seconds and preferredTimeScale of 60 it will be 1/12 of a second
I am retooling my audio player application to use AVQueuePlayer instead of AVAudioPlayer, as I now appreciate that I need to be able to queue up audio files for playback. Since AVQueuePlayer is a subclass of AVPlayer, which uses CMTime to keep track of AVPlayerItem time properties (duration, currentTime, etc...) I had to rework my animation function which updates the slider and UILabels of my AudioPlayer view.
I am finding that, now that I am using CMTime structs, instead of TimeIntervals, my currentSeconds variable returns some odd, inconsistent values as the audio file plays. You can see the printed results below the code I've included.
This also results in my track slider hiccuping to different spots and the duration label jumping to odd values, including 00:00 while half way through playback.
I'm new to a lot of what's going on here and I've tried to change the interval values in the addTimeObserver function, thinking that but it hasn't changed anything. This might be an issue with a Timer object that I've scheduled to fire the animateFooterView() every 0.1 seconds... but I'm not sure how to address it.
func animateFooterView(){
let track = audioQueuePlayer?.currentItem
let duration: CMTime = (track?.asset.duration)!
let durationSeconds: Float64 = CMTimeGetSeconds(duration)
let currentSeconds: Float64 = CMTimeGetSeconds(track!.currentTime())
audioQueuePlayer?.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 10), queue: DispatchQueue.main, using:
{ (CMTime) -> Void in
if track?.status == .readyToPlay{
let trackProgress: Float = Float(currentSeconds / durationSeconds) //* (self.footerView.trackSlider.maximumValue)
print("Progress in seconds: " + String(currentSeconds))
self.footerView.trackSlider.value = trackProgress
let minutesProgress = Int(currentSeconds) / 60 % 60
let secondsProgress = Int(currentSeconds) % 60
let minutesDuration = Int(durationSeconds) / 60 % 60
let secondsDuration = Int(durationSeconds) % 60
self.footerView.durationLabel.text = String(format: "%02d:%02d | %02d:%02d", minutesProgress, secondsProgress, minutesDuration, secondsDuration)
}
})
}
Print() output of currentSeconds:
Progress in seconds: 0.959150266
Progress in seconds: 0.459939791
Progress in seconds: 0.739364115
Progress in seconds: 0.0
Progress in seconds: 0.0
Progress in seconds: 0.253904687
Progress in seconds: 0.0
Progress in seconds: 0.15722927
Progress in seconds: 0.346783126
Progress in seconds: 0.050562017
Progress in seconds: 1.158813019
Progress in seconds: 1.252108811
Progress in seconds: 1.356793865
Progress in seconds: 1.458337661
Progress in seconds: 1.554249846
Progress in seconds: 1.615820093
Progress in seconds: 0.0
Progress in seconds: 0.0
Progress in seconds: 0.0
Progress in seconds: 0.050562017
Progress in seconds: 0.15722927
Progress in seconds: 0.253904687
Progress in seconds: 0.346783126
Progress in seconds: 0.459939791
Progress in seconds: 0.558436703
Progress in seconds: 0.656613436
Progress in seconds: 0.739364115
Progress in seconds: 0.854647401
Progress in seconds: 0.959150266
Progress in seconds: 1.057932049
Progress in seconds: 1.158813019
Here is a glimpse as at the timer I was mentioning, which calls animateFooterView every 0.1 seconds whenever the footerView plays.
func play() {
...
timer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(animateFooterView), userInfo: nil, repeats: true)
...
}
Actually, I think I just realized what's going wrong here... I'm adding a periodicTimeObserver every 0.1 seconds due to the timer firing. I think I just need to add that observer once, and then do away with the timer all together?
I was correct in my assumption that the timer continually adding an observer to the AVPlayer was messing up the returned current time. I removed the timer and instead had the observer function update the current time. It works perfect now.
I'm using AVPlayer to play a live streaming. This stream supports one hour catch-up which means user can seek to one hour ago and play. But I have one question how do I know the accurate position that the player is playing. I need to display current position on the player view. For example,if user is playing half an hour ago then display -30:00; if user is playing the latest content, the player will show 00:00 or live. Thanks
Swift solution :
override func getLiveDuration() -> Float {
var result : Float = 0.0;
if let items = player.currentItem?.seekableTimeRanges {
if(!items.isEmpty) {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
result = Float(startSeconds + durationSeconds)
}
}
return result;
}
To get a live position poison and seek to it you can by using seekableTimeRanges of AVPlayerItem:
CMTimeRange seekableRange = [player.currentItem.seekableTimeRanges.lastObject CMTimeRangeValue];
CGFloat seekableStart = CMTimeGetSeconds(seekableRange.start);
CGFloat seekableDuration = CMTimeGetSeconds(seekableRange.duration);
CGFloat livePosition = seekableStart + seekableDuration;
[player seekToTime:CMTimeMake(livePosition, 1)];
Also when you seek some time back, you can get current playing position by calling currentTime method
CGFloat current = CMTimeGetSeconds([self.player.currentItem currentTime]);
CGFloat diff = livePosition - current;
I know this question is old, but I had the same requirement and I believe the solutions aren't addressing properly the intent of the question.
What I did for this same requirement was to gather the current point in time, the starting time, and the length of the total duration of the stream.
I'll explain something before going further, the current point in time could surpass the (starting time + total duration) this is due to the way hls is structured as ts segments. Ts segments are small chucks of playable video, you could have on your seekable range 5 ts segments of 10 seconds each. This doesn't mean that 50 secs is the full length of the live stream, there is around a full segment more (so 60 seconds of playtime total) but it isn't categorized as seekable since you shouldn't seek to that segment. If you were to do this you'll notice in most instances rebuffering (cause the source may be still creating the next ts segment when you already reached the end of playback).
What I did was checking if the current stream time is further than the seekable rage, if so this would mean were are live on stream. If it isn't you could easily calculate how far behind you are from live if you subtract the current time, starting time, and total duration.
let timeRange:CMTimeRange = player.currentItem?.seekableTimeRanges.last
let start = timeRange.start.seconds
let totalDuration = timeRange.duration.seconds
let currentTime = player.currentTime().seconds
let secondsBehindLive = currentTime - totalDuration - start
The code above will give you a negative number with the number of seconds behind "live" or more specifically the start of the lastest ts segment. Or a positive number or zero when it's playing the latest ts segment.
Tbh I don't really know when does the seekableTimeRanges will have more than 1 value, it has always been just one for the streams I have tested with, but if you find in your streams more than 1 value you may have to figure if you want to add all the ranges duration, which time range to use as the start value, etc. At least for my use case, this was enough.
I have seen some examples of CMTime (Three separate links), but I still don't get it. I'm using an AVCaptureSession with AVCaptureVideoDataOutput and I want to set the max and min frame rate of the the output. My problem is I just don't understand the CMTime struct.
Apparently CMTimeMake(value, timeScale) should give me value frames every 1/timeScale seconds for a total of value/timeScale seconds, or am I getting that wrong?
Why isn't this documented anywhere in order to explain what this does?
If it does truly work like that, how would I get it to have an indefinite number of frames?
If its really simple, I'm sorry, but nothing has clicked just yet.
A CMTime struct represents a length of time that is stored as rational number (see CMTime Reference). CMTime has a value and a timescale field, and represents the time value/timescale seconds .
CMTimeMake is a function that returns a CMTime structure, for example:
CMTime t1 = CMTimeMake(1, 10); // 1/10 second = 0.1 second
CMTime t2 = CMTimeMake(2, 1); // 2 seconds
CMTime t3 = CMTimeMake(3, 4); // 3/4 second = 0.75 second
CMTime t4 = CMTimeMake(6, 8); // 6/8 second = 0.75 second
The last two time values t3 and t4 represent the same time value, therefore
CMTimeCompare(t3, t4) == 0
If you set the videoMinFrameDuration of a AVCaptureSession is does not make a difference if you set
connection.videoMinFrameDuration = CMTimeMake(1, 20); // or
connection.videoMinFrameDuration = CMTimeMake(2, 40);
In both cases the minimum time interval between frames is set to 1/20 = 0.05 seconds.
My experience differs.
For let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
If you set a breakpoint and look in the debugger side window it says:
"383 100ths of a second"
Testing by seeking to a fixed offset in a video in AVPlayer has confirmed this.
So put the actual number of seconds in the seconds field, and the precision in the preferredTimescale field. So 100 means precision of hundredths of a second.
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
Still seeks to the same place in the video, but it displays in the debugger side window as "3833 1000ths of a second"
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 1)
Does not seek to the same place in the video, because it's been truncated, and it displays in the debugger side window as "3 seconds". Notice that the .833 part has been lost due to the preferredTimescale.
CMTime(seconds: value, timescale: scale)
means value/scale in a just one second