I have an AVPlayer that I want to begin playing at the specific time of 11.593 seconds. I have this number in milliseconds retrieved from a URL string, converted to a Double, then to a CMTime like this:
http://www.mywebsite.com/video?s=11593
Extract the 11593 as a String -> convert to Double 11593.0.
Then I convert to a CMTime:
let time = CMTime(seconds: milliseconds,
preferredTimescale: CMTimeScale(NSEC_PER_MSEC))
Then I tell the AVPlayer to seek:
player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero)
But the player always seeks to 25.88 seconds. Why??
Your entire use of CMTime(seconds:preferredTimescale:) is wrong. The first argument should be a number of seconds (hence the name, seconds:), not a number of milliseconds; and the second argument should be a reasonable timescale, such as 600.
Related
Hello fellow AudioKit users,
I'm trying to setup AudioKit 5 with a playback time indication, and am having trouble.
If I use AudioPlayer's duration property, this is the total time of the audio file, not the current playback time.
ex:
let duration = player.duration
Always gives the file's total time.
Looking at old code from AKAudioPlayer, it seemed to have a "currentTime" property.
The migration guide (https://github.com/AudioKit/AudioKit/blob/v5-main/docs/MigrationGuide.md) mentions some potentially helpful classes from the old version, however the "AKTimelineTap" has no replacement with no comments from the developers... nice.
Also still not sure how to manipulate the current playback time either...
I've also checked out Audio Kit 5's Cookbooks, however this is for adding effects and nodes, not necessarily for playback display, etc..
Thanks for any help with this new version of AudioKit.
You can find playerNode in AudioPlayer, it's AVAudioPlayerNode class.
Use lastRenderTime and playerTime, you can calculate current time.
ex:
// get playerNode in AudioPlayer.
let playerNode = player.playerNode
// get lastRenderTime, and transform to playerTime.
guard let lastRenderTime = playerNode.lastRenderTime else { return }
guard let playerTime = playerNode.playerTime(forNodeTime: lastRenderTime) else { return }
// use sampleRate and sampleTime to calculate current time in seconds.
let sampleRate = playerTime.sampleRate
let sampleTime = playerTime.sampleTime
let currentTime = Double(sampleTime) / sampleRate
I have to play a base64 encoded mp3 string and get callbacks at certain times so that i can do some operations.
I tried AVAudioPlayer initWithData and i was able to play that string but there is no callback method in AVAudioPlayer.
AVPlayer provides Timed State Observations but it does not let me play a base 64 encoded string as it does not have initwithdata method.
Subject to your goal, you can observe timings at set intervals and within this callback optimise your logic to specific duration to act on.
Simply implement addPeriodicTimeObserver(forInterval:queue:using:) which will be triggered at predefined CMTime and trigger callback to execute.
Complete details with examples are in documentation
https://developer.apple.com/documentation/avfoundation/media_assets_playback_and_editing/observing_the_playback_time
but including the snippet for reference
var player: AVPlayer!
var playerItem: AVPlayerItem!
var timeObserverToken: Any?
// Notify every half second
let timeScale = CMTimeScale(NSEC_PER_SEC)
let time = CMTime(seconds: 0.5, preferredTimescale: timeScale)
timeObserverToken = player.addPeriodicTimeObserver(forInterval: time,
queue: .main) {
[weak self] time in
// implement your logic based on your required timings
}
make sure to keep a strong reference to any observer otherwise it will be released and no callback will be called.
I am playing a song using AVAudioPlayerNode and I am trying to control its time using a UISlider but I can't figure it out how to seek the time using AVAUdioEngine.
After MUCH trial and error I think I have finally figured this out.
First you need to calculate the sample rate of your file. To do this get the last render time of your AudioNode:
var nodetime: AVAudioTime = self.playerNode.lastRenderTime
var playerTime: AVAudioTime = self.playerNode.playerTimeForNodeTime(nodetime)
var sampleRate = playerTime.sampleRate
Then, multiply your sample rate by the new time in seconds. This will give you the exact frame of the song at which you want to start the player:
var newsampletime = AVAudioFramePosition(sampleRate * Double(Slider.value))
Next, you are going to want to calculate the amount of frames there are left in the audio file:
var length = Float(songDuration!) - Slider.value
var framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
Finally, stop your node, schedule the new segment of audio, and start your node again!
playerNode.stop()
if framestoplay > 1000 {
playerNode.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, atTime: nil,completionHandler: nil)
}
playerNode.play()
If you need further explanation I wrote a short tutorial here: http://swiftexplained.com/?p=9
For future readers, probably better to get the sample rate as :
playerNode.outputFormat(forBus: 0).sampleRate
Also take care when converting to AVAudioFramePosition, as it is an integer, while sample rate is a double. Without rounding the result, you may end up with undesirable results.
P.S. The above answer assumes that the file you are playing has the same sample rate as the output format of the player, which may or may not be true.
I'm really out of ideas so I'll have to ask you guys again...
I'm building an iPhone application which uses three instances of AVPlayer. They all play at the same time and it's very important that they do so. I used to run this code:
CMClockRef syncTime = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostTime);
[self.playerOne setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerTwo setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerThree setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
which worked perfectly. But a few days ago it just stopped working, the three players are delayed by about 300-400ms (which is way to much, everything under 100ms would be okay). Two of these AVPlayer have some Audio processing, which takes some time more than the "normal" AVPlayer, but it used to work before and the currentTime property tells me, that these players are delayed, so the syncing seems to fail.
I have no idea why it stopped working, I didn't really changed something, but I'm using an observer where i can ask the self.playerX.currentTime property, which gives me a delay of about .3-.4 seconds... I already tried to resync the players if delay>.1f but the delay is still there. So I think the audio processing of player1 and 2 can't be responsable for the delay, as the currentTime property does know they are delayed (i hope you know what I mean). Maybe someone of you guys know why I'm having such a horrible delay, or is able to provide me another idea.
Thanks in advance!
So, I found the solution. I forgot to [self.playerX prerollAtRate:]. I thought if the observer is AVPlayerReadyToPlay it means, that the player is "really" ready. In fact, it does not. After AVPlayer is readyToPlay, it has to be pre rolled. Once that is done you can sync your placer. The delay is now somewhere at 0.000006 seconds.
Full func to sync avplayer's across multiple iOS devices
private func startTribePlayer() {
let dateFormatterGet = DateFormatter()
dateFormatterGet.dateFormat = "yyyy-MM-dd"
guard let refDate = dateFormatterGet.date(from: "2019-01-01") else { return }
let tsRef = Date().timeIntervalSince(refDate)
//currentDuration is avplayeritem.duration().seconds
let remainder = tsRef.truncatingRemainder(dividingBy: currentDuration)
let ratio = remainder / currentDuration
let seekTime = ratio * currentDuration
let bufferTime = 0.5
let bufferSeekTime = seekTime + bufferTime
let mulFactor = 10000.0
let timeScale = CMTimeScale(mulFactor)
let seekCMTime = CMTime(value: CMTimeValue(CGFloat(bufferSeekTime * mulFactor)), timescale: timeScale)
let syncTime = CMClockGetHostTimeClock()
let hostTime = CMClockGetTime(syncTime)
tribeMusicPlayer?.seek(to: seekCMTime, toleranceBefore: .zero, toleranceAfter: .zero, completionHandler: { [weak self] (successSeek) in
guard let tvc = self, tvc.tribeMusicPlayer?.currentItem?.status == .readyToPlay else { return }
tvc.tribeMusicPlayer?.preroll(atRate: 1.0, completionHandler: { [tvc] (successPreroll) in
tvc.tribePlayerDidPlay = true
tvc.tribeMusicPlayer?.setRate(1.0, time: seekCMTime, atHostTime: hostTime)
})
})
}
I have an AVPlayer which is playing a HLS video stream. My user interface provides a row of buttons, one for each "chapter" in the video (the buttons are labeled "1", "2", "3"). The app downloads some meta-data from a server which contains the list of chapter cut-in points denoted in seconds. For example, one video is 12 minutes in length - the list of chapter cut-in points are 0, 58, 71, 230, 530, etc., etc.
When the user taps one of the "chapter buttons" the button handler code does this:
[self.avPlayer pause];
[self.avPlayer seekToTime: CMTimeMakeWithSeconds(seekTime, 600)
toleranceBefore: kCMTimeZero
toleranceAfter: kCMTimeZero
completionHandler: ^(BOOL finished)
{
[self.avPlayer play];
}];
Where "seekTime" is a local var which contains the cut-in point (as described above).
The problem is that the video does not always start at the correct point. Sometimes it does. But sometimes it is anywhere from a tenth of a second, to 2 seconds BEFORE the requested seekTime. It NEVER starts after the requested seekTime.
Here are some stats on the video encoding:
Encoder: handbrakeCLI
Codec: h.264
Frame rate: 24 (actually, 23.976 - same as how it was shot)
Video Bitrate: multiple bitrates (64/150/300/500/800/1200)
Audio Bitrate: 128k
Keyframes: 23.976 (1 per second)
I am using the Apple mediafilesegmenter tool, of course, and the variantplaylistcreator to generate the playlist.
The files are being served from an Amazon Cloud/S3 bucket.
One area which I remain unclear about is CMTimeMakeWithSeconds - I have tried several variations based on different articles/docs I have read. For example, in the above excerpt I am using:
CMTimeMakeWithSeconds(seekTime, 600)
I have also tried:
CMTimeMakeWithSeconds(seekTime, 1)
I can't tell which is correct, though BOTH seem to produce the same inconsistent results!
I have also tried:
CMTimeMakeWithSeconds(seekTime, 23.967)
Some articles claim this works like a numerator/denomenator, so n/1 should be correct where 'n' is number of seconds (as in CMTimeMakeWithseconds(n, 1)). But, the code was originally created by a different programmer (who is gone now) and he used the 600 number for the preferredTimeScale (ie. CMTimeMakeWithseconds(n, 600)).
Can anyone offer any clues as to what I am doing wrong, or even if the kind of accuracy I am trying to achieve is even possible?
And in case someone is tempted to offer "alternative" solutions, we are already considering breaking the video up into separate streams, one per chapter, but we do not believe that will give us the same performance in the sense that changing chapters will take longer as a new AVPlayerItem will have to be created and loaded, etc., etc., etc. So if you think this is the only solution that will work (and we do expect this will achieve the result we want - ie. each chapter WILL start exactly where we want it to) feel free to say so.
Thanks in advance!
int32_t timeScale = self.player.currentItem.asset.duration.timescale;
CMTime time = CMTimeMakeWithSeconds(77.000000, timeScale);
[self.player seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
I had a problem with 'seekToTime'. I solved my problem with this code. 'timescale' is trick for this problem.
Swift version:
let playerTimescale = self.player.currentItem?.asset.duration.timescale ?? 1
let time = CMTime(seconds: 77.000000, preferredTimescale: playerTimescale)
self.player.seek(to: time, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero) { (finished) in /* Add your completion code here */
}
My suggestion:
1) Don't use [avplayer seekToTime: toleranceBefore: toleranceAfter: ], this will delay your seek time 4-5 seconds.
2) HLS video cut to 10 seconds per segment. Your chapter start postion should fit the value which is multipes of 10. As the segment starts with I frame, on this way, you can get quick seek time and accurate time.
please use function like [player seekToTime:CMTimeMakeWithSeconds(seekTime,1)] .
Because your tolerance value kCMTimeZero will take more time to seek.Instead of using tolerance value of kCMTimeZero you can use kCMTimeIndefinite which is equivalent the function that i specified earlier.
Put this code it may be resolve your problem.
let targetTime = CMTimeMakeWithSeconds(videoLastDuration, 1) // videoLastDuration hold the previous video state.
self.playerController.player?.currentItem?.seekToTime(targetTime, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
Swift5
let seconds = 45.0
let time = CMTimeMake(value: seconds, timescale: 1)
player?.seek(to: time, toleranceBefore: CMTime.zero, toleranceAfter: CMTime.zero)