I'm really out of ideas so I'll have to ask you guys again...
I'm building an iPhone application which uses three instances of AVPlayer. They all play at the same time and it's very important that they do so. I used to run this code:
CMClockRef syncTime = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostTime);
[self.playerOne setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerTwo setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerThree setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
which worked perfectly. But a few days ago it just stopped working, the three players are delayed by about 300-400ms (which is way to much, everything under 100ms would be okay). Two of these AVPlayer have some Audio processing, which takes some time more than the "normal" AVPlayer, but it used to work before and the currentTime property tells me, that these players are delayed, so the syncing seems to fail.
I have no idea why it stopped working, I didn't really changed something, but I'm using an observer where i can ask the self.playerX.currentTime property, which gives me a delay of about .3-.4 seconds... I already tried to resync the players if delay>.1f but the delay is still there. So I think the audio processing of player1 and 2 can't be responsable for the delay, as the currentTime property does know they are delayed (i hope you know what I mean). Maybe someone of you guys know why I'm having such a horrible delay, or is able to provide me another idea.
Thanks in advance!
So, I found the solution. I forgot to [self.playerX prerollAtRate:]. I thought if the observer is AVPlayerReadyToPlay it means, that the player is "really" ready. In fact, it does not. After AVPlayer is readyToPlay, it has to be pre rolled. Once that is done you can sync your placer. The delay is now somewhere at 0.000006 seconds.
Full func to sync avplayer's across multiple iOS devices
private func startTribePlayer() {
let dateFormatterGet = DateFormatter()
dateFormatterGet.dateFormat = "yyyy-MM-dd"
guard let refDate = dateFormatterGet.date(from: "2019-01-01") else { return }
let tsRef = Date().timeIntervalSince(refDate)
//currentDuration is avplayeritem.duration().seconds
let remainder = tsRef.truncatingRemainder(dividingBy: currentDuration)
let ratio = remainder / currentDuration
let seekTime = ratio * currentDuration
let bufferTime = 0.5
let bufferSeekTime = seekTime + bufferTime
let mulFactor = 10000.0
let timeScale = CMTimeScale(mulFactor)
let seekCMTime = CMTime(value: CMTimeValue(CGFloat(bufferSeekTime * mulFactor)), timescale: timeScale)
let syncTime = CMClockGetHostTimeClock()
let hostTime = CMClockGetTime(syncTime)
tribeMusicPlayer?.seek(to: seekCMTime, toleranceBefore: .zero, toleranceAfter: .zero, completionHandler: { [weak self] (successSeek) in
guard let tvc = self, tvc.tribeMusicPlayer?.currentItem?.status == .readyToPlay else { return }
tvc.tribeMusicPlayer?.preroll(atRate: 1.0, completionHandler: { [tvc] (successPreroll) in
tvc.tribePlayerDidPlay = true
tvc.tribeMusicPlayer?.setRate(1.0, time: seekCMTime, atHostTime: hostTime)
})
})
}
Related
Hello fellow AudioKit users,
I'm trying to setup AudioKit 5 with a playback time indication, and am having trouble.
If I use AudioPlayer's duration property, this is the total time of the audio file, not the current playback time.
ex:
let duration = player.duration
Always gives the file's total time.
Looking at old code from AKAudioPlayer, it seemed to have a "currentTime" property.
The migration guide (https://github.com/AudioKit/AudioKit/blob/v5-main/docs/MigrationGuide.md) mentions some potentially helpful classes from the old version, however the "AKTimelineTap" has no replacement with no comments from the developers... nice.
Also still not sure how to manipulate the current playback time either...
I've also checked out Audio Kit 5's Cookbooks, however this is for adding effects and nodes, not necessarily for playback display, etc..
Thanks for any help with this new version of AudioKit.
You can find playerNode in AudioPlayer, it's AVAudioPlayerNode class.
Use lastRenderTime and playerTime, you can calculate current time.
ex:
// get playerNode in AudioPlayer.
let playerNode = player.playerNode
// get lastRenderTime, and transform to playerTime.
guard let lastRenderTime = playerNode.lastRenderTime else { return }
guard let playerTime = playerNode.playerTime(forNodeTime: lastRenderTime) else { return }
// use sampleRate and sampleTime to calculate current time in seconds.
let sampleRate = playerTime.sampleRate
let sampleTime = playerTime.sampleTime
let currentTime = Double(sampleTime) / sampleRate
I'm working on a iOS app which let's users play a song with multiple tracks. Every track is played simultaneously and the volume of each track may be managed independently. The latency between all players is fine, when the song is played for the first time within the lifecycle of the app. However, the latency between all players increases, when the user stops the playback of all tracks and resumes it afterwards.
I create a AVAudioPlayer for every track in my MusicPlayer-Class:
do {
let data = try Data.init(contentsOf: url, options: NSData.ReadingOptions.alwaysMapped )
let player = try AVAudioPlayer(data: data)
player.prepareToPlay()
self.audioPlayers.append(player)
}
After loading the players the playback of the song is started by the playSong-Method:
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
for item in self.audioPlayers {
item.prepareToPlay()
}
let time = self.audioPlayers[0].deviceCurrentTime + 0.2
for item in self.audioPlayers {
item.play(atTime: time)
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
playerSemaphore.signal()
When the user decides to pause the Song or the AudioSession is interrupted the pauseSong-Method pauses all the player by setting their volume to 0 , pausing them, synchronizing their playback-times and resetting the proper volume:
func pauseSong() {
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
DispatchQueue.global().async {
// Set volumes of all audio tracks to 0, so delay on pause is not noticed
for audioTrackPlayer in self.audioPlayers {
audioTrackPlayer.volume = 0
}
// Update Times and reset the volumes now the tracks are paused
for (index, audioTrackPlayer) in self.audioPlayers.enumerated() {
audioTrackPlayer.pause()
audioTrackPlayer.currentTime = self.audioPlayers[0].currentTime
if let currentSongID = self.loadedSongID, let currentVolume = sharedSongManager.getVolumeOfTrackOfSongID(id: currentSongID, track: index), let currentActivation = sharedSongManager.getActivationStatusOfTrackOfSongID(id: currentSongID, track: index) {
if currentActivation == true {
audioTrackPlayer.volume = currentVolume
}
}
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
}
playerSemaphore.signal()
}
For logging the playertimes I wrote a method which returns all the playertimes and the difference of the max-Value and the min-Value of that array. It works with a time-offset to get exact values for the playertimes:
let startTime = CACurrentMediaTime()
var playerTimes = [TimeInterval]()
var delay: Double?
for item in self.audioPlayers.enumerated() {
let playerTime = item.element.currentTime
let currentTime = CACurrentMediaTime()
let timeOffset = currentTime - startTime
let correctedPlayerTime = playerTime - timeOffset
playerTimes.append(correctedPlayerTime)
}
if let maxTime = playerTimes.max(), let minTime = playerTimes.min() {
delay = Double(maxTime) - Double(minTime)
}
return (delay, playerTimes)
Here is a list of the delay (max-value - minus-value of playertimes) before and after pausing:
And here are the playertimes for one log before and after pausing:
All logs were created on a iPhone X with the most recent iOS.
After loading a new song the lifecycle restarts and I have a low latency until the first time the user pauses the song. I don't like the idea of creating new AVAudioPlayers after every interaction but I tried a lot of things like calculating the delay as I did in the logging function or calling the player instances asynchronously. However, most of the time this even increased the delay. Does anyone know a method to synchronize the players properly after resuming a song?
I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}
Here is a link to a GIF of the problem:
https://gifyu.com/images/ScreenRecording2017-01-25at02.20PM.gif
I'm taking a PHAsset from the camera roll, adding it to a mutable composition, adding another video track, manipulating that added track, and then exporting it through AVAssetExportSession. The result is a quicktime file with .mov file extension saved in the NSTemporaryDirectory():
guard let exporter = AVAssetExportSession(asset: mergedComposition, presetName: AVAssetExportPresetHighestQuality) else {
fatalError()
}
exporter.outputURL = temporaryUrl
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.shouldOptimizeForNetworkUse = true
exporter.videoComposition = videoContainer
// Export the new video
delegate?.mergeDidStartExport(session: exporter)
exporter.exportAsynchronously() { [weak self] in
DispatchQueue.main.async {
self?.exportDidFinish(session: exporter)
}
}
I then take this exported file and load it into a mapper object that applies 'slow motion' to the clip based on some time mappings given to it. The result here is an AVComposition:
func compose() -> AVComposition {
let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
let asset = AVAsset(url: url)
guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }
var segments: [AVCompositionTrackSegment] = []
for map in timeMappings {
let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
segments.append(segment)
}
emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
emptyTrack.segments = segments
if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
audioTrack.segments = segments
}
return composition.copy() as! AVComposition
}
Then I load this file as well as the original file which has also been mapped to slowmo into AVPlayerItems to play in a AVPlayers which is connected to a AVPlayerLayers in my app:
let firstItem = AVPlayerItem(asset: originalAsset)
let player1 = AVPlayer(playerItem: firstItem)
firstItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
player1.actionAtItemEnd = .none
firstPlayer.player = player1
// set up player 2
let secondItem = AVPlayerItem(asset: renderedVideo)
secondItem.seekingWaitsForVideoCompositionRendering = true //tried false as well
secondItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
secondItem.videoComposition = nil // tried AVComposition(propertiesOf: renderedVideo) as well
let player2 = AVPlayer(playerItem: secondItem)
player2.actionAtItemEnd = .none
secondPlayer.player = player2
I then have a start and end time to loop through these videos over and over. I don't use PlayerItemDidReachEnd because i'm not interested in the end, I'm interested in the user inputed time. I even use dispatchGroup to ENSURE that both players have finished seeking before trying to replay the video:
func playAllPlayersFromStart() {
let dispatchGroup = DispatchGroup()
dispatchGroup.enter()
firstPlayer.player?.currentItem?.seek(to: startTime, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero, completionHandler: { _ in
dispatchGroup.leave()
})
DispatchQueue.global().async { [weak self] in
guard let startTime = self?.startTime else { return }
dispatchGroup.wait()
dispatchGroup.enter()
self?.secondPlayer.player?.currentItem?.seek(to: startTime, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero, completionHandler: { _ in
dispatchGroup.leave()
})
dispatchGroup.wait()
DispatchQueue.main.async { [weak self] in
self?.firstPlayer.player?.play()
self?.secondPlayer.player?.play()
}
}
}
The strange part here is that the original asset, which has also been mapped via my compose() function loops perfectly fine. However, the renderedVideo which has also been run through the compose() function sometimes freezes when seeking during one of the CMTimeMapping segments. The only difference between the file that freezes and the file that doesnt freeze is that one has been exported to the NSTemporaryDirectory via the AVAssetExportSession to combine the two video tracks into one. They're both the same duration. I'm also sure that it's only the video layer that is freezing and not the audio, because if I add BoundaryTimeObservers to the player that freezes it still hits them and loops. Also the audio loops properly.
To me the strangest part is that the video 'resumes' if it makes it past the spot where it paused to start the seek after a 'freeze'. I've been stuck on this for days and would really love some guidance.
Other odd things to note:
- Even though the CMTimeMapping of the original versus the exported asset are the exact same durations, you'll notice that the rendered asset's slow motion ramp is more 'choppy' than the original.
- Audio continues when video freezes.
- video almost only ever freezes during slow motion sections (caused by CMTimeMapping objects based on segments
- rendered video seems to have to play 'catch up' at the beginning. even though i'm calling play after both have finished seeking, it seems to me that the right side plays faster in the beginning as a catch up. Strange part is that the segments are the exact same, just referencing two separate source files. One located in the asset library, the other in NSTemporaryDirectory
- It seems to me that AVPlayer and AVPlayerItemStatus is 'readyToPlay' before i call play.
- It seems to 'unfreeze' if the player proceeds PAST the point that it locked up.
- I tried to add observers for 'AVPlayerItemPlaybackDidStall' but it was never called.
Cheers!
The problem was in the AVAssetExportSession. To my surprise, changing exporter.canPerformMultiplePassesOverSourceMediaData = true fixed the issue. Although the documentation is quite sparse and even claims 'setting this property to true may have no effect', it did seem to fix the issue. Very, very, very strange! I consider this a bug and will be filing a radar. Here are the docs on the property: canPerformMultiplePassesOverSourceMediaData
It seems possible that in your playAllPlayersFromStart() method that the startTime variable may have changed between the two tasks dispatched (this would be especially likely if that value updates based on scrubbing).
If you make a local copy of startTime at the start of the function, and then use it in both blocks, you may have better luck.
I'm trying to change the audioTimePitchAlgorithm for playback at lower rates than 0.5 but don't seem to be having much luck. I've scoured SO for solutions but every implementation I try seems to still limit me to a rate of 0.5 or above...
var player: AVPlayer = AVQueuePlayer()
#IBAction func play(sender: AnyObject) {
let assetQueue = [aVItem1, aVItem2, aVItem3, aVItem4, aVItem5, aVItem6, aVItem7, aVItem8, aVItem9, aVItem0]
var itemQueue: [AVPlayerItem] = []
for index in 0...9{
let nextItem: AVPlayerItem = AVPlayerItem(asset: assetQueue[index])
nextItem.audioTimePitchAlgorithm = "AVAudioTimePitchAlgorithmSpectral"
itemQueue.append(nextItem)
}
player = AVQueuePlayer(items: itemQueue)
player.play()
player.rate = 0.25
}
I apologise in advance if this is actually a straight forward process and I am perhaps too new to this to grasp some of the underlying concepts. I have already also tried creating an AVMutableAudioMixInputParameters object and assigning it to an AVMutableAudioMix object and intern assigning this mix object to the AVPlayerItem, but this also yielded the same result (playback at a rate of 0.5), so I have only included my first, more simplified, code attempt. Any help would be greatly appreciated :)