I'm working on a iOS app which let's users play a song with multiple tracks. Every track is played simultaneously and the volume of each track may be managed independently. The latency between all players is fine, when the song is played for the first time within the lifecycle of the app. However, the latency between all players increases, when the user stops the playback of all tracks and resumes it afterwards.
I create a AVAudioPlayer for every track in my MusicPlayer-Class:
do {
let data = try Data.init(contentsOf: url, options: NSData.ReadingOptions.alwaysMapped )
let player = try AVAudioPlayer(data: data)
player.prepareToPlay()
self.audioPlayers.append(player)
}
After loading the players the playback of the song is started by the playSong-Method:
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
for item in self.audioPlayers {
item.prepareToPlay()
}
let time = self.audioPlayers[0].deviceCurrentTime + 0.2
for item in self.audioPlayers {
item.play(atTime: time)
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
playerSemaphore.signal()
When the user decides to pause the Song or the AudioSession is interrupted the pauseSong-Method pauses all the player by setting their volume to 0 , pausing them, synchronizing their playback-times and resetting the proper volume:
func pauseSong() {
playerSemaphore.wait()
NotificationCenter.default.post(playerDidReceiveCommand.getNotification())
DispatchQueue.global().async {
// Set volumes of all audio tracks to 0, so delay on pause is not noticed
for audioTrackPlayer in self.audioPlayers {
audioTrackPlayer.volume = 0
}
// Update Times and reset the volumes now the tracks are paused
for (index, audioTrackPlayer) in self.audioPlayers.enumerated() {
audioTrackPlayer.pause()
audioTrackPlayer.currentTime = self.audioPlayers[0].currentTime
if let currentSongID = self.loadedSongID, let currentVolume = sharedSongManager.getVolumeOfTrackOfSongID(id: currentSongID, track: index), let currentActivation = sharedSongManager.getActivationStatusOfTrackOfSongID(id: currentSongID, track: index) {
if currentActivation == true {
audioTrackPlayer.volume = currentVolume
}
}
}
DispatchQueue.main.async {
NotificationCenter.default.post(playerFinishedCommand.getNotification())
}
}
playerSemaphore.signal()
}
For logging the playertimes I wrote a method which returns all the playertimes and the difference of the max-Value and the min-Value of that array. It works with a time-offset to get exact values for the playertimes:
let startTime = CACurrentMediaTime()
var playerTimes = [TimeInterval]()
var delay: Double?
for item in self.audioPlayers.enumerated() {
let playerTime = item.element.currentTime
let currentTime = CACurrentMediaTime()
let timeOffset = currentTime - startTime
let correctedPlayerTime = playerTime - timeOffset
playerTimes.append(correctedPlayerTime)
}
if let maxTime = playerTimes.max(), let minTime = playerTimes.min() {
delay = Double(maxTime) - Double(minTime)
}
return (delay, playerTimes)
Here is a list of the delay (max-value - minus-value of playertimes) before and after pausing:
And here are the playertimes for one log before and after pausing:
All logs were created on a iPhone X with the most recent iOS.
After loading a new song the lifecycle restarts and I have a low latency until the first time the user pauses the song. I don't like the idea of creating new AVAudioPlayers after every interaction but I tried a lot of things like calculating the delay as I did in the logging function or calling the player instances asynchronously. However, most of the time this even increased the delay. Does anyone know a method to synchronize the players properly after resuming a song?
Related
I am building a karaoke app with the ability to sing with video so here is my problem:
I am recording the user video (video only from the front camera) along with applying voice filters with audiokit on a separate audio records.
Now in my playback, i want to play the video and the audio in a sync mode but it didn't succeed because a have an out sync of video and audio.
I am using akplayer for audio so i can apply voice mod and vlckit for playing user video.
do {
//MARK: VLC kit part of the video setup
Vlc_VideoPlayer = VLCMediaPlayer()
Vlc_VideoPlayer.media = VLCMedia(url: recordVideoURL)
Vlc_VideoPlayer.addObserver(self, forKeyPath: "time", options: [], context: nil)
Vlc_VideoPlayer.addObserver(self, forKeyPath: "remainingTime", options: [], context: nil)
Vlc_VideoPlayer.drawable = self.CameraView
//MARK: Audiokit with AKPlayer Setup
file = try AKAudioFile(forReading: recordVoiceURL)
player = AKPlayer(audioFile: file)
self.player.preroll()
delay = AKVariableDelay(player)
delay.rampTime = 0.5
delayMixer = AKDryWetMixer(player, delay)
reverb = AKCostelloReverb(delayMixer)
reverbMixer = AKDryWetMixer(delayMixer, reverb)
booster = AKBooster(reverbMixer)
tracker = AKAmplitudeTracker(booster)
AudioKit.output = tracker
try AudioKit.start()
}catch{
print (error)
}
self.startPlayers()
now the startPlayers function :
func startPlayers(){
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
if AudioKit.engine.isRunning {
self.Vlc_VideoPlayer.audio.isMuted = true
self.Vlc_VideoPlayer.play()
self.player.isLooping = false
self.player.play()
}else{
self.startPlayers()
}
}
}
I don't know anything about the VLC player, but with the built in AVPlayer there is an option to sync to a clock:
var time: TimeInterval = 1 // 1 second in the future
videoPlayer.masterClock = CMClockGetHostTimeClock()
let hostTime = mach_absolute_time()
let cmHostTime = CMClockMakeHostTimeFromSystemUnits(hostTime)
let cmVTime = CMTimeMakeWithSeconds(time, preferredTimescale: videoPlayer.currentTime().timescale)
let futureTime = CMTimeAdd(cmHostTime, cmVTime)
videoPlayer.setRate(1, time: CMTime.invalid, atHostTime: futureTime)
AKPlayer then supports syncing to the mach_absolute_time() hostTime using its scheduling functions. As you have above, the two will start close together but there is no guarantee of any sync.
Trying to start two players will work out of pure look and unless you have means to synchronize playback after it started, it will not be perfect. Ideally, you should play the audio with VLC as well to make use of its internal synchronization tools.
To iterate on what you have right now, I would suggest to start playback with VLC until it decoded the first frame, pause, start your audio and continue playback with VLC as soon as you decoded the first audio sample. This will still not be perfect, but probably better.
I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}
I'm having a hard time understanding how to use SKVideoNode properly. I'm making a simple game where the player has to complete against the "computer" whose moves are rendered in form of a pre-recorded video.
When the game begins the video should be paused, so I call pause() on it. Once the human player begins his/her moves the artificial player's video starts playing. This part is working fine.
Once the time is up, the game scene is paused via setting isPaused = true on my SKScene subclass (let's call it GameScene).
However, the player also has a chance to restart the game via pressing a button. When the game is restarted, the game scene is "unpaused" and the game state is reset like this:
gameView.isPaused = false // gameView is an SKView presenting the scene
gameScene.resetGameState()
The resetGameState method will simply reset players' scores and pause and rewind the video:
public func resetGameState() {
isGameStarted = false
isGameStopped = false
artificialPlayerScoreboard.score = 0
humanPlayerScoreboard.score = 0
video.pause()
videoPlayer.seek(to: CMTime(seconds: 0, preferredTimescale: 1))
}
Here is how I create the video and video player:
if let videoPath = Bundle.main.path(forResource: "video", ofType: "mp4") {
videoPlayer = AVPlayer(url: URL(fileURLWithPath: videoPath))
video = SKVideoNode(avPlayer: videoPlayer)
video.size = playerScreenSize
video.anchorPoint = CGPoint(x: 0, y: 0)
video.position = CGPoint.zero
video.pause()
artificialPlayerScreen.addChild(video)
}
The problem is that the video starts playing immediately after the game is restarted despite being paused. I've been trying to figure out why and came up with this hack:
gameView.isPaused = false
// Can't do this immediately because the video will start plyaing.
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 0.1, execute: {
self.gameScene.resetGameState()
})
Basically it does the same thing but pauses the video after a small delay.
Is this normal behaviour of SKVideoNode? Am I using it right?
I am using an AVPlayer object to play a live stream of an MP3, just using an HTTP link.
As with normal live video playing in iOS, there is a button that you can press to skip to live.
Is it possible to do this with AVPlayer?
E.g I am listening live, pause, then after however long play again. It continues from where I left it. But what if I want to skip to live?
I had the same need and made this extension to AVPlayer:
extension AVPlayer {
func seekToLive() {
if let items = currentItem?.seekableTimeRanges, !items.isEmpty {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
seek(to: CMTimeMakeWithSeconds(startSeconds + durationSeconds, 1))
}
}
}
I'm really out of ideas so I'll have to ask you guys again...
I'm building an iPhone application which uses three instances of AVPlayer. They all play at the same time and it's very important that they do so. I used to run this code:
CMClockRef syncTime = CMClockGetHostTimeClock();
CMTime hostTime = CMClockGetTime(hostTime);
[self.playerOne setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerTwo setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
[self.playerThree setRate:1.0f time:kCMTimeInvalid atHostTime:hostTime];
which worked perfectly. But a few days ago it just stopped working, the three players are delayed by about 300-400ms (which is way to much, everything under 100ms would be okay). Two of these AVPlayer have some Audio processing, which takes some time more than the "normal" AVPlayer, but it used to work before and the currentTime property tells me, that these players are delayed, so the syncing seems to fail.
I have no idea why it stopped working, I didn't really changed something, but I'm using an observer where i can ask the self.playerX.currentTime property, which gives me a delay of about .3-.4 seconds... I already tried to resync the players if delay>.1f but the delay is still there. So I think the audio processing of player1 and 2 can't be responsable for the delay, as the currentTime property does know they are delayed (i hope you know what I mean). Maybe someone of you guys know why I'm having such a horrible delay, or is able to provide me another idea.
Thanks in advance!
So, I found the solution. I forgot to [self.playerX prerollAtRate:]. I thought if the observer is AVPlayerReadyToPlay it means, that the player is "really" ready. In fact, it does not. After AVPlayer is readyToPlay, it has to be pre rolled. Once that is done you can sync your placer. The delay is now somewhere at 0.000006 seconds.
Full func to sync avplayer's across multiple iOS devices
private func startTribePlayer() {
let dateFormatterGet = DateFormatter()
dateFormatterGet.dateFormat = "yyyy-MM-dd"
guard let refDate = dateFormatterGet.date(from: "2019-01-01") else { return }
let tsRef = Date().timeIntervalSince(refDate)
//currentDuration is avplayeritem.duration().seconds
let remainder = tsRef.truncatingRemainder(dividingBy: currentDuration)
let ratio = remainder / currentDuration
let seekTime = ratio * currentDuration
let bufferTime = 0.5
let bufferSeekTime = seekTime + bufferTime
let mulFactor = 10000.0
let timeScale = CMTimeScale(mulFactor)
let seekCMTime = CMTime(value: CMTimeValue(CGFloat(bufferSeekTime * mulFactor)), timescale: timeScale)
let syncTime = CMClockGetHostTimeClock()
let hostTime = CMClockGetTime(syncTime)
tribeMusicPlayer?.seek(to: seekCMTime, toleranceBefore: .zero, toleranceAfter: .zero, completionHandler: { [weak self] (successSeek) in
guard let tvc = self, tvc.tribeMusicPlayer?.currentItem?.status == .readyToPlay else { return }
tvc.tribeMusicPlayer?.preroll(atRate: 1.0, completionHandler: { [tvc] (successPreroll) in
tvc.tribePlayerDidPlay = true
tvc.tribeMusicPlayer?.setRate(1.0, time: seekCMTime, atHostTime: hostTime)
})
})
}