Swift - How to get the current position of AVAudioPlayerNode while it's looping? - ios

I have an AVAudioPlayerNode looping a segment of a song:
audioPlayer.scheduleBuffer(segment, at: nil, options:.loops)
I want to get current position of the song while it's playing. Usually, this is done by calculating = currentFrame / audioSampleRate
where
var currentFrame: AVAudioFramePosition {
guard let lastRenderTime = audioPlayer.lastRenderTime,
let playerTime = audioPlayer.playerTime(forNodeTime: lastRenderTime) else {
return 0
}
return playerTime.sampleTime
}
However, when the loop ends and restarts, the currentFrame does not restart. But it still increases which makes currentFrame / audioSampleRate incorrect as the current position.
So what is the correct way to calculate the current position?

Good old modulo will do the job:
public var currentTime: TimeInterval {
guard let nodeTime = player.lastRenderTime,
let playerTime = player.playerTime(forNodeTime: nodeTime) else {
return 0
}
let time = (Double(playerTime.sampleTime) / playerTime.sampleRate)
.truncatingRemainder(dividingBy: Double(file.length) / Double(playerTime.sampleRate))
return time
}

Related

Equivalent of getBufferedPosition() in exoplayer for iOS AVPlayer

I am currently developing an avplayer app which calculates HLS streaming metrics. I wanted to get buffer level for the current item.
private var availableDuration: Double {
guard let timeRange = player.currentItem?.loadedTimeRanges.first?.timeRangeValue else {
return 0.0
}
let startSeconds = timeRange.start.seconds
let durationSeconds = timeRange.duration.seconds
return startSeconds + durationSeconds
}
I am a little confused in the terminology used in apple documentations.
Here i am getting availableDuration of the current item but i am not sure if this represents the buffer level of the current item.
Your code seems ok. I used same
var bufferInSeconds: Double {
guard let range = self.loadedTimeRanges.first?.timeRangeValue else {
return 0.0
}
let sec = range.start.seconds + range.duration.seconds
return sec >= 0 ? sec : 0
}

Set left and right headphone volume using two different sliders

I am generating a wave sound for different frequencies and user should hear this wave sound using headphones only and he/she will set left and right headphone volumes using two different sliders. To achieve wave sound I wrote below code which works perfect.
But problem is: From last 5 days I am trying to set volume for left and right headphones separately, but no luck.
class Synth {
// MARK: Properties
public static let shared = Synth()
public var volume: Float {
set {
audioEngine.mainMixerNode.outputVolume = newValue
}
get {
audioEngine.mainMixerNode.outputVolume
}
}
public var frequencyRampValue: Float = 0
public var frequency: Float = 440 {
didSet {
if oldValue != 0 {
frequencyRampValue = frequency - oldValue
} else {
frequencyRampValue = 0
}
}
}
private var audioEngine: AVAudioEngine
private lazy var sourceNode = AVAudioSourceNode { _, _, frameCount, audioBufferList in
let ablPointer = UnsafeMutableAudioBufferListPointer(audioBufferList)
let localRampValue = self.frequencyRampValue
let localFrequency = self.frequency - localRampValue
let period = 1 / localFrequency
for frame in 0..<Int(frameCount) {
let percentComplete = self.time / period
let sampleVal = self.signal(localFrequency + localRampValue * percentComplete, self.time)
self.time += self.deltaTime
self.time = fmod(self.time, period)
for buffer in ablPointer {
let buf: UnsafeMutableBufferPointer<Float> = UnsafeMutableBufferPointer(buffer)
buf[frame] = sampleVal
}
}
self.frequencyRampValue = 0
return noErr
}
private var time: Float = 0
private let sampleRate: Double
private let deltaTime: Float
private var signal: Signal
// MARK: Init
init(signal: #escaping Signal = Oscillator.square) {
audioEngine = AVAudioEngine()
let mainMixer = audioEngine.mainMixerNode
let outputNode = audioEngine.outputNode
let format = outputNode.inputFormat(forBus: 0)
sampleRate = format.sampleRate
deltaTime = 1 / Float(sampleRate)
self.signal = signal
let inputFormat = AVAudioFormat(commonFormat: format.commonFormat,
sampleRate: format.sampleRate,
channels: 1,
interleaved: format.isInterleaved)
audioEngine.attach(sourceNode)
audioEngine.connect(sourceNode, to: mainMixer, format: inputFormat)
audioEngine.connect(mainMixer, to: outputNode, format: nil)
mainMixer.outputVolume = 0
audioEngine.mainMixerNode.pan = 100 // this does not work,
//audioEngine.mainMixerNode.pan = 1.0 // this also does not work
do {
try audioEngine.start()
} catch {
print("Could not start engine: \(error.localizedDescription)")
}
}
//This function will be called in view controller to generate sound
public func setWaveformTo(_ signal: #escaping Signal) {
self.signal = signal
}
}
With the above code I can hear the wave sound as normal in left and right headphone.
I tried to use audioEngine.mainMixerNode.pan for value 100 and -100 also -1.0 and 1.0 but this did not make any change.
I tried to use audioEngine.mainMixerNode.pan for value 100 and -100 but this did not make any change.
The allowable range for the pan value is {-1.0, 1.0}. The values that you say you used are outside that range, so it's not surprising that they had no effect. Try 0.75 or -0.75 instead.

Swift: Trying to control time in AVAudioPlayerNode using UISlider

I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound.
to get the current time of the player I'm doing this:
extension AVAudioPlayerNode {
var currentTime: TimeInterval {
get {
if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) {
return Double(playerTime.sampleTime) / playerTime.sampleRate
}
return 0
}
}
}
I have a slider that indicates the current time of the audio. When the user changes the slider value, on .ended event I have to change the current time of the player to that indicated in the slider.
To do so:
extension AVAudioPlayerNode {
func seekTo(value: Float, audioFile: AVAudioFile, duration: Float) {
if let nodetime = self.lastRenderTime{
let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodetime)!
let sampleRate = self.outputFormat(forBus: 0).sampleRate
let newsampletime = AVAudioFramePosition(Int(sampleRate * Double(value)))
let length = duration - value
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
self.stop()
if framestoplay > 1000 {
self.scheduleSegment(audioFile, startingFrame: newsampletime, frameCount: framestoplay, at: nil,completionHandler: nil)
}
}
self.play()
}
However, my function seekTo is not working correctly(I'm printing currentTime before and after the function and it shows always a negative value ~= -0.02). What is the wrong thing I'm doing and can I find a simpler way to change the currentTime of the player?
I ran into same issue. Apparently the framestoplay was always 0, which happened because of sampleRate. The value for playerTime.sampleRate was always 0 in my case.
So,
let framestoplay = AVAudioFrameCount(Float(playerTime.sampleRate) * length)
must be replaced with
let framestoplay = AVAudioFrameCount(Float(sampleRate) * length)

Split AKAudioFile into chunks separated by silence

Given a single AKAudioFile that has been created from an AKNodeRecorder containing a series of spoken words, where each word is separated by at least 1 second, what is the best approach to ultimately create a series of files with each file containing one word?
I believe this can be accomplished if there is a way to iterate the file in, for example, 100 ms chunks, and measure the average amplitude of each chunk. "Silent chunks" could be those below some arbitrarily small amplitude. While iterating, if I encounter a chunk with non-silent amplitude, I can grab the starting timestamp of this "non-silent" chunk to create an audio file that starts here and ends at the start time of the next "silent" chunk.
Whether it'd be using a manual approach like the one above or a more built-in processing technique to AudioKit, any suggestions would be greatly appreciated.
I don't have a complete solution, but I've started working on something similar to this. This function could serve as a jumping off point for what you need. Basically you want to read the file into a buffer then analyze the buffer data. At that point you could dice it up into smaller buffers and write those to file.
public class func guessBoundaries(url: URL, sensitivity: Double = 1) -> [Double]? {
var out: [Double] = []
guard let audioFile = try? AVAudioFile(forReading: url) else { return nil }
let processingFormat = audioFile.processingFormat
let frameCount = AVAudioFrameCount(audioFile.length)
guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: processingFormat, frameCapacity: frameCount) else { return nil }
audioFile.framePosition = 0
do {
audioFile.framePosition = 0
try audioFile.read(into: pcmBuffer, frameCount: frameCount)
} catch let err as NSError {
AKLog("ERROR: Couldn't read data into buffer. \(err)")
return nil
}
let channelCount = Int(pcmBuffer.format.channelCount)
let bufferLength = 1024
let inThreshold: Double = 0.001 / sensitivity
let outThreshold: Double = 0.0001 * sensitivity
let minSegmentDuration: Double = 1
var counter = 0
var thresholdCrossed = false
var rmsBuffer = [Float](repeating: 0, count: bufferLength)
var lastTime: Double = 0
AKLog("inThreshold", inThreshold, "outThreshold", outThreshold)
for i in 0 ..< Int(pcmBuffer.frameLength) {
// n is the channel
for n in 0 ..< channelCount {
guard let sample: Float = pcmBuffer.floatChannelData?[n][i] else { continue }
if counter == rmsBuffer.count {
let time: Double = Double(i) / processingFormat.sampleRate
let avg = rmsBuffer.reduce(0, +) / rmsBuffer.count
// AKLog("Average Value at frame \(i):", avg)
if avg > inThreshold && !thresholdCrossed && time - lastTime > minSegmentDuration {
thresholdCrossed = true
out.append(time)
lastTime = time
} else if avg <= outThreshold && thresholdCrossed && time - lastTime > minSegmentDuration {
thresholdCrossed = false
out.append(time)
lastTime = time
}
counter = 0
}
rmsBuffer[counter] = abs(sample)
counter += 1
}
}
rmsBuffer.removeAll()
return out
}

Need objects to move with a constant speed

I'm trying to create a game, where the objects need to chase food. Right now the objects speeds up, when the food is within the given radius. But I need the speed to always be the same.
Any suggestions how to fix this? I have tried to add an SKAction under the chase function, where I set the position.x and position.y, but I can't make it work correct.
Fish class:
class Fish:SKSpriteNode{
private let kMovingAroundKey = "movingAround"
private let kFishSpeed:CGFloat = 4.5
private var swimmingSpeed:CGFloat = 100.0
private let sensorRadius:CGFloat = 100.0
private weak var food:SKSpriteNode! = nil //the food node that this fish currently chase
override init(texture: SKTexture?, color: UIColor, size: CGSize) {
super.init(texture: texture, color: color, size: size)
physicsBody = SKPhysicsBody(rectangleOf: size)
physicsBody?.affectedByGravity = false
physicsBody?.categoryBitMask = Collider.fish
physicsBody?.contactTestBitMask = Collider.food
physicsBody?.collisionBitMask = 0x0 //No collisions with fish, only contact detection
name = "fish"
let sensor = SKShapeNode(circleOfRadius: 100)
sensor.fillColor = .red
sensor.zPosition = -1
sensor.alpha = 0.1
addChild(sensor)
}
func getDistanceFromFood()->CGFloat? {
if let food = self.food {
return self.position.distance(point: food.position)
}
return nil
}
func lock(food:SKSpriteNode){
//We are chasing a food node at the moment
if let currentDistanceFromFood = self.getDistanceFromFood() {
if (currentDistanceFromFood > self.position.distance(point: food.position)){
//chase the closer food node
self.food = food
self.stopMovingAround()
}//else, continue chasing the last locked food node
//We are not chasing the food node at the moment
}else{
//go and chase then
if food.position.distance(point: self.position) <= self.sensorRadius {
self.food = food
self.stopMovingAround()
}
}
}
//Helper method. Not used currently. You can use this method to prevent chasing another (say closer) food while already chasing one
func isChasing(food:SKSpriteNode)->Bool{
if self.food != nil {
if self.food == food {
return true
}
}
return false
}
func stopMovingAround(){
if self.action(forKey: kMovingAroundKey) != nil{
removeAction(forKey: kMovingAroundKey)
}
}
//MARK: Chasing the food
//This method is called many times in a second
func chase(within rect:CGRect){
guard let food = self.food else {
if action(forKey: kMovingAroundKey) == nil {
self.moveAround(within: rect)
}
return
}
//Check if food is in the water
if rect.contains(food.frame.origin) {
//Take a detailed look in my Stackoverflow answer of how chasing works : https://stackoverflow.com/a/36235426
let dx = food.position.x - self.position.x
let dy = food.position.y - self.position.y
let angle = atan2(dy, dx)
let vx = cos(angle) * kFishSpeed
let vy = sin(angle) * kFishSpeed
position.x += vx
position.y += vy
}
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func moveAround(within rect:CGRect){
if scene != nil {
//Go randomly around the screen within view bounds
let point = rect.randomPoint()
//Formula: time = distance / speed
let duration = TimeInterval(point.distance(point: position) / self.swimmingSpeed)
let move = SKAction.move(to: point, duration: duration)
let block = SKAction.run {
[unowned self] in
self.moveAround(within: rect)
}
let loop = SKAction.sequence([move,block])
run(loop, withKey: kMovingAroundKey)
}
}
}
Gamescene where you can see the update function.
override func update(_ currentTime: TimeInterval) {
self.enumerateChildNodes(withName: "fish") {
[unowned self] node, stop in
if let fish = node as? Fish {
self.enumerateChildNodes(withName: "food") {
node, stop in
fish.lock(food: node as! SKSpriteNode)
}
fish.chase(within: self.water.frame)
}
}
}
Probably something like this (GameScene):
var prev : TimeInterval!
//MARK: Chasing the food
override func update(_ currentTime: TimeInterval) {
defer { prev = currentTime }
guard prev != nil else { return }
let dt = currentTime - prev
print("delta time \(dt)")
self.enumerateChildNodes(withName: "fish") {
[unowned self] node, stop in
if let fish = node as? Fish {
self.enumerateChildNodes(withName: "food") {
node, stop in
fish.lock(food: node as! SKSpriteNode)
}
fish.chase(within: self.water.frame, delta:CGFloat(dt))
}
}
}
The variable prev is a property of GameScene.
And change chase() method in Fish class:
//MARK: Chasing the food
func chase(within rect:CGRect, delta:CGFloat){
guard let food = self.food else {
if action(forKey: kMovingAroundKey) == nil {
self.moveAround(within: rect)
}
return
}
//Check if food is in the water
if rect.contains(food.frame.origin) {
//Take a detailed look in my Stackoverflow answer of how chasing works : https://stackoverflow.com/a/36235426
//check for collision
if self.frame.contains(food.frame.origin) {
food.removeFromParent()
}else {
let dx = food.position.x - self.position.x
let dy = food.position.y - self.position.y
let angle = atan2(dy, dx)
let vx = cos(angle) * self.swimmingSpeed * delta
let vy = sin(angle) * self.swimmingSpeed * delta
print("vx \(vx), vy (\(vy)")
position.x += vx
position.y += vy
//time = distance / speed
}
}
}
I have added the delta time parameter. You may wonder what is delta time? I will quote LearnCocos2d from that article:
Delta time is simply the time difference between the previous and the
current frame.
Why is this important to maintain the constant speed of a node? Well, we use our Fish.swimmingSpeed variable to determine the speed of fish(forget kFishSpeed, it doesn't have a purpose now).
Now in the case of SKAction, a duration parameter directly determines the speed of fish, because duration applies to time, and time = distance / speed, so we calculate the time like this currently:
let duration = TimeInterval(point.distance(point: position) / self.swimmingSpeed)
Now lets say that duration equals to 1. That means, the fish is going to move 100 pts per second. Now, the difference between update() method and actions, is that it is executed 60 times per second. And because our method chase() is ideally called 60 time per second, our speed now have to be Fish.swimmingSpeed / 60.
And this is where delta time comes in. Because it may happen that frame is not rendered in 1/60 second (0.016667), but rather rendering may take longer (eg. 0.02,0.03 sec), we calculate that difference, and use it to adjust the movement. Kind of cheating IMO in compare to normal behaviour without using delta time, because player loses the control on moments if game lags a lot (eg. its hero teleports), but that part is off topic :) It is up to you to test what works / looks better for you.
So we do (in order to calculate the distance):
let vx = cos(angle) * self.swimmingSpeed * delta
let vy = sin(angle) * self.swimmingSpeed * delta
and that will give you a constant speed.
I could go more into detail but it is late here, and you probably got an idea how things are functioning, so I will stop here. Happy coding!

Resources