Playing an audio file repeatedly with AVAudioEngine - ios

I'm working on an iOS app with Swift and Xcode 6. What I would like to do is play an audio file using an AVAudioEngine, and until this point everything OK. But how can I play it without stopping, I mean, that when it ends playing it starts again?
This is my code:
/*==================== CONFIGURATES THE AVAUDIOENGINE ===========*/
audioEngine.reset() //Resets any previous configuration on AudioEngine
let audioPlayerNode = AVAudioPlayerNode() //The node that will play the actual sound
audioEngine.attachNode(audioPlayerNode) //Attachs the node to the audioengine
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: nil) //Connects the applause playback node to the sound output
audioPlayerNode.scheduleFile(applause.applauseFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(nil)
audioPlayerNode.play() //Plays the sound
Before saying me that I should use AVAudioPlayer for this, I can't because later I will have to use some effects and play three audio files at the same time, also repeatedly.

I found the solution in another question, asked and also auto-answered by #CarveDrone , so I've just copied the code he used:
class aboutViewController: UIViewController {
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}
The only thing you have to change is the filePath constant. Here is the link to the original answer: Having AVAudioEngine repeat a sound

Swift 5, thanks at #Guillermo Barreiro
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super. viewDidLoad()
guard let filePath: String = Bundle.main.path(forResource: "chimes", ofType: "wav") else{ return }
print("\(filePath)")
let fileURL: URL = URL(fileURLWithPath: filePath)
guard let audioFile = try? AVAudioFile(forReading: fileURL) else{ return }
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
do{
try audioFile.read(into: audioFileBuffer)
} catch{
print("over")
}
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
try? audioEngine.start()
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, at: nil, options:AVAudioPlayerNodeBufferOptions.loops)
}

Related

Getting distorted sound if changing rate of audio using AVAudioPlayerNode and AVAudioEngine

I want to change rate and pitch of audio file. when i'm changing rate of audio pitch automatically changing and getting not clear sound of audio.
i'm using this code
var engine: AVAudioEngine!
var player: AVAudioPlayerNode!
var pitch : AVAudioUnitTimePitch!
var file = AVAudioFile()
var totalDuration : TimeInterval!
func configurePlayer() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
player.volume = 1.0
let path = Bundle.main.path(forResource: "sample", ofType: "wav")
let url = URL.init(fileURLWithPath: path!)
file = try! AVAudioFile(forReading: url)
let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length))
do {
try file.read(into: buffer!)
} catch _ {
}
pitch = AVAudioUnitTimePitch()
pitch.rate = 1
engine.attach(player)
engine.attach(pitch)
engine.attach(speedEffect)
engine.connect(player, to: pitch, format: buffer?.format)
engine.connect(pitch, to: engine.mainMixerNode, format: buffer?.format)
player.scheduleBuffer(buffer!, at: nil, options: AVAudioPlayerNodeBufferOptions.loops, completionHandler: nil)
engine.prepare()
do {
try engine.start()
} catch _ {
}
}
#IBAction func slideValueChanged(_ sender: UISlider) {
let newRate = sender.value/120;
pitch.rate = newRate
}
when changing rate using slider getting bad sound.
slider minValue: 60 maxValue: 240

Is there any way to make iOS device play sound signal in a buffer continuously in swift?

I am working with Swift's AVFoundation to launch ultrasonic sinewave and my approach is to play a .wav file. I wonder if there's approach to play the sound continuously instead of using a extra .wav file.
Here follows my code but I don't think the new code will be similar to this:
let myThread = Thread(target: self,
selector: #selector(ZouViewController.play()),
object: nil)
myThread.start()
[...]
func play(){
//rewrite soom
let fileName = Bundle.main.path(forResource: "19kHz", ofType: "wav")
let url = URL(fileURLWithPath: fileName!)
soundPlayer = try? AVAudioPlayer(contentsOf: url)
while true{
soundPlayer?.play()
}
}
The file 19kHz.wav is a sound file playing ultrasonic sinewave at frequency 19kHz, but its duration is not unavoidable. so there would be a sudden change at the begin of sound signal every loop when it is played again. So I want to abandon that approach and try to play the data continuously from a buffer. Is there any way to play a sound signal in a buffer?
I have addressed it by following code:
import Foundation
import AVFoundation
class PlaySineWave{
var audioEngine = AVAudioEngine()
var audioFormat : AVAudioFormat
let FL: AVAudioFrameCount = 44100
let freq : Float = 19000 //19kHz
var pcmBuffer : AVAudioPCMBuffer
init() {
self.audioFormat = AVAudioFormat(standardFormatWithSampleRate: 44100.0, channels: 1)
self.pcmBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat,
frameCapacity:AVAudioFrameCount(FL))
self.pcmBuffer.frameLength = AVAudioFrameCount(FL)
}
func play(){
let floatData = self.pcmBuffer.floatChannelData!.pointee
let step = 2 * Float.pi/Float(FL)
for i in 0 ..< Int(FL) {
floatData[i] = 0.3 * sinf(freq * Float(i) * step)
}
let playerNode = AVAudioPlayerNode()
self.audioEngine.attach(playerNode)
audioEngine.connect(playerNode, to: audioEngine.mainMixerNode,format: pcmBuffer.format)
do {
try audioEngine.start()
} catch let err as NSError {
print("Oh, no! \(err.code) \(err.domain)")
}
playerNode.play()
playerNode.scheduleBuffer(pcmBuffer, at:nil, options: [.loops]) { }
//audioEngine.stop()
}
}
After defined the class, In the ViewController it was called as
override func viewDidLoad() {
[...]
let myThread =
Thread(target:self,selector:#selector(SpectralViewController.play),
object:nil)
myThread.start()
}
[...]
func play(){
var Player = PlaySineWave()
Player.play()
}

Use peripheral as AVAudio output

Once I have connected the Bluetooth LE peripheral (headphones) to my device.
How I could use it to play sound ?
EDIT: I want to force play sound on peripheral
Actually I'm using this code to play sound on device speaker:
var engine = AVAudioEngine()
var player = AVAudioPlayerNode()
var pitch = AVAudioUnitTimePitch()
override func viewDidLoad() {
super.viewDidLoad()
player.volume = 1.0
let path = NSBundle.mainBundle().pathForResource("Test", ofType: "m4a")!
let url = NSURL.fileURLWithPath(path)
let file = try? AVAudioFile(forReading: url)
let buffer = AVAudioPCMBuffer(PCMFormat: file!.processingFormat, frameCapacity: AVAudioFrameCount(file!.length))
do {
try file!.readIntoBuffer(buffer)
} catch _ {
}
engine.attachNode(player)
engine.attachNode(pitch)
engine.connect(player, to: pitch, format: buffer.format)
engine.connect(pitch, to: engine.mainMixerNode, format: buffer.format)
player.scheduleBuffer(buffer, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Loops, completionHandler: nil)
engine.prepare()
do {
try engine.start()
} catch _ {
}
}

detect the end of a file in AVAudioPlayerNode

I have set up an audio multitrack player using apple's AVFoundation. I use nine AVAudioPlayerNodes attached to an AVAudioEngine and they are played at precisely the same time. In spriteKit, in my game scene, I would like to detect the end of the file in any of the AVAudioPlayerNodes so that I can run subsequent code. How do I do that? Unfortunately AVAudioPlayerNodes don't have the same convenient functions as the simple AVAudioPlayer class. Here is the multiTrack function:
import SpriteKit
import AVFoundation
var onesie = AVAudioPlayer()
var singleTrack = AVAudioPlayerNode()
var trackOne = AVAudioPlayerNode()
var trackTwo = AVAudioPlayerNode()
var trackThree = AVAudioPlayerNode()
var trackFour = AVAudioPlayerNode()
var trackFive = AVAudioPlayerNode()
var trackSix = AVAudioPlayerNode()
var trackSeven = AVAudioPlayerNode()
var trackEight = AVAudioPlayerNode()
var trackNine = AVAudioPlayerNode()
//variables to hold NSURLs as AVAudioFiles for use in AudioPlayer Nodes.
var single = AVAudioFile()
var one = AVAudioFile()
var two = AVAudioFile()
var three = AVAudioFile()
var four = AVAudioFile()
var five = AVAudioFile()
var six = AVAudioFile()
var seven = AVAudioFile()
var eight = AVAudioFile()
var nine = AVAudioFile()
//varibles for audio engine and player nodes. The "mixer" is part of the engine and already hooked up to the output
var engine = AVAudioEngine()
//reference the mixer
let mainMixer = engine.mainMixerNode
func audioMultiTrack(trackOneFN: String, trackTwoFN: String, trackThreeFN: String, trackFourFN: String, trackFiveFN: String, trackSixFN: String, trackSevenFN: String, trackEightFN: String, trackNineFN: String){
/*access audio filess for audio players (tracks)*/
//1
guard let trackOneFile = NSBundle.mainBundle().URLForResource(trackOneFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//2
guard let trackTwoFile = NSBundle.mainBundle().URLForResource(trackTwoFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//3
guard let trackThreeFile = NSBundle.mainBundle().URLForResource(trackThreeFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//4
guard let trackFourFile = NSBundle.mainBundle().URLForResource(trackFourFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//5
guard let trackFiveFile = NSBundle.mainBundle().URLForResource(trackFiveFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//6
guard let trackSixFile = NSBundle.mainBundle().URLForResource(trackSixFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//7
guard let trackSevenFile = NSBundle.mainBundle().URLForResource(trackSevenFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//8
guard let trackEightFile = NSBundle.mainBundle().URLForResource(trackEightFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//9
guard let trackNineFile = NSBundle.mainBundle().URLForResource(trackNineFN, withExtension: "mp3") else {
fatalError("File not found.")
}
//place NSURLs in AVAudioFile variables
//1
do {
try one = AVAudioFile(forReading: trackOneFile)
} catch {
fatalError("error loading track one file.")
}
//2
do {
try two = AVAudioFile(forReading: trackTwoFile)
} catch {
fatalError("error loading track two file.")
}
//3
do {
try three = AVAudioFile(forReading: trackThreeFile)
} catch {
fatalError("error loading track three file.")
}
//4
do {
try four = AVAudioFile(forReading: trackFourFile)
} catch {
fatalError("error loading track four file.")
}
//5
do {
try five = AVAudioFile(forReading: trackFiveFile)
} catch {
fatalError("error loading track five file.")
}
//6
do {
try six = AVAudioFile(forReading: trackSixFile)
} catch {
fatalError("error loading track six file.")
}
//7
do {
try seven = AVAudioFile(forReading: trackSevenFile)
} catch {
fatalError("error loading track six file.")
}
//8
do {
try eight = AVAudioFile(forReading: trackEightFile)
} catch {
fatalError("error loading track six file.")
}
//9
do {
try nine = AVAudioFile(forReading: trackNineFile)
} catch {
fatalError("error loading track six file.")
}
/*hook up audio units*/
//attach audio players (tracks) to audio engine
engine.attachNode(trackOne)
engine.attachNode(trackTwo)
engine.attachNode(trackThree)
engine.attachNode(trackFour)
engine.attachNode(trackFive)
engine.attachNode(trackSix)
engine.attachNode(trackSeven)
engine.attachNode(trackEight)
engine.attachNode(trackNine)
//connect the tracks to the mixer
engine.connect(trackOne, to: mainMixer, format: nil)
engine.connect(trackTwo, to: mainMixer, format: nil)
engine.connect(trackThree, to: mainMixer, format: nil)
engine.connect(trackFour, to: mainMixer, format: nil)
engine.connect(trackFive, to: mainMixer, format: nil)
engine.connect(trackSix, to: mainMixer, format: nil)
engine.connect(trackSeven, to: mainMixer, format: nil)
engine.connect(trackEight, to: mainMixer, format: nil)
engine.connect(trackNine, to: mainMixer, format: nil)
//connect audio files to audio players (tracks)
trackOne.scheduleFile(one, atTime: nil, completionHandler: nil)
trackTwo.scheduleFile(two, atTime: nil, completionHandler: nil)
trackThree.scheduleFile(three, atTime: nil, completionHandler: nil)
trackFour.scheduleFile(four, atTime: nil, completionHandler: nil)
trackFive.scheduleFile(five, atTime: nil, completionHandler: nil)
trackSix.scheduleFile(six, atTime: nil, completionHandler: nil)
trackSeven.scheduleFile(seven, atTime: nil, completionHandler: nil)
trackEight.scheduleFile(eight, atTime: nil, completionHandler: nil)
trackNine.scheduleFile(nine, atTime: nil, completionHandler: nil)
//try to start the audio engine
do {
try engine.start()
} catch {
print("error starting engine")
}
//function to create a precice time to start all audio players (tracks)
func startTime () ->AVAudioTime{
let samplerate = one.processingFormat.sampleRate
let sampleTime = AVAudioFramePosition(samplerate)
let time = AVAudioTime(sampleTime: sampleTime, atRate: samplerate)
return time
}
//start audio players (tracks) at precise time
trackOne.playAtTime(startTime())
trackTwo.playAtTime(startTime())
trackThree.playAtTime(startTime())
trackFour.playAtTime(startTime())
trackFive.playAtTime(startTime())
trackSix.playAtTime(startTime())
trackSeven.playAtTime(startTime())
trackEight.playAtTime(startTime())
trackNine.playAtTime(startTime())
}

Having AVAudioEngine repeat a sound

I've been having trouble making the code below repeat the sound at audioURL over and over again. Right now it just plays it once when the view is opened, then stops.
import UIKit
import AVFoundation
class aboutViewController: UIViewController {
var audioUrl = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!)
var audioEngine = AVAudioEngine()
var myPlayer = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
audioEngine.attachNode(myPlayer)
var audioFile = AVAudioFile(forReading: audioUrl, error: nil)
var audioError: NSError?
audioEngine.connect(myPlayer, to: audioEngine.mainMixerNode, format: audioFile.processingFormat)
myPlayer.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&audioError)
myPlayer.play()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Thanks!
After hours and hour of searching, this did it:
class aboutViewController: UIViewController {
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let filePath: String = NSBundle.mainBundle().pathForResource("chimes", ofType: "wav")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options:.Loops, completionHandler: nil)
}
...
}

Resources