How to play a short audio file with CoreMotion and AVAudioPlayer in iOS? - ios

I'm trying to play short sounds (1 to 4 seconds) when I move the iPhone on the X axis (I'm using CoreMotion and AVAudioPlayer). I want to play one sound for each movement direction change.
I wrote the code below, but when I move the iPhone, the sound is played many times without the movement direction change. The print calls below show many Down and Up such as Down Down Down Up Down Down Up Up.... If I comment both play callbacks, the print shows the alternation that I expect: Down Up Down Up Down Up....
Why the AVAudioPlayer.play is called more than one time when the movement direction changes?
override func viewDidLoad() {
super.viewDidLoad()
// Audio
audioURL = NSBundle.mainBundle().URLForResource("shortSound", withExtension: "wav")!
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioURL)
audioPlayer.prepareToPlay()
} catch {
print("audioPlayer failure")
}
// Sensor
lastDirection = 0
threshold = 2.1
motionManager = CMMotionManager()
if motionManager.accelerometerAvailable {
let queue = NSOperationQueue()
motionManager.startAccelerometerUpdatesToQueue(queue, withHandler: {
data, error in
guard let data = data else{
return
}
// Get the acceleration
let xAccel = data.acceleration.x
let xPositive = xAccel > 0
// Run if the acceleration is higher than theshold
if abs(xAccel) > self.threshold {
// Run only if the direction is changed
if self.lastDirection != 1 && xPositive {
print("Up")
self.play()
self.lastDirection = 1
} else if self.lastDirection != -1 && !xPositive {
print("Down")
self.play()
self.lastDirection = -1
}
}
})
}
}
func play() {
audioPlayer.currentTime = 0
audioPlayer.play()
}

You probably have a threading problem. You are running the updates on a background queue (queue, an arbitrary NSOperationQueue, which by the way you are also failing to retain), but then you are talking to self.lastDirection and calling self.play() on that same background queue without regard to the thread-safety of those activities.
I would suggest at the very least rewriting this section:
if self.lastDirection != 1 && xPositive {
print("Up")
self.play()
self.lastDirection = 1
} else if self.lastDirection != -1 && !xPositive {
print("Down")
self.play()
self.lastDirection = -1
}
...more like this:
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && xPositive {
self.lastDirection = 1
print("Up")
self.play()
} else if self.lastDirection != -1 && !xPositive {
self.lastDirection = -1
print("Down")
self.play()
}
}
Note that I've made two changes: I've stepped out to the main thread for the entire check-print-play-toggle dance, and I've reversed the order of events so that it goes check-toggle-print-play.
Also I would suggest two other changes: retain the operation queue (i.e. make it a property instead of a local), and reduce the frequency of motion manager updates (by setting a lower accelerometerUpdateInterval).

Related

AudioKit playback cracks

I want to analyze the microphone input frequency and then play the correct note which is near the frequency which was determined. I did that with of AudioKit.
This is working right now but since I implemented AudioKit to get the frequency feature the sound which plays after the frequency detection cracks sometimes during playback. Thats happened after I implemented AudioKit. Everything was fine before that...
var mic: AKMicrophone!
var tracker: AKFrequencyTracker!
var silence: AKBooster!
func initFrequencyTracker() {
AKSettings.channelCount = 2
AKSettings.audioInputEnabled = true
AKSettings.defaultToSpeaker = true
AKSettings.allowAirPlay = true
AKSettings.useBluetooth = true
AKSettings.allowHapticsAndSystemSoundsDuringRecording = true
mic = AKMicrophone()
tracker = AKFrequencyTracker(mic)
silence = AKBooster(tracker, gain: 0)
}
func deinitFrequencyTracker() {
AKSettings.audioInputEnabled = false
plotTimer.invalidate()
do {
try AudioKit.stop()
AudioKit.output = nil
} catch {
print(error)
}
}
func initPlotTimer() {
AudioKit.output = silence
do {
try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth, .allowAirPlay, .allowBluetoothA2DP])
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
setupPlot()
plotTimer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(updatePlotUI), userInfo: nil, repeats: true)
}
func setupPlot() {
let plot = AKNodeOutputPlot(mic, frame: audioInputPlot.bounds)
plot.translatesAutoresizingMaskIntoConstraints = false
plot.alpha = 0.3
plot.plotType = .rolling
plot.shouldFill = true
plot.shouldCenterYAxis = false
plot.shouldMirror = true
plot.color = UIColor(named: uiFarbe)
audioInputPlot.addSubview(plot)
// Pin the AKNodeOutputPlot to the audioInputPlot
var constraints = [plot.leadingAnchor.constraint(equalTo: audioInputPlot.leadingAnchor)]
constraints.append(plot.trailingAnchor.constraint(equalTo: audioInputPlot.trailingAnchor))
constraints.append(plot.topAnchor.constraint(equalTo: audioInputPlot.topAnchor))
constraints.append(plot.bottomAnchor.constraint(equalTo: audioInputPlot.bottomAnchor))
constraints.forEach { $0.isActive = true }
}
#objc func updatePlotUI() {
if tracker.amplitude > 0.3 {
let trackerFrequency = Float(tracker.frequency)
guard trackerFrequency < 7_000 else {
// This is a bit of hack because of modern Macbooks giving super high frequencies
return
}
var frequency = trackerFrequency
while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
frequency /= 2.0
}
while frequency < Float(noteFrequencies[0]) {
frequency *= 2.0
}
var minDistance: Float = 10_000.0
var index = 0
for i in 0..<noteFrequencies.count {
let distance = fabsf(Float(noteFrequencies[i]) - frequency)
if distance < minDistance {
index = i
minDistance = distance
}
print(minDistance, distance)
}
// let octave = Int(log2f(trackerFrequency / frequency))
frequencyLabel.text = String(format: "%0.1f", tracker.frequency)
if frequencyTranspose(note: notesToTanspose[index]) != droneLabel.text {
momentaneNote = frequencyTranspose(note: notesToTanspose[index])
droneLabel.text = momentaneNote
stopSinglePlayer()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.03, execute: {
self.prepareSinglePlayerFirstForStart(note: self.momentaneNote)
self.startSinglePlayer()
})
}
}
}
func frequencyTranspose(note: String) -> String {
var indexNote = notesToTanspose.firstIndex(of: note)!
let chosenInstrument = UserDefaults.standard.object(forKey: "whichInstrument") as! String
if chosenInstrument == "Bb" {
if indexNote + 2 >= notesToTanspose.count {
indexNote -= 12
}
return notesToTanspose[indexNote + 2]
} else if chosenInstrument == "Eb" {
if indexNote - 3 < 0 {
indexNote += 12
}
return notesToTanspose[indexNote - 3]
} else {
return note
}
}
Appears that your implementation can be improved slightly by putting the multithreading principles of iOS into practice. Now, I'm not an expert in the subject, but if we look into the statement: "the sound which plays after the frequency detection cracks sometimes during playback".
I'd like to point out that the "frequency" of the "crack" is random or unpredictable and this happens during computation.
So, move code that doesn't need to be computed in the main thread to a background thread (https://developer.apple.com/documentation/DISPATCH)
While refactoring, you can test your implementation by increasing the frequency of calls to the callback computation of your Timer, so reduce the value to 0.05 for example. Which means that if you increase the frequency to, let's say 0.2, you'll probably hear less random crackles.
Now, this is easier said than done when considering concurrency but that's what you need to improve.

AudioKit output changes to ear speakers

I implemented the AudioKit "MICROPHONE ANALYSIS" example https://audiokit.io/examples/MicrophoneAnalysis/ in my App.
I want to analyze the microphone input frequency and then play the correct note which is near the frequency which was determined.
Normally the sound output is the speaker or a Bluetooth device connected to my iPhone but after implementing the "MICROPHONE ANALYSIS" example the sound output changed to the tiny little speaker on the top of the iPhone which is normally used when you get a call.
How can I switch to the "normal" speaker or to the connected Bluetooth device like before?
var mic: AKMicrophone!
var tracker: AKFrequencyTracker!
var silence: AKBooster!
func initFrequencyTracker() {
AKSettings.audioInputEnabled = true
mic = AKMicrophone()
tracker = AKFrequencyTracker(mic)
silence = AKBooster(tracker, gain: 0)
}
func deinitFrequencyTracker() {
plotTimer.invalidate()
do {
try AudioKit.stop()
AudioKit.output = nil
} catch {
print(error)
}
}
func initPlotTimer() {
AudioKit.output = silence
do {
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
setupPlot()
plotTimer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(updatePlotUI), userInfo: nil, repeats: true)
}
func setupPlot() {
let plot = AKNodeOutputPlot(mic, frame: audioInputPlot.bounds)
plot.translatesAutoresizingMaskIntoConstraints = false
plot.alpha = 0.3
plot.plotType = .rolling
plot.shouldFill = true
plot.shouldCenterYAxis = false
plot.shouldMirror = true
plot.color = UIColor(named: uiFarbe)
audioInputPlot.addSubview(plot)
// Pin the AKNodeOutputPlot to the audioInputPlot
var constraints = [plot.leadingAnchor.constraint(equalTo: audioInputPlot.leadingAnchor)]
constraints.append(plot.trailingAnchor.constraint(equalTo: audioInputPlot.trailingAnchor))
constraints.append(plot.topAnchor.constraint(equalTo: audioInputPlot.topAnchor))
constraints.append(plot.bottomAnchor.constraint(equalTo: audioInputPlot.bottomAnchor))
constraints.forEach { $0.isActive = true }
}
#objc func updatePlotUI() {
if tracker.amplitude > 0.1 {
let trackerFrequency = Float(tracker.frequency)
guard trackerFrequency < 7_000 else {
// This is a bit of hack because of modern Macbooks giving super high frequencies
return
}
var frequency = trackerFrequency
while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
frequency /= 2.0
}
while frequency < Float(noteFrequencies[0]) {
frequency *= 2.0
}
var minDistance: Float = 10_000.0
var index = 0
for i in 0..<noteFrequencies.count {
let distance = fabsf(Float(noteFrequencies[i]) - frequency)
if distance < minDistance {
index = i
minDistance = distance
}
}
// let octave = Int(log2f(trackerFrequency / frequency))
frequencyLabel.text = String(format: "%0.1f", tracker.frequency)
if frequencyTranspose(note: notesToTanspose[index]) != droneLabel.text {
note = frequencyTranspose(note: notesToTanspose[index])
droneLabel.text = note
DispatchQueue.main.asyncAfter(deadline: .now() + 0.03, execute: {
self.prepareSinglePlayerFirstForStart(note: self.note)
self.startSinglePlayer()
})
}
}
}
func frequencyTranspose(note: String) -> String {
var indexNote = notesToTanspose.firstIndex(of: note)!
let chosenInstrument = UserDefaults.standard.object(forKey: "whichInstrument") as! String
if chosenInstrument == "Bb" {
if indexNote + 2 >= notesToTanspose.count {
indexNote -= 12
}
return notesToTanspose[indexNote + 2]
} else if chosenInstrument == "Eb" {
if indexNote - 3 < 0 {
indexNote += 12
}
return notesToTanspose[indexNote - 3]
} else {
return note
}
}
It's a good practice to control the session settings, so start by creating a method in your application to take care of that during initialisation.
Following up, there's an example where I set a category and the desired options:
func start() {
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, options: .defaultToSpeaker)
try session.setActive(true, options: .notifyOthersOnDeactivation)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
try AudioKit.start()
} catch {
// your error handler
}
}
You can call the method start where you make the call to AudioKit.Start() in initPlotTimer.
The example above is using the AVAudioSession, which I believe is what AKSettings wraps (please feel free to edit my answer to not mislead future readers, as I'm not looking at the AudioKit source-code at the moment).
Now that AVAudioSession is exposed, let's stick with the method offered by AudioKit since that's what you're dealing with.
Here's another example using AKSettings:
func start() {
do {
AKSettings.channelCount = 2
AKSettings.ioBufferDuration = 0.002
AKSettings.audioInputEnabled = true
AKSettings.bufferLength = .medium
AKSettings.defaultToSpeaker = true
// check docs for other options and settings
try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth])
try AudioKit.start()
} catch {
// your handler
}
}
Have in mind that you don't necessarily have to call it start, or run AudioKit's start method, I'm just exposing the initialisation phase, to make it readable to you and other use-cases.
Reference:
https://developer.apple.com/documentation/avfoundation/avaudiosession/categoryoptions
https://audiokit.io/docs/Classes/AKSettings.html

How to detect apple watch position using accelerometer and Gravity in swift?

I have creating an application for apple watch. The Logic is, when the user rise their hand and tap a button from app. At that time I will fetch the accelerometer values. And whenever user rise their hand and meet the captured position, I have to send message to iPhone.
For me am getting the values correctly But, It will always give the values based on accelerometer. Which means user doesn't rise the hand but accelerometer values matched. So values will send to mobile.
func startUpadateAccelerometer() {
self.motionManager.accelerometerUpdateInterval = 1.0 / 10.0
self.motionManager.startAccelerometerUpdates(to: OperationQueue()) { (accelerometerData, error) -> Void in
guard accelerometerData != nil else
{
print("There was an error: \(String(describing: error))")
return
}
DispatchQueue.main.async {
if(self.can_reset){
let differenceX : Bool = self.validateButtom(currentValue: accelerometerData!.acceleration.x, inititalValue: self.gravityReference.x)
let differenceY : Bool = self.validateButtom(currentValue: accelerometerData!.acceleration.y, inititalValue: self.gravityReference.y)
if(differenceX && differenceY && self.gravityOffsetDifference(currentValue: accelerometerData!.acceleration.x, referenceValue: self.gravityReference.x ) && self.gravityOffsetDifference(currentValue: accelerometerData!.acceleration.y, referenceValue: self.gravityReference.y)){
WKInterfaceDevice().play(.success)
// self.addLog(_logStr: EventsTypes.Achievements1.rawValue)
self.logString += String(format: "X: %0.3f Y: %0.3f Z: %0.3f \n", accelerometerData!.acceleration.x,accelerometerData!.acceleration.y,accelerometerData!.acceleration.z)
self.m_XYZValueLbl.setText(self.logString)
self.is_RechedZeroPos = true
self.session?.sendMessage(["msg" : "\(self.logString)"], replyHandler: nil) { (error) in
NSLog("%#", "Error sending message: \(error)")
}
} else {
if(self.checkAchievements2_3(deviceMotionData: accelerometerData!.acceleration) == true) {
if self.is_RechedZeroPos == true {
self.addLog(_logStr: EventsTypes.Achievements2.rawValue)
self.is_RechedZeroPos = false
} else {
self.addLog(_logStr: EventsTypes.Achievements3.rawValue)
}
}
}
} else {
self.gravityReference = accelerometerData!.acceleration
//self.logString = String(format: "Reference Acceleration %0.3f %0.3f %0.3f \n", self.gravityReference.x,self.gravityReference.y,self.gravityReference.z)
self.can_reset = true
}
}
}
}
func validateButtom(currentValue : Double , inititalValue : Double) -> Bool {
if( currentValue == 0 && inititalValue == 0) {
return true
} else if( currentValue < 0 && inititalValue < 0) {
return true
} else if( currentValue > 0 && inititalValue > 0) {
return true
} else {
return false
}
}
func gravityOffsetDifference(currentValue : Double , referenceValue: Double) -> Bool {
var difference : Double!
if (fabs(currentValue) <= fabs(referenceValue)) {
difference = fabs(referenceValue) - fabs(currentValue)
} else {
difference = fabs(currentValue) - fabs(referenceValue)
}
if (difference <= gravityOffset ) {
return true
} else {
return false
}
}
Please guide me to get the values only when the user captured the position.
Accelerometers measure changes in velocity along the x, y, and z axes
Fetching accelerometer data
let motion = CMMotionManager()
func startAccelerometers() {
// Make sure the accelerometer hardware is available.
if self.motion.isAccelerometerAvailable {
self.motion.accelerometerUpdateInterval = 1.0 / 60.0 // 60 Hz
self.motion.startAccelerometerUpdates()
// Configure a timer to fetch the data.
self.timer = Timer(fire: Date(), interval: (1.0/60.0),
repeats: true, block: { (timer) in
// Get the accelerometer data.
if let data = self.motion.accelerometerData {
let x = data.acceleration.x
let y = data.acceleration.y
let z = data.acceleration.z
// Use the accelerometer data in your app.
}
})
// Add the timer to the current run loop.
RunLoop.current.add(self.timer!, forMode: .defaultRunLoopMode)
}
}
Important
If your app relies on the presence of accelerometer hardware, configure the UIRequiredDeviceCapabilities key of its Info.plist file with the accelerometer value. For more information about the meaning of this key, see Information Property List Key Reference.

How do I use the Swift sampler to play a tone then pause before playing the next?

I have code to take a sequence of letters in a string and interpret them as notes. The code will then play the notes. The problem is that they all play at the same time. How do I play them each as a quarter note, essentially to play a note, wait for it to end, and then play the next note?
#IBAction func playButton(sender: AnyObject) {
fractalEngine.output = "adgadefe"
var notes = Array(fractalEngine.output.characters)
var counter = 0
while counter < notes.count {
var note = notes[counter]
if note == "a" {
play(58)
}
else if note == "b" {
play(59)
}
else if note == "c" {
play(60)
}
else if note == "d" {
play(61)
}
else if note == "e" {
play(62)
}
else if note == "f" {
play(63)
}
else {
play(64)
}
counter += 1
}
//self.sampler.startNote(60, withVelocity: 64, onChannel: 0)
}
func play(note: UInt8) {
sampler.startNote(note, withVelocity: 64, onChannel: 0)
}
func stop(note: UInt8) {
sampler.stopNote(note, onChannel: 0)
}
Here's the code that initiates the sampler:
func initAudio(){
engine = AVAudioEngine()
self.sampler = AVAudioUnitSampler()
engine.attachNode(self.sampler)
engine.connect(self.sampler, to: engine.outputNode, format: nil)
guard let soundbank = NSBundle.mainBundle().URLForResource("gs_instruments", withExtension: "dls") else {
print("Could not initalize soundbank.")
return
}
let melodicBank:UInt8 = UInt8(kAUSampler_DefaultMelodicBankMSB)
let gmHarpsichord:UInt8 = 6
do {
try engine.start()
try self.sampler.loadSoundBankInstrumentAtURL(soundbank, program: gmHarpsichord, bankMSB: melodicBank, bankLSB: 0)
}catch {
print("An error occurred \(error)")
return
}
/*
self.musicSequence = createMusicSequence()
createAVMIDIPlayer(self.musicSequence)
createAVMIDIPlayerFromMIDIFIle()
self.musicPlayer = createMusicPlayer(musicSequence)
*/
}
It looks like you need to delay playing each in turn. Here's one implementation (that avoids blocking the main thread).
//global delay helper function
func delay(delay:Double, closure:()->()) {
dispatch_after(
dispatch_time(
DISPATCH_TIME_NOW,
Int64(delay * Double(NSEC_PER_SEC))
),
dispatch_get_main_queue(), closure)
}
//inside your playButton
let delayConstant = 0.05 //customize as needed
for (noteNumber, note) in notes.enumerate() {
delay(delayConstant * noteNumber) {
play(note)
//handle stopping
delay(delayConstant) {stop(note)}
}
}
What this does is plays each note after an escalating delay, then stops playing after note length, which is assumed to be the delay constant.

Integrating audio and accelerometer for Swift 2.0? (Using coremotion)

import CoreMotion
var ButtonAudio4URL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("Swing", ofType: "wav")!)
var ButtonAudioPlayer4 = AVAudioPlayer()
do {
try ButtonAudioPlayer4 = AVAudioPlayer(contentsOfURL: ButtonAudio4URL)
ButtonAudioPlayer4.prepareToPlay()
} catch {
print("audioPlayer failure")
}
func play() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.play()
}
func play2() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.play()
}
func play3() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.play()
}
func stop() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.stop()
}
func stop2() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.stop()
}
func stop3() {
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.stop()
}
lastDirection = 0 //Idle accel
threshold = 1.0 //Minimum positive(right) accel
nthreshold = -1.0 //Minimum negative(left) accel
if motionManager.accelerometerAvailable {
let queue = NSOperationQueue()
motionManager.startAccelerometerUpdatesToQueue(queue, withHandler: {
data, error in
guard let data = data else{
return
}
// Get the acceleration
let xAccel = data.acceleration.x //X accel
let yAccel = data.acceleration.y //Y accel
let zAccel = data.acceleration.z //Z accel
let xPositive = xAccel > 0 //Positive(right) x accel
let xNegative = xAccel < 0 //Negative(left) x accel
let yPositive = yAccel > 0 //Positive(up) y accel
let yNegative = yAccel < 0 //Negative(down) y accel
let zPositive = zAccel > 0 //Positive(front) z accel
let zNegative = zAccel < 0 //Negative(back) z accel
// Run if the acceleration is higher than theshold
if abs(xAccel) > self.threshold {
//If moved right
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && xPositive {
self.lastDirection = 1
print("Up")
self.play()
} else if self.lastDirection != -1 && !xPositive {
self.lastDirection = -1
print("Down")
self.play()
}
}
}
// Run if the acceleration is higher than ntheshold
if abs(xAccel) < self.nthreshold {
//If moved left
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && xNegative {
self.lastDirection = 1
print("Up")
self.play2()
} else if self.lastDirection != -1 && !xNegative {
self.lastDirection = -1
print("Down")
self.play2()
}
}
}
// Run if the acceleration is higher than theshold
if abs(yAccel) > self.threshold {
//If moved up
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && yPositive {
self.lastDirection = 1
print("Up")
self.play()
} else if self.lastDirection != -1 && !yPositive {
self.lastDirection = -1
print("Down")
self.play()
}
}
}
// Run if the acceleration is higher than ntheshold
if abs(yAccel) < self.nthreshold {
//If moved left
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && yNegative {
self.lastDirection = 1
print("Up")
self.play2()
} else if self.lastDirection != -1 && !yNegative {
self.lastDirection = -1
print("Down")
self.play2()
}
}
}
// Run if the acceleration is higher than theshold
if abs(zAccel) > self.threshold {
//If moved front
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && zPositive {
self.lastDirection = 1
print("Up")
self.play()
} else if self.lastDirection != -1 && !zPositive {
self.lastDirection = -1
print("Down")
self.play()
}
}
}
// Run if the acceleration is higher than theshold
if abs(zAccel) < self.nthreshold {
//If moved back
dispatch_async(dispatch_get_main_queue()) {
if self.lastDirection != 1 && zNegative {
self.lastDirection = 1
print("Up")
self.play2()
} else if self.lastDirection != -1 && !zNegative {
self.lastDirection = -1
print("Down")
self.play2()
}
}
}
})
}
Hello,
I have built an iOS app with Xcode 7.2, Swift 2.0, for i0S 9.2
I would like to add a feature where when the user waves their device in the air, a sound will be played. My problem is that, although the sound plays, it suddenly stops, and is replayed when moved again in the process of playing the sound. I would like the sound to just be played once, without any overlapping or stopping. I have included all the code related to CoreMotion, and excluded all others.
Thank you.
If I do understand your purpose, you probably would like to have a boolean barrier to prevent duplicated playing.
Ex:
var isPlaying1 = false
func play() {
if self.isPlaying1 == true {
return
}
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.play()
self.isPlaying1 = true
}
func stop() {
if self.isPlaying1 == false {
return
}
ButtonAudioPlayer4.currentTime = 0
ButtonAudioPlayer4.stop()
self.isPlaying1 = false
}

Resources