AVAudioPlayer and AudioServices both delay sound if >1 s between plays - ios

I want a sound to play with negligible delay when a user taps down on a view. I've tried this with both AVAudioPlayer and AudioServices, but I'm experiencing the same issue with both: a slight delay if there is more than about a second between taps. In other words, here's what happens:
Tap many times in quick succession and only the first tap is delayed,
Pause for about a second or longer,
Tap many times again (or just once) and again only the first tap is delayed.
The delay we're talking about is short, maybe 100 ms or so, but enough so that the playback doesn't feel/sound instantaneous like the others. Here's the AVAudioPlayer version of the code:
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player:AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
let soundUrl = Bundle.main.url(forResource: "short_sound", withExtension: "wav")!
player = try! AVAudioPlayer(contentsOf: soundUrl)
player.volume = 0 // "Prime the pump"
player.play()
let r = TouchDownGestureRecognizer(target: self, action: #selector(tapped(_:)))
self.view.addGestureRecognizer(r)
}
func tapped(_ gesture: TouchDownGestureRecognizer){
player.volume = 1
player.play()
}
}
And here's the AudioServices version. In this case, there are other posts suggesting that a silent sound could be played during initialization to "prime the pump" but that would only address the first sound. This issue affects all sounds that occur after a 1+ second delay.
import UIKit
import AudioToolbox
class ViewController: UIViewController {
var soundID:SystemSoundID = 0
override func viewDidLoad() {
super.viewDidLoad()
let soundUrl = Bundle.main.url(forResource: "short_sound", withExtension: "wav")!
AudioServicesCreateSystemSoundID(soundUrl as CFURL, &soundID)
let r = TouchDownGestureRecognizer(target: self, action: #selector(tapped(_:)))
self.view.addGestureRecognizer(r)
}
func tapped(_ gesture: TouchDownGestureRecognizer){
AudioServicesPlaySystemSound(soundID)
}
}
In both cases, this "touch down" recognizer (by #le-sang) is used:
import Foundation
import UIKit.UIGestureRecognizerSubclass
class TouchDownGestureRecognizer: UIGestureRecognizer {
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent) {
if self.state == .possible {
self.state = .recognized
}
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent) {
self.state = .failed
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent) {
self.state = .failed
}
}
I've also tried using a UILongPressGestureRecognizer with a minimumPressDuration=0 and the same thing happens. Also, same thing if I replace the UIView with a UIControl and setup a target/action. Any insights would be much appreciated, thanks for reading.

Here's the fix I found. I'm not totally happy with this solution (still not sure why this was happening), but it does indeed work for both players.
The idea is simple: if you leave either player alone for a second or so, it starts to nap and then takes a moment to wake up when you need it. So, the solution is to "nudge" the player at regular intervals so it doesn't start napping.
For the AVAudioPlayer case:
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player:AVAudioPlayer!
var ghostPlayer:AVAudioPlayer!
var timer:Timer!
override func viewDidLoad() {
super.viewDidLoad()
let soundUrl = Bundle.main.url(forResource: "short_sound", withExtension: "wav")!
player = try! AVAudioPlayer(contentsOf: soundUrl)
ghostPlayer = try! AVAudioPlayer(contentsOf: soundUrl)
ghostPlayer.volume = 0
ghostPlayer.play()
timer = Timer.scheduledTimer(timeInterval: 0.5,
target: self,
selector: #selector(nudgeAudio),
userInfo: nil,
repeats: true)
let r = TouchDownGestureRecognizer(target: self, action: #selector(tapped(_:)))
self.view.addGestureRecognizer(r)
}
func tapped(_ gesture: TouchDownGestureRecognizer){
player.play()
}
func nudgeAudio() {
ghostPlayer.play()
}
}
And for AudioServices:
import UIKit
import AudioToolbox
class ViewController: UIViewController {
var soundID:SystemSoundID = 0
var ghostID:SystemSoundID = 1
var timer:Timer!
override func viewDidLoad() {
super.viewDidLoad()
let soundUrl = Bundle.main.url(forResource: "short_sound", withExtension: "wav")!
AudioServicesCreateSystemSoundID(soundUrl as CFURL, &soundID)
let ghostUrl = Bundle.main.url(forResource: "silence", withExtension: "wav")!
AudioServicesCreateSystemSoundID(ghostUrl as CFURL, &ghostID)
timer = Timer.scheduledTimer(timeInterval: 0.5,
target: self,
selector: #selector(nudgeAudio),
userInfo: nil,
repeats: true)
let r = TouchDownGestureRecognizer(target: self, action: #selector(tapped(_:)))
self.view.addGestureRecognizer(r)
}
func tapped(_ gesture: TouchDownGestureRecognizer){
AudioServicesPlaySystemSound(soundID)
}
func nudgeAudio() {
AudioServicesPlaySystemSound(ghostID)
}
}

Related

How to pause background music in game using AVAudioPlayer for multiple scenes

So I have a background music looping in the background in my GameViewController. The pause music button is available in the GameScene where a user can mute or unmute the game music.
I have two global variables:
var muteButton = SKSpriteNode(imageNamed: "pause")
var mute: Bool = false
Inside my GameScene I've added, things work like they are suppose to (the print responses are triggered).
class GameScene: SKScene{
override func didMove(to view: SKView){
...
muteButton.position = CGPoint(x: self.size.width*0.2, y: self.size.height*0.90)
muteButton.name = "Mute Button"
muteButton.zPosition = 10
self.addChild(muteButton)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch: AnyObject in touches{
let pointOfTouch = touch.location(in: self)
let nodeITapped = atPoint(pointOfTouch)
if nodeITapped.name == "Mute Button"{
if mute == false {
print("music will now turn OFF")
mute = true
}
else{
print("music will now turn ON")
mute = false
}
}
}
}
}
I suspect the mute variable is only being called once in the GameViewController viewDidLoad, and thus the if statement is being checked only once. Since I have multiple senses connected that all need to have music playing, the best place for me to put the backgroundAudio would be here.
In my GameViewController:
class GameViewController: UIViewController{
var backgroundAudio = AVAudioPlayer()
override func viewDidLoad() {
super.viewDidLoad()
// Background Audio plays throughout the game
let filePath = Bundle.main.path(forResource: "Track1",ofType:"mp3")
let audioNS_URL = NSURL(fileURLWithPath: filePath!)
if mute == false{
do{ backgroundAudio = try AVAudioPlayer(contentsOf: audioNS_URL as URL)}
catch { return print("No Audio Found")}
// audio will loop forever
backgroundAudio.numberOfLoops = -1
backgroundAudio.play()
}
else{
backgroundAudio.pause()
}
}
}
Add an observer inside GameViewController:
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(self, selector: #selector(switchBackgroundAudio), name: NSNotification.Name.init("switchBackgroundAudio"), object: nil)
//... other stuff here ...//
}
then add a function for switching the sound on/off:
#objc func switchBackgroundAudio() {
if mute == false {
backgroundAudio.play()
} else {
backgroundAudio.pause()
}
}
finally whenever inside your GameScene you touch the button, you may launch an event:
if nodeITapped.name == "Mute Button"{
//... stuff here ...//
NotificationCenter.default.post(Notification(name: NSNotification.Name("switchBackgroundAudio")))
}

Create a random audio sound generator

Thanks for replying.
I am trying to make a program when I press a button once, two random sounds will play. I can get the random sounds to play if I press the button, but I am looking that if I press the button, the random sound will play differently each time.
I can paste the sounds together to hear them in the sequence I want them, but I would like swift to generate the sounds.
I thought of the AVqueplayer to make it as a playlist. I was thinking this can be like a pair of dice in an analogy. For example, if I were to throw the dice down, the random sounds will occur.
I am still a newbie, and tried to figure this out on my own, because it seemed so simple, but I am out of options now.
Here is what I got so far. This will play a random sound when I press the button each time.
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVAudioPlayer = AVAudioPlayer()
var sounds = ["sound1", "sound2", "sound3"]
override func viewDidLoad() {
super.viewDidLoad()
}
override func motionEnded(_ motion: UIEventSubtype, with event: UIEvent?) {
if event!.subtype == UIEventSubtype.motionShake {
let randomNumber = Int(arc4random_uniform(UInt32(sounds.count)))
let fileLocation = Bundle.main.path(forResource: sounds[randomNumber], ofType: "mp3")
var error: NSError? = nil
do { try player = AVAudioPlayer(contentsOf: URL(fileURLWithPath: fileLocation!))
player.play()
} catch {}
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Using the same code, and adding a second random number with matching file location will allow the sounds to play back to back, both being random:
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVAudioPlayer = AVAudioPlayer()
var sounds = ["sound1", "sound2", "sound3"]
override func viewDidLoad() {
super.viewDidLoad()
}
override func motionEnded(_ motion: UIEventSubtype, with event: UIEvent?) {
if event!.subtype == UIEventSubtype.motionShake {
let randomNumber1 = Int(arc4random_uniform(UInt32(sounds.count)))
let randomNumber2 = Int(arc4random_uniform(UInt32(sounds.count)))
let fileLocation1 = Bundle.main.path(forResource: sounds[randomNumber1], ofType: "mp3")
let fileLocation2 = Bundle.main.path(forResource: sounds[randomNumber2], ofType: "mp3")
//var error: NSError? = nil
do {
try player = AVAudioPlayer(contentsOf: URL(fileURLWithPath: fileLocation1!))
player.play()
try player = AVAudioPlayer(contentsOf: URL(fileURLWithPath: fileLocation2!))
player.play()
} catch {}
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}

Stopping the background music when starting a game

I have background music which starts when the app is launched in GameViewController.swift using the following code:
class GameViewController: UIViewController {
// VARIABLES
var backgroundMusicPlayer : AVAudioPlayer!
// AUDIO PLAYER
func playBackgroundMusic(filename: String) {
let url = NSBundle.mainBundle().URLForResource(filename, withExtension: nil)
var error : NSError? = nil
do {
backgroundMusicPlayer = try AVAudioPlayer(contentsOfURL: url!)
} catch let error1 as NSError {
error = error1
backgroundMusicPlayer = nil
}
if backgroundMusicPlayer == nil {
print("Could not create audio player: \(error!)")
return
}
backgroundMusicPlayer.numberOfLoops = -1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play()
}
func stopBackgroundMusic() {
backgroundMusicPlayer.stop()
}
override func viewDidLoad() {
super.viewDidLoad()
playBackgroundMusic("MainTheme.mp3")
<< Various irrelevant code >>
}
Because this is run in the viewController, it persists through changing scenes on the menu (i.e. opening the "shop" scene) and creates a seamless track. When I click the "Play" button on the menu scene I want the music to then stop, and transition to the game. I have the stopBackgroundMusic() method in the GameViewController but I don't know how to call it on on the menu scene. IN THE MENU SCENE I tried this:
// TOUCH
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
let touch = touches.first as UITouch?
let touchLocation = touch!.locationInNode(self)
let touchedNode = self.nodeAtPoint(touchLocation)
if touchedNode.name == "startGame" {
GameViewController.stopBackgroundMusic()
let transitionType = SKTransition.fadeWithDuration(2)
let viewSize = self.view?.bounds.size
let scene = GameScene(size: viewSize!)
self.view?.presentScene(scene, transition: transitionType)
}
}
But I get an error saying I'm missing parameter #1 in call for stopBackgroundMusic() which shouldn't require any parameters. Am I calling this method wrong? Thanks!
You are referring to your class by using GameViewController but your function is at the object instance level.
If you declare the variable and function at the class level, your code in the touchesBegan function should work fine.
static var backgroundMusicPlayer : AVAudioPlayer!
class func playBackgroundMusic(filename: String) ...
class func stopBackgroundMusic()
override func viewDidLoad() {
super.viewDidLoad()
GameViewController.playBackgroundMusic("MainTheme.mp3")
<< Various irrelevant code >>
}

Make a playlist (start next song) in swift

I have created a sound player in swift with AVFoundation. I am trying to start the next song in array when the playing song is finished. I was trying to implement this code
if (audioPlayer.currentTime >= audioPlayer.duration){
var recentSong = songPlaylist[selectedSongNumber + 1]
audioPlayer = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath:
NSBundle.mainBundle().pathForResource(recentSong, ofType: "mp3")!), error: nil)
audioPlayer.play()
}
but I am not being able to implement this code (I do not know where to implement it).Here is my complete code
import UIKit
import AVFoundation
import AVKit
public var audioPlayer = AVPlayer()
public var selectedSongNumber = Int()
public var songPlaylist:[String] = ["song1", "song2"]
public var recentSong = "song1"
let playImage = UIImage(named: "Play.png") as UIImage!
let pauseImage = UIImage(named: "Pause.png") as UIImage!
class FirstViewController: UIViewController {
#IBOutlet weak var musicSlider: UISlider!
#IBOutlet weak var PlayPause: UIButton!
var audioPlayer = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath:
NSBundle.mainBundle().pathForResource(recentSong, ofType: "mp3")!), error: nil)
override func viewDidLoad() {
super.viewDidLoad()
musicSlider.maximumValue = Float(audioPlayer.duration)
var timer = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector("updateMusicSlider"), userInfo: nil, repeats: true)
if (audioPlayer.currentTime >= audioPlayer.duration){
var recentSong = songPlaylist[selectedSongNumber + 1]
audioPlayer = AVAudioPlayer(contentsOfURL: NSURL(fileURLWithPath:
NSBundle.mainBundle().pathForResource(recentSong, ofType: "mp3")!), error: nil)
audioPlayer.play()
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func PlayPauseButton(sender: AnyObject) {
if (audioPlayer.playing == false){
audioPlayer.play()
PlayPause.setImage(pauseImage, forState: .Normal)
}else{
audioPlayer.pause()
PlayPause.setImage(playImage, forState: .Normal)
}
}
#IBAction func StopButton(sender: AnyObject) {
audioPlayer.stop()
audioPlayer.currentTime = 0
PlayPause.setImage(playImage, forState: .Normal)
}
#IBAction func musicSliderAction(sender: UISlider) {
audioPlayer.stop()
audioPlayer.currentTime = NSTimeInterval(musicSlider.value)
audioPlayer.play()
}
func updateMusicSlider(){
musicSlider.value = Float(audioPlayer.currentTime)
}
}
I am updating my code with something different:
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate {
var counter = 0
var song = ["1","2","3"]
var player = AVAudioPlayer()
#IBOutlet weak var musicSlider: UISlider!
override func viewDidLoad() {
super.viewDidLoad()
musicSlider.value = 0.0
}
func updateMusicSlider(){
musicSlider.value = Float(player.currentTime)
}
#IBAction func playSong(sender: AnyObject) {
music()
}
#IBAction func sliderAction(sender: AnyObject) {
player.stop()
player.currentTime = NSTimeInterval(musicSlider.value)
player.play()
}
func music(){
var audioPath = NSBundle.mainBundle().pathForResource("\(song[counter])", ofType: "mp3")!
var error : NSError? = nil
player = AVAudioPlayer(contentsOfURL: NSURL(string: audioPath), error: &error)
musicSlider.maximumValue = Float(player.duration)
var timer = NSTimer.scheduledTimerWithTimeInterval(0.05, target: self, selector: Selector("updateMusicSlider"), userInfo: nil, repeats: true)
player.delegate = self
if error == nil {
player.delegate = self
player.prepareToPlay()
player.play()
}
}
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool)
{
println("Called")
if flag {
counter++
}
if ((counter + 1) == song.count) {
counter = 0
}
music()
}
}
You can do it this way.
Hope It will help and HERE is sample project for more Info.
You need to implement AVAudioPlayerDelegate Protocol's method:
optional func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer!, successfully flag: Bool)
Documentation link
Play your next music item here.
But I will not recommend, since AVAudioPlayer can only play one item at a time. You need to instantiate again with another music item after completion. I will suggest you to use AVQueuePlayer. Detaled answer has been given here. Hope it helps!
I created a sound player in swift with AVFoundation. I used progressView to time the song and then when it hits 0.98743 it will update to the next song automatically this is the Github link: https://github.com/ryan-wlr/MusicPlayerIOS
func updateProgressView() {
if (progressView.progress > a.advanced(by: 0.98743)) {
audioPlayerDidFinishPlayeing()
}
if audioPlayer.isPlaying {
let progress = Float(audioPlayer.currentTime/audioPlayer.duration)
progressView.setProgress(progress, animated: true)
}
}

Remote Control event in iOS with Swift

Trying to figure out how to read the Apple headphone's volume buttons to use as a trigger for the camera shutter (as the Apple Camera app does).
From the documentation on Remote Control Events,
Remote Control Received With Event, and this git repo, I've pieced together that I'll probably need an AVAudioPlayer object, .beginReceivingRemoteControlEvents(), and remoteControlReceivedWithEvent, along with making this view canBecomeFirstResponder() return true.
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate {
var player: AVAudioPlayer!
override func viewDidLoad() {
super.viewDidLoad()
var session: AVAudioSession = AVAudioSession.sharedInstance()
session.setActive(true, error: nil)
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
println("viewDidAppear worked...")
self.becomeFirstResponder()
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
}
override func canBecomeFirstResponder() -> Bool {
return true
}
override func remoteControlReceivedWithEvent(event: UIEvent) {
let rc = event.subtype
println("does this work? \(rc.rawValue)")
//takePicture()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}
I expected to get "does this work" when hitting the volume buttons on the headphones, instead I just see it adjust the headphone volume like normal. So I must be missing something, maybe with a delegate or AVSession?
I cross-posted this on r/swift, where I was told it probably requires playing audio (quoted straight from the documentation).
So while this isn't the ideal solution, it works for my own private use.
import UIKit
import AVFoundation
import MediaPlayer
class ViewController: UIViewController, AVAudioPlayerDelegate {
var testPlayer: AVAudioPlayer? = nil
func loadSound(filename: NSString) -> AVAudioPlayer {
let url = NSBundle.mainBundle().URLForResource(filename as String, withExtension: "caf")
var error: NSError? = nil
let player = AVAudioPlayer(contentsOfURL: url, error: &error)
if error != nil {
println("Error loading \(url): \(error?.localizedDescription)")
} else {
player.prepareToPlay()
}
return player
}
override func viewDidLoad() {
super.viewDidLoad()
self.testPlayer = self.loadSound("silence")
self.testPlayer?.numberOfLoops = -1
self.testPlayer?.play()
}
override func canBecomeFirstResponder() -> Bool {
return true
}
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
self.becomeFirstResponder()
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
}
override func remoteControlReceivedWithEvent(event: UIEvent) {
let rc = event.subtype
println("rc.rawValue: \(rc.rawValue)")
// take photo
}
}
I noticed that in Apple's camera app, the +/- volume buttons trigger the camera, and the microphone button pauses/plays any audio running in another app, but in this implementation the volume buttons still control the volume (and any audio has been paused when the app is launched).
An rc.rawValue: 103 corresponds to a single click of the microphone button, a double click returns 104, and a triple click returns 105, and then sometimes bumping a couple at a time returns a 108 or 109.
Based on Cody's answer but updated for 2019 (Swift 5)
import UIKit
import AVFoundation
import MediaPlayer
class ViewController: UIViewController, AVAudioPlayerDelegate {
var myPlayer: AVAudioPlayer? = nil
func loadSound(filename: NSString) -> AVAudioPlayer? {
let url = Bundle.main.url(forResource: filename as String, withExtension: "mp3")
do {
let player = try AVAudioPlayer(contentsOf: url ?? URL(fileURLWithPath: ""))
player.prepareToPlay()
return player
}
catch {
print("Error : \(error)")
return nil
}
}
override func viewDidLoad() {
super.viewDidLoad()
guard let testPlayer = loadSound(filename: "silence") else {
print("Not able to load the sound")
return
}
testPlayer.delegate = self
testPlayer.volume = 0.8
testPlayer.numberOfLoops = -1
myPlayer = testPlayer
myPlayer?.play()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
self.becomeFirstResponder()
UIApplication.shared.beginReceivingRemoteControlEvents()
}
override func remoteControlReceived(with event: UIEvent?) {
let rc = event?.subtype
print("rc.rawValue: \(rc?.rawValue)")
// Do your thing
}
}

Resources