AVAudioPlayer multiple stop problem in swift - ios

I have piano keyboard.When I press the key I want the previous key not to be interrupted before calling func pianoKeyUp.So I created another player in pianoKeyDown.
The problem is: When simultaneously press the key created AudioPlayers is not deleted or simultaneous deletion occurs and gives an error about the missing element in AudioPlayers array and app crashes.What is the better way to play piano sound multiple?
var audioPlayers = [KeyAudio]()
There is a struct for each piano key that init in ViewDidLoad() in for key in cycle
struct KeyAudio {
let audioPlayer : AVAudioPlayer
var playersArray : [AVAudioPlayer]
init(audioPlayer: AVAudioPlayer) {
self.audioPlayer = audioPlayer
var array = [AVAudioPlayer]()
array.append(audioPlayer)
self.playersArray = array
}
}
ViewDidLoad()
Prepare each player to play and append to audioPlayers array with init of KeyAudio
for key in 1...61 {
do {
let pianoSoundURL = URL(fileURLWithPath: Bundle.main.path(forResource: "\(key).wav", ofType: nil)!)
let audioPlayer = try AVAudioPlayer(contentsOf: pianoSoundURL, fileTypeHint: nil)
audioPlayer.volume = 0.1
audioPlayer.prepareToPlay()
let player = KeyAudio(audioPlayer: audioPlayer) // init
self.audioPlayers.append(player)
} catch(let error) {
print(error)
}
And I have functions from custom piano view
First - keyDown - triggered when piano key pressed
If player.isPlaying I create another AudioPlayer and append it to common array of each note
func pianoKeyDown(_ keyNumber: UInt8) {
let number = Int(keyNumber)
audioPlayers[number].audioPlayer.setVolume(0, fadeDuration: 0.05)
if audioPlayers[number].audioPlayer.isPlaying {
let pianoSoundURL = URL(fileURLWithPath: Bundle.main.path(forResource: "\(number+1).wav", ofType: nil)!)
guard let duplicatePlayer = try? AVAudioPlayer(contentsOf: pianoSoundURL) else { return }
audioPlayers[number].playersArray.append(duplicatePlayer)
duplicatePlayer.prepareToPlay()
duplicatePlayer.currentTime = 0
DispatchQueue.global().async {
duplicatePlayer.play()
duplicatePlayer.setVolume(0.8, fadeDuration: 0.05)
}
} else {
guard let firstTimePlayer = audioPlayers[number].playersArray.first else { return }
firstTimePlayer.currentTime = 0
DispatchQueue.global().async {
firstTimePlayer.play()
firstTimePlayer.setVolume(0.8, fadeDuration: 0.05)
}
}
}
And second - keyUp - when finger is released I stop AudioPlayer created by first tap, then check if another AudioPlayer created by next tap
and there is the problem
func pianoKeyUp(_ keyNumber: UInt8) {
let number = Int(keyNumber)
if let firstPlayer = audioPlayers[number].playersArray.first, firstPlayer.isPlaying {
audioPlayers[number].audioPlayer.setVolume(0, fadeDuration: 0.75)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.75, execute: {
DispatchQueue.global().async {
if self.audioPlayers[number].audioPlayer.isPlaying {
self.audioPlayers[number].audioPlayer.stop()
}
}
})
}
let isIndexValid = audioPlayers[number].playersArray.indices.contains(1)
if isIndexValid, audioPlayers[number].playersArray[1].isPlaying {
audioPlayers[number].playersArray[1].setVolume(0, fadeDuration: 0.75)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.75, execute: {
if self.audioPlayers[number].playersArray.indices.contains(1) {
self.audioPlayers[number].playersArray[1].stop()
self.audioPlayers[number].playersArray.remove(at: 1)
}
})

Related

Where to Add Current Time and Duration Time For AVPlayer and AVAudioPlayer using MPRemoteCommandCenter

On a screen inside my app I have both an AVAudioPlayer for music and an AVPlayer for videos. The user can swap out different songs and different videos but can only play one at a time. They can play either the audioPlayer or watch videos on the avPlayer.
I have MPRemoteCommandCenter that works fine for both when using pause/play/ff/rewind. The issue is I can't display the currentTime or duration for either on the lock screen. I tried this but it doesn't say where to put the code.
This is what I tried so that every time the user switches songs or videos I have all of the available data for the new items:
Audio-
do {
audioPlayer = try AVAudioPlayer(contentsOf: audioTrack)
audioPlayer?.delegate = self
audioPlayer?.prepareToPlay()
audioPlayer?.play()
setupNowPlayingForAudio()
} catch {
}
func setupNowPlayingForAudio() {
guard let audioPlayer = audioplayer else { return }
var nowPlayingInfo = [String : Any]()
nowPlayingInfo[MPMediaItemPropertyTitle] = "My App Name"
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = Float(audioPlayer.currentTime)
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = Float(audioPlayer.duration)
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = audioPlayer.rate
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
}
Video-
playerStatusObserver = player?.observe(\.currentItem?.status, options: [.new, .old]) {
switch (player.status) {
case .readyToPlay:
player?.play()
setupNowPlayingForVideo()
}
}
func setupNowPlayingForVideo() {
guard let player = player, let playerItem = player.currentItem else { return }
var nowPlayingInfo = [String : Any]()
nowPlayingInfo[MPMediaItemPropertyTitle] = "My App Name"
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = playerItem.currentTime().seconds
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = playerItem.asset.duration.seconds
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = player.rate
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
}
The MPRemoteCommandCenter is set in viewDidLoad along with the AVAudioSession
I followed this answer which says that you have to add it to any pause/play/ff/rewind buttons, slider, and any observers that listen to the player playing/stopping. Here is the way that I did it. This works fine for me on the Lock Screen.
Here is where I used the below function for the AudioPlayer-
do {
audioPlayer = try AVAudioPlayer(contentsOf: audioTrack)
// ...
audioPlayer?.play()
setupNowPlaying(musicPlayer: audioPlayer)
} catch { }
func audioPlayerPausePlayButton() {
// ...
setupNowPlaying(musicPlayer: audioPlayer)
}
func audioPlayerFastFowardAndRewindButton() {
// ... ff or rewind functions
setupNowPlaying(musicPlayer: audioPlayer)
}
Here is where I used the below function for the AVPlayer-
playerStatusObserver = player?.observe(\.currentItem?.status, options: [.new, .old]) { // ...
switch (player.status) {
case .readyToPlay:
// ... this should occur on the MainQueue
self?.player?.play()
self?.setupNowPlaying(videoPlayer: self?.player)
}
}
// .. I also added it to any other observers that listen to the player stopping
func videoPlayerPausePlayButton() {
// ...
setupNowPlaying(videoPlayer: player)
}
func videoPlayerFastFowardAndRewindButton() {
// ...
player?.seek(to: whateverSeekTime) { [weak self](_) in
self?.setupNowPlaying(videoPlayer: self?.player)
}
}
Dictionary values for the CommandCenter to show on the Lock Screen
func setupNowPlaying(musicPlayer: AVAudioPlayer? = nil, videoPlayer: AVPlayer? = nil) {
var nowPlayingInfo = [String : Any]()
// audio
if let musicPlayer = musicPlayer, let musicUrl = musicPlayer.url {
nowPlayingInfo[MPNowPlayingInfoPropertyAssetURL] = musicUrl
nowPlayingInfo[MPMediaItemPropertyTitle] = musicUrl.lastPathComponent
nowPlayingInfo[MPNowPlayingInfoPropertyMediaType] = MPNowPlayingInfoMediaType.audio.rawValue
let currentTime: TimeInterval = musicPlayer.currentTime
let musicCurrentTimeCMTime = CMTime(seconds: currentTime, preferredTimescale: 1000)
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = CMTimeGetSeconds(musicCurrentTimeCMTime)
let musicDuration: TimeInterval = musicPlayer.duration
let musicDurationCMTime = CMTime(seconds: musicDuration, preferredTimescale: 1000)
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = CMTimeGetSeconds(musicDurationCMTime)
if musicPlayer.isPlaying {
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 1
} else {
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 0
}
}
// video
if let videoPlayer = videoPlayer, let currentItem = videoPlayer.currentItem {
if let videoAsset = currentItem.asset as? AVURLAsset {
let videoUrl = videoAsset.url
nowPlayingInfo[MPNowPlayingInfoPropertyAssetURL] = videoUrl
nowPlayingInfo[MPMediaItemPropertyTitle] = videoUrl.lastPathComponent
nowPlayingInfo[MPNowPlayingInfoPropertyMediaType] = MPNowPlayingInfoMediaType.video.rawValue
}
if let videoDuration: CMTime = videoPlayer.currentItem?.duration {
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = CMTimeGetSeconds(videoDuration)
}
let videoCurrentTime: CMTime = videoPlayer.currentTime()
let videoCurrentTimeAsSecs = CMTimeGetSeconds(videoCurrentTime)
nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = videoCurrentTimeAsSecs
print("videoCurrentTimeAsSecs: ", videoCurrentTimeAsSecs)
if videoPlayer.isPlaying {
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 1
} else {
nowPlayingInfo[MPNowPlayingInfoPropertyPlaybackRate] = 0
}
}
nowPlayingInfo[MPMediaItemPropertyTitle] = "Your App Name"
nowPlayingInfo[MPNowPlayingInfoPropertyIsLiveStream] = false
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
}

AVAudioengine play / loop audio , multiple buttons

I have this button to play and loop a wav. But what if I have a second button with another loop? I want this second loop to start playing once pressed, however, the first loop must finish his 'round'. And vice versa (play and loop the second loop, press button first loop and it takes over)
#IBAction func playButtonTapped(_ sender: Any) {
guard let filePath: String = Bundle.main.path(forResource: "25loop110", ofType: "wav") else{ return }
print("\(filePath)")
let fileURL: URL = URL(fileURLWithPath: filePath)
guard
let audioFile = try? AVAudioFile(forReading: fileURL) else{ return }
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
do{
try audioFile.read(into: audioFileBuffer)
timeShift.rate = adjustedBpm/bpm
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
} catch{
print("over")
}
try? audioEngine.start()
playerNode.play()
playerNode.scheduleBuffer(audioFileBuffer, at: nil, options:.loops,completionHandler: nil)
}
You can handle this behavior by using the completionHandler parameter of .scheduleBuffer.
For example, you could do something like this:
var nextAudioFilePath: String
var isPlaying: Bool = false
#IBAction func playLoopA() {
guard let path = Bundle.main.path(forResource: "audioFileA", ofType: "wav") else { return }
nextAudioFilePath = path
guard !isPlaying else { return }
play()
}
#IBAction func playLoopB() {
guard let path = Bundle.main.path(forResource: "audioFileB", ofType: "wav") else { return }
nextAudioFilePath = path
guard !isPlaying else { return }
play()
}
private func play() {
let fileURL = URL(fileURLWithPath: nextAudioFilePath)
...
playerNode.scheduleBuffer(audioFileBuffer, at: nil, options: [], completionHandler: { [weak self] in
self?.play()
})
}
I also found this solution:
playerNode.scheduleBuffer(audioFileBuffer, at: nil, options:[.interruptsAtLoop, .loops],completionHandler: nil)

Set a delay after UIButton activation

I am working on a dice game that rolls 4 dice with random numbers assigned to each of them. When I press the roll button, the sound effect plays as intended and the images are replaced, but I am looking for a way to prevent the user from pressing roll until the sound effect is finished (maybe 2 seconds).
This is my function that updates dice images, where I have been testing this problem by adding DispatchTime.now() + 2 to the if statements and before arc4random_uniform, but to no avail:
func updateDiceImages() {
randomDiceIndex1 = Int(arc4random_uniform(6))
randomDiceIndex2 = Int(arc4random_uniform(6))
randomDiceIndex3 = Int(arc4random_uniform(6))
randomDiceIndex4 = Int(arc4random_uniform(6))
randomMultiplier = Int(arc4random_uniform(4))
// determine the operator dice at random
if randomMultiplier == 0 {
addDice()
}
if randomMultiplier == 1 {
subDice()
}
if randomMultiplier == 2 {
divDice()
}
if randomMultiplier == 3 {
multDice()
}
// image changes to random dice index
diceImageView1.image = UIImage(named: diceArray[randomDiceIndex1])
diceImageView2.image = UIImage(named: diceArray[randomDiceIndex2])
diceImageView3.image = UIImage(named: diceArray[randomDiceIndex3])
diceImageView4.image = UIImage(named: diceArray[randomDiceIndex4])
multImageView1.image = UIImage(named: multArray[randomMultiplier])
}
If necessary, here is also my function that plays the sound effect, where I also tried implementing DispatchTime.now() + 2:
func rollSound() {
// Set the sound file name & extension
let alertSound = URL(fileURLWithPath: Bundle.main.path(forResource: "diceRoll", ofType: "mp3")!)
do {
// Preparation
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ {
}
// Play the sound
do {
audioPlayer = try AVAudioPlayer(contentsOf: alertSound)
} catch _{
}
audioPlayer.prepareToPlay()
audioPlayer.play()
}
Here is the implementation that I feel is the closest, but I get many errors:
func rollSound() {
// Set the sound file name & extension
let when = DispatchTime.now() + 2 // change 2 to desired number of seconds
DispatchQueue.main.asyncAfter(deadline: when) {
let alertSound = URL(fileURLWithPath: Bundle.main.path(forResource: "diceRoll", ofType: "mp3")!)
do {
// Preparation
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ {
}
// Play the sound
do {
audioPlayer = try AVAudioPlayer(contentsOf: alertSound)
} catch _{
}
audioPlayer.prepareToPlay()
audioPlayer.play()
}
}
I have updated code for your error.
func rollSound() {
// Set the sound file name & extension
let when = DispatchTime.now() + 2 // change 2 to desired number of seconds
DispatchQueue.main.asyncAfter(deadline: when) {
let alertSound = URL(fileURLWithPath: Bundle.main.path(forResource: "diceRoll", ofType: "mp3")!)
do {
// Preparation
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try AVAudioSession.sharedInstance().setActive(true)
} catch _ {
}
// Play the sound
do {
self.audioPlayer = try AVAudioPlayer(contentsOf: alertSound)
} catch _{
}
self.audioPlayer?.prepareToPlay()
self.audioPlayer?.play()
}
}
You need to add self to the property if you want to use class instance into DispatchQueue closer. and same for updateDiceImages()

how to monitor audio input on ios using swift - example?

I want to write a simple app that 'does something' when the sound level at the mic reaches a certain level, showing the audio input levels for extra credit
cant find any examples in swift that get to this -- dont want to record, just monitor
have been checking out the docs on the AVFoundation classes but cant get off the ground
thanks
Let you can use below code :
func initalizeRecorder ()
{
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
try AVAudioSession.sharedInstance().setActive(true)
}catch{
print(error);
}
let stringDir:NSString = self.getDocumentsDirectory();
let audioFilename = stringDir.stringByAppendingPathComponent("recording.m4a")
let audioURL = NSURL(fileURLWithPath: audioFilename)
print("File Path : \(audioFilename)");
// make a dictionary to hold the recording settings so we can instantiate our AVAudioRecorder
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 12000.0,
AVNumberOfChannelsKey: 1 as NSNumber,
AVEncoderBitRateKey:12800 as NSNumber,
AVLinearPCMBitDepthKey:16 as NSNumber,
AVEncoderAudioQualityKey: AVAudioQuality.High.rawValue
]
do {
if audioRecorder == nil
{
audioRecorder = try AVAudioRecorder(URL: audioURL, settings: settings )
audioRecorder!.delegate = self
audioRecorder!.prepareToRecord();
audioRecorder!.meteringEnabled = true;
}
audioRecorder!.recordForDuration(NSTimeInterval(5.0));
} catch {
print("Error")
}
}
//GET DOCUMENT DIR PATH
func getDocumentsDirectory() -> String {
let paths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
let documentsDirectory = paths[0]
return documentsDirectory
}
////START RECORDING
#IBAction func btnStartPress(sender: AnyObject) {
recordingSession = AVAudioSession.sharedInstance()
do {
recordingSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
dispatch_async(dispatch_get_main_queue()) {
if allowed {
print("Allowd Permission Record!!")
self.initalizeRecorder ()
self.audioRecorder!.record()
//instantiate a timer to be called with whatever frequency we want to grab metering values
self.levelTimer = NSTimer.scheduledTimerWithTimeInterval(0.02, target: self, selector: Selector("levelTimerCallback"), userInfo: nil, repeats: true)
} else {
// failed to record!
self.showPermissionAlert();
print("Failed Permission Record!!")
}
}
}
} catch {
// failed to record!
print("Failed Permission Record!!")
}
}
//This selector/function is called every time our timer (levelTime) fires
func levelTimerCallback() {
//we have to update meters before we can get the metering values
if audioRecorder != nil
{
audioRecorder!.updateMeters()
let ALPHA : Double = 0.05;
let peakPowerForChannel : Double = pow(Double(10.0), (0.05) * Double(audioRecorder!.peakPowerForChannel(0)));
lowPassResults = ALPHA * peakPowerForChannel + Double((1.0) - ALPHA) * lowPassResults;
print("low pass res = \(lowPassResults)");
if (lowPassResults > 0.7 ){
print("Mic blow detected");
}
}
}
//STOP RECORDING
#IBAction func btnStopPress(sender: AnyObject) {
if audioRecorder != nil
{
audioRecorder!.stop()
self.levelTimer.invalidate()
}
}
In AVAudioRecorder you can "record audio" (you don't have to save it) and set meteringEnabled to use the function peakPowerForChannel(_:)
It will
Returns the peak power for a given channel, in decibels, for the sound being recorded.
This link may provide a sample code.
Let me know if it help you.

Background music overlap when back to creator viewController

I have a game which consists of 3 view controllers:
viewController
settingViewController
GameViewController
I have set the background music and played it in the viewController class
var backgroundMusic : AVAudioPlayer!
func setUpSounds(){
//button sound
if let buttonSoundPath = NSBundle.mainBundle().pathForResource("buttonClick", ofType: "mp3") {
let buttonSoundURL = NSURL(fileURLWithPath: buttonSoundPath)
do {
try buttonSound = AVAudioPlayer(contentsOfURL: buttonSoundURL)
}catch {
print("could not setup button sound")
}
buttonSound.volume = 0.5
}
//background sound
if let backgroundMusicPath = NSBundle.mainBundle().pathForResource("BackgroundMusic", ofType: "mp3") {
let backgroundMusicURL = NSURL(fileURLWithPath: backgroundMusicPath)
do {
try backgroundMusic = AVAudioPlayer(contentsOfURL: backgroundMusicURL)
}catch {
print("could not setup background music")
}
backgroundMusic.volume = 0.2
/*
set any negative integer value to loop the sound
indefinitely until you call the stop method
*/
backgroundMusic.numberOfLoops = -1
}
}
override func viewDidLoad() {
super.viewDidLoad()
self.setUpSounds()
self.playBackgroundSound()
// Do any additional setup after loading the view, typically from a nib.
}
when I move to the settingViewController then back again to the viewController the sound replay and overlap the old played music.
What is the solution of this problem?
You have to check if the AVAudioPlayer is playing a music before call playBackgroundSound.
You can also put your SoundManager as a singleton, so you can manipulate it, from other parts of the app.
class SoundManager{
static var backgroundMusicSharedInstance = AVAudioPlayer?
}
In the View
func setUpSounds(){
//background sound
if SoundManager.backgroundMusicSharedInstance == nil {
if let backgroundMusicPath = NSBundle.mainBundle().pathForResource("BackgroundMusic", ofType: "mp3") {
let backgroundMusicURL = NSURL(fileURLWithPath: backgroundMusicPath)
do {
try SoundManager.backgroundMusicSharedInstance = AVAudioPlayer(contentsOfURL: backgroundMusicURL)
}catch {
print("could not setup background music")
}
SoundManager.backgroundMusicSharedInstance!.volume = 0.2
/*
set any negative integer value to loop the sound
indefinitely until you call the stop method
*/
SoundManager.backgroundMusicSharedInstance!.numberOfLoops = -1
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
self.setUpSounds()
if SoundManager.backgroundMusicSharedInstance!.playing == false{
self.playBackgroundSound()
}
}

Resources