I'm working on developing an app that plays narration by playing sentence-by-sentence sound file one after another.
With below code, it played as expected. However, after adding "Stop" button to stop what's playing, I found that "Stop" button didn't stop the sound.
I tested the "Stop" button before pressing "Play" button, which worked no problem (message was printed). However, after pressing "Play" and while NarrationPlayer is playing, "Stop" button didn't work (no message was printed).
Any idea what's wrong?
import UIKit
import AVFoundation
class ViewController: UIViewController,AVAudioPlayerDelegate {
var NarrationPlayer:AVAudioPlayer = AVAudioPlayer()
var soundlist: [String] = []
var counter = 0
}
func playSound(_ soundfile: String) {
let NarPath = Bundle.main.path(forResource: soundfile, ofType:"mp3")!
let NarUrl = URL(fileURLWithPath: NarPath)
do {
NarrationPlayer = try AVAudioPlayer(contentsOf: NarUrl)
NarrationPlayer.delegate = self
} catch{
print(error)
}
NarrationPlayer.play()
}
#IBAction func play(_ sender: Any) {
soundlist.append("a")
soundlist.append("b")
soundlist.append("c")
playSound("first")
while counter < soundlist.count{
if NarrationPlayer.isPlaying == true{
}
else{
playSound(soundlist[counter])
counter += 1
}
}
}
#IBAction func StopPlay(_ sender: Any) {
print("stop button worked")
}
The problem you're running into is that this line here:
while counter < soundlist.count{
is hogging the main thread, keeping any click on your "Stop Playing" button from actually firing.
You've set a delegate method though, and one of the very handy things you can do here is increment your counter and play your next sound file each time a sound file finishes up.
Something like this:
func playSound(_ soundfile: String) {
let NarPath = Bundle.main.path(forResource: soundfile, ofType:"mp3")!
let NarUrl = URL(fileURLWithPath: NarPath)
do {
NarrationPlayer = try AVAudioPlayer(contentsOf: NarUrl)
NarrationPlayer.delegate = self
} catch{
print(error)
}
NarrationPlayer.play()
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool)
{
playSound(self.soundlist[counter])
counter += 1
}
#IBAction func play(_ sender: Any) {
playSound("first")
soundlist.append("a")
soundlist.append("b")
soundlist.append("c")
}
One last piece of advice:
Change the name NarrationPlayer to narrationPlayer. Variables in Swift, like in Objective-C, should start with lowercase (also known as lowerCamelCase).
Related
I have an array of audio instructions that I would like to randomize, play (one at a time), and do some data recording in between each instruction.
With the below code, I have a button press that triggers beginTest(), but the instruction tracks play simultaneously. It never gets to startRecord().
I've tried placing startRecord() inside audioPlayerDidFinishPlaying; in that case it does get to startRecord() for the first track, but doesn't continue to the next iteration of the loop inside beginTest() where playScript() is called.
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool)
{
self.player = nil
print("finished playing this file")
}
func playScript(c: String)
{
let path = Bundle.main.path(forResource: c, ofType:"m4a")!
let url = URL(fileURLWithPath: path)
do {
self.player = try AVAudioPlayer(contentsOf: url)
self.player?.delegate = self
self.player?.play()
print(c)
print("playing script for " + c)
} catch {
print("couldn't load script")
}
}
func beginTest()
{
for c in conditions.shuffled()
{
playScript(c: c)
if self.player == nil
{
startRecord()
print("started data recording")
}
}
exportData()
}
I'm guessing that beginTest() keeps moving along the for loop before audioPlayerDidFinishPlaying resets the player to nil, but I don't know how to stall this.
The approach using audioPlayerDidFinishPlaying(_:,successfully:) is correct. In your code when you call playScript(c:) you will iterate through all audio files from conditions array and call startRecord method really quickly, while first audio is playing. All sounds will be placed in a queue and play one by one. I don't know contents of startRecord method, but I don't think that it was supposed to be called like that.
In order to avoid those problems you need to call playScript(c:) and then wait for method startRecord() to finish, and then call playScript(c:) again. So you can try something like this:
import AVFoundation
class SomeClass: NSObject, AVAudioPlayerDelegate {
var player: AVAudioPlayer?
var conditions = [String]()
var instructions: [() -> ()] = []
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
print("finished playing \(player.url)")
startRecord {
playNextInstruction()
}
}
func playScript(c: String)
{
let path = Bundle.main.path(forResource: c, ofType:"m4a")!
let url = URL(fileURLWithPath: path)
do {
self.player = try AVAudioPlayer(contentsOf: url)
self.player?.delegate = self
self.player?.play()
print(c)
print("playing script for " + c)
} catch {
print("couldn't load script")
}
}
func beginTest() {
instructions = conditions.shuffled().map { [unowned self] condition in
return {
self.playScript(c: condition)
}
}
playNextInstruction()
}
func playNextInstruction() {
if instructions.count != 0 {
instructions.remove(at: 0)()
} else {
exportData()
}
}
func exportData() {
}
func startRecord(completion: (() -> ())) {
}
}
If it won't work then I suggest to try searching problem somewhere else in your code.
The environment for this is iOS 13.6 and Swift 5. I have a very simple app that successfully plays an MP3 file in the foreground or background. I added MPRemoteCommandCenter play and pause command handlers to it. I play the sound file in the foreground and then pause it.
When I tap the play button from the lock screen, my code calls audioPlayer.play(), which returns true. I hear the sound start playing again, but the currentTime of the player does not advance. After that, the play and pause buttons on the lock screen do nothing. When I foreground the app again, the play button plays from where it was before I went to the lock screen.
Here is my AudioPlayer class:
import AVFoundation
import MediaPlayer
class AudioPlayer: RemoteAudioCommandDelegate {
var audioPlayer = AVAudioPlayer()
let remoteCommandHandler = RemoteAudioCommandHandler()
var timer:Timer!
func play(title: String) {
let path = Bundle.main.path(forResource: title, ofType: "mp3")!
let url = URL(fileURLWithPath: path)
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playback)
try AVAudioSession.sharedInstance().setActive(true)
audioPlayer = try AVAudioPlayer(contentsOf: url)
remoteCommandHandler.delegate = self
remoteCommandHandler.enableDisableRemoteCommands(true)
timer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(updateNowPlayingInfo), userInfo: nil, repeats: true)
} catch let error as NSError {
print("error = \(error)")
}
}
func play() {
print ("audioPlayer.play() returned \(audioPlayer.play())")
}
func pause() {
audioPlayer.pause()
}
func stop() {
audioPlayer.stop()
}
func currentTime() -> TimeInterval {
return audioPlayer.currentTime
}
func setCurrentTime(_ time:TimeInterval) {
audioPlayer.currentTime = time
}
#objc func updateNowPlayingInfo() {
// Hard-code the nowPlayingInfo since this is a simple test app
var nowPlayingDict =
[MPMediaItemPropertyTitle: "Tin Man",
MPMediaItemPropertyAlbumTitle: "The Complete Greatest Hits",
MPMediaItemPropertyAlbumTrackNumber: NSNumber(value: UInt(10) as UInt),
MPMediaItemPropertyArtist: "America",
MPMediaItemPropertyPlaybackDuration: 208,
MPNowPlayingInfoPropertyPlaybackRate: NSNumber(value: 1.0 as Float)] as [String : Any]
nowPlayingDict[MPNowPlayingInfoPropertyElapsedPlaybackTime] = NSNumber(value: audioPlayer.currentTime as Double)
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingDict
}
}
Here is my RemoteCommandHandler class:
import Foundation
import MediaPlayer
protocol RemoteAudioCommandDelegate: class {
func play()
func pause()
}
class RemoteAudioCommandHandler: NSObject {
weak var delegate: RemoteAudioCommandDelegate?
var remoteCommandCenter = MPRemoteCommandCenter.shared()
var playTarget: Any? = nil
var pauseTarget: Any? = nil
func enableDisableRemoteCommands(_ enabled: Bool) {
print("Called with enabled = \(enabled)")
remoteCommandCenter.playCommand.isEnabled = enabled
remoteCommandCenter.pauseCommand.isEnabled = enabled
if enabled {
addRemoteCommandHandlers()
} else {
removeRemoteCommandHandlers()
}
}
fileprivate func addRemoteCommandHandlers() {
print( "Entered")
if playTarget == nil {
print( "Adding playTarget")
playTarget = remoteCommandCenter.playCommand.addTarget { (event) -> MPRemoteCommandHandlerStatus in
print("addRemoteCommandHandlers calling delegate play")
self.delegate?.play()
return .success
}
}
if pauseTarget == nil {
print( "Adding pauseTarget")
pauseTarget = remoteCommandCenter.pauseCommand.addTarget { (event) -> MPRemoteCommandHandlerStatus in
print("addRemoteCommandHandlers calling delegate pause")
self.delegate?.pause()
return .success
}
}
}
fileprivate func removeRemoteCommandHandlers() {
print( "Entered")
if playTarget != nil {
print( "Removing playTarget")
remoteCommandCenter.playCommand.removeTarget(playTarget)
playTarget = nil
}
if pauseTarget != nil {
print( "Removing pauseTarget")
remoteCommandCenter.pauseCommand.removeTarget(pauseTarget)
pauseTarget = nil
}
}
}
I will gladly supply further required info, because I'm baffled at why this relatively straightforward code (in my mind) code doesn't work.
Assistance is much appreciated!
After some more debugging, I found that the AVAudioPlayer started to play the sound from the lock screen, but stopped again shortly after.
I mitigated the problem by adding a Timer. The timer checks if the last command by the user was play, but the sound is not playing. I also change the status when the user selects pause or the song stops playing at its natural end.
I am still at a loss for an actual fix for this problem.
I have a set of buttons in a stack view. Each button when pressed plays a different sound. I have a separate button (loop button) that when pressed calls the loopButtonPressed function. My goal is that when this loop button is pressed, it will loop through the subviews that are buttons in this stack view and play each of the sounds sequentially in order using the soundButtonPressed function. I saw a method that I implemented below using the run() function which sets each consecutive function to run after a given amount of time. Although this kind of works it is not a great solution because the sound files are of varying length. I was thinking there may be a way to do this using dispatch groups, which I don't fully understand. If I take away the run function, it will only play the sound of the last button in the stack view. I am using AVFoundation to play the wav files as well. I appreciate any advice or direction, thanks.
func run(after seconds: Int, completion: #escaping () -> Void) {
let deadline = DispatchTime.now() + .milliseconds(seconds)
DispatchQueue.main.asyncAfter(deadline: deadline) {
completion()
}
}
#objc func loopButtonPressed(_ sender: UIButton) {
var i = 1
for case let button as UIButton in self.colorBubblesStackView.subviews {
run(after: 800*i) {
self.soundButtonPressed(sender: button)
}
i += 1
}
}
My soundButtonPressed function is just a switch statement where each case calls the function playSound() with the correct sound file name. Here is the playSound function:
func playSound(_ soundFileName: String) {
guard let url = Bundle.main.url(forResource: soundFileName, withExtension: "wav") else { return }
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileType.wav.rawValue)
guard let player = player else { return }
player.play()
} catch let error {
print(error.localizedDescription)
}
}
func playSound(name: String ) {
guard let url = Bundle.main.url(forResource: name, withExtension: "mp3") else {
print("url not found")
return
}
do {
/// this codes for making this app ready to takeover the device audio
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
/// change fileTypeHint according to the type of your audio file (you can omit this)
player = try AVAudioPlayer(contentsOf: url, fileTypeHint: AVFileTypeMPEGLayer3)
player?.delegate = self
// no need for prepareToPlay because prepareToPlay is happen automatically when calling play()
player!.play()
} catch let error as NSError {
print("error: \(error.localizedDescription)")
}
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
print("finished")//It is working now! printed "finished"!
}
confirm to protocol
class ViewController: UIViewController,AVAudioPlayerDelegate {
and instead of looping add each button a Tag and start from first tag sat 100 . Then when the call back obtained for player finished playing play the next File with new icremented Tag say 101
Try AVQueuePlayer
A player used to play a number of items in sequence.
#objc func loopButtonPressed(_ sender: UIButton) {
let allUrls = allSongs.compactMap { Bundle.main.url(forResource: $0, withExtension: "wav") }
let items = allUrls.map { AVPlayerItem(url: $0) }
let queuePlayer = AVQueuePlayer(items: items)
queuePlayer.play()
}
I have made a custom LaunchScreen in the Main.Storyboard to make it seem that a sound is actually coming from the LaunchScreen. The sound works fine and the segue to the next view controller as well. The only problem is that the segue happens before the sound has stopped playing. I would like the sound to complete before making the segue. Logically, it should work since the performSegue is directly after the .play(). But it seems that the two happens simultaneously. Here's my code:
super.viewDidLoad()
//PLAY SOUND CLIP//
let musicFile = Bundle.main.path(forResource: "fanfare", ofType: ".mp3")
do {
try musicSound = AVAudioPlayer(contentsOf: URL (fileURLWithPath: musicFile!))
} catch { print("ERROR PLAYING MUSIC")}
musicSound.play() //WORKS
//
DispatchQueue.main.async() {
self.performSegue(withIdentifier: "myLaunchSegue", sender: self) //WORKS
}
I have tried to add:
perform(Selector(("showNavController")), with: self, afterDelay: 3)
where "showNavController" simply is the segue:
func showNavController() {
performSegue(withIdentifier: "myLaunchSegue", sender: nil)
}
but the program crashes with the error "uncaught exception.... ....unrecognized selector sent to instance"
I have also tried to add a boolean to keep the program from progressing until the sound has played, but didn't get it to work. Any ideas?
//////////////////////////
Update:
Trying Russels answer, but have a few questions. Setting AVAudioPlayer as delegate, does that mean setting it next to the class like this:
class MyLaunchViewController: UIViewController, AVAudioPlayer { ...
Also, how do I call the function audioPlayerDidFinishPlaying? Like so:
audioPlayerDidFinishPlaying(musicSound, successfully: true)
I'll post the whole code block. Makes it easier to understand...
import UIKit
import AVFoundation //FOR SOUND
class MyLaunchViewController: UIViewController, AVAudioPlayer {
var musicSound: AVAudioPlayer = AVAudioPlayer() //FOR SOUND
override func viewDidLoad() {
super.viewDidLoad()
//PLAY SOUND CLIP//
let musicFile = Bundle.main.path(forResource: "fanfare", ofType: ".mp3")
do {
try musicSound = AVAudioPlayer(contentsOf: URL (fileURLWithPath: musicFile!))
} catch { print("ERROR PLAYING MUSIC")}
musicSound.play() //WORKS
//
audioPlayerDidFinishPlaying(musicSound, successfully: true)
}
optional func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
DispatchQueue.main.async() {
self.performSegue(withIdentifier: "myLaunchSegue", sender: self)
}
}
I get an error when writing AVAudioPlayer in the class (perhaps I misunderstood what I was supposed to do?). It says I have multiple inheritances. Also, it doesn't want me to set the new function as optional, as its only for protocol members. Finally, If I correct the errors and run the program, the next segue runs before the sound has finished playing... :( sad panda.
You need to make your LaunchScreen an AVAudioPlayerDelegate, and then use the audioPlayerDidFinishPlaying callback. Here's all you need in the first controller
import UIKit
import AVFoundation
class ViewController: UIViewController, AVAudioPlayerDelegate
{
var musicSound: AVAudioPlayer?
override func viewDidLoad() {
super.viewDidLoad()
//PLAY SOUND CLIP//
let musicFile = Bundle.main.path(forResource: "sound", ofType: ".wav")
do {
try musicSound = AVAudioPlayer(contentsOf: URL (fileURLWithPath: musicFile!))
} catch { print("ERROR PLAYING MUSIC")}
musicSound?.delegate = self
musicSound!.play() //WORKS
print("Next line after play")
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
DispatchQueue.main.async() {
print("audioPlayerDidFinishPlaying")
self.performSegue(withIdentifier: "myLaunchSegue", sender: self)
}
}
}
You can get more details here https://developer.apple.com/documentation/avfoundation/avaudioplayerdelegate/1389160-audioplayerdidfinishplaying
I am creating a simple music app, and I was wondering how I can make a UiSlider to follow the progress of a audio file. Here's my project so far:
Code:
import UIKit
import AVFoundation
class SongDetailViewController: UITableViewController {
var audioPlayer = AVAudioPlayer()
override func viewDidLoad() {
super.viewDidLoad()
do {
audioPlayer = try AVAudioPlayer(contentsOf: URL.init(fileURLWithPath: Bundle.main.path(forResource: "Song Name", ofType: "mp3")!))
audioPlayer.prepareToPlay()
var audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayback)
}
}
catch {
print(error)
}
}
// Buttons
// Dismiss
#IBAction func dismiss(_ sender: Any) {
dismiss(animated: true, completion: nil)
}
// Play
#IBAction func play(_ sender: Any) {
audioPlayer.stop()
audioPlayer.play()
}
// Pause
#IBAction func pause(_ sender: Any) {
audioPlayer.pause()
}
// Restart
#IBAction func restart(_ sender: Any) {
audioPlayer.currentTime = 0
}
}
I'm wanting to create the uislider similar to the Apple Music app where it follows the audio file's progress and whenever the user slides the ball thing (lol) it goes to that time of the song. If you could give me some code to complete this, that would be amazing!
Please keep in mind that I am fairly new to coding and am still learning swift, so keep it simple :-) Thanks again!
If you switch to using an AVPlayer, you can add a periodicTimeObserver to your AVPlayer. In the example below you'll get a callback every 1/30 second…
let player = AVPlayer(url: Bundle.main.url(forResource: "Song Name", withExtension: "mp3")!)
player.addPeriodicTimeObserver(forInterval: CMTimeMake(1, 30), queue: .main) { time in
let fraction = CMTimeGetSeconds(time) / CMTimeGetSeconds(player.currentItem!.duration)
self.slider.value = fraction
}
Where you create an audioPlayer in your code, replace with the code above.
Using AVAudioPlayer you could create a periodic timer that fires several times a second (up to 60 times/second - any more would be a waste) and updates your slider based on your audio player's currentTime property. To sync the update with screen refresh you could use a CADisplayLink timer.
Edit:
This part of my answer doesn't work:
It should also be possible to set up a Key Value Observer on your
AVAudioPlayers currentTime property so that each time the value
changes your observer fires. (I haven't tried this, but it should
work.)