I am trying to create an AVAudioPlayer that plays NSData downloaded from Parse.
I am pretty certain the sound (.wav format) has been uploaded to Parse. I am also certain that the sound can be downloaded from Parse in the NSData format. So I am creating an AVAudioPlayer object using the downloaded data from Parse:
if audioData != nil {
print("successful downloading audio!") //this prints out
let audioPlayer = try! AVAudioPlayer(data: audioData!, fileTypeHint: AVFileTypeWAVE)
audioPlayer.prepareToPlay()
audioPlayer.volume = 0.5
audioPlayer.play()
}
As you can see above, the audioPlayer is created, but it does not play the sound. Where might be wrong?
Is this code within a function? If so, your making the audioPlayer variable locally. This means that the audioPlayer is created, starts playing and then is deallocated (at the end of the function call), resulting in no audio. Your audioPlayer object needs to be a class property or within global space (such as a singleton for example), so the object life persists after the function ends.
Related
I am working on a video calling application which involves streaming. I get continuous NSData steam in a delegate callback. Could anyone tell me how to render this continuous NSData stream in using Objective-C?
Should I use AVPlayer or MetalKit for rendering the data?
- (void) videoReceived: frame:(NSData*) data
This delegate method keeps getting called continuously while streaming (this is the place I get video data in the form of NSData). Could anyone tell me how to render it?
Here:
var videoData: Data // some video data
var fileURL: URL // some local path, preferably appending NSTemporaryDirectory()
try! videoData.write(to: fileURL)
let item = AVPlayerItem(url: fileURL)
I have an app being used by people to receive orders with it needing to make a continuous sound until staff attend to it. It was working for two months then just started crashing a lot. For whatever reason, it runs fine on an iPad but not on iPhones running a recent operating system.
When this bit of code gets called it crashes:
guard let path = Bundle.main.path(forResource: "alert.mp3", ofType: nil) else { return }
let url = URL(fileURLWithPath: path)
do {
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
} catch let err {
print("err: \(err)")
}
DispatchQueue.main.async {
self.alertSoundEffect.numberOfLoops = -1
self.alertSoundEffect.prepareToPlay()
self.alertSoundEffect.play()
}
The fix online to declare the alertSoundEffect variable like this:
private var alertSoundEffect : AVAudioPlayer!
has not worked at all.
I tried moving everything but the line:
self.alertSoundEffect.play()
to viewDidLoad as I thought maybe that code couldn't get called more than once, but it didn't help.
Specifically, the compiler highlights this line when it crashes:
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
I tried using try AVAudioPlayer where it takes a Data object as a parameter or with including the type of audio file to be played, but that did not change anything.
When I try the AVAudioPlayer's delegate and declare it like this:
self.alertSoundEffect.delegate = self
right before the first lines of code I shared above Xcode highlights this line instead when it reliably crashes.
What else should I try?
I suppose your path is wrong.
Try this:
guard let path = Bundle.main.path(forResource: "alert", ofType: "mp3") else { return }
Also, if your audio file is short, like less than 30s, then try not to call self.alertSoundEffect.prepareToPlay(). Just call self.alertSoundEffect.play() right away.
Since iOS 13, this was causing a bug in my app, since I have notification sounds which are 3-10 seconds long.
If you initialise your AVAudioPlayer like var wrongMusicPlayer: AVAudioPlayer = AVAudioPlayer() OR wrongMusicPlayer = AVAudioPlayer() in any method then please remove it and just Declare like var wrongMusicPlayer: AVAudioPlayer!.
iOS 13.1 Crash in AVAudio Player
I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time.
I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it would delay the second sound just slightly after the first sound was started). Furthermore, if I pressed notes at a very fast rate, it seemed like the engine couldn't keep up, and later sounds would start well after I had pressed the later notes.
I am trying to solve this problem, and from the research I have done, it seems like using the AVAudioEngine to play sounds would be the best method, where I can set up the sounds in an array of buffers, and then have them play back from those buffers.
class ViewController: UIViewController
{
// Main Audio Engine and it's corresponding mixer
var audioEngine: AVAudioEngine = AVAudioEngine()
var mainMixer = AVAudioMixerNode()
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
// Array of filepaths
let noteFilePath: [String] = [
Bundle.main.path(forResource: "note1", ofType: "wav")!,
Bundle.main.path(forResource: "note2", ofType: "wav")!,
Bundle.main.path(forResource: "note3", ofType: "wav")!]
// Array to store the note URLs
var noteFileURL = [URL]()
// One audio file per note
var noteAudioFile = [AVAudioFile]()
// One audio buffer per note
var noteAudioFileBuffer = [AVAudioPCMBuffer]()
override func viewDidLoad()
{
super.viewDidLoad()
do
{
// For each note, read the note URL into an AVAudioFile,
// setup the AVAudioPCMBuffer using data read from the file,
// and read the AVAudioFile into the corresponding buffer
for i in 0...2
{
noteFileURL.append(URL(fileURLWithPath: noteFilePath[i]))
// Read the corresponding url into the audio file
try noteAudioFile.append(AVAudioFile(forReading: noteFileURL[i]))
// Read data from the audio file, and store it in the correct buffer
let noteAudioFormat = noteAudioFile[i].processingFormat
let noteAudioFrameCount = UInt32(noteAudioFile[i].length)
noteAudioFileBuffer.append(AVAudioPCMBuffer(pcmFormat: noteAudioFormat, frameCapacity: noteAudioFrameCount)!)
// Read the audio file into the buffer
try noteAudioFile[i].read(into: noteAudioFileBuffer[i])
}
mainMixer = audioEngine.mainMixerNode
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...2
{
audioEngine.attach(audioFilePlayer[i])
audioEngine.connect(audioFilePlayer[i], to: mainMixer, fromBus: 0, toBus: i, format: noteAudioFileBuffer[i].format)
}
// Start the audio engine
try audioEngine.start()
// Setup the audio session to play sound in the app, and activate the audio session
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.soloAmbient)
try AVAudioSession.sharedInstance().setMode(AVAudioSession.Mode.default)
try AVAudioSession.sharedInstance().setActive(true)
}
catch let error
{
print(error.localizedDescription)
}
}
func playSound(senderTag: Int)
{
let sound: Int = senderTag - 1
// Set up the corresponding audio player to play its sound.
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, options: .interrupts, completionHandler: nil)
audioFilePlayer[sound].play()
}
Each sound should be playing without interrupting the other sounds, only interrupting its own sound when the sounds is played again. However, despite setting up multiple buffers and players, and assigning each one to its own Bus on the audioEngine's mixer, playing one sound still stops any other sounds from playing.
Furthermore, while leaving out .interrupts does prevent sounds from stopping other sounds, these sounds won't play until the sound that is currently playing completes. This means that if I play note1, then note2, then note3, note1 will play, while note2 will only play after note1 finishes, and note3 will only play after note2 finishes.
Edit: I was able to get the audioFilePlayer to reset to the beginning again without using interrupt with the following code in the playSound function.
if audioFilePlayer[sound].isPlaying == true
{
audioFilePlayer[sound].stop()
}
audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, completionHandler: nil)
audioFilePlayer[sound].play()
This still leaves me with figuring out how to play these sounds simultaneously, since playing another sound will still stop the currently playing sound.
Edit 2: I found the solution to my problem. My answer is below.
It turns out that having the .interrupt option wasn't the issue (in fact, this actually turned out to be the best way to restart the sound that was playing in my experience, as there was no noticeable pause during the restart, unlike the stop() function). The actual problem that was preventing multiple sounds from playing simultaneously was this particular line of code.
// One AVAudioPlayerNode per note
var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)
What happened here was that each item of the array was being assigned the exact same AVAudioPlayerNode value, so they were all effectively sharing the same AVAudioPlayerNode. As a result, the AVAudioPlayerNode functions were affecting all of the items in the array, instead of just the specified item. To fix this and give each item a different AVAudioPlayerNode value, I ended up changing the above line so that it starts as an empty array of type AVAudioPlayerNode instead.
// One AVAudioPlayerNode per note
var audioFilePlayer = [AVAudioPlayerNode]()
I then added a new line to append to this array a new AVAudioPlayerNode at the beginning inside of the second for-loop of the viewDidLoad() function.
// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...6
{
audioFilePlayer.append(AVAudioPlayerNode())
// audioEngine code
}
This gave each item in the array a different AVAudioPlayerNode value. Playing a sound or restarting a sound no longer interrupts the other sounds that are currently being played. I can now play any of the notes simultaneously and without any noticeable latency between note press and playback.
In iOS 8/Xcode 6 I had a function that included a sound effect. It no longer works in iOS 9 after changing the code multiple times. This is what I've tried:
Original:
let bangSoundEffect = SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false)
runAction(bangSoundEffect)
Other attempt:
self.runAction(SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false))
Also:
func playRocketExplosionSound(filename: String) {
let url = NSBundle.mainBundle().URLForResource(
filename, withExtension: nil)
if (url == nil) {
print("Could not find file: \(filename)")
return }
var error: NSError? = nil
do {
backgroundMusicPlayer =
try AVAudioPlayer(contentsOfURL: url!)
} catch let error1 as NSError {
error = error1
backgroundMusicPlayer = nil
}
if backgroundMusicPlayer == nil {
print("Could not create audio player: \(error!)")
return}
backgroundMusicPlayer.numberOfLoops = 1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play() }
playRocketExplosionSound("Bang.mp3")
I'm pulling my hair out. I'm using the same code in a different scene for another sound effect and it works fine!! What's going wrong?
I've noticed that the sound effect begins to play sometimes in the simulator, however it doesn't complete and throws this error:
2015-09-24 19:12:14.554 APPNAME[4982:270835] 19:12:14.553 ERROR: 177: timed out after 0.012s (735 736); mMajorChangePending=0
It doesn't work at all on actual devices.
What is the problem? :'(
Possible problem with MP3 file
The problem is most likely connected with the MP3 file you're using. The code works for other sounds, this suggests that the MP3 file might be corrupted and AVAudioPlayer fails with decoding it. You can try re-encode this file and see if the problem persists. Or, even better, converting it to WAV.
Using WAVs
General rule of the thumb when creating short sound effects for games, is to use WAV unless you really feel you need the trim the fat.
Top-notch games are going for top-of-the-line production quality, so they record and produce assets uncompressed 24bit/48kHz. Titles with slightly lesser ambitions might record and produce in 16/44.1, which is the official standard for CD quality audio.
This has at least two benefits. One is that the sound has a better quality. Second one, the CPU does not have to decode the file to play it.
Corrupt data file | AVAudioPlayer out of scope
1. Corrupt data file
This will ensure you have found the file:
var backgroundMusicPlayer: AVAudioPlayer? = nil
if let url = Bundle.main.url(
forResource: "Bang", withExtension: "mp3") {
do {
try backgroundMusicPlayer = AVAudioPlayer(contentsOf: url)
backgroundMusicPlayer!.play()
} catch {}
}
return nil
2. AVAudioPlayer out of scope
The variable retaining backgroundMusicPlayer must not go out of scope before play() has completed and returns. This is generally achieved by using a class variable:
var backgroundMusicPlayer: AVAudioPlayer? = nil
Don't do this: the following sound will play for, at best, outOfScopeDelay due to the local scope of var audioPlayer.
let outOfScopeDelay = 0.5
do {
var audioPlayer:AVAudioPlayer! // Incorrectly scoped variable
try audioPlayer = AVAudioPlayer(contentsOf: audioRecorder.url)
audioPlayer.play()
Thread.sleep(forTimeInterval: outOfScopeDelay)
} catch {}
► Find this solution on GitHub and additional details on Swift Recipes.
try this:
dispatch_async(dispatch_get_main_queue(), {
(self.playRocketExplosionSound("Bang.mp3")
})
it's no longer safe to play audio in child thread under iOS 9.
I am trying to play an audio file i saved on parse. I am getting the url from the PFFile from the object i saved to parse. When i run the app the avplayer produces no audio. I tested to see if the avplayer was playing by the first code snippet below and it prints out "Playing" which means the player is playing but no audio. I also tried setting the volume for avplayer and that didn't help. Don't understand why it won't play if anyone would like to help me out.
Audio File URL: http://files.parsetfss.com/292b6f11-5fee-4be7-b317-16fd494dfa3d/tfss-ccc3a843-967b-4773-b92e-1cf2e8f3c1c6-testfile.wav
This Code stops avplayer if it is playing:
if (player.rate > 0) && (player.error == nil) {
// player is playing
println("Playing")
} else {
println("Not Playing")
}
AVPlayer Code:
let objectAudio: PFObject = object as PFObject
let parseAudio: PFFile = objectAudio.valueForKey("audioFileParse") as PFFile
let audioPath: String = parseAudio.url
let urlParse: NSURL = NSURL(fileURLWithPath: audioPath)!
player = AVPlayer(URL: urlParse)
println(player) //prints out <AVPlayer: 0x79e863c0>
player.volume = 1.0
player.play()
You are using the wrong method to get a NSURL here, you try to create a local file URL from an URL that points to a resource on a remote server.
Instead of NSURL(fileURLWithPath: audioPath) you should use the initalizer that accepts an URL string as the input (see here https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSURL_Class/#//apple_ref/occ/instm/NSURL/initWithString:)
Your current code would point to a local resource which does not exist on the local filesystem whereas it should point to the file on the Parse server.
Just as a reference, the difference between URLWithString and fileURLWithPath What is difference between URLWithString and fileURLWithPath of NSURL?