I'm trying to use the Convolution Reverb in a mac os app.
AudioKit 4.03
The playground example works for me, but when I try to replicate it in my app, I get this error, and no audio.
2017-11-18 20:21:36.116436-0500 convolutionVerb testing[37554:4533072] [avae] AVAEInternal.h:69:_AVAE_Check: required condition is false: [AVAudioEngine.mm:348:AttachNode: (node != nil)]
2017-11-18 20:21:36.116642-0500 convolutionVerb testing[37554:4533072] Failed to set (contentViewController) user defined inspected property on (NSWindow): required condition is false: node != nil
Here's my code (which is just a slightly modified version of the AudioKit playground)
class ViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
do {
file = try AKAudioFile(readFileName: "SAMPLES/Bell.wav")
player = try AKAudioPlayer(file: file)
}
catch {print("DIDN'T LOAD")}
player.looping = true
let stairwell = bundle.url(forResource: "stairwell", withExtension: "wav", subdirectory: "Impulse Responses")
let dish = bundle.url(forResource: "dish", withExtension: "wav", subdirectory: "Impulse Responses")
stairwellConvolution = AKConvolution(player,impulseResponseFileURL: stairwell!,partitionLength: 8_192)
dishConvolution = AKConvolution(player,impulseResponseFileURL: dish!,partitionLength: 8_192)
mixer = AKDryWetMixer(stairwellConvolution, dishConvolution, balance: 0.5)
dryWetMixer = AKDryWetMixer(player, mixer, balance: 0.5)
AudioKit.output = dryWetMixer
AudioKit.start()
stairwellConvolution.start()
dishConvolution.start()
player.play()
// Do any additional setup after loading the view.
}
Its a bit hard to say for sure this is correct, but I'll venture a guess that it has something to do with your view controller life cycle. You have this AudioKit set up code in viewDidLoad which might not be safe. I think its at least a worthwhile step to put the audio code into an Engine or Conductor singleton class accessible across your project and not tied to UI life cycles.
Related
I'm on AudioKit 4.9.1 and can't manage to play a MIDI file with the new AKSequencer (replacing AKAppleSequencer). No sound playing. Assume that MIDI file AND samples are loaded correctly since they previously worked with AKAppleSequencer. Background audio mode capability is also enabled.
Here's the relevant code: (I've also tried both AKSampler and AKAppleSampler but same result)
class MIDIPlayer {
var sampler: AKSampler
var legacySampler: AKAppleSampler
var sequencer: AKSequencer
init(withSfz sfz: String, orSf2 sf2: String, andMidiFile midiFile: String) {
self.sampler = AKSampler()
self.legacySampler = AKAppleSampler()
try? legacySampler.loadSoundFont(sf2, preset: 0, bank: 0)
sampler.loadSFZ(url: Bundle.main.url(forResource: sfz, withExtension: "sfz")!)
AudioKit.output = sampler
try? AudioKit.start()
sequencer = AKSequencer(targetNode: sampler)
// sequencer = AKSequencer(targetNode: legacySampler)
let midi = AKMIDIFile(url: Bundle.main.url(forResource: midiFile, withExtension: "mid")!)
sequencer.load(midiFile: midi)
}
func play() {
sequencer.playFromStart()
}
Is there some difference in how to set up the signal chain that I'm missing?
With the new sequencer, it has to be part of the signal chain. So, do something like
let mixer = AKMixer
sampler >>> mixer
for track in sequencer.tracks { track >>> mixer }
AudioKit.output = mixer
and it should work. Sorry for the delay in seeing this on Github issues.
I have an app being used by people to receive orders with it needing to make a continuous sound until staff attend to it. It was working for two months then just started crashing a lot. For whatever reason, it runs fine on an iPad but not on iPhones running a recent operating system.
When this bit of code gets called it crashes:
guard let path = Bundle.main.path(forResource: "alert.mp3", ofType: nil) else { return }
let url = URL(fileURLWithPath: path)
do {
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
} catch let err {
print("err: \(err)")
}
DispatchQueue.main.async {
self.alertSoundEffect.numberOfLoops = -1
self.alertSoundEffect.prepareToPlay()
self.alertSoundEffect.play()
}
The fix online to declare the alertSoundEffect variable like this:
private var alertSoundEffect : AVAudioPlayer!
has not worked at all.
I tried moving everything but the line:
self.alertSoundEffect.play()
to viewDidLoad as I thought maybe that code couldn't get called more than once, but it didn't help.
Specifically, the compiler highlights this line when it crashes:
self.alertSoundEffect = try AVAudioPlayer(contentsOf: url)
I tried using try AVAudioPlayer where it takes a Data object as a parameter or with including the type of audio file to be played, but that did not change anything.
When I try the AVAudioPlayer's delegate and declare it like this:
self.alertSoundEffect.delegate = self
right before the first lines of code I shared above Xcode highlights this line instead when it reliably crashes.
What else should I try?
I suppose your path is wrong.
Try this:
guard let path = Bundle.main.path(forResource: "alert", ofType: "mp3") else { return }
Also, if your audio file is short, like less than 30s, then try not to call self.alertSoundEffect.prepareToPlay(). Just call self.alertSoundEffect.play() right away.
Since iOS 13, this was causing a bug in my app, since I have notification sounds which are 3-10 seconds long.
If you initialise your AVAudioPlayer like var wrongMusicPlayer: AVAudioPlayer = AVAudioPlayer() OR wrongMusicPlayer = AVAudioPlayer() in any method then please remove it and just Declare like var wrongMusicPlayer: AVAudioPlayer!.
iOS 13.1 Crash in AVAudio Player
I have an app that uses samplers to play loops. I am in the process of converting my app from using AVAudioEngine to AudioKit. My app now works well except for this: Approximately every 1-3 minutes, my app receives two .AVAudioEngineConfigurationChange notifications in a row. There is no apparent pattern to its repetition and this happens on both my iPhone 6s and new iPad.
Here is my init code for my "conductor" singleton:
init() {
//sampler array
//sampler array is cycled through as user changes sounds
samplerArray = [sampler0, sampler1, sampler2, sampler3]
//start by loading samplers with default preset
for sampler in samplerArray {
//get the sampler preset
let presetPath = Bundle.main.path(forResource: currentSound, ofType: "aupreset")
let presetURL = NSURL.fileURL(withPath: presetPath!)
do {
try sampler.samplerUnit.loadPreset(at: presetURL)
print("rrob: loaded sample")
} catch {
print("rrob: failed to load sample")
}
}
//signal chain
samplerMixer = AKMixer(samplerArray)
filter = AKMoogLadder(samplerMixer)
reverb = AKCostelloReverb(filter)
reverbMixer = AKDryWetMixer(filter, reverb, balance: 0.3)
outputMixer = AKMixer(reverbMixer)
AudioKit.output = outputMixer
//AKSettings.enableRouteChangeHandling = false
AKSettings.playbackWhileMuted = true
do {
try AKSettings.setSession(category: AKSettings.SessionCategory.playback, with: AVAudioSessionCategoryOptions.mixWithOthers)
} catch {
print("rrob: failed to set audio session")
}
//AudioBus recommended buffer length
AKSettings.bufferLength = .medium
AudioKit.start()
print("rrob: did init autoEngine")
}
Any AudioKit experts have ideas for where I can start troubleshooting? Happy to provide more info. Thanks.
I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine. I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have:
var audioEngine: AVAudioEngine = AVAudioEngine()
var equalizer: AVAudioUnitEQ!
var audioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
var audioFile: AVAudioFile!
// in viewDidLoad():
equalizer = AVAudioUnitEQ(numberOfBands: 5)
audioEngine.attach(audioPlayerNode)
audioEngine.attach(equalizer)
let bands = equalizer.bands
let freqs = [60, 230, 910, 4000, 14000]
audioEngine.connect(audioPlayerNode, to: equalizer, format: nil)
audioEngine.connect(equalizer, to: audioEngine.outputNode, format: nil)
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
bands[0].gain = -10.0
bands[0].filterType = .lowShelf
bands[1].gain = -10.0
bands[1].filterType = .lowShelf
bands[2].gain = -10.0
bands[2].filterType = .lowShelf
bands[3].gain = 10.0
bands[3].filterType = .highShelf
bands[4].gain = 10.0
bands[4].filterType = .highShelf
do {
if let filepath = Bundle.main.path(forResource: "song", ofType: "mp3") {
let filepathURL = NSURL.fileURL(withPath: filepath)
audioFile = try AVAudioFile(forReading: filepathURL)
audioEngine.prepare()
try audioEngine.start()
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioPlayerNode.play()
}
} catch _ {}
Since the low frequencies have a gain of -10 and the high frequencies have a gain of 10, there should be a very noticeable difference when playing any media. However, when the media starts playing, it sounds the same as if played without any equalizer attached.
I'm not sure why this is happening, but I tried several different things to debug. I thought that it might be the order of the functions so I tried switching it so that audioEngine.connect is called after adjusting all of the bands, but that did not make a difference either.
I tried this same code with using an AVAudioUnitTimePitch, and it worked perfectly, so I am dumbfounded as to why it does not work with AVAudioUnitEQ.
I do not want to use any third-party libraries or cocoa pods for this project, I would like to do it using AVFoundation alone.
Any help would be greatly appreciated!
Thanks in advance.
AVAudioUnitEQFilterParameters
Looking through the documentation, I noticed that I had messed with all of the parameters except bypass and it seems that changing this flag fixed everything!
So, I believe the main issue here is that each AVAudioUnitEQ band must not be bypassed by the provided system values rather than the values the programmer sets.
So, I changed
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
to
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
bands[i].bypass = false
bands[i].filtertype = .parametric
}
and everything started working. Furthermore, to make an effective equalizer that allows the user to modify individual frequencies the filtertype for each band should be set to .parametric.
I am still unsure on what I should set the bandwith to, but I can probably check online for that or just mess with it until the sound matches a different equalizer application.
In iOS 8/Xcode 6 I had a function that included a sound effect. It no longer works in iOS 9 after changing the code multiple times. This is what I've tried:
Original:
let bangSoundEffect = SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false)
runAction(bangSoundEffect)
Other attempt:
self.runAction(SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false))
Also:
func playRocketExplosionSound(filename: String) {
let url = NSBundle.mainBundle().URLForResource(
filename, withExtension: nil)
if (url == nil) {
print("Could not find file: \(filename)")
return }
var error: NSError? = nil
do {
backgroundMusicPlayer =
try AVAudioPlayer(contentsOfURL: url!)
} catch let error1 as NSError {
error = error1
backgroundMusicPlayer = nil
}
if backgroundMusicPlayer == nil {
print("Could not create audio player: \(error!)")
return}
backgroundMusicPlayer.numberOfLoops = 1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play() }
playRocketExplosionSound("Bang.mp3")
I'm pulling my hair out. I'm using the same code in a different scene for another sound effect and it works fine!! What's going wrong?
I've noticed that the sound effect begins to play sometimes in the simulator, however it doesn't complete and throws this error:
2015-09-24 19:12:14.554 APPNAME[4982:270835] 19:12:14.553 ERROR: 177: timed out after 0.012s (735 736); mMajorChangePending=0
It doesn't work at all on actual devices.
What is the problem? :'(
Possible problem with MP3 file
The problem is most likely connected with the MP3 file you're using. The code works for other sounds, this suggests that the MP3 file might be corrupted and AVAudioPlayer fails with decoding it. You can try re-encode this file and see if the problem persists. Or, even better, converting it to WAV.
Using WAVs
General rule of the thumb when creating short sound effects for games, is to use WAV unless you really feel you need the trim the fat.
Top-notch games are going for top-of-the-line production quality, so they record and produce assets uncompressed 24bit/48kHz. Titles with slightly lesser ambitions might record and produce in 16/44.1, which is the official standard for CD quality audio.
This has at least two benefits. One is that the sound has a better quality. Second one, the CPU does not have to decode the file to play it.
Corrupt data file | AVAudioPlayer out of scope
1. Corrupt data file
This will ensure you have found the file:
var backgroundMusicPlayer: AVAudioPlayer? = nil
if let url = Bundle.main.url(
forResource: "Bang", withExtension: "mp3") {
do {
try backgroundMusicPlayer = AVAudioPlayer(contentsOf: url)
backgroundMusicPlayer!.play()
} catch {}
}
return nil
2. AVAudioPlayer out of scope
The variable retaining backgroundMusicPlayer must not go out of scope before play() has completed and returns. This is generally achieved by using a class variable:
var backgroundMusicPlayer: AVAudioPlayer? = nil
Don't do this: the following sound will play for, at best, outOfScopeDelay due to the local scope of var audioPlayer.
let outOfScopeDelay = 0.5
do {
var audioPlayer:AVAudioPlayer! // Incorrectly scoped variable
try audioPlayer = AVAudioPlayer(contentsOf: audioRecorder.url)
audioPlayer.play()
Thread.sleep(forTimeInterval: outOfScopeDelay)
} catch {}
► Find this solution on GitHub and additional details on Swift Recipes.
try this:
dispatch_async(dispatch_get_main_queue(), {
(self.playRocketExplosionSound("Bang.mp3")
})
it's no longer safe to play audio in child thread under iOS 9.