I am using AudioKit to mix WAV files together with MIDI files.
I also need to save the result in a separate file.
To mix the WAVs and MIDIs I am using an AKMIDISampler with an AKSequencer like this:
func add(track: MixerTrack) -> Bool {
do{
let trackSampler = AKMIDISampler()
try trackSampler.loadWav(track.instrument.fileName)
trackSampler.connect(to: mixer)
let sequencer = AKSequencer(filename: track.midi.fileName)
sequencer.setTempo(Double(tempo))
sequencer.setRate(rate)
sequencer.setGlobalMIDIOutput(trackSampler.midiIn)
sequencer.enableLooping()
sequencer.enableLooping()
sequencers.append(sequencer)
tracks.append(track)
return true
} catch {
return false
}
}
I am using the SongProcessor example from AudioKit's examples for ideas on how to use AKOfflineRenderNode.
The thing is the example works with AKAudioPlayer instances and not sequencers as I am using. I believe I cannot use players because I need to mix the WAV and MIDI files, and I was only able to achieve that using sequencers.
My first question is: Is it possible to create files from sequencers the same way it is done in SongProcessor with players?
I was able to save an m4a file but the result is weird. First, if I don't set the rate manually to a number like 40, it is veeery slow to play all the notes. And when I set ti to a value like that,I can hear the sequence playing but at wrong rates. At some moments the beats play correctly but they often start playing too slow or too fast at different times.
Is there something I am doing wrong? Is this a bug with AKOfflineRenderNode or is it just not mean to be used like this?
Here is the code I use to save the mix to disk:
func saveMixToDisk() -> URL? {
do {
let fileManager = FileManager.default
let name = UUID().uuidString.appending(".m4a")
let documentDirectory = try fileManager.url(for: .documentDirectory, in: .userDomainMask, appropriateFor:nil, create:false)
let fileURL = documentDirectory.appendingPathComponent(name)
offlineRender.internalRenderEnabled = false
let duration = sequencers.first!.length.seconds
for sequencer in sequencers {
sequencer.stop()
sequencer.setTime(AKDuration(seconds: 0).musicTimeStamp)
sequencer.rewind()
}
for sequencer in sequencers {
sequencer.setRate(40) // I would like to find a way to avoid having to set this, since this value is hardcoded and I don't know how to find the correct one. (When I only play through the sequencer inside the app the rate is perfect, but it gets messed up when rendering to URL)
sequencer.play()
}
try offlineRender.renderToURL(fileURL, seconds: duration * 10)
for sequencer in sequencers {
sequencer.stop()
sequencer.setTime(AKDuration(seconds: 0).musicTimeStamp)
sequencer.rewind()
}
offlineRender.internalRenderEnabled = true
return fileURL
} catch let error {
print(error)
return nil
}
}
Any help is very much appreciated. I can't seem to be able to get this to work, and sadly I don't know of any other options in iOS to achieve what I need.
Instead of using AKOfflineRender, try the new AudioKit.renderToFile in AudioKit 4.0.4: https://github.com/AudioKit/AudioKit/commit/09aedf7c119a399ab00026ddfb91ae6778570176
I think you need to use this method in iOS11
[AudioKit renderToFile:file duration:self->_audioDurationSeconds error:&error prerender:^{
[self.voicePlayer start];
}];
Related
i'm currently working a musician app. In my app notes should be played with a specific duration. I don't get into detail when the notes are played. Basically there is a ui view (a vertical line) which is moving and when this hits my other ui views (rectangle) it should be played a note. Important here: the note should be played until the line is not hitting the rectangle anymore.
The note playing is no problem but I don't find any duration. Also it should be possible to play the same note multiple times with a delay.
So I tried to make this work with AudioKit cause it's seems like the best greatest solution for audio. But it has so much stuff. I took a look into their examples and found this:
let bundlePath = Bundle.main.bundlePath
let soundPath = ("\(bundlePath)/sounds")
let akSampler = AKAppleSampler()
let mixer = AKMixer(akSampler)
try! akSampler.loadSoundFont(soundPath, preset: 0, bank: 0)
mixer.start()
AudioKit.output = mixer
do {
_ = try AudioKit.engine.start()
} catch {
print("AudioKit wouldn't start!")
}
do {
try akSampler.play(noteNumber: myNote.rawValue, velocity: 100, channel: 1)
} catch let e{
print(e)
}
Unfortunately I can't pass any duration and when I call akSampler.stop(noteNumber: myNote.rawValue) it also stops the other notes with the same type.
I tried to find a solution with AVFoundation like so:
engine = AVAudioEngine()
sampler = AVAudioUnitSampler()
engine.attach(sampler)
engine.connect(sampler, to: engine.mainMixerNode, format: nil)
guard let bankURL = Bundle.main.url(forResource: "sounds", withExtension: "SF2") else {
print("could not load sound font")
return
}
... init engine
sampler.startNote(60, withVelocity: 64, onChannel: 0)
But same result. Also the same case that I can't pass any duration.
I also digged into MIDISequencer's but it seems that they generating a sequence which I can play but this does not fit on my problem.
Does someone has a solution here?
The laziest solution would be to just schedule a stop with asyncAfter when you trigger the note, e.g.,
func makeNote(note: MIDINoteNumber, dur: Double) {
sampler.play(noteNumber: note, velocity: 100, channel: 0)
DispatchQueue.main.asyncAfter(deadline: .now() + dur) {
self.sampler.stop(noteNumber: note)
}
}
A better solution would probably use either AKSequencer or AKAppleSequencer. Both allow you to create sequences on the fly by adding individual notes with a specified duration (in musical time, i.e., number of beats). AKSequencer is considerably more accurate, but AKAppleSequencer has more readily available code examples on the web. A little confusingly, the current AKAppleSequencer used to also be called AKSequencer, but their interfaces are sufficiently different that a quick look at the docs for the two classes will tell you which you're looking at.
Your question is asking about how to schedule MIDI events which is precisely what these classes are designed to do. You haven't really given a clear reason why generating a sequence doesn't fit your problem.
OK. I have a nasty feeling that this will be met with the gentle chirp of crickets...
I base that on this and this.
I'm actually wondering if this is a feature, not a bug, as maybe there's a security issue with loading a movie locally, then playing it. I would think that isn't the case, but maybe. It should be noted that the loaded asset comes from a REST interaction with a server, in which the movie data is actually just a part of a data query response. It is not something that is loaded directly from a video streaming page (it is SSL, though).
I'm pretty green at AV Foundation.
I have the following code:
do {
// We create a path to a unique temporary file to grab the media.
let url = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(UUID().uuidString)
// Store the media in the temp file.
try myData.write(to: url, options: .atomic)
let options = [AVURLAssetPreferPreciseDurationAndTimingKey: true]
let asset = AVURLAsset(url: url, options: options)
if 0 < asset.tracks.count {
print("YOU GET \(asset.tracks.count) TRACKS!")
} else {
print("NO TRACKS FOR YOU!")
}
} catch let error {
NSLog("Error Encoding AV Media: %#", error._domain)
}
Pretty basic, eh? The "myData" variable contains a MP4 movie (.m4v) that was downloaded. I write it to a temp file, then load that temp file with AVURLAsset, just like it says to do.
The problem is that I can never get the dangblammit movie to play. The file is where it's supposed to be. I can fish out the temp file, slap on a '.m4v' extension, and play it in the QT Viewer.
I am quite prepared to accept a slap upside the head, followed by "ya darn eedjut!", but I'd like to know which "M" I should "RTFM".
The problem seems to be with this line
let url = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(UUID().uuidString)
You should add the extension for the file.
let videoName = UUID().uuidString + ".mp4"
let url = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(videoName)
Hope it fix.
I am currently working on a code that downloads .m4a audio format files from Firebase and allows the user to hear the music play within the same View Controller. Through lots of research on SO and other websites, I managed to get the audio downloaded, but when the audio is supposed to be being played, I only hear static. In fact, it actually plays the entire duration of the audio clip (e.g. the audio is 6s long, so a user would hear static for exactly 6s), so I know for a fact it is actually downloading something, just nothing is playing.
Below is my source code for the download function and play music function. I tried searching this topic on SO, but there are very few articles pertaining to audio download to iOS from Firebase (mostly it seems to be "how to download images", etc.)
Thank you very much in advance!
Sam
//Function to start playing music
func playMusic(){
do {
self.audioPlay = try AVAudioPlayer(contentsOf: self.localSongURL)
self.audioPlay.delegate = self
self.audioPlay.prepareToPlay()
self.audioPlay.play()
self.audioPlay.volume = 1.0 //I tried adjusting volumes to see if it would make a difference
} catch {
createAlert(title: "File Not Found", message: "Audio downloaded cannot be interpereted.")
}
}
//Function to download music
//In the Firebase storage, songs are listed underneath an ID, so the "tillAt"s help to parse through
//the appropriate strings to get the right path for collection
func downloadMusic(){
let tillAt1 = self.songPath.components(separatedBy: ID + "/")
let tillAt2 = tillAt1[1].components(separatedBy: ".")
store = Storage.storage().reference().child(patientID).child(tillAt1[1])
//From here is where I think the issue is within
self.localSongURL = try! FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent(tillAt2[0] + ".m4a")
store.getData(maxSize: 128 * 1024 * 1024, completion: {(data, error) in
if let error = error {
print("Error Here _______________ Level 1")
print(error)
} else {
if let d = data {
do {
try d.write(to: self.localSongURL)
} catch {
print("Error Here _______________ Level 2")
print(error)
}
}
}
})
}
I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine. I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have:
var audioEngine: AVAudioEngine = AVAudioEngine()
var equalizer: AVAudioUnitEQ!
var audioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
var audioFile: AVAudioFile!
// in viewDidLoad():
equalizer = AVAudioUnitEQ(numberOfBands: 5)
audioEngine.attach(audioPlayerNode)
audioEngine.attach(equalizer)
let bands = equalizer.bands
let freqs = [60, 230, 910, 4000, 14000]
audioEngine.connect(audioPlayerNode, to: equalizer, format: nil)
audioEngine.connect(equalizer, to: audioEngine.outputNode, format: nil)
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
bands[0].gain = -10.0
bands[0].filterType = .lowShelf
bands[1].gain = -10.0
bands[1].filterType = .lowShelf
bands[2].gain = -10.0
bands[2].filterType = .lowShelf
bands[3].gain = 10.0
bands[3].filterType = .highShelf
bands[4].gain = 10.0
bands[4].filterType = .highShelf
do {
if let filepath = Bundle.main.path(forResource: "song", ofType: "mp3") {
let filepathURL = NSURL.fileURL(withPath: filepath)
audioFile = try AVAudioFile(forReading: filepathURL)
audioEngine.prepare()
try audioEngine.start()
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioPlayerNode.play()
}
} catch _ {}
Since the low frequencies have a gain of -10 and the high frequencies have a gain of 10, there should be a very noticeable difference when playing any media. However, when the media starts playing, it sounds the same as if played without any equalizer attached.
I'm not sure why this is happening, but I tried several different things to debug. I thought that it might be the order of the functions so I tried switching it so that audioEngine.connect is called after adjusting all of the bands, but that did not make a difference either.
I tried this same code with using an AVAudioUnitTimePitch, and it worked perfectly, so I am dumbfounded as to why it does not work with AVAudioUnitEQ.
I do not want to use any third-party libraries or cocoa pods for this project, I would like to do it using AVFoundation alone.
Any help would be greatly appreciated!
Thanks in advance.
AVAudioUnitEQFilterParameters
Looking through the documentation, I noticed that I had messed with all of the parameters except bypass and it seems that changing this flag fixed everything!
So, I believe the main issue here is that each AVAudioUnitEQ band must not be bypassed by the provided system values rather than the values the programmer sets.
So, I changed
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
to
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
bands[i].bypass = false
bands[i].filtertype = .parametric
}
and everything started working. Furthermore, to make an effective equalizer that allows the user to modify individual frequencies the filtertype for each band should be set to .parametric.
I am still unsure on what I should set the bandwith to, but I can probably check online for that or just mess with it until the sound matches a different equalizer application.
In iOS 8/Xcode 6 I had a function that included a sound effect. It no longer works in iOS 9 after changing the code multiple times. This is what I've tried:
Original:
let bangSoundEffect = SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false)
runAction(bangSoundEffect)
Other attempt:
self.runAction(SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false))
Also:
func playRocketExplosionSound(filename: String) {
let url = NSBundle.mainBundle().URLForResource(
filename, withExtension: nil)
if (url == nil) {
print("Could not find file: \(filename)")
return }
var error: NSError? = nil
do {
backgroundMusicPlayer =
try AVAudioPlayer(contentsOfURL: url!)
} catch let error1 as NSError {
error = error1
backgroundMusicPlayer = nil
}
if backgroundMusicPlayer == nil {
print("Could not create audio player: \(error!)")
return}
backgroundMusicPlayer.numberOfLoops = 1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play() }
playRocketExplosionSound("Bang.mp3")
I'm pulling my hair out. I'm using the same code in a different scene for another sound effect and it works fine!! What's going wrong?
I've noticed that the sound effect begins to play sometimes in the simulator, however it doesn't complete and throws this error:
2015-09-24 19:12:14.554 APPNAME[4982:270835] 19:12:14.553 ERROR: 177: timed out after 0.012s (735 736); mMajorChangePending=0
It doesn't work at all on actual devices.
What is the problem? :'(
Possible problem with MP3 file
The problem is most likely connected with the MP3 file you're using. The code works for other sounds, this suggests that the MP3 file might be corrupted and AVAudioPlayer fails with decoding it. You can try re-encode this file and see if the problem persists. Or, even better, converting it to WAV.
Using WAVs
General rule of the thumb when creating short sound effects for games, is to use WAV unless you really feel you need the trim the fat.
Top-notch games are going for top-of-the-line production quality, so they record and produce assets uncompressed 24bit/48kHz. Titles with slightly lesser ambitions might record and produce in 16/44.1, which is the official standard for CD quality audio.
This has at least two benefits. One is that the sound has a better quality. Second one, the CPU does not have to decode the file to play it.
Corrupt data file | AVAudioPlayer out of scope
1. Corrupt data file
This will ensure you have found the file:
var backgroundMusicPlayer: AVAudioPlayer? = nil
if let url = Bundle.main.url(
forResource: "Bang", withExtension: "mp3") {
do {
try backgroundMusicPlayer = AVAudioPlayer(contentsOf: url)
backgroundMusicPlayer!.play()
} catch {}
}
return nil
2. AVAudioPlayer out of scope
The variable retaining backgroundMusicPlayer must not go out of scope before play() has completed and returns. This is generally achieved by using a class variable:
var backgroundMusicPlayer: AVAudioPlayer? = nil
Don't do this: the following sound will play for, at best, outOfScopeDelay due to the local scope of var audioPlayer.
let outOfScopeDelay = 0.5
do {
var audioPlayer:AVAudioPlayer! // Incorrectly scoped variable
try audioPlayer = AVAudioPlayer(contentsOf: audioRecorder.url)
audioPlayer.play()
Thread.sleep(forTimeInterval: outOfScopeDelay)
} catch {}
► Find this solution on GitHub and additional details on Swift Recipes.
try this:
dispatch_async(dispatch_get_main_queue(), {
(self.playRocketExplosionSound("Bang.mp3")
})
it's no longer safe to play audio in child thread under iOS 9.