I'm working with AKSequence with just one track with AKMIDISampler:
var instrument1 = AKMIDISampler()
mixer.connect(input: instrument1, bus: 0)
sequencer.tracks[0].setMIDIOutput(instrument1.midiIn)
Need to export to wav, like this:
let audioFile = try! AKAudioFile(forWriting: fileURL, settings: settings)
try! AudioKit.renderToFile(audioFile, seconds: seconds, prerender: {
self.sequencer.play()
}
return fileURL
The Audio File resulting is just the first note long. What's wrong here ?
Related
I am a newbie to swift developer, I am now trying to make some music apps. I have been using audiokit frameworks to solve some audio problems. Audiokit is very helpful to me. I use AppleSequencer to initialize the audio and complete the correct playback.
But now I have some problems. Specifically, when I export the file using engine.renderToFile, I get a noise file.
The following is the main code part, I also uploaded the relevant code to the github address:https://github.com/devlong/sequenceExport.git
import AudioKit
import AVFoundation
class SequenceExport: NSObject {
static let shared = SequenceExport.init()
private override init(){}
var mixer = Mixer();
var sequencer : AppleSequencer?
var engine = AudioEngine()
var currentMidiSample:MIDISampler?
var track : MusicTrackManager?
func setCurrentSequencer() {
let instrumentName = "edm-kit"
currentMidiSample = MIDISampler.init(name: instrumentName)
engine.output = currentMidiSample;
do {
try engine.start()
} catch {
Log("AudioKit did not start \(error)")
}
let directory = "Sounds/\(instrumentName)"
let resetPath = Bundle.main.path(forResource:instrumentName, ofType: "aupreset", inDirectory: directory)
do{
try currentMidiSample!.loadPath(resetPath!)
}catch let error {
print("************load resetPath error!!!!!!:\(error)")
}
sequencer = AppleSequencer(filename: "tracks")
track = sequencer!.newTrack()
for index in 0...4 {
track!.add(
noteNumber:MIDINoteNumber(5+index),
velocity: 100, //调节单个音符声音
position: Duration.init(beats: Double(index)),
duration: Duration.init(seconds: 0.5))
}
sequencer!.setGlobalMIDIOutput(currentMidiSample!.midiIn)
sequencer!.setTempo(120)
sequencer!.setLength(Duration.init(beats: 4))
sequencer!.enableLooping()
sequencer!.play()
}
func exportM4a() {
guard let outputURL = try? FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("audio_file_new.m4a") else { return }
print("outputURL !!!!!!:\(outputURL)")
guard let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 0, interleaved: true) else {
fatalError("Failed to create format")
}
let file = try! AVAudioFile(forWriting: outputURL, settings: format.settings)
do{
try engine.renderToFile(file, maximumFrameCount: 1_096, duration: 5) {
self.sequencer!.play()
} progress: { progress in
// print("progress !!!!!!:\(progress)")
}
}catch let error {
print("export !!!!!!:\(error)")
}
}
}
I tried to find many ways. Audiokit has a way to export mid format, but this is not my requirement. I need to find out how to export m4a, mp3 and other format files from AppleSequencer. Can anyone help me?
If anyone has a way, I hope to help me make changes in the github code, which will help more people who encounter such problems.
I have uploaded the code to the address:https://github.com/devlong/sequenceExport.git
I think that AppleSequencer just won't run in outside of a real time context, so nothing happens when you try to use renderToFile. You could use a NodeRecorder to recorder the track as its being played, but that would be in real time, not super fast.
I am building an app that needs to perform analysis on the audio it receives from the microphone in real time. In my app, I also need to play a beep sound and start recording audio at the same time, in other words, I can't play the beep sound and then start recording. This introduces the problem of hearing the beep sound in my recording, (this might be because I am playing the beep sound through the speaker, but unfortunately I cannot compromise in this regard either). Since the beep sound is just a tone of about 2350 kHz, I was wondering how I could exclude that range of frequencies (say from 2300 kHz to 2400 kHz) in my recordings and prevent it from influencing my audio samples. After doing some googling I came up with what I think might be the solution, a band stop filter. According to Wikipedia: "a band-stop filter or band-rejection filter is a filter that passes most frequencies unaltered, but attenuates those in a specific range to very low levels". This seems like what I need to to exclude frequencies from 2300 kHz to 2400 kHz in my recordings (or at least for the first second of the recording while the beep sound is playing). My question is: how would I implement this with AVAudioEngine? Is there a way I can turn off the filter after the first second of the recording when the beep sound is done playing without stopping the recording?
Since I am new to working with audio with AVAudioEngine (I've always just stuck to the higher levels of AVFoundation) I followed this tutorial to help me create a class to handle all the messy stuff. This is what my code looks like:
class Recorder {
enum RecordingState {
case recording, paused, stopped
}
private var engine: AVAudioEngine!
private var mixerNode: AVAudioMixerNode!
private var state: RecordingState = .stopped
private var audioPlayer = AVAudioPlayerNode()
init() {
setupSession()
setupEngine()
}
fileprivate func setupSession() {
let session = AVAudioSession.sharedInstance()
//The original tutorial sets the category to .record
//try? session.setCategory(.record)
try? session.setCategory(.playAndRecord, options: [.mixWithOthers, .defaultToSpeaker])
try? session.setActive(true, options: .notifyOthersOnDeactivation)
}
fileprivate func setupEngine() {
engine = AVAudioEngine()
mixerNode = AVAudioMixerNode()
// Set volume to 0 to avoid audio feedback while recording.
mixerNode.volume = 0
engine.attach(mixerNode)
//Attach the audio player node
engine.attach(audioPlayer)
makeConnections()
// Prepare the engine in advance, in order for the system to allocate the necessary resources.
engine.prepare()
}
fileprivate func makeConnections() {
let inputNode = engine.inputNode
let inputFormat = inputNode.outputFormat(forBus: 0)
engine.connect(inputNode, to: mixerNode, format: inputFormat)
let mainMixerNode = engine.mainMixerNode
let mixerFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: inputFormat.sampleRate, channels: 1, interleaved: false)
engine.connect(mixerNode, to: mainMixerNode, format: mixerFormat)
//AudioPlayer Connection
let path = Bundle.main.path(forResource: "beep.mp3", ofType:nil)!
let url = URL(fileURLWithPath: path)
let file = try! AVAudioFile(forReading: url)
engine.connect(audioPlayer, to: mainMixerNode, format: nil)
audioPlayer.scheduleFile(file, at: nil)
}
//MARK: Start Recording Function
func startRecording() throws {
print("Start Recording!")
let tapNode: AVAudioNode = mixerNode
let format = tapNode.outputFormat(forBus: 0)
let documentURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
// AVAudioFile uses the Core Audio Format (CAF) to write to disk.
// So we're using the caf file extension.
let file = try AVAudioFile(forWriting: documentURL.appendingPathComponent("recording.caf"), settings: format.settings)
tapNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: {
(buffer, time) in
try? file.write(from: buffer)
print(buffer.description)
print(buffer.stride)
let floatArray = Array(UnsafeBufferPointer(start: buffer.floatChannelData![0], count:Int(buffer.frameLength)))
})
try engine.start()
audioPlayer.play()
state = .recording
}
//MARK: Other recording functions
func resumeRecording() throws {
try engine.start()
state = .recording
}
func pauseRecording() {
engine.pause()
state = .paused
}
func stopRecording() {
// Remove existing taps on nodes
mixerNode.removeTap(onBus: 0)
engine.stop()
state = .stopped
}
}
AVAudioUnitEQ supports a band-stop filter.
Perhaps something like:
// Create an instance of AVAudioUnitEQ and connect it to the engine's main mixer
let eq = AVAudioUnitEQ(numberOfBands: 1)
engine.attach(eq)
engine.connect(eq, to: engine.mainMixerNode, format: nil)
engine.connect(player, to: eq, format: nil)
eq.bands[0].frequency = 2350
eq.bands[0].filterType = .bandStop
eq.bands[0].bypass = false
A slightly more complete answer, linked to an IBAction; in this example, I use .parametric for the filter type, with more bands than required, to give a broader insight on how to use it:
#IBAction func PlayWithEQ(_ sender: Any) {
self.engine.stop()
self.engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let url = Bundle.main.url(forResource:"yoursong", withExtension: "m4a")!
let f = try! AVAudioFile(forReading: url)
self.engine.attach(player)
// adding eq effect node
let effect = AVAudioUnitEQ(numberOfBands: 4)
let bands = effect.bands
let freq = [125, 250, 2350, 8000]
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freq[i])
}
bands[0].gain = 0.0
bands[0].filterType = .parametric
bands[0].bandwidth = 1
bands[1].gain = 0.0
bands[1].filterType = .parametric
bands[1].bandwidth = 0.5
// filter of interest, rejecting 2350Hz (adjust bandwith as needed)
bands[2].gain = -60.0
bands[2].filterType = .parametric
bands[2].bandwidth = 1
bands[3].gain = 0.0
bands[3].filterType = .parametric
bands[3].bandwidth = 1
self.engine.attach(effect)
self.engine.connect(player, to: effect, format: f.processingFormat)
let mixer = self.engine.mainMixerNode
self.engine.connect(effect, to: mixer, format: f.processingFormat)
player.scheduleFile(f, at: nil) {
delay(0.05) {
if self.engine.isRunning {
self.engine.stop()
}
}
}
self.engine.prepare()
try! self.engine.start()
player.play()
}
I want to record audio file and save it by applying some effects.
Record is okay and also playing this audio with effect is okay too.
The problem is when I try to save such audio offline it produces empty audio file.
Here is my code:
let effect = AVAudioUnitTimePitch()
effect.pitch = -300
self.addSomeEffect(effect)
func addSomeEffect(_ effect: AVAudioUnit) {
try? AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: .defaultToSpeaker)
let format = self.audioFile.processingFormat
self.audioEngine.stop()
self.audioEngine.reset()
self.audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
self.audioEngine.attach(audioPlayerNode)
self.audioEngine.attach(effect)
self.audioEngine.connect(audioPlayerNode, to: self.audioEngine.mainMixerNode, format: format)
self.audioEngine.connect(effect, to: self.audioEngine.mainMixerNode, format: format)
audioPlayerNode.scheduleFile(self.audioFile, at: nil)
do {
let maxNumberOfFrames: AVAudioFrameCount = 8096
try self.audioEngine.enableManualRenderingMode(.offline,
format: format,
maximumFrameCount: maxNumberOfFrames)
} catch {
fatalError()
}
do {
try audioEngine.start()
audioPlayerNode.play()
} catch {
}
let outputFile: AVAudioFile
do {
let url = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("temp.m4a")
if FileManager.default.fileExists(atPath: url.path) {
try? FileManager.default.removeItem(at: url)
}
let recordSettings = self.audioFile.fileFormat.settings
outputFile = try AVAudioFile(forWriting: url, settings: recordSettings)
} catch {
fatalError()
}
let buffer = AVAudioPCMBuffer(pcmFormat: self.audioEngine.manualRenderingFormat,
frameCapacity: self.audioEngine.manualRenderingMaximumFrameCount)!
while self.audioEngine.manualRenderingSampleTime < self.audioFile.length {
do {
let framesToRender = min(buffer.frameCapacity,
AVAudioFrameCount(self.audioFile.length - self.audioEngine.manualRenderingSampleTime))
let status = try self.audioEngine.renderOffline(framesToRender, to: buffer)
switch status {
case .success:
print("Write to file")
try outputFile.write(from: buffer)
case .error:
fatalError()
default:
break
}
} catch {
fatalError()
}
}
print("Finish write")
audioPlayerNode.stop()
audioEngine.stop()
self.outputFile = outputFile
self.audioPlayer = try? AVAudioPlayer(contentsOf: outputFile.url)
}
AVAudioPlayer fails to open file with output url. I looked at the file through the file system and it is empty and can't be played.
Picking different categories for AVAudioSession is not working too.
Thanks for help!
UPDATE
I switched to use .caf file extension in my record in output file and it worked. Any idea why is .m4a is not working?
You need to nil outputFile to flush the header and close the m4a file.
I am new to Swift and making an audio app using AVAudioPlayer. I am using a remote URL mp3 file for the audio, and this works when it's static.
For my use case, I want to pull a URL for an mp3 file from a JSON array and then pass it into the AVAudioPlayer to run.
If I move the AVAudioPlayer block into the ViewDidLoad and make the mp3 file a static URL, it will run fine.
Then, when I move this code into my block that extracts an mp3 url from JSON, I can print the URL successfully. But when I pass it into my audio player, problems arise. Here's the code.
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(string: "http://www.example.com/example.json")
URLSession.shared.dataTask(with:url!, completionHandler: {(data, response, error) in
guard let data = data, error == nil else { return }
let json: Any?
do{
json = try JSONSerialization.jsonObject(with: data, options: [])
}
catch{
return
}
guard let data_list = json as? [[String:Any]] else {
return
}
if let foo = data_list.first(where: {$0["episode"] as? String == "Example Preview"}) {
self.audiotest = (foo["audio"] as? String)!
print(self.audiotest) // this prints
// where i'm passing it into the audio player
if let audioUrl = URL(string: self.audiotest) {
// then lets create your document folder url
let documentsDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
// lets create your destination file url
let destinationUrl = documentsDirectoryURL.appendingPathComponent(audioUrl.lastPathComponent)
//let url = Bundle.main.url(forResource: destinationUrl, withExtension: "mp3")!
do {
audioPlayer = try AVAudioPlayer(contentsOf: destinationUrl)
} catch let error {
print(error.localizedDescription)
}
} // end player
// ....
Specifically, I get an error Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value when clicking a play button IBAction that is connected to the audio player. Finally, that action function looks like this:
#IBAction func playPod(_ sender: Any) {
audioPlayer.play()
}
Do you know where I'm going wrong? I'm confused as to why I can't print the URL and also get a response that the URL is nil in the same block, but maybe that's an asynchronous thing.
The problem is that you didn't save the mp3 file to documents and trying to play it
this line
audioPlayer = try AVAudioPlayer(contentsOf: destinationUrl)
assumes that there is a saved mp3 file in that path , but acutally there is no files you appended the audio extension on the fly
besides for steaming audio from a remote server, use AVPlayer instead of AVAudioPLayer.
AVPlayer Documentation
Also try this with urls parsed from json
var urlStr = (foo["audio"] as? String)!
self.audiotest = urlStr.addingPercentEncoding(withAllowedCharacters: .urlHostAllowed)
I want to make piano player in which I want to play predefined notes which are some .mid files. Here is my code which is not working.
let soundPath: String? = Bundle.main.path(forResource: "0_100", ofType: "mid")
let midiFile:URL = URL(fileURLWithPath: soundPath ?? "")
var midiPlayer: AVMIDIPlayer?
do {
try midiPlayer = AVMIDIPlayer(contentsOf: midiFile, soundBankURL: nil)
midiPlayer?.prepareToPlay()
midiPlayer?.play {
print("finished playing")
}
} catch {
print("could not create MIDI player")
}
soundBankURL is missing in the following:
try midiPlayer = AVMIDIPlayer(contentsOf: midiFile, soundBankURL: nil)
It's required according to the doc: https://developer.apple.com/documentation/avfoundation/avmidiplayer/1389225-init
Important
For macOS the bankURL can be set to nil to use the default sound bank. However, iOS must always refer to a valid bank file.
A soundBank can be a .sf2 file for instance.