Audiokit AKFFTTap generate array of 0 in offline rendering mode - ios

I'm trying to use Audiokit and its AKFFTap to get the fft data of an audiofile.
I manage to get them in real time processing but as soos as I do it in offline rendering mode the generated data are 0.
So I was wondering if it was possible to get it with the offline rendering mode?
Here is the code I use:
class OfflineProcessingClass {
var tracker: AKFrequencyTracker!
var fftTap: AKFFTTap!
// ....
private func process(audioFile: AKAudioFile) throws {
// Make connection
let player = try AKAudioPlayer(file: audioFile)
tracker = AKFrequencyTracker(player)
fftTap = AKFFTTap(tracker)
AudioKit.output = tracker
// Setup offline rendering mode
let timeIntervalInSeconds: TimeInterval = 0.1
let sampleInterval = Int(floor(timeIntervalInSeconds * audioFile.sampleRate))
try AudioKit.engine.enableManualRenderingMode(
.offline,
format: audioFile.fileFormat,
maximumFrameCount: AVAudioFrameCount(sampleInterval)
)
// Setup buffer
let buffer = AVAudioPCMBuffer(
pcmFormat: AudioKit.engine.manualRenderingFormat,
frameCapacity: AudioKit.engine.manualRenderingMaximumFrameCount
)
// Start processing
try AudioKit.start()
player.start()
// Read file offline
while AudioKit.engine.manualRenderingSampleTime < audioFile.length {
let frameCount = audioFile.length - manualRenderingSampleTime
let framesToRender = min(AVAudioFrameCount(frameCount), buffer.frameCapacity)
try! AudioKit.engine.renderOffline(framesToRender, to: buffer)
// track is good
print("\(tracker.amplitude) dB - \(tracker!.frequency) Hz")
// Array of 0
print(fftTap.fftData) /////////////// <====== Error is here
}
// End processing
player.stop()
AudioKit.engine.stop()
}
}
Do you see something wrong in this code?

This is because handleTapBlock in BaseTap does a dispatch async on the main queue. That means, since you're occupying the main queue in your for loop, the BaseTap will never have the opportunity to get any callbacks. You'll need to relinquish the main queue for that to work.

Related

Import big set in Core data with relations in batches

I'm trying to import a large data set of around 80k objects. I'm trying to follow Apple example
I have two issues:
In the code example there is a comment:
// taskContext.performAndWait runs on the URLSession's delegate queue
// so it won’t block the main thread.
But in my case I'm not using URLSession to fetch the JSON. The file is bundled with the app. In this case how to make sure the import won’t block the main thread. Should I create a custom queue ? Any example ?
In the example it's just importing an array of entities. But in my case I need to import just one entity that has 70k object in a relation to many.
So what I want to achieve is:
If there is a ContactBook don't import anything because we have already imported the JSON.
If there is no ContactBook create one and import all the 70k Contact object to the contacts relation of the ContactBook. This should happen in batches like in the example and should not block the UI.
What I have tried:
private func insertContactbookIfNeeded() {
let fetch: NSFetchRequest<Contactbook> = ContactBook.fetchRequest()
let contactBookCount = (try? context.count(for: fetch)) ?? 0
if contactBookCount > 0 {
return
}
let contacts = Bundle.main.decode([ContactJSON].self, from: "contacts.json")
// Process records in batches to avoid a high memory footprint.
let batchSize = 256
let count = contacts.count
// Determine the total number of batches.
var numBatches = count / batchSize
numBatches += count % batchSize > 0 ? 1 : 0
for batchNumber in 0 ..< numBatches {
// Determine the range for this batch.
let batchStart = batchNumber * batchSize
let batchEnd = batchStart + min(batchSize, count - batchNumber * batchSize)
let range = batchStart..<batchEnd
// Create a batch for this range from the decoded JSON.
let contactsBatch = Array(contacts[range])
// Stop the entire import if any batch is unsuccessful.
if !importOneBatch(contactsBatch) {
assertionFailure("Could not import batch number \(batchNumber) range \(range)")
return
}
}
}
private func importOneBatch(_ contactsBatch: [ContactJSON]) -> Bool {
var success = false
// Create a private queue context.
let taskContext = container.newBackgroundContext()
taskContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
// NOT TRUE IN MY CASE: (Any suggestion ??)
// taskContext.performAndWait runs on the URLSession's delegate queue
// so it won’t block the main thread.
print("isMainThread: \(Thread.isMainThread)") // prints true
taskContext.performAndWait {
let fetchRequest: NSFetchRequest<ContactBook> = ContactBook.fetchRequest()
fetchRequest.returnsObjectsAsFaults = true
fetchRequest.includesSubentities = false
let contactBookCount = (try? taskContext.count(for: fetchRequest)) ?? 0
var contactBook: ContactBook?
if contactBookCount > 0 {
do {
contactBook = try taskContext.fetch(fetchRequest).first
} catch let error as NSError {
assertionFailure("can't fetch the contactBook \(error)")
}
} else {
contactBook = ContactBook(context: taskContext)
}
guard let book = contactBook else {
assertionFailure("Could not fetch the contactBook")
return
}
// Create a new record for each contact in the batch.
for contactJSON in contactsBatch {
// Create a Contact managed object on the private queue context.
let contact = Contact(context: taskContext)
// Populate the Contact's properties using the raw data.
contact.name = contactJSON.name
contact.subContacts = NSSet(array: contactJSON.subContacts { subC -> Contact in
let contact = Contact(context: taskContext)
contact.name = subC.name
})
book.addToContacts(contact)
}
// Save all insertions and deletions from the context to the store.
if taskContext.hasChanges {
do {
try taskContext.save()
} catch {
print("Error: \(error)\nCould not save Core Data context.")
return
}
// Reset the taskContext to free the cache and lower the memory footprint.
taskContext.reset()
}
success = true
}
return success
}
The problem is that this is very slow because in each batch I fetch the workbook (which is getting bigger in each iteration) to be able to insert new batch of contacts in the contact book. Is there a efficient way to avoid fetching the workbook in each batch ? also any suggestion to make this is faster ? increase the batch size ? create a background queue ?
Update:
I have tried to create a ContactBook once in insertWordbookIfNeeded and pass it to importOneBatch with each iteration but I get:
Thread 1: Exception: "Illegal attempt to establish a relationship
'contactBook' between objects in different contexts

WebRTC(iOS): local video is not getting stream on remote side

I am trying to make an app with Audio, video call using WebRTC.
remote video and audio are working properly in my app, but my local stream is not appearing on the client side.
here is what I have written to add a video track
let videoSource = self.rtcPeerFactory.videoSource()
let videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
guard let frontCamera = (RTCCameraVideoCapturer.captureDevices().first { $0.position == .front }),
// choose highest res
let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
return width1 < width2
}).last,
// choose highest fps
let fps = (format.videoSupportedFrameRateRanges.sorted { return $0.maxFrameRate < $1.maxFrameRate }.last) else {
print(.error, "Error in createLocalVideoTrack")
return nil
}
videoCapturer.startCapture(with: frontCamera,
format: format,
fps: Int(fps.maxFrameRate))
self.callManagerDelegate?.didAddLocalVideoTrack(videoTrack: videoCapturer)
let videoTrack = self.rtcPeerFactory.videoTrack(with: videoSource, trackId: K.CONSTANT.VIDEO_TRACK_ID)
and this is to add Audio track
let constraints: RTCMediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: [:], optionalConstraints: nil)
let audioSource: RTCAudioSource = self.rtcPeerFactory.audioSource(with: constraints)
let audioTrack: RTCAudioTrack = self.rtcPeerFactory.audioTrack(with: audioSource, trackId: K.CONSTANT.AUDIO_TRACK_ID)
my full webRTC log attached here.
some logs I am getting (I think this is something wrong)
(thread.cc:303): Waiting for the thread to join, but blocking calls have been disallowed
(basic_port_allocator.cc:1035): Port[31aba00:0:1:0:relay:Net[ipsec4:2405:204:8888:x:x:x:x:x/64:VPN/Unknown:id=2]]: Port encountered error while gathering candidates.
...
(basic_port_allocator.cc:1017): Port[38d7400:audio:1:0:local:Net[en0:192.168.1.x/24:Wifi:id=1]]: Port completed gathering candidates.
(basic_port_allocator.cc:1035): Port[3902c00:video:1:0:relay:Net[ipsec5:2405:204:8888:x:x:x:x:x/64:VPN/Unknown:id=3]]: Port encountered error while gathering candidates.
finally, find the solution
it was due to TCP protocol in the TURN server.

Is the ios iPhone simulator causing memory usage analysis to swell?

I am trying to process a large text file in my app. I know that I want to be careful with the amount of memory being consumed while I read the data. Once a piece of data is read the app does not need to keep the data around.
Thanks to “Martin R” and the post Read a file/URL line-by-line for helping me jump start my effort.
I am trying to monitor the memory consumption of my app as it reads in the large data file so that I can be sure it is behaving as expected. Here’s where I am running into a problem.
When I run Instruments using Command-I from within Xcode and I monitor allocations I see that during the read of the file the app peeks at ~15MB and then drops back down. This is fairly repeatable +/- 0.5MB.
When I run the app using Command-R from within Xcode and then let it finish reading through the file, and then press record within Instruments, the memory consumption now swells to ~360MB.
So to clarify, the two ways I have done measurement of memory allocations are:
Profile:
1. Xcode Command-I.
2. Instruments Record Allocations. Observe ~15MB
Simulate and Profile:
1. Xcode Command-R.
2. Let app run to “IDLE”.
3. Instruments Record. Observe ~360MB.
I have been trying to figure out a few things here.
Q1. Why the difference? (This may answer all my questions)
Q2. Do I have a real problem or is this only a problem because of how debug code is annotated on to the simulator?
Q3. Similar to Q2, if I run a debug build on a real device, will it have the same issue?
Q4. For my app, ~15MB is acceptable when parsing the file, but ~360MB will not be. Is there another way I can continue to debug on my device without taking this 360MB hit?
Version 8.1 (8B62)
Sierra
2.7Ghz i5
MacBook Pro Circa 2015
Sample Code attached. The first part of the file is merely a copy of the code from the referenced post for reader convenience. One can take this code as is and run it in Xcode. At the bottom is the ViewController ViewDidLoad() method where things "run". The memory “swell” is after “File opened”.
//
//
import UIKit
/* Originally from
* stackoverflow:
* https://stackoverflow.com/questions/24581517/read-a-file-url-line-by-line-in-swift
* posted by Martin R.
* Much thanks!
*/
class StreamReader {
let encoding : String.Encoding
let chunkSize : Int
var fileHandle : FileHandle!
let delimData : Data
var buffer : Data
var atEof : Bool
init?(path: String, delimiter: String = "\n", encoding: String.Encoding = .utf8,
chunkSize: Int = 4096) {
guard let fileHandle = FileHandle(forReadingAtPath: path),
let delimData = delimiter.data(using: encoding) else {
return nil
}
self.encoding = encoding
self.chunkSize = chunkSize
self.fileHandle = fileHandle
self.delimData = delimData
self.buffer = Data(capacity: chunkSize)
self.atEof = false
}
deinit {
self.close()
}
/// Return next line, or nil on EOF.
func nextLine() -> String? {
precondition(fileHandle != nil, "Attempt to read from closed file")
// Read data chunks from file until a line delimiter is found:
while !atEof {
if let range = buffer.range(of: delimData) {
// Convert complete line (excluding the delimiter) to a string:
let line = String(data: buffer.subdata(in: 0..<range.lowerBound), encoding: encoding)
// Remove line (and the delimiter) from the buffer:
buffer.removeSubrange(0..<range.upperBound)
return line
}
let tmpData = fileHandle.readData(ofLength: chunkSize)
if tmpData.count > 0 {
buffer.append(tmpData)
} else {
// EOF or read error.
atEof = true
if buffer.count > 0 {
// Buffer contains last line in file (not terminated by delimiter).
let line = String(data: buffer as Data, encoding: encoding)
buffer.count = 0
return line
}
}
}
return nil
}
/// Start reading from the beginning of file.
func rewind() -> Void {
fileHandle.seek(toFileOffset: 0)
buffer.count = 0
atEof = false
}
/// Close the underlying file. No reading must be done after calling this method.
func close() -> Void {
fileHandle?.closeFile()
fileHandle = nil
}
}
extension StreamReader : Sequence {
func makeIterator() -> AnyIterator<String> {
return AnyIterator {
return self.nextLine()
}
}
}
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let path2WordList = Bundle.main.path(forResource: "large_text_file", ofType: "txt")
var wordCnt: Int = 0
if nil != path2WordList {
if let aStreamReader = StreamReader(path: path2WordList!) {
defer { aStreamReader.close() }
print("File openned")
/* Read and discard */
while aStreamReader.nextLine() != nil {
wordCnt += 1
}
} // if let ...
} // if nil ...
print ("Final wordCnt := \(wordCnt)")
} // viewDidLoad
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
I've encountered problems like this when using long running while loops. The problem is that anything that is allocated into the current autorelease pool won't get deallocated until the loop exits.
To guard against this, you can wrap the contents of your while loop in autoreleasepool(invoking:). This will cause each iteration of your loop to have its own autorelease pool that is drained each time.
It would look something like this:
/// Return next line, or nil on EOF.
func nextLine() -> String? {
precondition(fileHandle != nil, "Attempt to read from closed file")
var result: String? = nil
// Read data chunks from file until a line delimiter is found:
while !atEof, result == nil {
result = autoreleasepool {
if let range = buffer.range(of: delimData) {
// Convert complete line (excluding the delimiter) to a string:
let line = String(data: buffer.subdata(in: 0..<range.lowerBound), encoding: encoding)
// Remove line (and the delimiter) from the buffer:
buffer.removeSubrange(0..<range.upperBound)
return line
}
let tmpData = fileHandle.readData(ofLength: chunkSize)
if tmpData.count > 0 {
buffer.append(tmpData)
} else {
// EOF or read error.
atEof = true
if buffer.count > 0 {
// Buffer contains last line in file (not terminated by delimiter).
let line = String(data: buffer as Data, encoding: encoding)
buffer.count = 0
return line
}
}
return nil
}
}
return result
}
As to whether your memory growth is a side effect of the debug environment, it's hard to say. But it would probably be wise to guard against this kind of growth regardless.

Converting Objective-C block to Swift

I have a function written in Objective-C below and I want to convert it to Swift, but I keep getting errors.
Below is the Objective-C code:
- (void)openFileWithFilePathURL:(NSURL*)filePathURL
{
self.audioFile = [EZAudioFile audioFileWithURL:filePathURL];
self.filePathLabel.text = filePathURL.lastPathComponent;
//
// Plot the whole waveform
//
self.audioPlot.plotType = EZPlotTypeBuffer;
self.audioPlot.shouldFill = YES;
self.audioPlot.shouldMirror = YES;
//
// Get the audio data from the audio file
//
__weak typeof (self) weakSelf = self;
[self.audioFile getWaveformDataWithCompletionBlock:^(float **waveformData,
int length)
{
[weakSelf.audioPlot updateBuffer:waveformData[0]
withBufferSize:length];
}];
}
And here is my Swift code:
func openFileWithFilePathURL(url: NSURL) {
let audioFile = EZAudioFile(URL: url)
audioPlot.plotType = EZPlotType.Buffer
audioPlot.shouldFill = true
audioPlot.shouldMirror = true
audioFile.getWaveformDataWithCompletionBlock({(waveformData, length) in
audioPlot.updateBuffer(waveformData[0], withBufferSize: length)
})
}
And I always get the error
Command failed due to signal: Segmentation fault: 11
I'm new to iOS language and I spent hours on this problem. I really have no clue on how to fix this problem.
I guess the problem lies in how I converted the block from Objective-C to Swift.
Thank you for your help!!
You can try this:
func openFileWithFilePathURL(filePathURL: NSURL) {
self.audioFile = EZAudioFile.audioFileWithURL(filePathURL)
self.filePathLabel.text = filePathURL.lastPathComponent
//
// Plot the whole waveform
//
self.audioPlot.plotType = EZPlotTypeBuffer
self.audioPlot.shouldFill = true
self.audioPlot.shouldMirror = true
//
// Get the audio data from the audio file
//
weak var weakSelf = self
self.audioFile.getWaveformDataWithCompletionBlock({(waveformData: Float, length: Int) -> Void in
weakSelf.audioPlot.updateBuffer(waveformData[0], withBufferSize: length)
})
}
This is typically an Xcode glitch. The only thing you can do it try to alter the syntax, first by the order of the lines and then the actual lines themselves (i.e. the type of line has several variations). You can also submit a bug report to apple if you can still not fix it. (here)

How to get the power of audio buffer using The Amazing Audio Engine

I am fairly new to iOS Development, and I am a complete newbie with audio stuff.
I am trying to get the loudness or the power of the audio that is getting played using TAAE. I am not sure if what I am doing makes any sense.
Here is my code
static var gameStatus : GameStatus = .Starting
private init(){
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())
initializeAudioTrack()
}
func initializeAudioTrack() {
let file = NSBundle.mainBundle().URLForResource("01 Foreign Formula", withExtension:
"mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in
let leftSample = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
let rightSample = UnsafeMutablePointer<Float>(audioBufferList[1].mBuffers.mData)
var accumulator = Float(0.0)
for i in 0...frames {
accumulator += leftSample[Int(i)] * leftSample[Int(i)]
}
var power = accumulator / Float(frames)
println(power)
}
println(audioController?.masterOutputVolume)
audioController?.addChannels([channel])
audioController?.addOutputReceiver(receiver)
audioController?.useMeasurementMode = true
audioController?.preferredBufferDuration = 0.005
audioController?.start(nil)
}
I looked everywhere trying to understand how to get this done but it is kind of hard for me to know what should I be looking for.
Basically all I need is to find the power of audio (intensity, bass etc) to determine and manipulate certain stuff in the game I am building.
I would really love any kind of explanation or help.
Feel free to write code in Objective-C or other

Resources