If you had a file stored as data which is much larger than your buffer and you wanted to iterate over this data in pieces of a buffer size, what is a good way to do this? If you could provide some context, that would be great.
I was thinking,
let bufferSize: Int = 20000
let myData: Data = Data(..)
var buffer: ??? = ???
var theOffset: ??? = ???
func runWhenReady() {
buffer = &myData
let amount = sendingData(&buffer[bufferOffset] maxLength: bufferSize - bufferOffset)
bufferOffset += amount
}
// pseudocode
// from foundation, but changed a bit (taken from Obj-C foundations just for types)
// writes the bytes from the specified buffer to the stream up to len bytes. Returns the number of bytes actually written.
func sendingData(_ buffer: const uint8_t *, maxLength len: NSUInteger) -> Int {
...
}
If you want to iterate you need a loop.
This is an example to slice the data in chunks of bufferSize with stride.
let bufferSize = 20000
var buffer = [UInt8]()
let myData = Data(..)
let dataCount = myData.count
for currentIndex in stride(from: 0, to: dataCount, by: bufferSize) {
let length = min(bufferSize, dataCount - currentIndex) // ensures that the last chunk is the remainder of the data
let endIndex = myData.index(currentIndex, offsetBy: length)
buffer = [UInt8](myData[currentIndex..<endIndex])
// do something with buffer
}
Open the file
let fileUrl: URL = ...
let fileHandle = try! FileHandle(forReadingFrom: fileUrl)
defer {
fileHandle.closeFile()
}
Create buffer:
let bufferSize = 20_000
let buffer = UnsafeMutablePointer<UInt8>.allocate(capacity: bufferSize)
defer {
buffer.deallocate()
}
Read until end of file is reached:
while true {
let bytesRead = read(fileHandle.fileDescriptor, buffer, bufferSize)
if bytesRead < 0 {
// handle error
break
}
if bytesRead == 0 {
// end of file
break
}
// do something with data in buffer
}
Related
Given a single AKAudioFile that has been created from an AKNodeRecorder containing a series of spoken words, where each word is separated by at least 1 second, what is the best approach to ultimately create a series of files with each file containing one word?
I believe this can be accomplished if there is a way to iterate the file in, for example, 100 ms chunks, and measure the average amplitude of each chunk. "Silent chunks" could be those below some arbitrarily small amplitude. While iterating, if I encounter a chunk with non-silent amplitude, I can grab the starting timestamp of this "non-silent" chunk to create an audio file that starts here and ends at the start time of the next "silent" chunk.
Whether it'd be using a manual approach like the one above or a more built-in processing technique to AudioKit, any suggestions would be greatly appreciated.
I don't have a complete solution, but I've started working on something similar to this. This function could serve as a jumping off point for what you need. Basically you want to read the file into a buffer then analyze the buffer data. At that point you could dice it up into smaller buffers and write those to file.
public class func guessBoundaries(url: URL, sensitivity: Double = 1) -> [Double]? {
var out: [Double] = []
guard let audioFile = try? AVAudioFile(forReading: url) else { return nil }
let processingFormat = audioFile.processingFormat
let frameCount = AVAudioFrameCount(audioFile.length)
guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: processingFormat, frameCapacity: frameCount) else { return nil }
audioFile.framePosition = 0
do {
audioFile.framePosition = 0
try audioFile.read(into: pcmBuffer, frameCount: frameCount)
} catch let err as NSError {
AKLog("ERROR: Couldn't read data into buffer. \(err)")
return nil
}
let channelCount = Int(pcmBuffer.format.channelCount)
let bufferLength = 1024
let inThreshold: Double = 0.001 / sensitivity
let outThreshold: Double = 0.0001 * sensitivity
let minSegmentDuration: Double = 1
var counter = 0
var thresholdCrossed = false
var rmsBuffer = [Float](repeating: 0, count: bufferLength)
var lastTime: Double = 0
AKLog("inThreshold", inThreshold, "outThreshold", outThreshold)
for i in 0 ..< Int(pcmBuffer.frameLength) {
// n is the channel
for n in 0 ..< channelCount {
guard let sample: Float = pcmBuffer.floatChannelData?[n][i] else { continue }
if counter == rmsBuffer.count {
let time: Double = Double(i) / processingFormat.sampleRate
let avg = rmsBuffer.reduce(0, +) / rmsBuffer.count
// AKLog("Average Value at frame \(i):", avg)
if avg > inThreshold && !thresholdCrossed && time - lastTime > minSegmentDuration {
thresholdCrossed = true
out.append(time)
lastTime = time
} else if avg <= outThreshold && thresholdCrossed && time - lastTime > minSegmentDuration {
thresholdCrossed = false
out.append(time)
lastTime = time
}
counter = 0
}
rmsBuffer[counter] = abs(sample)
counter += 1
}
}
rmsBuffer.removeAll()
return out
}
I have this two structs:
struct pcap_hdr_s {
UInt32 magic_number;
UInt16 version_major;
UInt16 version_minor;
int32_t thiszone;
UInt32 sigfigs;
UInt32 snaplen;
UInt32 network;
};
//packet header
struct pcaprec_hdr_s {
UInt32 ts_sec;
UInt32 ts_usec;
UInt32 incl_len;
UInt32 orig_len;
};
which are initialised as follows(for example):
let pcapHeader : pcap_hdr_s = pcap_hdr_s(magic_number: 0xa1b2c3d4,
version_major: 2,
version_minor: 4,
thiszone: 0,
sigfigs: 0,
snaplen:
pcap_record_size,
network: LINKTYPE_ETHERNET)
let pcapRecHeader : pcaprec_hdr_s = pcaprec_hdr_s(ts_sec: UInt32(ts.tv_sec),
ts_usec: UInt32(ts.tv_nsec),
incl_len: plen,
orig_len: length)
I tried to create Data/NSData objects of the structs like this:
//write pcap header
let pcapHeaderData : NSData = NSData(bytes: pcapHeader, length: sizeofValue(pcapHeader))
//write pcaprec header
let pcapRecHeaderData : NSData = NSData(bytes: pcapRecHeader, length: sizeofValue(pcapRecHeader))
but I always get this error for each line:
"Connot convert value if type 'pcap_hdr_s' to expected arguemnt type 'UsafeRawPointer?'"
I had a look at the documentation of UnsafeRawPointers in Swift, but I don't get it enough as for now, to create the NSData object from the structs.
Am I on the right way or is there a better one to accomplish my intend?
If this Data initialisation would work, my next steps would be
Append pcapRecHeaderData to pcapHeaderData
write pcapHeaderData atomically to file/url with the provided function of Data/NSData
EDIT:
//packet ethernet header
struct ethernet_hdr_s {
let dhost : [UInt8]
let shost : [UInt8]
let type : UInt16
};
let src_mac : [UInt8] = [0x66, 0x77, 0x88, 0x99, 0xAA, 0xBB]
let dest_mac : [UInt8] = [0x00, 0x11, 0x22, 0x33, 0x44, 0x55]
let ethernetHeader : ethernet_hdr_s = ethernet_hdr_s(dhost: dest_mac, shost: src_mac, type: 0x0800)
EDIT 2:
let payloadSize = packet.payload.count
let plen = (payloadSize < Int(pcap_record_size) ? payloadSize : Int(pcap_record_size));
bytesWritten = withUnsafePointer(to: &(packet.payload)) {
$0.withMemoryRebound(to: UInt8.self, capacity: Int(plen)) {
ostream.write($0, maxLength: Int(plen))
}
}
if bytesWritten != (Int(plen)) {
// Could not write all bytes, report error ...
NSLog("error in Writting packet payload, not all Bytes written: bytesWritten: %d|plen: %d", bytesWritten, Int(plen))
}
You can write arbitrary data to an InputStream without creating a
(NS)Data object first. The "challenge" is how to convert the pointer to
the struct to an UInt8 pointer as expected by the write method:
let ostream = OutputStream(url: url, append: false)! // Add error checking here!
ostream.open()
var pcapHeader = pcap_hdr_s(...)
let headerSize = MemoryLayout.size(ofValue: pcapHeader)
let bytesWritten = withUnsafePointer(to: &pcapHeader) {
$0.withMemoryRebound(to: UInt8.self, capacity: headerSize) {
ostream.write($0, maxLength: headerSize)
}
}
if bytesWritten != headerSize {
// Could not write all bytes, report error ...
}
In the same way you can read data from in InputStream:
let istream = InputStream(url: url)! // Add error checking here!
istream.open()
let bytesRead = withUnsafeMutablePointer(to: &pcapHeader) {
$0.withMemoryRebound(to: UInt8.self, capacity: headerSize) {
istream.read($0, maxLength: headerSize)
}
}
if bytesRead != headerSize {
// Could not read all bytes, report error ...
}
If the file was possibly created on a different platform with a
different byte order then you can check the "magic" and swap bytes
if necessary (as described on https://wiki.wireshark.org/Development/LibpcapFileFormat):
switch pcapHeader.magic_number {
case 0xa1b2c3d4:
break // Already in host byte order
case 0xd4c3b2a1:
pcapHeader.version_major = pcapHeader.version_major.byteSwapped
pcapHeader.version_minor = pcapHeader.version_minor.byteSwapped
// ...
default:
// Unknown magic, report error ...
}
To simplify the task of writing and reading structs one can define
custom extension methods, e.g.
extension OutputStream {
enum ValueWriteError: Error {
case incompleteWrite
case unknownError
}
func write<T>(value: T) throws {
var value = value
let size = MemoryLayout.size(ofValue: value)
let bytesWritten = withUnsafePointer(to: &value) {
$0.withMemoryRebound(to: UInt8.self, capacity: size) {
write($0, maxLength: size)
}
}
if bytesWritten == -1 {
throw streamError ?? ValueWriteError.unknownError
} else if bytesWritten != size {
throw ValueWriteError.incompleteWrite
}
}
}
extension InputStream {
enum ValueReadError: Error {
case incompleteRead
case unknownError
}
func read<T>(value: inout T) throws {
let size = MemoryLayout.size(ofValue: value)
let bytesRead = withUnsafeMutablePointer(to: &value) {
$0.withMemoryRebound(to: UInt8.self, capacity: size) {
read($0, maxLength: size)
}
}
if bytesRead == -1 {
throw streamError ?? ValueReadError.unknownError
} else if bytesRead != size {
throw ValueReadError.incompleteRead
}
}
}
Now you can write and read simply with
try ostream.write(value: pcapHeader)
try istream.read(value: &pcapHeader)
Of course this works only with "self-contained" structs like your
pcap_hdr_s and pcaprec_hdr_s.
You can convert pcap_hdr_s to Data and vice versa in Swift 3 with
pcap_hdr_s -> Data
var pcapHeader : pcap_hdr_s = pcap_hdr_s(magic_number ...
let data = withUnsafePointer(to: &pcapHeader) {
Data(bytes: UnsafePointer($0), count: MemoryLayout.size(ofValue: pcapHeader))
}
Data -> pcap_hdr_s
let header: pcap_hdr_s = data.withUnsafeBytes { $0.pointee }
Reference: round trip Swift number types to/from Data
You can view this project on github here: https://github.com/Lkember/MotoIntercom/
The class that is of importance is PhoneViewController.swift
I have an AVAudioPCMBuffer. The buffer is then converted to NSData using this function:
func audioBufferToNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData {
let channelCount = 1
let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount)
let data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameCapacity * PCMBuffer.format.streamDescription.pointee.mBytesPerFrame))
return data
}
This data needs to be converted to UnsafePointer< UInt8 > according to the documentation on OutputStream.write.
https://developer.apple.com/reference/foundation/outputstream/1410720-write
This is what I have so far:
let data = self.audioBufferToNSData(PCMBuffer: buffer)
let output = self.outputStream!.write(UnsafePointer<UInt8>(data.bytes.assumingMemoryBound(to: UInt8.self)), maxLength: data.length)
When this data is received, it is converted back to an AVAudioPCMBuffer using this method:
func dataToPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false) // given NSData audio format
let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.pointee.mBytesPerFrame)
audioBuffer.frameLength = audioBuffer.frameCapacity
let channels = UnsafeBufferPointer(start: audioBuffer.floatChannelData, count: Int(audioBuffer.format.channelCount))
data.getBytes(UnsafeMutableRawPointer(channels[0]) , length: data.length)
return audioBuffer
}
Unfortunately, when I play this audioBuffer, I only hear static. I don't believe that it is an issue with my conversion from AVAudioPCMBuffer to NSData or my conversion from NSData back to AVAudioPCMBuffer. I imagine it is the way that I am writing NSData to the stream.
The reason I don't believe that it is my conversion is because I have created a sample project located here (which you can download and try) that records audio to an AVAudioPCMBuffer, converts it to NSData, converts the NSData back to AVAudioPCMBuffer and plays the audio. In this case there are no problems playing the audio.
EDIT:
I never showed how I actually get Data from the stream as well. Here is how it's done:
func stream(_ aStream: Stream, handle eventCode: Stream.Event) {
switch (eventCode) {
case Stream.Event.hasBytesAvailable:
DispatchQueue.global().async {
var buffer = [UInt8](repeating: 0, count: 8192)
let length = self.inputStream!.read(&buffer, maxLength: buffer.count)
let data = NSData.init(bytes: buffer, length: buffer.count)
print("\(#file) > \(#function) > \(length) bytes read on queue \(self.currentQueueName()!) buffer.count \(data.length)")
if (length > 0) {
let audioBuffer = self.dataToPCMBuffer(data: data)
self.audioPlayerQueue.async {
self.peerAudioPlayer.scheduleBuffer(audioBuffer)
if (!self.peerAudioPlayer.isPlaying && self.localAudioEngine.isRunning) {
self.peerAudioPlayer.play()
}
}
}
else if (length == 0) {
print("\(#file) > \(#function) > Reached end of stream")
}
}
Once I have this data, I use the dataToPCMBuffer method to convert it to an AVAudioPCMBuffer.
EDIT 1:
Here is the AVAudioFormat's that I use:
self.localInputFormat = AVAudioFormat.init(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: false)
Originally, I was using this:
self.localInputFormat = self.localInput?.inputFormat(forBus: 0)
However, if the channel count does not equal the expected channel count, than I was getting crashes. So I switched it to the above.
The actual AVAudioPCMBuffer I'm using is in the installTap method (where localInput is an AVAudioInputNode):
localInput?.installTap(onBus: 0, bufferSize: 4096, format: localInputFormat) {
(buffer, time) -> Void in
Pretty sure you want to replace this:
let length = self.inputStream!.read(&buffer, maxLength: buffer.count)
let data = NSData.init(bytes: buffer, length: buffer.count)
With
let length = self.inputStream!.read(&buffer, maxLength: buffer.count)
let data = NSData.init(bytes: buffer, length: length)
Also, I am not 100% sure that the random blocks of data will always be ok to use to make the audio buffers. You might need to collect up the data first into a bigger block of NSData.
Right now, since you always pass in blocks of 8192 (even if you read less), the buffer creation probably always succeeds. It might not now.
How can I convert a UIimage into a Byte Array, so I can upload it into my web service?
You can actually use a couple of lines to do it
guard let image = UIImage(named: "someImage"),
let data = image.jpegData(compressionQuality: 1.0) else { return }
// OR
guard let image = UIImage(named: "someImage"),
let data = image.pngData() else { return }
The number should range from 0.0 to 1.0 and sets the jpeg quality. PNG is lossless so there is no need for compression quality identifier but be aware that the file size can be about 10 times higher
--- update ---
Updated for Swift 5.1
You can convert UIImage to NSData and pass it to this method
func getArrayOfBytesFromImage(imageData:NSData) -> NSMutableArray
{
// the number of elements:
let count = imageData.length / sizeof(UInt8)
// create array of appropriate length:
var bytes = [UInt8](count: count, repeatedValue: 0)
// copy bytes into array
imageData.getBytes(&bytes, length:count * sizeof(UInt8))
var byteArray:NSMutableArray = NSMutableArray()
for (var i = 0; i < count; i++) {
byteArray.addObject(NSNumber(unsignedChar: bytes[i]))
}
return byteArray
}
Swift 5, iOS 14 version based on toofani answer, minimal changes
func getArrayOfBytesFromImage(imageData:NSData) -> Array<UInt8>
{
// the number of elements:
let count = imageData.length / MemoryLayout<Int8>.size
// create array of appropriate length:
var bytes = [UInt8](repeating: 0, count: count)
// copy bytes into array
imageData.getBytes(&bytes, length:count * MemoryLayout<Int8>.size)
var byteArray:Array = Array<UInt8>()
for i in 0 ..< count {
byteArray.append(bytes[i])
}
return byteArray
}
So a complete sequence looks like this... assuming I got a UIImage I extract the data and then recombine it.
let data = imageX.pngData()
bytes = getArrayOfBytesFromImage(imageData: data! as NSData)
let datos: NSData = NSData(bytes: bytes, length: bytes.count)
newImage = UIImage(data: datos as Data) // Note it's optional. Don't force unwrap!!!
I'd like to cache CAF files before converting them to PCM whenever they play.
For example,
char *mybuffer = malloc(mysoundsize);
FILE *f = fopen("mysound.caf", "rb");
fread(mybuffer, mysoundsize, 1, f);
fclose(f);
char *pcmBuffer = malloc(pcmsoundsize);
// Convert to PCM for playing
AudioFileReadBytes(mybuffer, false, 0, mysoundsize, &numbytes, pcmBuffer);
This way, whenever the sound plays, the compressed CAF file is already loaded into memory, avoiding disk access. How can I open a block of memory with an 'AudioFileID' to make AudioFileReadBytes happy? Is there another method I can use?
I have not done it myself, but from the documentation I would think that you have to use AudioFileOpenWithCallbacks and implement callback functions that read from your memory buffer.
You can finish it with AudioFileStreamOpen
fileprivate var streamID: AudioFileStreamID?
public func parse(data: Data) throws {
let streamID = self.streamID!
let count = data.count
_ = try data.withUnsafeBytes { (bytes: UnsafePointer<UInt8>) in
let result = AudioFileStreamParseBytes(streamID, UInt32(count), bytes, [])
guard result == noErr else {
throw ParserError.failedToParseBytes(result)
}
}
}
you can store the data in memory within the callback
func ParserPacketCallback(_ context: UnsafeMutableRawPointer, _ byteCount: UInt32, _ packetCount: UInt32, _ data: UnsafeRawPointer, _ packetDescriptions: Optional<UnsafeMutablePointer<AudioStreamPacketDescription>>) {
let parser = Unmanaged<Parser>.fromOpaque(context).takeUnretainedValue()
/// At this point we should definitely have a data format
guard let dataFormat = parser.dataFormatD else {
return
}
let format = dataFormat.streamDescription.pointee
let bytesPerPacket = Int(format.mBytesPerPacket)
for i in 0 ..< Int(packetCount) {
let packetStart = i * bytesPerPacket
let packetSize = bytesPerPacket
let packetData = Data(bytes: data.advanced(by: packetStart), count: packetSize)
parser.packetsX.append(packetData)
}
}
full code in github repo