NSURLConnection Upload file, inconsistent byteWritten value - ios

i am using NSURLConnection and it's delegates to upload file to the server. I am using below method to get how much data gas been send till now.
- (void)connection:(NSURLConnection *)connection didSendBodyData:(NSInteger)bytesWritten
totalBytesWritten:(NSInteger)totalBytesWritten
totalBytesExpectedToWrite:(NSInteger)totalBytesExpectedToWrite
I am getting output as below:
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 32768
bytesWritten : 2408
bytesWritten : 3828
bytesWritten : 3828
bytesWritten : 2840
bytesWritten : 4260
bytesWritten : 5680
bytesWritten : 2840
bytesWritten : 7100
My question is:
Why am I getting inconsistent data? For first few records, I am getting maximum byteWritten value and for the next it's very moderate.
Can anyone help to understand this inconsistency?

I came to the conclusion that connection starts sending maximum bytes through the network. As and when server starts responding to the incoming data this byte size gets finalise according to network strength.
Thank you.

Related

Why does escpos hang when trying to connect to printer?

I have a SureMark 4610 printer that I'm trying to connect to, but for some reason it hangs when I connect. I've installed the necessary drivers for the printer. What could be the problem?
I'm using Windows 10 if that is of any use to you all.
Here's the printer device info
DEVICE ID 04b3:4535 on Bus 000 Address 019 =================
bLength : 0x12 (18 bytes)
bDescriptorType : 0x1 Device
bcdUSB : 0x110 USB 1.1
bDeviceClass : 0x0 Specified at interface
bDeviceSubClass : 0x0
bDeviceProtocol : 0x0
bMaxPacketSize0 : 0x40 (64 bytes)
idVendor : 0x04b3
idProduct : 0x4535
bcdDevice : 0x216 Device 2.16
iManufacturer : 0x1 (c) Copyright IBM Corp. 2000
iProduct : 0x5 Printer Interface (Usage = 3500h, Usage Page = FF45h)
iSerialNumber : 0x3 060725163603C100193
bNumConfigurations : 0x1
CONFIGURATION 1: 0 mA ====================================
bLength : 0x9 (9 bytes)
bDescriptorType : 0x2 Configuration
wTotalLength : 0x22 (34 bytes)
bNumInterfaces : 0x1
bConfigurationValue : 0x1
iConfiguration : 0x0
bmAttributes : 0x40 Self Powered
bMaxPower : 0x0 (0 mA)
INTERFACE 1: Human Interface Device ====================
bLength : 0x9 (9 bytes)
bDescriptorType : 0x4 Interface
bInterfaceNumber : 0x1
bAlternateSetting : 0x0
bNumEndpoints : 0x1
bInterfaceClass : 0x3 Human Interface Device
bInterfaceSubClass : 0x0
bInterfaceProtocol : 0x0
iInterface : 0x5 Printer Interface (Usage = 3500h, Usage Page = FF45h)
ENDPOINT 0x82: Interrupt IN ==========================
bLength : 0x7 (7 bytes)
bDescriptorType : 0x5 Endpoint
bEndpointAddress : 0x82 IN
bmAttributes : 0x3 Interrupt
wMaxPacketSize : 0x10 (16 bytes)
bInterval : 0x4
Here is the simple code
from escpos import printer
p = printer.Usb(idVendor=0x04B3, idProduct=0x4535, timeout=10, in_ep=0x82)
print(p.idProduct)
I cannot get to the print because it just hangs indefinitely at line 2.

Low latency audio output problems on iOS (aka How to beat AUAudioUnit sampleRate, maximumFramesToRender, and ioBufferDuration into submission)

Okay, I'm clearly missing some important piece here. I'm trying to do low-latency audio across the network, and my fundamental frames are 10ms. I expected this to be no problem. My target phone is an iPhone X speakers--so my hardware sample rate should be locked to 48000Hz. I'm requesting 10ms which is a nice even divisor and should be 480, 960, 1920 or 3840 depending upon how you want to slice frames/samples/bytes.
Yet, for the life of me, I absolute cannot get iOS to do anything I regard as sane. I get 10.667ms buffer duration which is ludicrous--iOS is going out of it's way to give me buffer sizes that aren't integer multiples of the sampleRate. Even worse, the frame is sightly LONG which means that I have to absorb not one but two packets of latency in order to be able to fill that buffer. I can't get maximumFrameToRender to change at all, and the system is returning 0 as my sample rate even though it quite plainly is rendering at 48000Hz.
I'm clearly missing something important--what is it? Did I forget to disconnect/connect something in order to get a direct hardware mapping? (My format is 1 which pcmFormatFloat32--I would expect pcmFormatInt16 or pcmFormatInt32 for mapping directly to hardware so something in the OS is probably getting in the way) Pointers are appreciated and I'm happy to go read more. Or is AUAudioUnit simply half-baked and I need to go backward to older, more useful APIs? Or did I completely miss the plot and low-latency audio folks use a whole different set of audio management functions?
Thanks for the help--it's much appreciated.
Output from code:
2019-11-07 23:28:29.782786-0800 latencytest[3770:50382] Ready to receive user events
2019-11-07 23:28:34.727478-0800 latencytest[3770:50382] Start button pressed
2019-11-07 23:28:34.727745-0800 latencytest[3770:50382] Launching auxiliary thread
2019-11-07 23:28:34.729278-0800 latencytest[3770:50445] Thread main started
2019-11-07 23:28:35.006005-0800 latencytest[3770:50445] Sample rate: 0
2019-11-07 23:28:35.016935-0800 latencytest[3770:50445] Buffer duration: 0.010667
2019-11-07 23:28:35.016970-0800 latencytest[3770:50445] Number of output busses: 2
2019-11-07 23:28:35.016989-0800 latencytest[3770:50445] Max frames: 4096
2019-11-07 23:28:35.017010-0800 latencytest[3770:50445] Can perform output: 1
2019-11-07 23:28:35.017023-0800 latencytest[3770:50445] Output Enabled: 1
2019-11-07 23:28:35.017743-0800 latencytest[3770:50445] Bus channels: 2
2019-11-07 23:28:35.017864-0800 latencytest[3770:50445] Bus format: 1
2019-11-07 23:28:35.017962-0800 latencytest[3770:50445] Bus rate: 0
2019-11-07 23:28:35.018039-0800 latencytest[3770:50445] Sleeping 0
2019-11-07 23:28:35.018056-0800 latencytest[3770:50445] Buffer count: 2 4096
2019-11-07 23:28:36.023220-0800 latencytest[3770:50445] Sleeping 1
2019-11-07 23:28:36.023400-0800 latencytest[3770:50445] Buffer count: 190 389120
2019-11-07 23:28:37.028610-0800 latencytest[3770:50445] Sleeping 2
2019-11-07 23:28:37.028790-0800 latencytest[3770:50445] Buffer count: 378 774144
2019-11-07 23:28:38.033983-0800 latencytest[3770:50445] Sleeping 3
2019-11-07 23:28:38.034142-0800 latencytest[3770:50445] Buffer count: 566 1159168
2019-11-07 23:28:39.039333-0800 latencytest[3770:50445] Sleeping 4
2019-11-07 23:28:39.039534-0800 latencytest[3770:50445] Buffer count: 756 1548288
2019-11-07 23:28:40.041787-0800 latencytest[3770:50445] Sleeping 5
2019-11-07 23:28:40.041943-0800 latencytest[3770:50445] Buffer count: 944 1933312
2019-11-07 23:28:41.042878-0800 latencytest[3770:50445] Sleeping 6
2019-11-07 23:28:41.043037-0800 latencytest[3770:50445] Buffer count: 1132 2318336
2019-11-07 23:28:42.048219-0800 latencytest[3770:50445] Sleeping 7
2019-11-07 23:28:42.048375-0800 latencytest[3770:50445] Buffer count: 1320 2703360
2019-11-07 23:28:43.053613-0800 latencytest[3770:50445] Sleeping 8
2019-11-07 23:28:43.053771-0800 latencytest[3770:50445] Buffer count: 1508 3088384
2019-11-07 23:28:44.058961-0800 latencytest[3770:50445] Sleeping 9
2019-11-07 23:28:44.059119-0800 latencytest[3770:50445] Buffer count: 1696 3473408
Actual code:
import UIKit
import os.log
import Foundation
import AudioToolbox
import AVFoundation
class AuxiliaryWork: Thread {
let II_SAMPLE_RATE = 48000
var iiStopRequested: Int32 = 0; // Int32 is normally guaranteed to be atomic on most architectures
var iiBufferFillCount: Int32 = 0;
var iiBufferByteCount: Int32 = 0;
func requestStop() {
iiStopRequested = 1;
}
func myAVAudioSessionInterruptionNotificationHandler(notification: Notification ) -> Void {
os_log(OSLogType.info, "AVAudioSession Interrupted: %s", notification.debugDescription)
}
func myAudioUnitProvider(actionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>, timestamp: UnsafePointer<AudioTimeStamp>,
frameCount: AUAudioFrameCount, inputBusNumber: Int, inputData: UnsafeMutablePointer<AudioBufferList>) -> AUAudioUnitStatus {
let ppInputData = UnsafeMutableAudioBufferListPointer(inputData)
let iiNumBuffers = ppInputData.count
if (iiNumBuffers > 0) {
assert(iiNumBuffers == 2)
for bbBuffer in ppInputData {
assert(Int(bbBuffer.mDataByteSize) == 2048) // FIXME: This should be 960 or 1920 ...
iiBufferFillCount += 1
iiBufferByteCount += Int32(bbBuffer.mDataByteSize)
memset(bbBuffer.mData, 0, Int(bbBuffer.mDataByteSize)) // Just send silence
}
} else {
os_log(OSLogType.error, "Zero buffers from system")
assert(iiNumBuffers != 0) // Force crash since os_log would cause an audio hiccup due to locks anyway
}
return noErr
}
override func main() {
os_log(OSLogType.info, "Thread main started")
#if os(iOS)
let kOutputUnitSubType = kAudioUnitSubType_RemoteIO
#else
let kOutputUnitSubType = kAudioUnitSubtype_HALOutput
#endif
let audioSession = AVAudioSession.sharedInstance() // FIXME: Causes the following message No Factory registered for id
try! audioSession.setCategory(AVAudioSession.Category.playback, options: [])
try! audioSession.setMode(AVAudioSession.Mode.measurement)
try! audioSession.setPreferredSampleRate(48000.0)
try! audioSession.setPreferredIOBufferDuration(0.010)
NotificationCenter.default.addObserver(
forName: AVAudioSession.interruptionNotification,
object: nil,
queue: nil,
using: myAVAudioSessionInterruptionNotificationHandler
)
let ioUnitDesc = AudioComponentDescription(
componentType: kAudioUnitType_Output,
componentSubType: kOutputUnitSubType,
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0,
componentFlagsMask: 0)
let auUnit = try! AUAudioUnit(componentDescription: ioUnitDesc,
options: AudioComponentInstantiationOptions())
auUnit.outputProvider = myAudioUnitProvider;
auUnit.maximumFramesToRender = 256
try! audioSession.setActive(true)
try! auUnit.allocateRenderResources() // Make sure audio unit has hardware resources--we could provide the buffers from the circular buffer if we want
try! auUnit.startHardware()
os_log(OSLogType.info, "Sample rate: %d", audioSession.sampleRate);
os_log(OSLogType.info, "Buffer duration: %f", audioSession.ioBufferDuration);
os_log(OSLogType.info, "Number of output busses: %d", auUnit.outputBusses.count);
os_log(OSLogType.info, "Max frames: %d", auUnit.maximumFramesToRender);
os_log(OSLogType.info, "Can perform output: %d", auUnit.canPerformOutput)
os_log(OSLogType.info, "Output Enabled: %d", auUnit.isOutputEnabled)
//os_log(OSLogType.info, "Audio Format: %p", audioFormat)
var bus0 = auUnit.outputBusses[0]
os_log(OSLogType.info, "Bus channels: %d", bus0.format.channelCount)
os_log(OSLogType.info, "Bus format: %d", bus0.format.commonFormat.rawValue)
os_log(OSLogType.info, "Bus rate: %d", bus0.format.sampleRate)
for ii in 0..<10 {
if (iiStopRequested != 0) {
os_log(OSLogType.info, "Manual stop requested");
break;
}
os_log(OSLogType.info, "Sleeping %d", ii);
os_log(OSLogType.info, "Buffer count: %d %d", iiBufferFillCount, iiBufferByteCount)
Thread.sleep(forTimeInterval: 1.0);
}
auUnit.stopHardware()
}
}
class FirstViewController: UIViewController {
var thrAuxiliaryWork: AuxiliaryWork? = nil;
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
#IBAction func startButtonPressed(_ sender: Any) {
os_log(OSLogType.error, "Start button pressed");
os_log(OSLogType.error, "Launching auxiliary thread");
thrAuxiliaryWork = AuxiliaryWork();
thrAuxiliaryWork?.start();
}
#IBAction func stopButtonPressed(_ sender: Any) {
os_log(OSLogType.error, "Stop button pressed");
os_log(OSLogType.error, "Manually stopping auxiliary thread");
thrAuxiliaryWork?.requestStop();
}
#IBAction func muteButtonPressed(_ sender: Any) {
os_log(OSLogType.error, "Mute button pressed");
}
#IBAction func unmuteButtonPressed(_ sender: Any) {
os_log(OSLogType.error, "Unmute button pressed");
}
}
You cannot beat iOS silicon hardware into submission by assuming the API will do it for you. You have to do your own buffering if you want to abstract the hardware.
For the very best (lowest) latencies, your software will have to (potentially dynamically) adapt to the actual hardware capabilities, which can vary from device to device, and mode to mode.
The hardware sample rate appears to be either 44.1ksps (older iOS devices), 48ksps (newer arm64 iOS devices), or an integer multiple thereof (and potentially other rates when plugging in non-AirPod Bluetooth headsets, or external ADCs). The actual hardware DMA (or equivalent) buffers seem to always be a power of 2 in size, potentially down to 64 samples on newest devices. However various iOS power saving modes will increase the buffer size (by powers of 2) up to 4k samples, especially on older iOS devices. If you request a sample rate other than the hardware rate, the OS might resample the buffers to a different size than a power of 2, and this size can change from Audio Unit callback to subsequent callback if the resampling ratio isn't an exact integer.
Audio Units are the lowest level accessible via public API on iOS devices. Everything else is built on top, and thus potentially incurs greater latencies. For instance, if you use the Audio Queue API with non-hardware buffer sizes, the OS will internally use power-of-2 audio buffers to access the hardware, and chop them up or fractionally concatenate them to return or fetch Audio Queue buffers of non-hardware sizes. Slower and jittery.
Far from being half-baked, for a long time the iOS API was the only API usable on mobile phones and tablets for live low-latency music performance. But by developing software matched to the hardware.

GCDWebServer for streaming

I was trying to stream a mp4 video from a remote server on a iOS device. I somehow got
[ERROR] Error while writing to socket 15: Broken pipe (32) after a few writes.
Not sure where is wrong. Can you kindly give any advice?
iOS re-routing requests w/ GCDWebServer (not redirecting)
Thanks!
Rao
#property (nonatomic, strong) GCDWebServerBodyReaderCompletionBlock cb;
[_webServer addDefaultHandlerForMethod:#"GET"
requestClass:[GCDWebServerRequest class]
processBlock:^GCDWebServerResponse *(GCDWebServerRequest* request) {
NSURLSessionConfiguration *configuration = [NSURLSessionConfiguration ephemeralSessionConfiguration];
self.session = [NSURLSession sessionWithConfiguration:configuration delegate:self delegateQueue:[NSOperationQueue mainQueue]];
GCDWebServerStreamedResponse* response = [GCDWebServerStreamedResponse responseWithContentType:#"application/octet-stream" asyncStreamBlock:^(GCDWebServerBodyReaderCompletionBlock completionBlock) {
NSString * realURL = #"https://ia800309.us.archive.org/2/items/Popeye_Nearlyweds/Popeye_Nearlyweds_512kb.mp4";
self.cb = completionBlock;
NSURLSessionDataTask * dataTask = [self.session dataTaskWithURL:[[NSURL alloc] initWithString:realURL]] ;
}];
return response;
}];
(void)URLSession:(NSURLSession *)session dataTask:(NSURLSessionDataTask *)dataTask
didReceiveResponse:(NSURLResponse *)response
completionHandler:(void (^)(NSURLSessionResponseDisposition disposition))completionHandler
{
self.response = response;
completionHandler(NSURLSessionResponseAllow);
}
(void)URLSession:(NSURLSession *)session dataTask:(NSURLSessionDataTask *)dataTask
didReceiveData:(NSData *)data
{
NSLog(#"got data size == %lu", data.length);
self.cb(data, nil);
}
(void)URLSession:(NSURLSession *)session task:(NSURLSessionTask *)task
didCompleteWithError:(NSError *)error
{
self.cb([NSData data], nil);
}
[DEBUG] Did open IPv4 listening socket 6
[DEBUG] Did open IPv6 listening socket 7
[INFO] GCDWebServer started on port 8080 and reachable at http://192.168.1.27:8080/
2016-02-23 15:34:39.096 Proxy2Proxy[5315:1148991] Visit http://192.168.1.27:8080/ in your web browser
[DEBUG] Did open connection on socket 15
[DEBUG] Did connect
[DEBUG] Did start background task
[DEBUG] Connection received 370 bytes on socket 15
[DEBUG] Connection on socket 15 preflighting request "GET /https://ia800309.us.archive.org/2/items/Popeye_Nearlyweds/Popeye_Nearlyweds_512kb.mp4" with 370 bytes body
[DEBUG] Connection on socket 15 processing request "GET /https://ia800309.us.archive.org/2/items/Popeye_Nearlyweds/Popeye_Nearlyweds_512kb.mp4" with 370 bytes body
[DEBUG] Connection sent 175 bytes on socket 15
2016-02-23 15:34:47.488 Proxy2Proxy[5315:1148991] got data size == 16059
[DEBUG] Connection sent 16067 bytes on socket 15
2016-02-23 15:34:47.488 Proxy2Proxy[5315:1148991] got data size == 32768
[DEBUG] Connection sent 32776 bytes on socket 15
2016-02-23 15:34:47.489 Proxy2Proxy[5315:1148991] got data size == 32768
[DEBUG] Connection sent 32776 bytes on socket 15
2016-02-23 15:34:47.489 Proxy2Proxy[5315:1148991] got data size == 98304
2016-02-23 15:34:47.490 Proxy2Proxy[5315:1148991] got data size == 16384
[DEBUG] Connection sent 98313 bytes on socket 15
[DEBUG] Connection sent 16392 bytes on socket 15
2016-02-23 15:34:47.506 Proxy2Proxy[5315:1148991] got data size == 16384
[DEBUG] Connection sent 16392 bytes on socket 15
2016-02-23 15:34:47.517 Proxy2Proxy[5315:1148991] got data size == 16384
[DEBUG] Connection sent 16392 bytes on socket 15
2016-02-23 15:34:47.530 Proxy2Proxy[5315:1148991] got data size == 16384
[DEBUG] Connection sent 16392 bytes on socket 15
2016-02-23 15:34:47.545 Proxy2Proxy[5315:1148991] got data size == 16384
[ERROR] Error while writing to socket 15: Broken pipe (32)

read()/write() calls on iOS seem to be limited by 2250 bytes

I am having a strange problem trying to read and write 9k bytes with open(), read() and write(). When I attempt to write 9k to a file and read it back, the data only goes up to 2250 bytes. After that everything is zeros.
Here is my code (except for the filename which isn't relevant, I'm just putting it to NSDocumentDirectory):
int fp = open([appFile cStringUsingEncoding:NSASCIIStringEncoding], O_RDWR | O_CREAT, 0644);
[_detailViewController log:#"first open() returns %i (err: %i)", fp, errno];
int data2[10000];
int data3[10000];
for (int i=0;i<10000;i++) data2[i] = 1;
[_detailViewController log:#"resetting seek to 0"];
int seekPos = lseek(fp, 0, SEEK_SET);
result = write(fp, data2, 9000);
[_detailViewController log:#"wrote 9k, result is %i", result];
[_detailViewController log:#"resetting seek to 0"];
seekPos = lseek(fp, 0, SEEK_SET);
result = read(fp, data3, 9000);
[_detailViewController log:#"read 9k, result is %i", result];
[_detailViewController log:#"values of data2[2248-2252] = 0x%x 0x%x 0x%x 0x%x 0x%x", data2[2248], data2[2249], data2[2250], data2[2251], data2[2252]];
[_detailViewController log:#"values of data3[2248-2252] = 0x%x 0x%x 0x%x 0x%x 0x%x", data3[2248], data3[2249], data3[2250], data3[2251], data3[2252]];
close(fp);
And here is the strange output:
2013-02-13 14:08:38.290 FileTester[2800:907] first open() returns 6 (err: 3)
2013-02-13 14:08:38.295 FileTester[2800:907] resetting seek to 0
2013-02-13 14:08:38.301 FileTester[2800:907] wrote 9k, result is 9000
2013-02-13 14:08:38.306 FileTester[2800:907] resetting seek to 0
2013-02-13 14:08:38.311 FileTester[2800:907] read 9k, result is 9000
2013-02-13 14:08:38.319 FileTester[2800:907] values of data2[2248-2252] = 0x1 0x1 0x1 0x1 0x1
2013-02-13 14:08:38.327 FileTester[2800:907] values of data3[2248-2252] = 0x1 0x1 0x0 0x0 0x0
As you can see on the last line, the data suddenly goes to zero.
Any ideas what I might be doing wrong? The thing that really gets me is that both the read() and write() return 9000.
As mentioned by ughoavgfhw (Thanks!) the problem was I was mixing up bytes and ints. 9000 bytes is the same thing as 2250 ints, since each int is 4 bytes.

Is extra space allocated for variant records?

I'm working with the variant record below. The variable instance is Kro_Combi. SizeOf(Kro_Combi) reports 7812 bytes. SizeOf(Kro_Combi.data) reports 7810 bytes.
The sum of the SizeOf of all the other data structures composing the "non-directmode" case of the variant record also adds to 7810 bytes.
Why is there a two byte difference? I would like to have the two variant exactly overlay each other.
TKro_Combi = record
case directmode:boolean of
true : (
data : array[0..7809] of byte
);
false : (
Combi_Name : array[0..23] of char; //24
Gap1 : array[0..63] of byte; // 24-87 (64)
Ins_Effect_Group : array[1..12] of TIns_Effect_Params; //74 each, (Ins_Effect_Data=9 bytes) 74*12 = 888
Mast_Effect_Params : array[0..229] of byte; // 976-1205 : 230 bytes
Vect_Aud__Drum_Params : array[0..97] of byte; //1206-1303 : 98 bytes
Karma_Common : array[0..509] of byte; //1304-1813 : 510 bytes
Karma_Module : array[0..3] of TKarma_Module; //1814-2557 : 744 bytes each Total span 1814 - 4789 = 2976 bytes total
Common_Params : array[0..11] of byte; //4790-4801 = 12 bytes
Timbre_Group : array[1..16] of TTimbre_Params; ) // 4802 -4989 = 188 bytes each, 16 Timbres, 4802-7809 = 3008 bytes total for all
end;
First of all, there needs to be space for the directmode field. If you really want the record to have size 7810 bytes then you should remove that field. The other byte will be due to internal alignment and padding of the false part of the variant record. I can't yet quite work out where it comes from. No matter, you simply want to use a packed record to avoid any padding bytes.
TKro_Combi = packed record
case boolean of
true : (
data : array[0..7809] of byte
);
false : (
Combi_Name : array[0..23] of char; //24
Gap1 : array[0..63] of byte; // 24-87 (64)
Ins_Effect_Group : array[1..12] of TIns_Effect_Params; //74 each, (Ins_Effect_Data=9 bytes) 74*12 = 888
Mast_Effect_Params : array[0..229] of byte; // 976-1205 : 230 bytes
Vect_Aud__Drum_Params : array[0..97] of byte; //1206-1303 : 98 bytes
Karma_Common : array[0..509] of byte; //1304-1813 : 510 bytes
Karma_Module : array[0..3] of TKarma_Module; //1814-2557 : 744 bytes each Total span 1814 - 4789 = 2976 bytes total
Common_Params : array[0..11] of byte; //4790-4801 = 12 bytes
Timbre_Group : array[1..16] of TTimbre_Params; ) // 4802 -4989 = 188 bytes each, 16 Timbres, 4802-7809 = 3008 bytes total for all
end;

Resources