Convert TJpegImage to byte array - delphi

I am implementing webcam functionality in a client/server, and I am sending/receiving each frame over the socket as a JPEG. In order to do this, I am converting the JPEG into a byte array and then sending it. The server receives it as a byte array and converts it to a JPEG.
My question is how to convert the JPEG to a byte array (and vice versa) efficiently.
The way that I'm doing it now seems like it's probably not ideal. I'm currently creating a TMemoryStream, saving the JPEG into it, and then reading the stream into a byte array. Then on the server side, once it receives the array, I'm creating a TMemoryStream, writing the array into it, and then creating a TJpegImage and loading the stream into it.
It seems like my way requires a lot of steps, and memory allocations. Is there a better way?

There is no need for a conversion, you can directly save a jpeg image to a stream , transfer the stream, and load the jpeg from a stream.

Related

How can I load a jpeg image from a buffer?

I want to store a jpeg image in a RC_DATA resource, but not a single image in one single RC_DATA. There are many things in that RC_DATA, all muxed together. At runtime I load that RC_DATA in a bufer and extract all the object, including this Jpeg. Now I have this image in a memory buffer and I need to load it in a TJpegImage or TBitmap. How can I do that ? I saw that those classes doesn't have some methods to achieve this...
Copy the JPEG bytes from your buffer into a TMemoryStream (or, use a TCustomMemoryStream to point directly at the JPEG bytes to avoid making a copy). And then you can pass that stream to TJPEGImage.LoadFromStream().

Reading file as stream of strings in Dart: how many events will be emitted?

Standard way to open a file in Dart as a stream is to use file.openRead() which returns a Stream<List<int>>.
The next standard step is to transform this stream with utf8.decoder SteamTranformer which returns Stream<String>.
I noticed that with the files I've tried this resulting stream only emits a single event with the whole file content represented as one string. But I feel like this should not be a general case since otherwise the API wouldn't need to return a stream of strings, a Future<String> would suffice.
Could you explain how can I observe the behavior when this stream emits more than one event? Is this dependent on the file size / disk IO rate / some buffers size?
It depends on file size and buffer size, and however the file operations are implemented.
If you read a large file, you will very likely get multiple events of a limited size. The UTF-8 decoder decodes chunks eagerly, so you should get roughly the same number of chunks after decoding. It might carry a few bytes across chunk boundaries, but the rest of the bytes are decoded as soon as possible.
Checking on my local machine, the buffer size seems to be 65536 bytes. Reading a file larger than that gives me multiple chunks.

How to store .raw or raw bytes of pictures to local disk documents folder without png or jpeg representation

I want to store the raw bytes of a captured picture using AVCaptureSession. But I have only seen examples of pngrepresentation and jpegrepresentation. I want to store the raw data bytes in local disk documents of phone so it can be reopened at other times and converted into a UIImage for post processing. Is there a way to do this?
for example:
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(pixelBuffer);
Can I store rawImageBytes in documents to open it later?
Sure you can. Create an NSData object containing your bytes and save that using one of the NSData file saving methods (e.g. writeToURL:atomically:.)
You'll need to know the number of bytes in your pixelBuffer though. It looks like you should use CVPixelBufferGetDataSize to get the number of bytes.

Can i define in what endianess i read from NSData?

I have some files written on an Android device, it wrote bytes in big endian.
Now i try to read this file with iOS and there i need them in small endian.
I can make a for loop and
int temp;
for(...) {
[readFile getBytes:&temp range:NSMakeRange(offset, sizeof(int))];
target_array[i] = CFSwapInt32BigToHost(temp);
// read more like that
}
However it feels silly to read every single value and turn it before i can store it. Can i tell the NSData that i want the value read with a certain byte-order so that i can directly store it where it should be ?
(and save some time, as the data can be quite large)
I also worry about errors when some datatype changes and i forget to use the 16 instead of the 32 swap.
No, you need to swap every value. NSData is just a series of bytes with no value or meaning. It is your app that understands the meaning so it is your code logic that must swap each set of bytes as needed.
The data could be filled with all kinds of values of different sizes. 8-bit values, 16-bit values, 32-bit values, etc. as well as string data or just a stream of bytes that don't need any ordering at all. And the NSData can contain any combination of these values.
Given all of this, there is no simple way to tell NSData that the bytes need to be treated in a specific endianness.
If your data is, for example, nothing but 32-bit integer values stored in a specific endianness and you want to extract an array of bytes, create a helper class that does the conversion.

Finding the size of a deflated chunk of data

Let's suppose I have a chunk of deflated data (as in the compressed data right after the first 2 bytes and before the ADLER32 check in a zlib compressed file)
How can I know the length of that chunk? How can I find where it ends?
You would need to get that information from some metadata external to the zlib block. (The same is true of the uncompressed size.) Or you could decompress and see where you end up.
Compressed blocks in deflate format are self-delimiting, so the decoder will terminate at the correct point (unless the datastream has been corrupted).
Most file storage and data transmission formats which include compressed data make this metadata available. But since it is not necessary for decompression, it is not stored in the compressed stream.
The only way to find where it ends is to decompress it. deflate streams are self-terminating.

Resources