NSData length of an Image compressed with UIImageJPEGRepresentation() - ios

We know the image can be compressed with the Method UIImageJPEGRepresentation() as in the following codes.
NSData *imgData = UIImageJPEGRepresentation(imageResized, 0.5);
NSLog(#"imgData.length :%d",imgData.length);
imageResized = [UIImage imageWithData:imgData];
NSData *imgData2 = UIImageJPEGRepresentation(imageResized, 1);
NSLog(#"imgData2.length :%d",imgData2.length);
The log is:
2013-02-25 00:33:14.756 MyApp[1119:440b] imgData.length :371155
2013-02-25 00:33:20.988 MyApp[1119:440b] imgData2.length :1308415
What Im confused is that why the length of imgData and imgData2 are different. In my App, the image should be uploaded to the server. Should I upload the NSData to the server for saving storage? Is it possible for an Android phone to download the NSData and convert it to an image? Any help will be appreciated!

You start with a UIImage of some size (say 1024x768). This takes 1024x768x4 byes in memory. Then you compress it with a factor of 0.5 and get 371,155 bytes.
You then create a new UIImage with the compressed data. This is still a 1024x768 (or whatever) UIImage so it now takes the same amount of memory (1024x768x4) as the original image. You then convert it to a new JPG with less compression giving you 1,308,415 bytes.
Even though you create an uncompressed version of the compressed image, the number of bytes comes from converting the full sized UIImage. The 2nd, uncompressed image, though bigger, will still have the same lower quality of the compressed image.
Since your data represents a JPG, anything that downloads the data will be able to treat the data as a JPG, including an Android phone.

The number of bytes is bigger for the second image because you passed a much higher compression quality value to UIImageJPEGRepresentation. Higher quality takes more bytes.
The file once uploaded to a server will be a standard JPEG file, viewable by any device, including Android.

Related

why the UIImagePNGRepresentation return larger data than (10 times than) image's raw data size

when i download a image use sdwebimage, and i print in the image downloader operation's - (void)URLSession:(NSURLSession *)session task:(NSURLSessionTask *)task didCompleteWithError:(NSError *)error {
UIImage *image = [UIImage sd_imageWithData:self.imageData];
NSString *key = [[SDWebImageManager sharedManager] cacheKeyForURL:self.request.URL];
image = [self scaledImageForKey:key image:image];
NSLog(#"didCompleteWithError:%# data lenght:%ld png::%ld,jpg:%ld",self.request.URL,[self.imageData length],[UIImagePNGRepresentation(image) length],[UIImageJPEGRepresentation(image, 1) length]);
the logs turns that
data lenght:163480 png::202498,jpg:131774
and the raw data length is the exact file size as the image file in server , but when i create a image with these raw data , and use UIImagePNGRepresentation or UIImageJPEGRepresentation to get the UIImage's NSData length , it seems that both png and jpg presentation (no compression) of UIImage will be more larger the the raw data. what makes this happen?
PNG Files
PNG format is a lossless compression file format, which makes it a common choice for use on the Web. PNG is a good choice for storing line drawings, text, and iconic graphics at a small file size.
Portable Network Graphics is a lossless file format created with the intent to replace the GIF format, due to the patent restrictions of GIF compression. The project was a success and we now have complete access to the format, which is patent-free, has great compression, and is widely supported by web browsers. PNG files are used primarily for transparent images, simple-color images, and images that have hard lines, like text. There are two versions of PNG files: 8-bit PNG(known as PNG-8) and 24-bit PNG(known as PNG-24). PNG-8 is limited to 256 indexed colors, while PNG-24 has millions.
JPEG Files
Joint Photographic Experts Group created a file format, creatively named JPEG \ˈjā-ˌpeg\, to handle complex-color photographic images. When saving a file as a JPEG, users have the choice of quality vs. compression. More compression results in a smaller file size, but you will lose quality. Obviously, less compression results in a larger file-size, but also a higher-quality image. The great thing about JPEG compression is that you can usually find a balance that both looks good and has a small file size. Unfortunately, JPEG files have no transparency. Additionally, the file format is lossy, meaning that it loses some of it’s data each time it is compressed. If you re-save the same image multiple times for some reason, the image quality may be low.
JPG format is a lossy compressed file format. This makes it useful for storing photographs at a smaller size than a BMP. JPG is a common choice for use on the Web because it is compressed. For storing line drawings, text, and iconic graphics at a smaller file size, GIF or PNG are better choices because they are lossless.
JPEGs are for photographs and realistic images. PNGs are for line art, text-heavy images, and images with few colors. GIFs are just fail.

UIImagePNGRepresentation returns inappropriately large data

We have an UIImage of size 1280*854 and we are trying to save it in png format.
NSData *pngData = UIImagePNGRepresentation(img);
The problem is that the size of pngData is 9551944 which is inappropriately large for the input image size. Even considering 24 bit PNG, at the max it should be 1280*854*3 (3 for 24 bit png).
BTW, this is only happening with images scaled with UIGraphicsGetImageFromCurrentImageContext. We also noticed that image._scale is set to 2.0 in image returned by UIGraphicsGetImageFromCurrentImageContext.
Any idea what's wrong.

Image size optimization for less data usage

I'm working on a Instagram-like app on iOS, and I'm wondering on how to optimize the file size of each picture so the users will use as least data as possible when fetching all those pictures from my backend. It seems like it will drain data if you download the file just as it is since high resolution pictures are around 1.5MB each. Is there a way to shrink the size of the picture while maintaining the quality of the picture as much as possible?
you can compress the image by saving it into json binary data.
OR simply core binary data.
OR elegantly Swift Realm - Core binary data.
Do you really want to customise image by you own! because there are lots of libraries already available for doing that.
Which are more effective and powerful.
like,
AFNetworking Api
Its is wonderful it could not only compressed images as per UIImageView current available size according to device resolution but also give you image cache flexibility.
Here is the link of Pod File and github
Just try It you will love it
You can compress a UIImage by converting it into NSData
UIImage *rainyImage =[UImage imageNamed:#"rainy.jpg"];
NSData *imgData= UIImageJPEGRepresentation(rainyImage,0.1 /*compressionQuality*/); // Second parameter of this function is the compression Quality
Now use this NSData object to save or convert it into UIImage
To convert it again into UIImage
UIImage *image = [UIImage imageWithData:imgData];
Hope this will resolve your issue.
Image compression will loose in the image quality.

iphone sdk get actual size of image in bytes

How to get actual size of image ?
I am using
NSInteger actualSize = CGImageGetHeight(image.CGImage) * CGImageGetBytesPerRow(image.CGImage);
or
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
[imageData2 length];
but I don't get the actual size of image it is either larger of smaller compared to the size on the disk (as I am using simulator).
Is there any way to get actual size of the image?
It depends upon what you mean by "size".
If you want the amount of memory used while the image is loaded into memory and used by the app, the bytes-per-row times height is the way to go. This captures the amount of memory used by the uncompressed pixel buffer while the image is actively used by the app.
If you want the number of bytes used in persistent storage when you save the image (generally enjoying some compression), then grab the the original asset's NSData and examine its length. Note, though, if you load an image and then use UIImageJPEGRepresentation with a quality of 1, you'll generally get a size a good deal larger than the original compressed file.
Bottom line, standard JPEG and PNG files enjoy some compression, but when the image is loaded into memory it is uncompressed. You can't generally infer the original file size from a UIImage object. You have to look at the original asset.
Try this (for iOS 6.0 or later and OS X 10.8):
NSLog(#"%#",[NSByteCountFormatter stringFromByteCount:imageData2.length countStyle:NSByteCountFormatterCountStyleFile]);
UPDATE:
Question: Can you post code where you initialise your image?
Above solution did not work for you. Let's try something else. You could try to check directly image file size:
NSError* error;
NSDictionary *fileDictionary = [[NSFileManager defaultManager] attributesOfItemAtPath:mediaURL error: &error];
NSNumber *size = [fileDictionary objectForKey:NSFileSize];

UIImage takes up much more memory than its NSData

I'm loading a UIImage from NSData with the following code
var image = UIImage(data: data!)
However, there is a weird behavior.
At first, I used png data, and the NSData was about 80kB each.
When I set UIImage with the data, the UIImage took up 128kb each.
(Checked with Allocation instrument, the size of ImageIO_PNG_Data)
Then I changed to use jpeg instead, and the NSData became about 7kb each.
But still, the UIImage is 128kb each, so when displaying the image I get no memory advantage! (The NSData reduced to 80kb -> 7kb and still the UIImage takes up the same amount of memory)
It is weird, why the UIImage should take up 128kb when the original data is just 7kb?
Can I reduce this memory usage by UIImage without shrinking the size of the UIImage itself??
Note that I'm not dealing with high resolution image so resizing the image is not an option (The NSData is already 7kb!!)
Any help will be appreciated.
Thanks!!
When you access the NSData, it is often compressed (with either PNG or JPEG). When you use the UIImage, there is an uncompressed pixel buffer which is often 4 bytes per pixel (one byte for red, green, blue, and alpha, respectively). There are other formats, but it illustrates the basic idea, that the JPEG or PNG representations can be compressed, when you start using an image, it is uncompressed.
In your conclusion, you say that resizing not an option and that the NSData is already 7kb. I would suggest that resizing should be considered if the resolution of the image is greater than the resolution (the points of the bounds/frame times the scale of the device) of the UIImageView in which you're using it. The question of whether to resize is not a function of the size of the NSData, but rather the resolution of the view. So, if you have a 1000x1000 pixel image that you're using in a small thumbview in a table view, then regardless of how small the JPEG representation is, you should definitely resize the image.
This is normal. When the image is stored as NSData, it is compressed (usually using PNG or JPG compression). When it's a UIImage, the image is decompressed, which allows it to be drawn quickly on the screen.

Resources