UIImagePNGRepresentation returns inappropriately large data - ios

We have an UIImage of size 1280*854 and we are trying to save it in png format.
NSData *pngData = UIImagePNGRepresentation(img);
The problem is that the size of pngData is 9551944 which is inappropriately large for the input image size. Even considering 24 bit PNG, at the max it should be 1280*854*3 (3 for 24 bit png).
BTW, this is only happening with images scaled with UIGraphicsGetImageFromCurrentImageContext. We also noticed that image._scale is set to 2.0 in image returned by UIGraphicsGetImageFromCurrentImageContext.
Any idea what's wrong.

Related

iphone sdk get actual size of image in bytes

How to get actual size of image ?
I am using
NSInteger actualSize = CGImageGetHeight(image.CGImage) * CGImageGetBytesPerRow(image.CGImage);
or
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
[imageData2 length];
but I don't get the actual size of image it is either larger of smaller compared to the size on the disk (as I am using simulator).
Is there any way to get actual size of the image?
It depends upon what you mean by "size".
If you want the amount of memory used while the image is loaded into memory and used by the app, the bytes-per-row times height is the way to go. This captures the amount of memory used by the uncompressed pixel buffer while the image is actively used by the app.
If you want the number of bytes used in persistent storage when you save the image (generally enjoying some compression), then grab the the original asset's NSData and examine its length. Note, though, if you load an image and then use UIImageJPEGRepresentation with a quality of 1, you'll generally get a size a good deal larger than the original compressed file.
Bottom line, standard JPEG and PNG files enjoy some compression, but when the image is loaded into memory it is uncompressed. You can't generally infer the original file size from a UIImage object. You have to look at the original asset.
Try this (for iOS 6.0 or later and OS X 10.8):
NSLog(#"%#",[NSByteCountFormatter stringFromByteCount:imageData2.length countStyle:NSByteCountFormatterCountStyleFile]);
UPDATE:
Question: Can you post code where you initialise your image?
Above solution did not work for you. Let's try something else. You could try to check directly image file size:
NSError* error;
NSDictionary *fileDictionary = [[NSFileManager defaultManager] attributesOfItemAtPath:mediaURL error: &error];
NSNumber *size = [fileDictionary objectForKey:NSFileSize];

How to get NSData representation of UIGraphicsGetImageFromCurrentImageContext() [duplicate]

This question already has answers here:
convert UIImage to NSData
(7 answers)
Closed 7 years ago.
I'm taking a "snapshot" of the image context in UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, true, 0) and eventually creating a UIImage using
var renderedImage = UIGraphicsGetImageFromCurrentImageContext()
However I need to get the NSData representation of this UIImage without using UIImageJPEGRepresentation or UIImagePNGRepresentation (because these produce files that are way larger than the original UIImage). How can I do this?
these produce files that are way larger than the original UIImage). How can I do this?
Image files contain compressed data, while NSData is raw i.e. not compressed. Therefore NSData will in about all cases be larger, when written into a file.
More info at another question: convert UIImage to NSData
I'm not sure what you mean by "way larger" than the original UIImage. The data backing the UIImage object is at least as big as the data you would get by converting it into a JPG, and roughly equivalent to the data you would get by converting it to a PNG.
The rendered image will be twice the screen size in points, because you have rendered a retina screen into an image.
You can avoid this and render the image as non-retina by making your image context have a scale of 1:
UIGraphicsBeginImageContextWithOptions(UIScreen.mainScreen().bounds.size, true, 1)

UIImage takes up much more memory than its NSData

I'm loading a UIImage from NSData with the following code
var image = UIImage(data: data!)
However, there is a weird behavior.
At first, I used png data, and the NSData was about 80kB each.
When I set UIImage with the data, the UIImage took up 128kb each.
(Checked with Allocation instrument, the size of ImageIO_PNG_Data)
Then I changed to use jpeg instead, and the NSData became about 7kb each.
But still, the UIImage is 128kb each, so when displaying the image I get no memory advantage! (The NSData reduced to 80kb -> 7kb and still the UIImage takes up the same amount of memory)
It is weird, why the UIImage should take up 128kb when the original data is just 7kb?
Can I reduce this memory usage by UIImage without shrinking the size of the UIImage itself??
Note that I'm not dealing with high resolution image so resizing the image is not an option (The NSData is already 7kb!!)
Any help will be appreciated.
Thanks!!
When you access the NSData, it is often compressed (with either PNG or JPEG). When you use the UIImage, there is an uncompressed pixel buffer which is often 4 bytes per pixel (one byte for red, green, blue, and alpha, respectively). There are other formats, but it illustrates the basic idea, that the JPEG or PNG representations can be compressed, when you start using an image, it is uncompressed.
In your conclusion, you say that resizing not an option and that the NSData is already 7kb. I would suggest that resizing should be considered if the resolution of the image is greater than the resolution (the points of the bounds/frame times the scale of the device) of the UIImageView in which you're using it. The question of whether to resize is not a function of the size of the NSData, but rather the resolution of the view. So, if you have a 1000x1000 pixel image that you're using in a small thumbview in a table view, then regardless of how small the JPEG representation is, you should definitely resize the image.
This is normal. When the image is stored as NSData, it is compressed (usually using PNG or JPG compression). When it's a UIImage, the image is decompressed, which allows it to be drawn quickly on the screen.

iOS returning image with different size?

I want to upload an image up to 8 mb. so for testing I have added image in my photo gallery of size 4.2 mb (dimension : 3264 X 2443). But when I am picking that image for uploading, I have checked size of image. But it returning 9840076 bytes i.e. 9.3842 mb which is 4.2 mb in actual. So image of size 4.2 mb is not able to upload.
I have used below method to calculate size of image, and this is returning 9840076 bytes.
[UIImageJPEGRepresentation(imageRerurnedFromPhotoGallery , 1.0) length];
Am I doing something wrong in calculating size of image?
Please suggest me the proper way.
Thanks in advance!
why dont you try like this?
you will get size in bytes
NSData *data = [[NSData alloc] initWithData:UIImageJPEGRepresentation((Your Image), 0.5)];
int imageSize = data.length;
NSLog(#"size of image is: %i ", imageSize);
You are decoding and then re-encoding the image, which is why the file size may be different. If you don't need to modify the image at all, use ALAssetsLibrary to get the image as an NSData object rather than a UIImage object. Then, when you look at the NSData's length property, your file size will match exactly. See Using ALAssetsLibrary and ALAsset take out Image as NSData for some sample code.

NSData length of an Image compressed with UIImageJPEGRepresentation()

We know the image can be compressed with the Method UIImageJPEGRepresentation() as in the following codes.
NSData *imgData = UIImageJPEGRepresentation(imageResized, 0.5);
NSLog(#"imgData.length :%d",imgData.length);
imageResized = [UIImage imageWithData:imgData];
NSData *imgData2 = UIImageJPEGRepresentation(imageResized, 1);
NSLog(#"imgData2.length :%d",imgData2.length);
The log is:
2013-02-25 00:33:14.756 MyApp[1119:440b] imgData.length :371155
2013-02-25 00:33:20.988 MyApp[1119:440b] imgData2.length :1308415
What Im confused is that why the length of imgData and imgData2 are different. In my App, the image should be uploaded to the server. Should I upload the NSData to the server for saving storage? Is it possible for an Android phone to download the NSData and convert it to an image? Any help will be appreciated!
You start with a UIImage of some size (say 1024x768). This takes 1024x768x4 byes in memory. Then you compress it with a factor of 0.5 and get 371,155 bytes.
You then create a new UIImage with the compressed data. This is still a 1024x768 (or whatever) UIImage so it now takes the same amount of memory (1024x768x4) as the original image. You then convert it to a new JPG with less compression giving you 1,308,415 bytes.
Even though you create an uncompressed version of the compressed image, the number of bytes comes from converting the full sized UIImage. The 2nd, uncompressed image, though bigger, will still have the same lower quality of the compressed image.
Since your data represents a JPG, anything that downloads the data will be able to treat the data as a JPG, including an Android phone.
The number of bytes is bigger for the second image because you passed a much higher compression quality value to UIImageJPEGRepresentation. Higher quality takes more bytes.
The file once uploaded to a server will be a standard JPEG file, viewable by any device, including Android.

Resources