memory size of UIImageView - ios

I'd like to know the memory size for a UIImageView object as I need to show some large image and I need to handle the memory. I guess it is decided by the image property. But I'm not sure how to calculate the actuarial memory size,and below is the code I write to test:
//1.jpg has a size of 4016X2657 and 2.1MB
NSData *imageData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"1" ofType:#"jpg"]];
NSLog(#"imageData:%d",[imageData length]);
The console shows:
imageData:2070461
This is exactly the size of 2.1MB.
However, in my opinion the UIImageViewshould know the each pixel to show the image and in other words it should have a memory of :
4016*2657*4/1024/1024 = 40.7+MB
It is so large and I don't know whether iOS will do some optimization or not.And I also can't find any relevant in the document.
Could anyone help me what is the exactly memory size of a UIImageView object?

It is described in the question. The memory is the actuarial size of the photo and UIImage will not do optimization.

Related

Quick way to load images

Right now I'm loading images view a file url and its taking a long time. I've even put it on the main thread as a high priority but its still slow. This is actually in a loop for like 6 images. My question is:
Is there a faster way to load images to a view than this? Like an alternative to a file url?
//check the filetype
if ([fileType isEqual: #"image"])
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
//get image
NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
dispatch_async(dispatch_get_main_queue(), ^{
imageView.image = [UIImage imageWithData:imageData];
});
});
}
My images are at a quality of this preset and the size is the size of the iphone screen whether that be a 5,6,or 6 plus
self.camera = [[LLSimpleCamera alloc] initWithQuality:AVCaptureSessionPreset1280x720
position:CameraPositionBack
videoEnabled:YES];
thanks,
While there isn't any shortcut to loading the images there are things you can do to help.
Initially only load the image that are visible, after that load any that may be visible soon.
If the size of the image is larger than the view create a smaller image, perhaps a couple for different screen resolution devices.
Create a low resolution image versions, load them first to get something up for the user and then load the full resolution images replacing the low resolution version..
Please, Add the sizes of the images and the sizes of the UIImageView and screen resolution to the question.
What can I suggest you that,
Save the duplicate copy of each image in lower resolution/Size on
server.
Load the duplicate image on first load.
When user particularly views an image, load original image then.

iphone sdk get actual size of image in bytes

How to get actual size of image ?
I am using
NSInteger actualSize = CGImageGetHeight(image.CGImage) * CGImageGetBytesPerRow(image.CGImage);
or
NSData *imageData2 = UIImageJPEGRepresentation(image, 1.0);
[imageData2 length];
but I don't get the actual size of image it is either larger of smaller compared to the size on the disk (as I am using simulator).
Is there any way to get actual size of the image?
It depends upon what you mean by "size".
If you want the amount of memory used while the image is loaded into memory and used by the app, the bytes-per-row times height is the way to go. This captures the amount of memory used by the uncompressed pixel buffer while the image is actively used by the app.
If you want the number of bytes used in persistent storage when you save the image (generally enjoying some compression), then grab the the original asset's NSData and examine its length. Note, though, if you load an image and then use UIImageJPEGRepresentation with a quality of 1, you'll generally get a size a good deal larger than the original compressed file.
Bottom line, standard JPEG and PNG files enjoy some compression, but when the image is loaded into memory it is uncompressed. You can't generally infer the original file size from a UIImage object. You have to look at the original asset.
Try this (for iOS 6.0 or later and OS X 10.8):
NSLog(#"%#",[NSByteCountFormatter stringFromByteCount:imageData2.length countStyle:NSByteCountFormatterCountStyleFile]);
UPDATE:
Question: Can you post code where you initialise your image?
Above solution did not work for you. Let's try something else. You could try to check directly image file size:
NSError* error;
NSDictionary *fileDictionary = [[NSFileManager defaultManager] attributesOfItemAtPath:mediaURL error: &error];
NSNumber *size = [fileDictionary objectForKey:NSFileSize];

iOS: Instruments shows imageio_png_data is 300x larger in size than its actual image size

I have an image that is only 28KB in size:
I'm adding it to my view using this code:
UIImageView *background = [UIImageView new];
background.frame = CGRectMake(0, 0, 1080, 1920);
background.image = [UIImage imageNamed:#"Submit.png"];
[self.view addSubview:background];
Now I'm profiling with Instruments Allocation and "Marking Generation" right before and right after the image is allocated:
Instruments indicates that it took 7.92MB to load the image into memory.
I'm seeing the same issue with other images as well.
Why is ImageIO_PNG_Data at 7.92MB when the image is only 28KB in size?
#matt and #dan really did a good job of explaining why an uncompressed image should take up literally 300X memory of the actual PNG image size to display on the screen. What makes this issue worse is that the iOS caches these images and does NOT release them from cache EVER, even on memory warnings.
So here's a way to prevent image caching on iOS to save up a ton of memory, just use imageWithContentsOfFile instead of imageNamed:
Replace:
background.image = [UIImage imageNamed:#"Submit.png"];
With:
background.image = [UIImage imageWithContentsOfFile:[[[NSBundle mainBundle] bundlePath] stringByAppendingString:#"/Submit.png"]];
and now the ImageIO_PNG_Data's will be released when the view controller is dismissed.
It's all right here:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIImage_Class/#//apple_ref/occ/clm/UIImage/imageNamed:
If you have an image file that will only be displayed once and wish to
ensure that it does not get added to the system’s cache, you should
instead create your image using imageWithContentsOfFile:. This will
keep your single-use image out of the system image cache, potentially
improving the memory use characteristics of your app.
It's because a PNG is compressed data describing what the image looks like, so a PNG that is nothing but a solid color is tiny because it is easy to describe. But the bitmap is the bitmap - just a grid of pixels - and depends purely on the dimensions of the image (which, in your case, is immense).

Preloading images ios app

I have an app with 150 local images (about 500kb each). I have loaded them all into an array like this:
allPics = [[NSMutableArray alloc] init];
//NSString *imagePath;
NSArray *result = [database performQuery:#"SELECT image_path FROM validWords order by valid_word"];
for (NSArray *row in result) {
NSString *temp = [row objectAtIndex:0];
NSLog(#"%#", temp);
//imagePath = temp;
UIImage *newImage = [UIImage imageNamed:temp];
[allPics addObject:newImage];
}
When I set my UIImageView later to one of these pics, it hangs my interface up for a second, due to lazy loading from what I have read. I tried to prerender them, but that spiked my memory usage to over 3gb before it got a third of the way through my images. Should I be looking to use a background thread to render the image when I need it? When I reduced the image total to 4, once all 4 were rendered once, the transitions between them was seamless.
I appreciate any and all tips and solutions!
Yes, I would suggest a background thread and paging. If the user is looking at image 7, you should load images, say, 5,6,8 and 9. If the user then moves onto image 8, you can discard image 5 and lazy load image 10. This way the user sjhould be able to move through your images without a significant memory or performance overhead.
You can then also add heuristics such as 'if the user is paging through the images very quickly, don't load any images until they slow down',
Another tip is to store a very low resolution version of the image (say, a 50kb version) and store that at a different path. Then you can show the thumbnail images to the user and only lazy load in the high res image if the user stops on that image for a period of time.
Finally, be careful when you talk about image sizes. Is the 500KB compressed or uncompressed? If it is a 500KB compressed JPeg, the actual image on the device could be vastly bigger. A jpg with fairly uniform colour and a reasonably high level of compression can be very small on disk, but decompressed it could be a massive image. This could be another source of the lag you experience.

How to determine the number of bytes used by a UIImage?

I would like to be able to calculate the total number of bytes a UIImage uses in memory.
I can make a rough estimate by multiplying the width by the height and then by a multiplier number of bytes, but I'd like to calculate the size exactly if possible.
In general, objects don't have a single meaningful "size", since they can allocate and release any number of other objects privately as needed. sizeof(*myObj) only gives you the size of the top level structure, not a very useful number. If you need the complete memory impact of allocating and using an object, run under Instruments and watch allocations.
For a UIImage, its practical size is the size of whatever is backing it, typically either an NSData containing a PNG, or a CGimageRef, plus the object overhead. (There's also the pixel buffer when it gets rendered to the screen or other context; but that buffer belongs to the view or context in question, not the UIImage. If a UIView is doing the rendering then that buffer is likely in GL texture memory anyway.)
[UIImage imageWithData:[NSData dataWithContentsOfFile:#"foo.png"]] gives you a UIImage that is the same size as the foo.png file, plus some inconsequential overhead. [UIImage imageNamed:#"foo.png"] does the same thing, except that the class maintains a cache table of one object per filename, and will cause that object to dump its memory copy of the png in low-memory situations, reducing its "size" to just the overhead.
imageWithCGImage: and variants give you an UIImage that uses a CGImage reference as its backing store, and CGImages can be any number of things depending on their source. If you've been painting in one, it's probably an uncompressed pixel buffer. Calculate its size exactly as you propose above. If you need what its size "would be" if it were from a file, inspect the result of the UIImagePNGRepresentation or UIImageJPEGRepresentation functions.
Width * height * 4 will get you close. I'm not sure there's a way to get the exact size, since width is rounded out to an arbitrary, undocumented boundary (at least 4 pixels or 16 bytes, I gather), and there are several extra internal pieces of the object that you'd need to count. Plus likely there are internal attributes that are hung on the object or not, based on its use.
I had to solve this for a twitter app I was writing. Twitter rejects images larger than 3MB, so I needed to compress the image just enough to get below the 3MB limit. Here is the code snippet I used:
float compression = 1.0f;
NSData* data = UIImageJPEGRepresentation(photo, compression);
while(data.length > 3145728) //3MB
{
compression -= .1f;
NSLog(#"Compressing Image to: %lf", compression);
data = UIImageJPEGRepresentation(photo, compression);
NSLog(#"Image Bytes: %i", data.length);
}
The compression algorithm I used is non-optimized.
So, What is it doing?
Good question! The UIImageJPEGRepresentation method returns a byte array. To get the size, simply check the length of the array!
There is also a UIImagePNGRepresentation method. Keep in mind, these method are having to build byte arrays, and if needed convert the binary representation of the data. This can take a bit of time. Luckily in my case, most images taken by the iPhone are already less than 3MB, and will only need the compression if there is a wide range of colors; but calling the UIImageJPEGRepresentation method repeatedly (which could happen in my posted code) can take some time.

Resources