Loading large images faster in UIImageView - ios

Is there an accepted way, or updated way, to more quickly load images into a UIImageView?
My scenario: A collection view, with a large UIImageView. Only 1 cell is displayed at a time. I have implemented NSCache and Prefetching on the collection view already. Performance on scrolling has a pause partway through. The images I am using are "relatively" large, in order to accomodate both an iPad and iPhone layout. For example, images are 1600x1600px, RGB, PNG. (from 2-5MB compressed, ~10MB uncompressed, stored locally in the app)
Once the images are loaded, scrolling back and forth is usually OK then, ~60fps visually. But on first load they are ALWAYS jittery. BUT if I make the images physically smaller, such as 800x800 then they load quickly and I can not see a jitter on scroll. So I am dealing with an image size vs drawing speed issue. Same issue seen on a 5s as on an iPhone X.
The same performance hit happens with [UIImage imageNamed:imageName] or [UIImage imageWithContentsOfFile:imagePath]
I am reading how UIImages are decompressed before actually being drawn, and if the system has to draw a subsampled image, it can significantly affect main thread performance. I've done a little Instruments testing and confirmed that it appear that none of my code is actually slow, the image drawing of a PNG is slow.
Is there a newer way to do something similar to the content in links below, IE draw an image in a CGContext and hope it stays cached?
https://www.cocoanetics.com/2011/10/avoiding-image-decompression-sickness/
- (void)decompressImage:(UIImage *)image
{
UIGraphicsBeginImageContext(CGSizeMake(1, 1));
[image drawAtPoint:CGPointZero];
UIGraphicsEndImageContext();
}
https://gist.github.com/steipete/1144242
this seems like real overkill for me:
https://github.com/path/FastImageCache

I have implemented NSCache and Prefetching on the collection view already.
Good, but
(1) you should not NSCache images
(2) you should downsize the image to the max size needed for display
(3) you should not be doing anything time-consuming in itemForRowAt: (you didn't show yours) — you have only a couple of milliseconds to produce the cell and get out
(4) if you can't provide the image in time, provide a placeholder and get out of itemForRowAt:; you can always reload later when you have the real image
(5) do all time-consuming work off the main thread (includes converting to UIImage and drawing UIImage to downsize it)
(6) measure measure measure! this is why we have Instruments; do not guess where the problem is

Related

How to optimize the memory when displaying same image multiple times?

I have an instance of UIImage with an image with size of 200KB, then I create 5 instances of UIImageView that reference to same this UIImage.
I wonder how much memory allocated in this case - only 200KB (of one UIImage instance) or 1MB (for 5 cloned UIImage instances)? In the case of wasting memory occured, is there effective way to solve it?
A couple of thoughts:
UIImage is a reference type, so when you reference the same image five times, you generally will have one image object in memory. It depends a little upon how you do this. For example, if you use UIImage(data:) each time, or something like that, it's possible to instantiate a new object each time, but if you instantiate only one UIImage and then proceed to use if five times, then you won't see duplicative memory consumption taking place.
As an aside:
You say the image has a size of 200kb. Is that the size of the original asset, or have you figured out that this is how much memory it will take at run time?
The reason I ask is that JPG and PNG files are generally compressed, but when you use it in an image view, it will be uncompressed. The amount of memory that an image takes has little to do with the file size of the original asset, but rather corresponds to the dimensions (in pixels) of the image. So a random PNG that is 676 kb that is 2560 x 1440 pixels may actually require 14mb of memory (four bytes per pixel).
Note, this memory consumption corresponds to the dimension of the image in question, not the dimensions of the image view to which you added it. If you're concerned about memory usage and if the image dimensions exceed the size of the image view (times the device scale), then you might want to consider resizing the image.
In the future, you can answer these questions empirically using Instruments. For example, in the following timeline, at the green signpost, I loaded a UIImage with the 676kb asset with modest memory impact, I set the image view image to use this asset at the purple signpost with a significant memory impact as it uncompressed this 2560 x 1440 px image, and I loaded five more image views with the same image at the orange signpost with negligible further memory impact.

Bigger size image in smaller image view

This is more sort of a logical question, everything is working fine.
I have an ImageView and for that I download images from the web server. Our web server keeps the biggest size image and then I am rescaling the images down for required devices. So lets say, if I have an UIImageView with size 200 * 200 and I am downloading image of 400 * 400, I rescale the image to 200 * 200 and then I put it in the imageview, I tried putting 400 by 400 image in 200 by 200 image view and it looks fine to me (no pixelation). The way I implemented the downscaling is
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
within the image context. Now I feel like apple might be doing this anyway because it is rescaling my image to fit in the image view, so is it really required? Or should I just put high resolution images directly in the image view?
Suggestions required.
You should be fine to just assign a 400x400 UIImage to a 200x200 UIImageView. CoreAnimation will deal with the image scaling underneath.
Image Quality
If you want to experiment with different image scaling qualities, you can set the minificationFilter on the UIImageView's layer. The default is kCAFilterLinear, which I think would be fine for your usage. Multiple pixels from the 400x400 image will be selected and linearly blended together to get the 200x200 image pixel color. kCAFilterNearest will get you better performance at the cost of image quality (a single pixel from the 400x400 image is selected to get the color for the 200x200 image pixel).
You could experiment using kCAFilterTrilinear instead, which should get you better image quality at the cost of some performance. The documentation doesn't make it clear which devices this will actually have an affect, although this guy's had success using it on an iPad 2 which makes me think it may be supported on all devices now. The documentation also notes your image may need to be of dimensions of a power of 2 for this to have affect.
Performance
You could scale the image down from 200x200 perhaps as a performance optimization to save memory and CoreAnimation render time (including the image scaling), but I wouldn't do that unless you have reason to think your app's performance would actually benefit from this.

Display large images on iOS without precut tiles

I'm building a camera application that saves the image data to a single JPEG file in the sandbox. The images average at about 2mb in size.
Problem : I cannot display the images in a photo viewer because having a few images in memory throws memory warnings and makes scrolling through the images very slow.
I cannot split the image into tiles and save them to disk because that's even more expensive than displaying the single image. I tried splitting the image up into tiles upon capture, but on the 5S it took, on average, 5 1/2 seconds to save all the tiles to disk. It will only get worse from there as the app is executed on older devices. This is very bad because what if the user exists the app in the middle of the save? I will have missing tiles and no uncompressed original file to find missing tiles later.
Question : what's the best way to show a full sized image without causing memory issues and keeping the scrolling fast? Tons of camera applications on the App Store do this and the Photos app does this, there has to be a good solution.
Note : I'm currently showing a thumbnail of the image and then loading the full size image from disk in another thread. Once the full size image loading has finished, I present the full size image on the main thread. This removes the memory issues because I only have one full size image in memory at once, with two thumbnails, but still causes lagging on the scrollview because drawing the full size image in the main thread is still pretty expensive.
I would greatly appreciate any input!
you could..
create a down sized thumb nail..
create a smaller image and save that in a different "sandbox" folder.. and read that for browsing.. then after that load the image if the user wants to look at it full size.
One way to deal with this is to tile the image.
You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.)
This is how applications like Photoshop deal with this situation.
Second way which I suggest you is to
check the example from Apple for processing large images called PhotoScroller. The images have already been tiled. If you need an example of tiling an image in Cocoa check out cimgf.com
Hope this will helps you.

Poor memory management performance for images on ios devices

I have the following issue:
I have a primary view object (that inherits from UIView) that displays a grid of 16 squares (each is a class I created that inherits from UIImageView), in a 4x4 layout.
Each of these 16 squares is 160x160, and contains an image (a different image for each square) that is no bigger than 30kb. The image, however, is 500x500 (because it is used elsewhere in the program, in its full size), so it gets resized in the "square" class to 160x160, by the setFrame method.
By looking at the memory management feature of Xcode when the app is running, I've noticed a few things:
each of these squares, when added to the primary view object, increase the memory usage of the app by 1MB. This doesn't happen at instantiation, but only when they are added by [self addSubview:square] at the primary view object.
if I use the same image for all the squares, the memory increase is
minimal. If I initialize the square objects without any images, then
the increase is basically zero.
the same app, when running in the simulator, uses 1/6 of the memory
it does on an actual device.
The whole point here is: why is each of the squares using up 1MB of memory when loading a 30kb image? Is there a way to reduce this? I've tried creating the images in a number of different ways: [UIImage imageNamed:img], [UIImage imageWithContentsFromFile:path], [UIImage imageWithData:imgData scale:scale], as well as not resizing the frame.
When you use a 500x500 image in a smaller UIImageView, it's still loading the larger image into memory. You can solve this by resizing the UIImage, itself (not just adjusting the frame of the UIImageView), making a 160x160 image, and use that image in your view. See this answer for some code to resize the image, which can then be invoked as follows:
UIImage *smallImage = [image scaleImageToSizeAspectFill:CGSizeMake(160, 160)];
You might even want to save the resized image, so you're not constantly encumbering yourself with the computational overhead of creating the smaller images every time, e.g.:
NSData *data = UIImagePNGRepresentation(smallImage);
[data writeToFile:path atomically:YES];
You can then load that PNG file corresponding to your small image in future invocations of the view.
In answer to your question why it takes up so much memory, it's because while the image is probably stored as a compressed JPG or PNG in persistent storage, I suspect in memory it's held as an uncompressed bitmap. There are many internal formats, but a common one is a 32-bit format with 8 bits each for red, green, blue, and alpha. Regardless of the specifics, you can quickly see how a 500 x 500 pixel representation, with 4 bytes per pixel could translate to a 1 mb of memory. But a 160 x 160 image should be roughly one tenth the size.

IOS optimalization loading Images into UITableView

Hello I’ve problem with loading images in my UITableViewCells. Ofcourse I use dequeueReusableCellWithIdentifier for my cells. The major problem appears when I scroll table really fast, and cells which displays images freeze app for a 0.1sec but It’s wierd and user unfriendly. Images are cashed in array as UIImage, only what I do with images is setting for UIImageView. Any solutions?
You need to create a scaled-down version of each image to use in your table view. When you display an image on screen for the first time iOS needs to decode that image, which of course will take longer if the image is bigger. This alone can cause a bad scrolling experience. But then for each image on screen the GPU has to read the huge image and scale it down. This also takes a lot of time and produces a lower quality rendering than scaling the image using Core Graphics.
Images with 2000x3000 dimension is really big. Maybe try to crop it using MGImageUtilities(https://github.com/michaelhenry/MGImageUtilities)

Resources