UICollectionview lags with local stored images - ios

I have simple application, who consumes data from local core data storage. I am displaying images in collectionview & it lags when i scroll. All images are from local bundle itself.
Can anyone help?
Thanx.

Store all the images in an array once and use that array instead of fetching images from core data everytime the cell loads.

A typical solution to this would be to use the proxy design pattern and to load images asynchronously on a background thread.
You could have a UIImageView subclass that shows a placeholder image or a blank rectangle, but that is automatically updated once the image has been loaded. Some apps even use this to make a fancy "polaroid fade" effect, i.e. fade the image from white or black when it goes on-screen.
To prevent placeholder images from ever being seen (although you'd want to test this on device, as local storage might be fast enough to make this unnoticeable), you could pre-cache images in advance (for example by anticipating a number of extra rows in your cellForRowAtIndexPath method)

Related

Displaying multiple images causes massive memory usage

I have an app that allows a user to go into their gallery and select photos from their photo library to add to the app, it is displayed in a tableview
This is then added to an array which after UIImagePickerController fetches the image, reloads the collection view with the new images.
My problem is that this uses large amounts of memory. I need a way to display the photos added exactly like the native photos app on the iPhone.
I've looked in lazy loading but I have no idea where to begin.
Could someone please tell how I would go about displaying images in a UICollectionView with lazy loading, or at least reducing the amount of memory used.
The app is generally at normal use using around 10mb of memory. This increases to 50mb+ when displaying a multitude of images in the collectionView.
Thanks.
Holding high-resolution images into an array will generally be problematic. Also, using the full resolution images for thumbnail sized image views in collection view is an extravagant use of memory.
So, when the user selects an image, capture a reference to that asset's URL. Then, as images are required by cellForItemAtIndexPath, retrieve the image, resize it to thumbnail dimensions (e.g. you could use something like this) and use that in your cell's image view.
If you want to be elegant about it, implement a NSCache in which you'll cache previously resized images, but make sure you have reasonable retention rules and have that purge itself upon memory pressure. That way, cellForItemAtIndexPath can see if the image exists in its cache, and if so, use that, otherwise go back to the assets library and resize that image. But by using cache, you can speed up the process of scrolling back through images that were previously resized.
But the key is to avoid holding high resolution images in memory. And if you're going to hold even the thumbnails in memory, you might want to capture that in something like a NSCache rather than an array.

Best image caching strategy in iOS

In my app, I have a UITableView which displays fairly large images and loads a moderately designed Xib file to display it in. Each image is around 700KB to 1MB in size. The flow is virtually never ending, it loads more and more as we scroll down. So you can imagine that I am running into memory issues.
I have tried using SDImageCache and NSCache. The former used disk memory for caching images. In both cases, the caches somehow didn't clear images automatically. I had to manually clear them when I got a Received memory warning prompt. And each time I clear these caches, the memory freed seems to be lesser each subsequent time.
Now I confused as to which cache strategy I must use for such a long list of images. Might I be having some leaks somewhere? They certainly didn't show up when I profiled the app.
P.S.: I am loading the images from the web. Just to be clear.
From the docs:
UIImage
+(UIImage *)imageNamed:(NSString *)name
Discussion This method looks in the system caches for an image object
with the specified name and returns that object if it exists. If a
matching image object is not already in the cache, this method loads
the image data from the specified file, caches it, and then returns
the resulting object.
So I guess leaving this to the UIImage class is a good approach.
Hope this helps!
As we implemented it in both Android and iOS: once you can show on the screen only 2-3 images.
Load in memory 2 more for the downwards scroll and 2 more for the upwards one. So you have in memory 7 images. Display them. The other images must be stored in files (when you download them). If the user scrolls too fast, do not show all the sequence of the images, instead show some "loading" icons in place of the images. When the scrolling stops, show the appropriate image + the previous one + the next one + prepare 2 more (for upwards scroll) and 2 more for downwards scroll.

How does the iPhone photo's app load so many images so quickly and smoothly?

I'm building an application that requires a bunch of local images to be displayed in the imageview of a uitableviewcell. However, i'm having difficulty optimizing the performance of the uitableview. I've noticed two issues specifically: first, the view takes a while to load. Second, the scrolling gets laggy when new cells are displayed.
The viewDidLoad is loading in the images like this:
for (Object *object in self.objects)
{
object.thumbnail = [UIImage imageNamed:object.imageName];
}
this is obviously causing the long-load issue, but I'm not sure how else to get those images loaded. Is it a size issue? is this just a bad way of doing it?
The process of displaying the images also seems to be problematic, in other words, even after the images have been assigned to the thumbnail property, they still take too long to be drawn.
Although this is a specific case, I'm curious more generally on how apple loads images in photos so efficiently. Any insights? thanks
Whenever I find that my UI is lagging, the first thing I suspect is that I am performing some operation synchronously (on the main thread) that should be performed asynchronously (on a background thread).
I am also very curious as to how exactly Apple is achieving that performance in the photo app. I am writing an app that has similar requirements as yours right now. My current approach is to load a bunch of photos from disk into memory asynchronously as soon as the user opens my view controller, and continue to load (and remove photos) from memory - ahead of time - as the user scrolls.
Currently, I load each photo from disk asynchronously in cell for row at index path, which is pretty fast, but causes this cascading effect to happen if you scroll quickly through the table view. That is, the cell will appear empty for a moment before the photo appears.
I hope this sheds some light for you.
You may also be interested in trying SDWebImage - which includes an image cache object that has been making my life easier when dealing with local photos.
They are doing everything in the background on various detached threads most likely. This will allow for a great deal of fluidity in applications that are hosting/presenting a great deal of information. I have created a photo gallery myself in various applications, loading many photos simultaneously from different web APIs and whatnot, and by simply creating new threads and managing the allocation efficiently and accurately, you can get a very smooth interface/interaction
Well, the long-load issue is caused by this part of your code - UIImage imageNamed:, because this method caches all the images on the same thread, and also could crash the app if the memory is overloaded.
Try looking at this library - it should do what you are trying to achieve :)
http://www.cocoacontrols.com/platforms/ios/controls/ktphotobrowser

Download an image from the web and caching it by using Core Data

Imagine you use a web service that delivers images via an API. Every image has an UID and you have a Core Data entity Image with attributes uid (int) and image (transformable).
Now a gallery of your app needs to show many images (the UIDs are known). You don't know which of the images have been downloaded and stored before. How can you lazily download images with core data in the background, so that the view may show an UIActivityIndicator during loading and automatically shows the image as soon as it is stored locally (e.g. by using the NSFetchedResultsControllerDelegate protocol)?
Is it useful to subclass UIImageView for that purpose?
Yes, you can use Core Data for this, but be forewarned that there is a performance hit when you use Core Data to store images. It's negligible if you're dealing with small (e.g. thumbnail) images, but for very large images, the performance hit is observable. For large images, I'd store the images in Documents folder (or, better, a subfolder), and use Core Data to keep track of what images have been downloaded, their filenames, etc. But if the images are smaller, keeping everything right there in Core Data is cleaner.
I probably would not want to use a subclassed UIImageView for this purpose, because you might want to decouple the presentation layer (the image view) from the caching of images. Also, for sophisticated user interfaces, UIImageView objects may be discarded or reused as the user scrolls through a big collection of images, so you might not want a hard link between the UIImageView and your caching logic. Also, depending upon your user interface, sometimes the images can dictate something broader than the UIImageView (e.g., if you're using a tableview, you might want to adjust the cell height based upon the image as it's downloaded). The particulars of the implementation of what the UI might do depend upon where the image is being used (a UITableView, a UIScrollView that is showing a grid of images, etc.).
So, bottom line, for simple user interfaces, perhaps you could subclass a UIImageView but I'd generally advise to have some custom image caching object that does the lazy loading with some delegate protocol to tell the UI when the image load is complete.
I wouldn't use Core Data to store images, but I'd store them on disk. I was trying to find in the documentation where Apple themselves warn against storing images larger than 100k or so in Core Data since you'd run into a performance issue.
However, I found this article that talks about Core Data Image Caching that may be of use.
Also, here's another Stack Overflow post with a good answer that tells you when to store images in a database and when to just use references to disk storage.
If you don't absolutely need Core Data, then may I recommend using MWPhotoBrowser?
https://github.com/mwaterfall/MWPhotoBrowser
It essential generates a gallery view controller for you which you can push onto your navigation controller. The gallery view controller has scrollable images with pinch zoom, pan, everything, even emailing the photo to someone.
It also does the lazy loading of the image with the activity indicator.
Short answer: everything you wanted to do without reinventing the wheel.

Showing large UIImage causes jittering even though it is already in memory

I have app similar to sample app PhotoScroller, e.g. lot of large images (2048x1536) in scrollview. I am not using tile approach as I don't like that partial load effect. I would like to show whole image at once. I am loading images in background thread. When I try to use loaded image in UIImageView for the first time, it blocks main thread for half a second even tho it is already in memory.
I used profiler to see where this lag is coming from but I couldn't find any useful information there.
Is iOS copying image data when it is used for the first time or something like that? Can I somehow do that in background thread as well?
EDIT: when I scroll there and back again and use that same UIImage second time, there is no delay
Try to make image in background thread too.
Do you loading image from inet or locally? bundle or custom path?
I think I solved it. Turns out when ios loads UIImage from disc or web, it doesn't load it in correct format. So when you want to display it, ios has to reformat this image into the format that he can use. This reformating can cause visible stuttering when image is big enough.
To prevent this, you need to reformat UIImage after it was loaded and you can do that safely in background (as far as I know). This link show how to do it.

Resources