Memory Management On Large Image Collection Apps - ios

What's the basic idea behind how large apps that contain many images (especially in a table or collection view) manage caching with dequeued cells? No one really answered my previous question about this iOS Memory Warnings and I'm trying to populate a collection view with 100+ images from Parse. After failing to find a good solution with manual code along with down-scaling my images to super low quality, I've tried a number of different libraries, including SDWebImage, LRImageManager, Haneke, and PINRemoteImage. I just want to get a good idea or approach on tackling my app's memory management. How is it that other apps can load hundreds of "fair" quality photos in a dequeue-able view? Is it rocket science?

The answer is a combination of things. Don't load all your images in memory at the same time. Save them as files to your documents or caches directory, and load them for each cell in cellForRowAtIndexPath, and simply discard the in-memory image when the user scrolls it off-screen.
Also, don't load a large image as a thumbnail. Instead, when you download it, immediately generate a thumbnail version at your target display size and save that (AND the original version, if you need it.) Then you can load and display the thumbnails, with much lower memory impact. Since you're generating the thumbnails for a particular device, you can generate only the thumbnail size you need (retina or non-retina, sized for the screen on the current device. {iPad/4" phone, 5" phone, iPhone 6, 6+, etc})

You should have a properly configures memory cache (NSCache, for instance) that would store images resized for their displayed size, which is good for both memory and drawing performance. Here's a tutorial on how to configure NSCache to store images. And that's exactly what DFImageManager does.
let targetSize = CGSize(width: 100, height: 100)
let request = DFImageRequest(resource: imageURL, targetSize: targetSize, contentMode: .AspectFill, options: nil)
let task = DFImageManager.imageTaskForRequest(request) { (image, _, _, _) -> Void in
// Image is resized to fill 100x100 px target size
// Image is stored into mem cache, next request would finish synchronously
var fetchedImage = image
}.resume()
// NSURLSession stores original data into NSURLCache

Related

Image optimisation using the Kingfisher library in iOS when loading an image?

I'm loading images on a list from a server. The UIImageView size is very small (40 x 40), So ideally an image with a size of 40 x 40 or a little larger should be loaded in it. However, the server guy is sending an image of a much larger size (1920 x 1200).
When I diagnosed the view controller memory I noticed that it increases as the image is loading.
So my question is
Will the Kingfisher library download a smaller size version of the image?
Will it abort the image loading if the image size is too big?
Is there any way by which can I specifically achieve the above task.
Kingfisher: https://github.com/onevcat/Kingfisher
Will the Kingfisher library download a smaller size version of the
image?
Nor Kingfisher nor any other library will download a smaller version of an image. The server has to send you a smaller version of it. (different url, or same url with a parameter)
Will it abort the image loading if the image size is too big?
I don't know how the library implements the downloading but unless you are talking about unreasonably big files (way bigger than the resolution you expect (1920x1200)) it will not be aborted. And even then, it probably won't be aborted but instead the app might be killed by the system when displaying super high resolution images (again, way bigger than 1920 x 1200)
Is there any way by which can I specifically achieve the above task.
If you want to display many many images at once and are worried about their sizes, its better to create thumbnails of them.
You can use Kingfisher to
1) first download the image
2) then resize the image
3) then show and cache it if you need
Check this page to see a ton of code snippets for that library.
https://github.com/onevcat/Kingfisher/wiki/Cheat-Sheet

UIImageJPEGRepresentation crashes in Share Extension on iPhone SE

I have a share extension that lets you crop an image and then upload it to our service. We call UIImageJPEGRepresentation to get the image's data before we upload it, but causes a crash due to memory excessive memory. This only happens with large images, and (as far as we can tell) on the SE, and didReceiveMemoryWarning is not called first. This is happening when using the Photos app.
Is there anyway to safely call UIImageJPEGRepresentation, or try to determine if the image is too large beforehand?
Why not check for the image file size? If the image file size exceeds a certain quota, then resize it.
let image: Data = UIImagePNGRepresentation(image)
var imageSize: Double = (image.length)/1024 // in KB

Memory issue in storing and retrieving Image from database

I am working on application which include native SQLite database in which I am storing and retrieving images and showing into My application.
Now my problem is Like I am storing lots of images in directory and its path storing into database. So when I retrieve that path from database and load image into application, Memory Increases upto 10-20 Mb per image.
I also tried to store image data into database but same issue, Memory increase 10-20 Mb per image.
Please what should I do for this memory issue ?
Help me with it
Images, when used in the app, may require considerably more memory than the size of the asset in persistent storage might otherwise suggest. Assets are frequently compressed (e.g. JPG or PNG), but when you use the image, they're uncompressed, often requiring 4 bytes per pixel (one byte for red, green, blue, and alpha, respectively). So, for example, a iPhone 7+ full-screen retina image can require 14mb when you use the image. So, the memory-efficient technique is to employ lazy loading, not creating the UIImage objects until you absolutely need them. And, as Jerry suggested, because the amount of memory is determined by the size of the image and not the imageview in which you use the image, if your images have dimensions greater than required by the UIImageView in which you use them (i.e. width and height of the imageview times the "scale" of the device), you may want to resize the image accordingly.
It may be the case that the images you are trying to display are much larger than they need to be on the screen. Try loading the images into memory, creating a version with the size you need, and then using that. Of course, you could implement some caching so you don't have to keep resizing the same images. But then when the original image goes out of scope, its memory will be released. If you always need your images at the same size, try resizing them before storing them in the database, but if you want to support multiple sizes (for different devices, maybe), then store the images at the largest size you need and resize for the others.

Camera App iOS memory usage

I'm making a camera app in iOS. Everything is fine except from the memory usage. This is the flow of the "takeImageTask":
AVCaptureStillImageOutput -> NSData -> UIImage (which I crop and do other changes to) -> save to ALAssetsLibrary
I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.
This flow uses a lot of memory, so what I thought could be a better solution was to use the AssetURL I get from the AVCaptureStillImageOutput writeImageToSavedPhotosAlbum to fetch the images from the phone storage. My app also have the possibility to fetch images from the photo album by using UIImagePickerController, and I've noticed that this just uses a fraction of the memory compared to the "takeImageTask". I first made the app as an Android version, and there I just stored all the URLs, and used them to present images. Does anyone have any experience using asseturls to present images in iOS?
I need to keep the UIImage in an array because I need to have them in a UICollectionView, and I have the possibility to show the images in a "big size" view.
This is flawed logic. You might want to keep a minimal size thumbnail image in memory for the collection view, but you should even drop these from memory before the count gets too big. And the big images should be loaded from disk on demand because it is relatively infrequent and relatively expensive (using the asset URL is a good plan for this).

Display large images on iOS without precut tiles

I'm building a camera application that saves the image data to a single JPEG file in the sandbox. The images average at about 2mb in size.
Problem : I cannot display the images in a photo viewer because having a few images in memory throws memory warnings and makes scrolling through the images very slow.
I cannot split the image into tiles and save them to disk because that's even more expensive than displaying the single image. I tried splitting the image up into tiles upon capture, but on the 5S it took, on average, 5 1/2 seconds to save all the tiles to disk. It will only get worse from there as the app is executed on older devices. This is very bad because what if the user exists the app in the middle of the save? I will have missing tiles and no uncompressed original file to find missing tiles later.
Question : what's the best way to show a full sized image without causing memory issues and keeping the scrolling fast? Tons of camera applications on the App Store do this and the Photos app does this, there has to be a good solution.
Note : I'm currently showing a thumbnail of the image and then loading the full size image from disk in another thread. Once the full size image loading has finished, I present the full size image on the main thread. This removes the memory issues because I only have one full size image in memory at once, with two thumbnails, but still causes lagging on the scrollview because drawing the full size image in the main thread is still pretty expensive.
I would greatly appreciate any input!
you could..
create a down sized thumb nail..
create a smaller image and save that in a different "sandbox" folder.. and read that for browsing.. then after that load the image if the user wants to look at it full size.
One way to deal with this is to tile the image.
You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.)
This is how applications like Photoshop deal with this situation.
Second way which I suggest you is to
check the example from Apple for processing large images called PhotoScroller. The images have already been tiled. If you need an example of tiling an image in Cocoa check out cimgf.com
Hope this will helps you.

Resources