I'm trying to optimize an iOS application which contains a lot of images and code. I have reduced the size of images with some programs but using instruments reveals that the application is still taking among 70-90mb of cache memory.
I have read that loading the resources(images) by demand and discard them when are not longer needed would be a good solution. How can i do it?
I have also a question:
When we use:
UIImage *aux = [UIImage imagenamed:#"image.png"];
and after we write aux=nil;
the image is discarded from cache?
Are only some of those images visible at a time? Write a system that loads only those images currently visible (and perhaps some that you application thinks might become visible soon). When you get a memory warning from the system, look for some images you've loaded in the past that haven't been visible for a while and release them.
To answer your second question, yes, setting a reference to nil will release it IF you are using ARC (Automatic Reference Counting), and if the reference you set to nil is the only reference to that object. All references to an object must go away before it will be released.
I would look at some of the solutions available, such as Path's FastImageCache, and see if they meet your needs. FastImageCache stores uncompressed images on disk in a format similar to Sprite Sheets (used by 2D games) so they can be loaded quickly when needed. The emphasis here is on improving scrolling performance, so if that's not an issue for you, this might not be the right tool for the job.
You might also look at this thread, although this is aimed at caching web images.
You might also take a look at The Tumblr Image Cache
Related
I have been searching these threads and other sites but have not come across a way to do this both efficiently and memory friendly. And so, here is my story:
My IOS (iPad) app uses sprite sheets (a large image, such as 2k x 2k at 16 bpp, which is composed of many smaller sprites). I have created a sprite atlas class which manages these sheets, handle sprite animations, and other features.
The idea (in the load method) is to load in the sheet from the file system, split it apart into UIImages (one per sprite) using CGImageCreateWithImageInRect, and then dispose of the loaded sheet. Seems simple enough.
Note that the sheet is loaded into a UIImage by initWithContentsOfFile or imageNamed (more on that below).
The desire is to save file memory by using sprite sheets, and then save runtime memory by only retaining the actual sprites themselves as UIImages. In my experiments thus far this is what I find happening:
If I use initWithContentsOfFile I see (from Instruments) that it appears to do a file open, fstat64, and close for the file for EACH sprite in it. This takes an horrendous amount of time to load all the sprites. It actually seems to load the entire sheet, grab one sprite, close the sheet, then load the entire sheet again for the next sprite, until they are all created. Also it appears to consume lots of memory (proven by the "received memory warning" after just the second sheet loaded, as well as by Instruments allocations).
Next I tried imageNamed (which caches the sheet). The file loading occurs once and so it is MUCH faster. All seems good and in fact and I can go until many sheets are loaded. But eventually the dreaded "received memory warning" appears... and a few seconds later the app crashes. It appears to be the case that the cached image (even though the pointer is set to null after pulling out the sprites) never goes away. I have read several posts that also state this seems to be its behavior (although other posts say other things, so that is not conclusive - does anyone know definitively?).
And so it appears that neither method is what is needed. What I want is to load the file, pull out each sprite into its own UIImage, then have the UIImage for the big sheet file released completely.
I have read one site that talked about using the initWithContentsOfFile approach to set up their own caching system (rather than trust the IOS plan with imageNamed) so they can release the image when desired. However, I don't think they had sprite sheets in mind.
And so, I turn this over to the experts out there to see if there are some ideas on how to get both fast load times AND use minimal memory.
[and yes, I know that IOS 7 has SpriteKit. But this needs to also work on IOS 6.x.]
One interesting data point is that on an original iPad the imageNamed version actually works fine with no "received memory warning". It might have been IOS 5.x. But the app will crash on the iPad 2 device.
I am not including code here because what I am after is an understanding of the mechanics involved with how memory is used with these functions related to image handling.
And while I am at it, can someone please clarify this point:
True or False: When using CGImageCreateWithImageInRect, what it does is actually creates from new memory a bitmap the size of the rectangle specified and then COPIES from the original UIImage the pixels into this new memory (as opposed to setting up the bitmap format and having a pointer point into the original UIImage's pixel data). I think this is True, but want verification.
Thanks!
I'm working on an iPad-only iOS app that essentially downloads large, high quality images (JPEG) from Dropbox and shows the selected image in a UIScrollView and UIImageView, allowing the user to zoom and pan the image.
The app is mainly used for showing the images to potential clients who are interested in buying them as framed prints. The way it works is that the image is first shown, zoomed and panned to show the potential client if they like the image. If they do like it, they can decide if they want to crop a specific area (while keeping to specific aspect ratios/sizes) and the final image (cropped or not) is then sent as an email attachment to production.
The problem I've been facing for a while now, is that even though the app will only be running on new iPads (ie. more memory etc.), I'm unable to find a method of handling the images so that the app doesn't get a memory warning and then crash.
Most of the images are sized 4256x2832, which brings the memory usage to at least 40MB per image. While I'm only displaying one image at a time, image cropping (which is the main memory/crash problem at the moment) is creating a new cropped image, which in turn momentarily bumps the apps total RAM usage to about 120MB, causing a crash.
So in short: I'm looking for a way to manage very large images, have the ability to crop them and after cropping still have enough memory to send them as email attachments.
I've been thinking about implementing a singleton image manager, which all the views would use and it would only contain one big image at a time, but I'm not sure if that's the right way to go, or even if it'd help in any way.
One way to deal with this is to tile the image. You can save the large decompressed image to "disk" as a series of tiles, and as the user pans around pull out only the tiles you need to actually display. You only ever need 1 tile in memory at a time because you draw it to the screen, then throw it out and load the next tile. (You'll probably want to cache the visible tiles in memory, but that's an implementation detail. Even having the whole image as tiles may relieve memory pressure as you don't need one large contiguous block.) This is how applications like Photoshop deal with this situation.
I ended up sort of solving the problem. Since I couldn't resize the original files in Dropbox (the client has their reasons), I went ahead and used BOSImageResizeOperation, which is essentially just a fast, thread-safe library for quickly resizing images.
Using this library, I noticed that images that previously took 40-60MB of memory per image, now only seemed to take roughly half that. Additionally, the resizing is so quick that the original image gets released from memory so fast, that iOS doesn't execute a memory warning.
With this, I've gotten further with the app and I appreciate all the idea, suggestions and comments. I'm hoping this will get the app done and I can get as far away from large image handling as possible, heh.
In its simplest form, my app displays 10 UIImageViews, each containing an image. Even with all UIImageViews containing images, my app uses a small enough memory footprint. However, there is a button to clear all the UIImageViews by setting all their images to nil. The problem is, when checking Memory Monitor in Instruments, the memory held by the UIImageViews is NOT going away. This doesn't appear in the Allocations instrument, confirming the remaining memory footprint is not an object, but instead graphics-based memory. If I resize the images to something smaller or larger, the memory remaining is also smaller or larger, respectively.
Why is the image data sticking around after the UIImageView's image has been set to nil?
I believe UIKit keeps a cache of images for reuse. UIImageView might be releasing the object, but a copy is kept around for performance reasons.
These images, though, should be released on receiving a memory warning. If they're not, there's two places I'd check:
Make sure the UIImageView is being dealloc'd. Use Allocations Instrument to profile your app and do whatever you need to do in the program to load those images. Then unload the images and do a search for UIImageView. As long as you're sure your program should have released all of them, if you find any in the search you know something is wrong.
I'd also check any places the image was created, for example: UIImage = [UIImage imageName:#"Foo.jpg"]; Make sure these are also being released. You can use allocations to find UIImage classes, but it'll be harder to weed out the ones that should/should not be there.
Run the static analyzer: In Xcode 4 it's under Products -> Analyze. This is a great tool for finding logic errors, over/under release (if you not using ARC) etc.
Until actual UIImageViews are themselves released, their memory will remain allocated. Additionally, if you're using convenience methods on UIImage to obtain your images, eg:
UIImage *myImage = [UIImage imageNamed:#"myImage"];
Note that your image may cached behind-the-scenes by iOS, and so even if the image is being released by you, the memory footprint may still reflect the presence of the image in memory (eventually iOS will release it, so this shouldn't adversely impact your resource consumption).
From the UIImage documentation:
In low-memory situations, image data may be purged from a UIImage
object to free up memory on the system. This purging behavior affects
only the image data stored internally by the UIImage object and not
the object itself. When you attempt to draw an image whose data has
been purged, the image object automatically reloads the data from its
original file.
What about UIImages that were not loaded from a file, but instead drawn, say, with UIGraphicsGetImageFromCurrentContext()?
I'm trying to come up with ways to optimize the memory usage of UITableViewCells with UIImageViews containing UIImages as the cells enter and are pulled from the reuse queue.
Thoughts?
Mike,
My understanding is that CGImage data is gone, so I think (this for your custom drawn image point) you are out of luck?
I actually just dealt with a similar issue with UITableViews. One thing that I did for performance was to create cells with a Nib; this was the single largest boost in performance of all the things I did, so if you are not using a Nib consider it.
You might also consider some form of preloading if you have that much data. I don't know what you are trying to implement, so this may or may be applicable.
One other note, after purging UIImage's reloading them from their files is a significant memory hit, so if you are at that point you really need to just look at memory usage overall.
When developing a mobile app, and letting the user take photos (That later will be shown in full size also) but are also viewed in the table views (mid size) and even in the Google maps pin title view, Should I create a thumbnail/s for every image the user take for the smaller ones? or should I just use the regular image?
I am asking because From the tutorials i saw, and as a web developer, all I could figure out is that when using a web service to get groups of small images you usually get the thumbnails first and only when needed get the Full size image.
But this is an embedded (I know it is not embedded, but i don't have a better way to describe this) app, that all the data sits on the device, So there is no upload performance issues, just memory and processor time issues (loading to view the big HD photos that the cameras take today is very heavy I think.
Any way, What is best practice for this?
Thank you,
Erez
It's all about memory usage balanced with performance. If you don't create thumbnails for each photo, there are only so many photo you can hold in memory before you receive memory warnings or have your app terminated by the system (maybe only 6-8 full size UIImages). To avoid that, you might write the photos out to the file system and keep a reference to their location. But then your tableview scrolling will suffer as it attempts to read photos from the file system for display.
So the solution is to create thumbnails for each photo so that you can store a lot of them in memory without any troubles. Your tableview will perform well as the photos are quickly accessible from memory. You'll also want to write the full size photos to the file system (and keep a reference to their location) to avoid having to store them in memory. When it's time to display the full size image, retrieve it from the file system and store it in memory. When it's no longer needed, release it.
I'm assuming that you're in iOS4, and you are saving the photos in the Asset library, there is already a method for you.
http://developer.apple.com/library/ios/#documentation/AssetsLibrary/Reference/ALAsset_Class/Reference/Reference.html
You're looking for the "thumbnail" method.
So, save the large image, and compute the thumbnail when required, I believe, is the way to go.