Reducing memory usage with UIImagePickerController - ios

In my app, the user can take multiple images using the UIImagePickerController, and those images are then displayed one by one in the view.
I've been having some trouble with memory management. With cameras on today's phones quickly rising in megapixels, UIImages returned from UIImagePickerController are memory hogs. On my iPhone 4S, the UIImages are around 5MB; I can hardly imagine what they're like on the newer and future models.
A friend of mine said that the best way to handle UIImages was to immediately save them to a JPEG file in my app's document directory and to release the original UIImage as soon as possible. So this is what I've been trying to do. Unfortunately, even after saving the UIImage to a JPEG and leaving no references to it in my code, it is not being garbage collected.
Here are the relevant sections of my code. I am using ARC.
// Entry point: UIImagePickerController delegate method
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Process the image. The method returns a pathname.
NSString* path = [self processImage:[info objectForKey:UIImagePickerControllerOriginalImage]];
// Add the image to the view
[self addImage:path];
}
-(NSString*) processImage:(UIImage*)image {
// Get a file path
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* documentsDirectory = [paths objectAtIndex:0];
NSString* filename = [self makeImageFilename]; // implementation omitted
NSString* imagePath = [documentsDirectory stringByAppendingPathComponent:filename];
// Get the image data (blocking; around 1 second)
NSData* imageData = UIImageJPEGRepresentation(image, 0.1);
// Write the data to a file
[imageData writeToFile:imagePath atomically:YES];
// Upload the image (non-blocking)
[self uploadImage:imageData withFilename:filename];
return imagePath;
}
-(void) uploadImage:(NSData*)imageData withFilename:(NSString*)filename {
// this sends the upload job (implementation omitted) to a thread
// pool, which in this case is managed by PhoneGap
[self.commandDelegate runInBackground:^{
[self doUploadImage:imageData withFilename:filename];
}];
}
-(void) addImage:(NSString*)path {
// implementation omitted: make a UIImageView (set bounds, etc). Save it
// in the variable iv.
iv.image = [UIImage imageWithContentsOfFile:path];
[iv setNeedsDisplay];
NSLog(#"Displaying image named %#", path);
self.imageCount++;
}
Notice how the processImage method takes a reference to a UIImage, but it uses it for only one thing: making the NSData* representation of that image. So, after the processImage method is complete, the UIImage should be released from memory, right?
What can I do to reduce the memory usage of my app?
Update
I now realize that a screenshot of the allocations profiler would be helpful for explaining this question.

Your processImage method is not your problem.
We can test your image-saving code by transplanting it into Apple's PhotoPicker demo app
Conveniently, Apple's sample project is very similar to yours, with a method to take repeated pictures on a timer. In the sample, the images are not saved to the filesystem, but accumulated in memory. It comes with this warning:
/*
Start the timer to take a photo every 1.5 seconds.
CAUTION: for the purpose of this sample, we will continue to take pictures indefinitely.
Be aware we will run out of memory quickly. You must decide the proper threshold number of photos allowed to take from the camera.
One solution to avoid memory constraints is to save each taken photo to disk rather than keeping all of them in memory.
In low memory situations sometimes our "didReceiveMemoryWarning" method will be called in which case we can recover some memory and keep the app running.
*/
With your method added to Apple's code, we can address this issue.
The imagePicker delegate method looks like this:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
[self.capturedImages removeAllObjects]; // (1)
[self.imagePaths addObject:[self processImage:image]]; //(2)
[self.capturedImages addObject:image];
if ([self.cameraTimer isValid])
{
return;
}
[self finishAndUpdate]; //(3)
}
(1) - our addition, to flush the live memory on each image capture event
(2) - our addition, to save image to filesystem and build a list of filesystem paths.
(3) - for our tests we are using the cameraTimer to take repeat images, so finishAndUpdate does not get called.
I have used your processImage: method as is, with the line:
[self uploadImage:imageData withFilename:filename];
commented out.
I have also added a small makeImageFileName method:
static int imageName = 0;
-(NSString*)makeImageFilename {
imageName++;
return [NSString stringWithFormat:#"%d.jpg",imageName];
}
These are the only additions I have made to Apple's code.
Here is the memory footprint of Apple's original code (cameraTimer run without (1) and (2))
Memory climbed to ~140MB after capture of ~40 images
Here is the memory footprint with the additions (cameraTimer run with (1) and (2))
The filesaving method is fixing the memory issue: memory is flat with spikes of ~30MB per image capture.
These test were run on an iPhone5S. Uncompressed images are 3264 x 2448 px, which should be around 24mB (24-bit RGB). Jpeg compressed (filesystem) size ranges between 250kB (0.1 quality, as per your code) to 1-2mB (0.7 quality) upto ~6mB (1.0 quality).
In a comment to your question, you suggest that a re-loaded image will benefit from that compression. This is not the case: when an image is loaded into memory it must first be uncompressed. It's memory footprint will be approximately equal to pixels x colours x bit-depth per colour - regardless of the way the image is stored on disk. As jrturton has pointed out, this at least suggests that you should avoid loading an image at greater resolution than you need for display. Say you have a full-screen (retina) imageView of 832 x 640, you are wasting memory loading an image larger than that if your user cannot zoom in. That's a live memory footprint of ~1.6mB, a huge improvement on your 24mMB original (but this is a digression from your main issue).
As processImage doesn't seem to be the cause of your memory trouble, you should look at other possibilities:
1/ You don't have a memory issue. How are you profiling the app?
2/ One of addImage or uploadImage is retaining memory. Try commenting each out in turn to identify which.
3/ The problem is elsewhere (something managed by PhoneGap?)
As regards those memory spikes, these are caused by the image-to-data jpeg compression line:
NSData* imageData = UIImageJPEGRepresentation(image, 0.1);
Under the hood, that is ImageIO, and it is probably unavoidable when using ImagePickerController. See here: Most memory efficient way to save a photo to disk on iPhone? If you switch to AVFoundation you can get at the image as unconverted NSData so you could avoid the spike.

Related

Why the UIImage is not released by ARC when I used UIGraphicsGetImageFromCurrentImageContext inside of a block

I try to download an image from server by using the NSURLSessionDownloadTask(iOS 7 API), and inside of the completion block, I want to the original image to be resized and store locally. So I wrote the helper method to create the bitmap context and draw the image, then get the new image from UIGraphicsGetImageFromCurrentImageContext(). The problem is the image is never released every time I do this. However, if I don't use the context and image drawing, things just work fine and no memory increasing issue. There is no CGImageCreate/Release function called, so really nothing to manually release here, and nothing fixed by adding #autoreleasepool here. Is there any way to fix this? I really want to modify the original image after downloading and before storing.
Here is some snippets for the issue:
[self fetchImageByDownloadTaskWithURL:url completion:^(UIImage *image, NSError *error) {
UIImage *modifiedImage = [image resizedImageScaleAspectFitToSize:imageView.frame.size];
// save to local disk
// ...
}];
// This is the resize method in UIImage Category
- (UIImage *)resizedImageScaleAspectFitToSize:(CGSize)size
{
CGSize imageSize = [self scaledSizeForAspectFitToSize:size];
UIGraphicsBeginImageContextWithOptions(imageSize, YES, 0.0);
CGRect imageRect = CGRectMake(0.0, 0.0, imageSize.width, imageSize.height);
[self drawInRect:imageRect]; // nothing will change if make it weakSelf
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
updates:
When I dig into with allocations instrument, I find out that the memory growth is related with "VM: CG raster data". In my storing method, I use the NSCache for a photo memory cache option before store it persistently, and the raster data eats a lot of memory if I use the memory cache. It seems like after the rendered image being cached, all drawing data is also alive in memory until I release all cached images. If I don't memory cache the image, then non of raster data that coming from my image category method will be alive in memory. I just can not figure out why the drawing data is not released after image is being cached? Shouldn't it being released after drawing?
new updates:
I still didn't figure out why raster data is not being released when image for drawing is alive, and there is no analyze warning about this for sure. So I guess I just have to not cache the huge image for drawing to fit the big size, and remove cached drawing images when I don't want to use them any more. If I call [UIImage imageNamed:] and make it drawing, it seems never being released with raster data together since the image is system cached. So I called [UIImage imageWithContentsOfFile:] instead. Eventually the memory performs well. Other memory growth are something called non-object in allocations instrument which I have no idea currently. The memory warning simulation will release the system cached image created by [UIImage imageNamed:]. But for raster data, I will give some more tests on tomorrow and see.
Try making your category method a class method instead. Perhaps the leak is the original CGImage data which you are overwriting when you call [self drawInRect:imageRect];.

How to fix this memory leak while drawing on a UIImage?

Addendum to the question below.
We have traced the growth in allocated memory to a NSMutableArray which points at a list of UIImages. The NSMutable array is in a method. It has no outside pointers, strong or weak, that are pointing at it. Because the NSMutableArray is in a method - shouldn't it - and all the objects at which it points be automatically de-allocated at some point after the method returns?
How do we ensure that happens?
=================
(1) First, does calling this code cause a memory leak or should we be looking elsewhere?
(It appears to us that this code does leak as when we look at Apple's Instruments, running this code seems to create a string of 1.19MB mallocs from CVPixelBuffer - and skipping the code avoids that. Additionally, the malloc allocation size continually creeps up across the execution cycle and never seems to be reclaimed. Adding an #autorelease pool decreased peak memory use and helped prolong the app from crashing - but there is steady increase in baseline memory use with the biggest culprit being these 1.19MB mallocs.) image2 is an existing UIImage.
image2 = [self imageByDrawingCircleOnImage:image2 withX:newX withY:newY withColor:color];
- (UIImage *)imageByDrawingCircleOnImage:(UIImage *)image withX:(int)x withY:(int)y withColor:(UIColor *)color
{
UIGraphicsBeginImageContext(image.size);
[image drawAtPoint:CGPointZero];
CGContextRef ctx = UIGraphicsGetCurrentContext();
[color setStroke];
CGRect shape = CGRectMake(x-10, y-10, 20, 20);
shape = CGRectInset(shape, 0, 0);
CGContextStrokeEllipseInRect(ctx, shape);
UIImage *retImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return retImage;
}
(2) Second, if this code does leak, then how do we prevent the leak and, more importantly, prevent a crash from a memory shortage when we call this method multiple times in rapid succession? We notice that memory use is surging as we call this method multiple times which leads to a crash. The question is how do we ensure the rapid freeing of the discarded UIImages so that the app doesn't crash from shortage of memory while calling this method multiple times.
running this code seems to create a string of 1.19MB mallocs from CVPixelBuffer
But do not make the mistake of calling memory use a memory leak. It's a leak only if the used memory can never be reclaimed. You have not proved that.
Lots of operations use memory — but that doesn't matter if the operation is performed once, because then your code ends and the memory is reclaimed.
Issues arise only if your code keeps going, possibly looping so that there is never a chance for the memory to be reclaimed; and in that situation, you can provide such a chance by wrapping each iteration of the loop in an #autoreleasepool block.
We found the leak elsewhere. We needed to release a pixelBuffer. We were getting a pixelBuffer from a CGI image and adding the buffer to a AVAssetWriterInputPixelBufferAdaptor - but it was never released.
After this code which created the buffer:
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 480,
640, kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
...and this code which appended it to an AVAssetWriter:
[adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
...we needed to add this release code per this SO answer:
CVPixelBufferRelease(buffer);
After that code was added, the memory footprint of the app stayed constant.
Additionally, we added #autoreleasepool { } commands to several points in the video writing code and the memory usage spikes flattened which also stabilized the app.
Our simple conclusion is that SO should get a Nobel prize.

what is the advantage of imageNamed?

I know that loading an image like this
UIImage *image = [UIImage imageNamed:#"img"];
will cache the image and loading it like this will not
UIImage *image = [UIImage imageWithContentsOfFile:#"img.png"];
people say the it will be faster to access the cached image because iOS will access it from memory and we will not have the overhead of reading and decoding the file. OK, I see that, but suppose I use the second non-cached method to load the image on a view that is a property, like this
UIImage *image = [UIImage imageWithContentsOfFile:#"img.png"];
self.imageView = [[UIImageView alloc] initWithImage:image];
isn't image already on memory? If I want to access it I simply do imageView.image and get it from memory.
I am probably tired but I cannot imagine a single use for the cached version or I am not understanding what this cache means.
Care to explain? Thanks.
Imagine that your image is some icon that you use in 7 different view controllers... You could either load the image once and then pass it to each VC or you could use imageNamed... Your choice.
From the documentation:
This method looks in the system caches for an image object with the specified name and returns that object if it exists. If a matching image object is not already in the cache, this method locates and loads the image data from disk or asset catelog, and then returns the resulting object. You can not assume that this method is thread safe.
Let's say you have an image in your app. When you need to use the image, you use this code:
UIImage *image = [UIImage imageWithContentsOfFile:#"img.png"];
iOS will then look for your image in the app bundle, load it into memory, and then decode it into the UIImage.
However, say you need 10 different objects to use the image and you load it similar to this:
for (ClassThatNeedsImage *object in objects) {
object.image = [UIImage imageWithContentsOfFile:#"img.png"];
}
(This isn't the best example since you could just load the image once and pass it to each of the objects. However, I have had more complex code where that is not an option.)
iOS will then look for the image 10 times, load it into memory 10 times, and then decode it 10 times. However, if you use imageNamed:
for (ClassThatNeedsImage *object in objects) {
object.image = [UIImage imageNamed:#"img"];
}
From Wikipedia:
In computing, a cache is a component that transparently stores data so that future requests for that data can be served faster.
The cache used by UIImage is stored in memory, which is much faster to access than the disk.
The first time through the loop, iOS looks in the cache to see if the image is stored there. Assuming you haven't loaded this image with imageNamed previously, it doesn't find it, so it looks for the image, loads it into memory, decodes it, and then copies it into the cache.
On the other iterations, iOS looks in the cache, finds the image, and copies it into the UIImage object, so it doesn't have to do any hard disk access at all.
If you are only going to use the image once in the lifetime of your app, use imageWithContentsOfFile:. If you are going to use the image multiple times, use imageNamed:.

Core Graphics raster data not releasing from memory

So I'm getting my App to take a screen shot and save it to the photo album with the code below...
- (void) save {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0 );
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *theImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(theImage,nil,NULL,NULL);
NSData*theImageData=UIImageJPEGRepresentation(theImage, 1.0 );
[theImageData writeToFile:#"image.jpeg" atomically:YES];
}
How can I release the memory allocated by Core Graphics that is holding the screenshot raster data?
My project is using ARC for memory management. When testing how the App is allocating memory I've noticed that memory is not being released after taking the screen shot, causing the app to grow sluggish over time. The 'Allocation Summary' in Instruments is telling me that the data category is 'CG raster data' and the responsible caller is 'CGDataProviderCreatWithCopyOfData'.
Is there a solution in CFRelease(); ?
My first App so I'm pretty noob, I've had a look around the internet to try and resolve the issue with no luck...
You could try wrapping the contents of your method into an #autorelease block.
#autoreleasepool {
...
}

resize image data on iPhone

I get image data from NSURLConnection, and store the data using NSMutableData. The point is that I want to store a smaller size of the image.
Now I use the following code to get a smaller image:
UIImage *temp = [[UIImage alloc] initWithData: imageData];
image = [temp resizedImage:CGSizeMake(72, 72) interpolationQuality:kCGInterpolationHigh];
smallImageData = UIImagePNGRepresentation( image);
My question is can I get smallImageData directly from imageData?
As I think my method will cost a lot if I have numbers of connection to get the image data.
This is a correct method to downsize images. If this is in a tight loop, you can wrap it in an autorelease pool to make sure any temporary objects are collected.
If you are running into issues with memory then serialize your requests to run one at a time.

Resources