Should retain count increase after an image rotation? - ios

I'm using the following code to rotate an image
http://www.platinumball.net/blog/2010/01/31/iphone-uiimage-rotation-and-scaling/
that's one of the few image transformations that I do before uploading an image to the server, I also have some other transformations: normalize, crop, resize.
Each one of the transformations returns an (UIImage*) and I add those functions using a category. I use it like this:
UIImage *img = //image from camera;
img = [[[img normalize] rotate] scale] resize];
[upload img];
After selecting 3~4 photos from the camera and executing the same code each time I get a Memory Warning message in XCode.
I'm guessing I have a memory leak somewhere (even though im using ARC). I'm not very experienced using the xCode debugging tools, so I started printing the retain count after each method.
UIImage *img = //image from camera;
img = [img normalize];
img = [img rotate]; // retain count increases :(
img = [img scale];
img = [img resize];
The only operation that increases the retain count is the rotation. Is this normal?

The only operation that increases the retain count is the rotation. Is this normal?
It's quite possible that the UIGraphicsGetImageFromCurrentImageContext() call in your rotate function ends up retaining the image. If so, it almost certainly also autoreleases the image in keeping with the normal Cocoa memory management rules. Either way, you shouldn't worry about it. As long as your rotate function doesn't itself contain any unbalanced retain (or alloc, new, or copy) calls, you should expect to be free of leaks. If you do suspect a leak, it's better to track it down with Instruments than by watching retainCount yourself.

Related

iOS Redrawing image to prevent deferred decompression resulting in a bigger image

I've noticed some people redraw images on a CGContext to prevent deferred decompression and this has caused a bug in our app.
The bug is that the size of the image professes to remain the same but the CGImageDataProvider data has extra bytes appended to it.
For example, we have a 797x500 PNG image downloaded from the Internet, and the AsyncImageViewredraws and returns the redrawn image.
Here is the code:
UIImage *image = [[UIImage alloc] initWithData:data];
if (image)
{
// Log to compare size and data length...
NSLog(#"BEFORE: %f %f", image.size.width, image.size.height);
NSLog(#"LEN %ld", CFDataGetLength(CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))));
// Original code from AsyncImageView
//redraw to prevent deferred decompression
UIGraphicsBeginImageContextWithOptions(image.size, NO, image.scale);
[image drawAtPoint:CGPointZero];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Log to compare size and data length...
NSLog(#"AFTER: %f %f", image.size.width, image.size.height);
NSLog(#"LEN %ld", CFDataGetLength(CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))));
// Some other code...
}
The log shows as follows:
BEFORE: 797.000000 500.000000
LEN 1594000
AFTER: 797.000000 500.000000
LEN 1600000
I decided to print each byte one by one, and sure enough there were twelve 0s appended for each row.
Basically, the redrawing was causing the image data to be that of a 800x500 image. Because of this our app was looking at the wrong pixel when it wanted to look at the 797 * row + columnth pixel.
We're not using any big images so deferred decompression doesn't pose any problems, but should I decide to use this method to redraw images, there's a chance I might introduce a subtle bug.
Does anyone have a solution to this? Or is this a bug introduced by Apple and we can't really do anything?
As you've discovered, rows are padded out to a convenient size. This is generally to make vector algorithms more efficient. You just need to adapt to that layout if you're going to use CGImage this way. You need to call CGImageGetBytesPerRow to find out the actual number of bytes allocated, and then adjust your offsets based on that (bytesPerRow * row + column).
That's probably best for you, but if you need to get rid of the padding, you can do that by creating your own CGBitmapContext and render into it. That's a heavily covered topic around Stack Overflow if you're not familiar with it. For example: How to get pixel data from a UIImage (Cocoa Touch) or CGImage (Core Graphics)?

Is there any alternate for animating array of images using UIImageView?

UIImageView has animationImages for animating sequence of images. That works fine. But It holds the images object. So There is a spike in use of memory when this animation is happening. I tried NSTimer for setting image property of that image view, But it doesn't work.
Can we achieve this in any other approach?
Instead of looking for alternatives, just try to correct the existing code.
The best practice of allocating an image for animationImages should be using initWithContentsOfFile instead of imageNamed:
imageNamed: it cache’s your images and you lose control over the memory - there's no guarantee that releasing the object will actually release the image but does provide faster loading of images second time around as they are cached.
imageWithContentsOfFile: it does not cache images and is more memory friendly however as it does not cache images and the heavier images are loaded much slower.
When the animation does stop, just release the image collection array. If you are using ARC then make it the image collection array to nil.
Best Practice:
for(int itemIndex = 0; itemIndex < 20; itemIndex++) {
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"myImage1" ofType:#"png"];
UIImage *image = [[UIImage alloc] initWithContentsOfFile:filePath];
[imageArray addObject:image];
}
After the Animation:
imageArray = nil;

the memory of using drawInRect to resize picture

recently i use PHAssets replace the old Assets in my project.However,when i use my app to scale some pictures ,i found it usually crashes.
i use the debug mode, found it is the memory problem.
i use the code below to resize picture
+(UIImage*)scaleRetangleToFitLen:(UIImage *)img sWidth:(float)wid sHeight:(float)hei{
CGSize sb = img.size;;
if (img.size.height/img.size.width > hei/wid) {
sb = CGSizeMake(wid,wid*img.size.height/img.size.width);
}else{
sb = CGSizeMake(img.size.width*hei/img.size.height,hei);
}
if (sb.width > img.size.width || sb.height > img.size.height) {
sb = img.size;
}
UIImage* scaledImage = nil;
UIGraphicsBeginImageContext(sb);
[img drawInRect:CGRectMake(0,0, sb.width, sb.height)];
scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
img = nil;
return scaledImage;
}
the memory will increase about 50M when the code
[img drawInRect:CGRectMake(0,0, sb.width, sb.height)]
runs and it will not be setted free even thought the method is finished.
the width and the height is 304*228 ,the image original size is about 3264*2448,the returned image is 304*228;it means the real image i wanted at last is just a 304*228 size image,however it takes 50+M memory..
Is there any way to free the memory the drawInRect: function takes?
(the #autoreleasepool does not work ~ 😢 😢)
When loading an image, iOS usually doesn't decompress it until it really needs to. So the image you pass into your function is most likely a JPEG or PNG image that iOS keeps in memory in it's compressed state. The moment you draw it, it will be decompressed first and therefore the memory increases significantly. I would expect an increase by 3264 x 2448 x 4 = 35MB (and not 50MB).
To get rid of the memory again, you will make sure you release all reference to the image you pass into your function. So the problem is outside the code you show in your question.
For a more specific answer, you'll need to show all the code that works with the original image.

UIImageView stops displaying images after a specific amount of loop iterations

My iOS app utilizes a loop to cycle through images in a folder.
My application is supposed to loop through a total of 2031 images (sized 1200x900) inside a folder. The images were taken at 8fps and each image will be displayed as the loop continues to simulate a video clip. After the 696th picture, the images will cease to be displayed in the UIImageView although the app will continue looping.
I tested to see if the disconnect was because of the picture not existing
I started the loop at picture 200, but after picture 896 the UIImageView stop displaying the pictures.
The Code:
imgName = [NSString stringWithFormat:#"subject_basline_mat k (%d).png",jojo];
jojo++;
imageToCrop.image = [UIImage imageNamed:imgName]; //imageToCrop is the name of the UIImageView image and it is set to the image file here
imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size]; //Here the image is converted to fit the bounds of the simulator which is 320x240
The code loops due to a timer that loops it about once every 0.8 seconds.
I ran my code with instruments to see if there was a memory problem occurring,and instruments is very heavy on my computer. As such, my application ran quite slowly. However, when I arrived at the 696th picture, the pictures kept displaying themselves. It was almost as if my application running too quickly caused the picture to not be displayed... which I don't really understand.
The only memory heavy part of the image switching seems to be the size conversion step which is called by the line imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size];
imageToCrop.image = [self imageWithImage:imageToCrop.image convertToSize:self.imageToCrop.frame.size];
The method "imageWithImage" is here:
- (UIImage *)imageWithImage:(UIImage *)image convertToSize:(CGSize)size {
#autoreleasepool {
UIGraphicsBeginImageContext(size);
[image drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *destImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return destImage;
}
And the line [image drawInRect:CGRectMake(0, 0, size.width, size.height)]; uses around up the most memory out of all the image management in the app.
Any Ideas as to why my app will only display a certain amount of images?
Try loading the full-size images from the app bundle by URL. For example:
#autoreleasepool {
NSString *imgName = [NSString stringWithFormat:#"subject_basline_mat k (%d)",jojo];
NSURL *imageURL = [[NSBundle mainBundle] URLForResource:imgName withExtension:#"png"];
UIImage *image = [UIImage imageWithContentsOfFile:[imageURL path]];
imageToCrop.image = [self imageWithImage:image convertToSize:self.imageToCrop.frame.size];
}
Almost for sure your problem is [UIImage imageNamed:imgName]. There are hundreds of posts here on the pitfalls of using it. The issue is that it caches the images - its real purpose is for some small number of images in your bundle.
If you have oodles of images, get the path to the image, then get the image through a URL or file pointer. That way its not cached. Note that when you do this, you lose the automatic "get-retina-image-automatically", and so you will need to grab the appropriately sized image depending on whether the device is retina or not.

UIGraphicsGetImageFromCurrentImageContext leak

I have written the following code snippet to take screen snapshot:
UIGraphicsBeginImageContext(animationView.frame.size);
[[window layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage* screenshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
But, UIGraphicsGetImageFromCurrentImageContext seems to be leaking. Is this correct?
In Instruments I could not get the exact leak point. In activity monitor I observed that when I switch to the UI that executes the above code snippet memory increments by some MB. After this point it never decreases.
Does UIGraphicsGetImageFromCurrentImageContext has memory leak? How do I solve this?
Edit
Instruments analysis
Activity Monitor: shows the memory hike when this line of code is executed; never decreases even after releasing screenshot (UIImage)
Leaks and allocation, Heap Snapshot: Does not show any leak OR this allocation.
You have just created a UIImage with data for your animationView (which could be some MB). Perhaps you should wrap this functionality in an autorelease pool.
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
UIImage* screenshot = UIGraphicsGetImageFromCurrentImageContext();
//[screenshot retain]; //If you want precise control over when it is released and you will use it later.
[pool release];
CGContextRef context = UIGraphicsGetCurrentContext();
/* you code */
CGContextRelease(context);
clear the context when done

Resources