Memory growth during image editing - ios

I'm trying to fetch an image of PDF page and edit it. Everything works fine, but there is an huge memory growth. Profiler says that there is no any memory leak. Also profiler says that 90% memory allocated at UIGraphicsGetCurrentContext() and UIGraphicsGetImageFromCurrentImageContext(). This code not runs in the loop and there is no need to wrap it with #autorelease.
if ((pageRotation == 0) || (pageRotation == 180) ||(pageRotation == -180)) {
UIGraphicsBeginImageContextWithOptions(cropBox.size, NO, PAGE_QUALITY);
}
else {
UIGraphicsBeginImageContextWithOptions(
CGSizeMake(cropBox.size.height, cropBox.size.width), NO, PAGE_QUALITY);
}
CGContextRef imageContext = UIGraphicsGetCurrentContext();
[PDFPageRenderer renderPage:_PDFPageRef inContext:imageContext pagePoint:CGPointMake(0, 0)];
UIImage *pageImage = UIGraphicsGetImageFromCurrentImageContext();
[[NSNotificationCenter defaultCenter] postNotificationName:#"PAGE_IMAGE_FETCHED" object:pageImage];
UIGraphicsEndImageContext();
But debug shows that the memory growing occurs only when I start editing the fetched image. For image editing I use the Leptonica library. For example:
+(void) testAction:(UIImage *) image{
PIX * pix =[self getPixFromUIImage:image];
pixConvertTo8(pix, FALSE);
pixDestroy(&pix);
}
Before pixConvertTo8 app takes 13MB, after - 50MB. Obviously growth depends on image size. Converting method:
+(PIX *) getPixFromUIImage:(UIImage *) image{
CFDataRef data = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
UInt8 const* pData = (UInt8*)CFDataGetBytePtr(data);
Pix *myPix = (Pix *) malloc(sizeof(Pix));
CGImageRef myCGImage = [image CGImage];
myPix->w = CGImageGetWidth (myCGImage)-1;
myPix->h = CGImageGetHeight (myCGImage);
myPix->d = CGImageGetBitsPerPixel([image CGImage]) ;
myPix->wpl = CGImageGetBytesPerRow (myCGImage)/4 ;
myPix->data = (l_uint32 *)pData;
myPix->colormap = NULL;
myPix->text="text";
CFRelease(data);
return myPix;
}
P.S. Sorry for my terrible English.

Related

Why did the SDWebImage use #autoreleasepool block in the decodedImageWithImage method?

Look like the code snippet below, it use the #autoreleasepool block in this method.
+ (UIImage *)decodedImageWithImage:(UIImage *)image {
// while downloading huge amount of images
// autorelease the bitmap context
// and all vars to help system to free memory
// when there are memory warning.
// on iOS7, do not forget to call
// [[SDImageCache sharedImageCache] clearMemory];
if (image == nil) { // Prevent "CGBitmapContextCreateImage: invalid context 0x0" error
return nil;
}
#autoreleasepool{
// do not decode animated images
if (image.images != nil) {
return image;
}
CGImageRef imageRef = image.CGImage;
CGImageAlphaInfo alpha = CGImageGetAlphaInfo(imageRef);
BOOL anyAlpha = (alpha == kCGImageAlphaFirst ||
alpha == kCGImageAlphaLast ||
alpha == kCGImageAlphaPremultipliedFirst ||
alpha == kCGImageAlphaPremultipliedLast);
if (anyAlpha) {
return image;
}
// current
CGColorSpaceModel imageColorSpaceModel = CGColorSpaceGetModel(CGImageGetColorSpace(imageRef));
CGColorSpaceRef colorspaceRef = CGImageGetColorSpace(imageRef);
BOOL unsupportedColorSpace = (imageColorSpaceModel == kCGColorSpaceModelUnknown ||
imageColorSpaceModel == kCGColorSpaceModelMonochrome ||
imageColorSpaceModel == kCGColorSpaceModelCMYK ||
imageColorSpaceModel == kCGColorSpaceModelIndexed);
if (unsupportedColorSpace) {
colorspaceRef = CGColorSpaceCreateDeviceRGB();
}
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
// kCGImageAlphaNone is not supported in CGBitmapContextCreate.
// Since the original image here has no alpha info, use kCGImageAlphaNoneSkipLast
// to create bitmap graphics contexts without alpha info.
CGContextRef context = CGBitmapContextCreate(NULL,
width,
height,
bitsPerComponent,
bytesPerRow,
colorspaceRef,
kCGBitmapByteOrderDefault|kCGImageAlphaNoneSkipLast);
// Draw the image into the context and retrieve the new bitmap image without alpha
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGImageRef imageRefWithoutAlpha = CGBitmapContextCreateImage(context);
UIImage *imageWithoutAlpha = [UIImage imageWithCGImage:imageRefWithoutAlpha
scale:image.scale
orientation:image.imageOrientation];
if (unsupportedColorSpace) {
CGColorSpaceRelease(colorspaceRef);
}
CGContextRelease(context);
CGImageRelease(imageRefWithoutAlpha);
return imageWithoutAlpha;
}
}
(the method is in SDWebImageDecoder.m, the version is SDWebImage
3.7.0).
I am confused with it, because these temp objects will be released after the method return, so is it necessary to use the autoreleasepool to release them only a little before? the autoreleasepool will also occupy the memory.
anyone can explain it, thanks!
Go through this apple doc. It is mentioned that
Three occasions when you might use your own autorelease pool blocks:
If you are writing a program that is not based on a UI framework, such as a command-line tool.
If you write a loop that creates many temporary objects.
You may use an autorelease pool block inside the loop to dispose of those objects before the next iteration. Using an autorelease pool block in the loop helps to reduce the maximum memory footprint of the application.
If you spawn a secondary thread.
You must create your own autorelease pool block as soon as the thread begins executing; otherwise, your application will leak objects. (See Autorelease Pool Blocks and Threads for details.)
I am not sure about the first point, but SDWebImage will surely use autoreleasepool for other two points.

Intermittent crash on CFRelease() called with NULL for some JPG images

In my iOS app, I download game assets from s3. My artist updated one of the assets, and we immediately started seeing crashes with the new asset.
Here's the code that processes them:
+ (UIImage *)imageAtPath:(NSString *)imagePath scaledToSize:(CGSize)size
{
// Create the image source (from path)
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef) [NSURL fileURLWithPath:imagePath], NULL);
NSParameterAssert(imageSource);
// Get the image dimensions (without loading the image into memory)
CGFloat width = 512.0f, height = 384.0f;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
NSParameterAssert(imageProperties);
if (imageProperties) {
CFNumberRef widthNumRef = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
if (widthNumRef != NULL) {
CFNumberGetValue(widthNumRef, kCFNumberCGFloatType, &width);
}
CFNumberRef heightNumRef = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
if (heightNumRef != NULL) {
CFNumberGetValue(heightNumRef, kCFNumberCGFloatType, &height);
}
CFRelease(imageProperties);
} else {
// If the image info is somehow missing, make up some numbers so we don't divide by zero
width = 512;
height = 384;
}
// Create thumbnail options
CGFloat maxDimension = size.height;
if (useDeviceNativeScale) {
maxDimension *= [UIScreen mainScreen].scale;
}
NSParameterAssert(maxDimension);
// If we have a really wide image, scaling it to the screen height will make it too blurry.
// Here we calculate the maximum dimension we want (probably width) to make the image be the full height of the device
CGFloat imageAspectRatio = width / height;
if (width > height) {
maxDimension *= imageAspectRatio;
}
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceShouldCache : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(maxDimension)
};
// Generate the thumbnail
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
CFRelease(imageSource);
// 🔻 Crashlytics says the crash is on this line, even though the line above is the one that uses `CFRelease()`
UIImage *image = [UIImage imageWithCGImage:thumbnail];
CGImageRelease(thumbnail);
NSAssert(image.size.height - SCREEN_MIN_LENGTH * [UIScreen mainScreen].scale < 1, #"Image should be the height of the view");
return image;
}
And here's a stack trace from Fabric's Crashlytics:
0. Crashed: com.apple.main-thread
0 CoreFoundation 0x1820072a0 CFRelease + 120
1 Homer 0x10014e7f0 +[UIImage(ResizeBeforeLoading) imageAtPath:scaledToSize:] (UIImage+ResizeBeforeLoading.m:98)
With this crash info key:
CRASH_INFO_ENTRY_0
* CFRelease() called with NULL *
The odd thing is that this crash is not 100%, it's happening to a very low percentage of users. I can't reproduce it on any of my own devices. There is also no pattern to what version of iOS it happens on, nor what iOS hardware it happens on.
Here is a file that does not cause a crash:
And here is a file that does cause a crash:
And here are links to them in my production image host (s3 with cloudfront):
(good)
http://d3iq9oupxk0b1m.cloudfront.net/PirateDinosaur/2x/dinoBack._k91G0.jpg
(crashy)
http://d3iq9oupxk0b1m.cloudfront.net/PirateDinosaur/2x/PirateDino.3LHRmc.jpg
I can't see a meaningful difference between them, even when I look at them using Imagemagick's identify -verbose <filename>. Can any jpeg experts weigh in?
Note: In my next app version, to be released, I have added guards around releasing NULL references, to protect against this crash:
if (imageSource) { CFRelease(imageSource), imageSource = nil; }
...
if (thumbnail) { CGImageRelease(thumbnail), thumbnail = nil; }
However, I would still like to know what in the world is wrong with this jpeg that is causing this crash, so my artist can avoid it in the future.

Malloc pointer being freed was not allocated error when calling initWithBitmapData

When I create a CIImage by calling this routine I get the malloc error in the title. When I call initWithBitmapData directly (not within the createCIUimageFromData routine) then it works fine.
I have seen references to possible bugs in iOS that might be related, but I can't tell for sure and I certainly suspect my code more than Apple's!
My guess is somehow my additional redirection is screwing things up, but it's cleaner to have the separate routine than to embed the code wherever I need it.
Thank you.
Fails:
- (CIImage *) createCIimageFromData : (unsigned char *)pData width : (int32_t)width height : (int32_t)height
{
/*
Once we have the raw data, we convert it into a CIImage.
The following code does the required work.
*/
NSLog(#"entering createciimage\n");
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceGray();
NSData *_pixelsData = [NSData dataWithBytesNoCopy:pData length:(sizeof(unsigned char)*width*height) freeWhenDone:YES ];
CIImage *_dataCIImage = [[CIImage alloc] initWithBitmapData:_pixelsData bytesPerRow:(width*sizeof(unsigned char)) size:CGSizeMake(width,height) format:kCIFormatR8 colorSpace:colorSpaceRef];
CGColorSpaceRelease(colorSpaceRef);
/*
newImage is the final image
Do remember to release the allocated parts.
*/
NSLog(#"done ciimage\n");
return _dataCIImage;
}
Works:
void prepData(unsigned char *pData, // source-destination
int strideSrc, // stride
int width,
int height,
double amount,
int deltaLimit,
id owner)
{
//[owner createCIimageFromData:pData width:width height:height]; // <-- commented out
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceGray();
NSData *_pixelsData = [NSData dataWithBytesNoCopy:pData length:(sizeof(unsigned char)*width*height) freeWhenDone:YES ];
CIImage *_dataCIImage = [[CIImage alloc] initWithBitmapData:_pixelsData bytesPerRow:(width*sizeof(unsigned char)) size:CGSizeMake(width,height) format:kCIFormatR8 colorSpace:colorSpaceRef];
CGColorSpaceRelease(colorSpaceRef);
// . . .
}
Evidently the problem is caused when the NSData object attempts to free the data. To avoid the problem, use freeWhenDone:NO and then free the data after you're done with the CIImage.

Memory leak in malloc showed by instruments GPUImageFilter Objective C

I'm implementing the instagram like image filters in my app and I'm using GPUImageFilters for that. But when I keep switching to different filter more than 10 times it got crashed then I tried with instruments and found out that there is a large memory allocation in GPUFilter class and its because of malloc. As I'm new to memory leak related issues, please help me out! Thanks
Here is the GPUImageFilter code:
- (UIImage *)imageFromCurrentlyProcessedOutput {
[GPUImageOpenGLESContext useImageProcessingContext];
[self setFilterFBO];
CGSize currentFBOSize = [self sizeOfFBO];
NSUInteger totalBytesForImage = (int)currentFBOSize.width * (int)currentFBOSize.height * 4;
GLubyte *rawImagePixels = (GLubyte *)malloc(totalBytesForImage); //here its showing the large memory allocation
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
CGDataProviderRef dataProvider = CGDataProviderCreateWithData(NULL, rawImagePixels, totalBytesForImage, dataProviderReleaseCallback);
CGColorSpaceRef defaultRGBColorSpace = CGColorSpaceCreateDeviceRGB();
CGImageRef cgImageFromBytes = CGImageCreate((int)currentFBOSize.width, (int)currentFBOSize.height, 8, 32, 4 * (int)currentFBOSize.width, defaultRGBColorSpace, kCGBitmapByteOrderDefault, dataProvider, NULL, NO, kCGRenderingIntentDefault);
UIImage *finalImage = [UIImage imageWithCGImage:cgImageFromBytes scale:1.0 orientation:UIImageOrientationUp];
// free(rawImagePixels);
CGImageRelease(cgImageFromBytes);
CGDataProviderRelease(dataProvider);
CGColorSpaceRelease(defaultRGBColorSpace);
return finalImage;
}
Screenshot from instruments:
malloc doesn't free when whatever it allocates on one thread is deallocated on another.
Wrap your code in this:
dispatch_async(dispatch_get_main_queue(), ^{
// malloc and whatever other code goes here...
});

CGContextDrawImage memory is not freed

I'm using the PhotoScrollerNetwork project to provide a single high resolution image to a view in my project and automatically tile it, so memory is managed properly. It uses this block of code to draw the full high res image to memory, so that tiles can be calculated out of it.
-(void)drawImage:(CGImageRef)image {
madvise(ims[0].map.addr, ims[0].map.mappedSize - ims[0].map.emptyTileRowSize, MADV_SEQUENTIAL);
unsigned char *addr = ims[0].map.addr + ims[0].map.col0offset + ims[0].map.row0offset * ims[0].map.bytesPerRow;
CGContextRef context = CGBitmapContextCreate(addr, ims[0].map.width, ims[0].map.height, bitsPerComponent, ims[0].map.bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little);
assert(context);
CGContextSetBlendMode(context, kCGBlendModeCopy); // Apple uses this in QA1708
CGRect rect = CGRectMake(0, 0, ims[0].map.width, ims[0].map.height);
CGContextDrawImage(context, rect, image);
CGContextRelease(context);
madvise(ims[0].map.addr, ims[0].map.mappedSize - ims[0].map.emptyTileRowSize, MADV_FREE);
}
In the dealloc method of the class, the ims is freed ( 'free(ims)'), so this should be handled properly. However, if I make a new view (and thus a call to drawImage) repeatedly, my memory is getting filled. I found that if I comment CGContextDrawImage(context, rect, image);, the memory is ok, so I think something is kept in memory, but I can't get what... The dealloc method is always called, so that's not the problem.
EDIT:
My image is also released properly, this is the complete flow:
- (void)myFunc {
CFDictionaryRef options = [self createOptions];
CGImageRef image = CGImageSourceCreateImageAtIndex(imageSourcRef, 0, options);
CFRelease(options);
CFRelease(imageSourcRef);
if (image) {
[self decodeImage:image];
CGImageRelease(image);
}
}
- (void)decodeImage:(CGImageRef)image {
assert(decoder == cgimageDecoder);
size_t width = CGImageGetWidth(image);
size_t height = CGImageGetHeight(image);
#if LEVELS_INIT == 0
zoomLevels = [self zoomLevelsForSize:CGSizeMake(width, height)];
ims = calloc(zoomLevels, sizeof(imageMemory));
#endif
[self mapMemoryForIndex:0 width:width height:height];
[self drawImage:image];
[self createLevelsAndTile];
}
Running both local from-bundle images, and network images, it appears any significant leak is gone. This with iOS7 and Xcode 5.

Resources