Intermittent crash on CFRelease() called with NULL for some JPG images - ios

In my iOS app, I download game assets from s3. My artist updated one of the assets, and we immediately started seeing crashes with the new asset.
Here's the code that processes them:
+ (UIImage *)imageAtPath:(NSString *)imagePath scaledToSize:(CGSize)size
{
// Create the image source (from path)
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef) [NSURL fileURLWithPath:imagePath], NULL);
NSParameterAssert(imageSource);
// Get the image dimensions (without loading the image into memory)
CGFloat width = 512.0f, height = 384.0f;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
NSParameterAssert(imageProperties);
if (imageProperties) {
CFNumberRef widthNumRef = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
if (widthNumRef != NULL) {
CFNumberGetValue(widthNumRef, kCFNumberCGFloatType, &width);
}
CFNumberRef heightNumRef = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
if (heightNumRef != NULL) {
CFNumberGetValue(heightNumRef, kCFNumberCGFloatType, &height);
}
CFRelease(imageProperties);
} else {
// If the image info is somehow missing, make up some numbers so we don't divide by zero
width = 512;
height = 384;
}
// Create thumbnail options
CGFloat maxDimension = size.height;
if (useDeviceNativeScale) {
maxDimension *= [UIScreen mainScreen].scale;
}
NSParameterAssert(maxDimension);
// If we have a really wide image, scaling it to the screen height will make it too blurry.
// Here we calculate the maximum dimension we want (probably width) to make the image be the full height of the device
CGFloat imageAspectRatio = width / height;
if (width > height) {
maxDimension *= imageAspectRatio;
}
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceShouldCache : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(maxDimension)
};
// Generate the thumbnail
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(imageSource, 0, options);
CFRelease(imageSource);
// 🔻 Crashlytics says the crash is on this line, even though the line above is the one that uses `CFRelease()`
UIImage *image = [UIImage imageWithCGImage:thumbnail];
CGImageRelease(thumbnail);
NSAssert(image.size.height - SCREEN_MIN_LENGTH * [UIScreen mainScreen].scale < 1, #"Image should be the height of the view");
return image;
}
And here's a stack trace from Fabric's Crashlytics:
0. Crashed: com.apple.main-thread
0 CoreFoundation 0x1820072a0 CFRelease + 120
1 Homer 0x10014e7f0 +[UIImage(ResizeBeforeLoading) imageAtPath:scaledToSize:] (UIImage+ResizeBeforeLoading.m:98)
With this crash info key:
CRASH_INFO_ENTRY_0
* CFRelease() called with NULL *
The odd thing is that this crash is not 100%, it's happening to a very low percentage of users. I can't reproduce it on any of my own devices. There is also no pattern to what version of iOS it happens on, nor what iOS hardware it happens on.
Here is a file that does not cause a crash:
And here is a file that does cause a crash:
And here are links to them in my production image host (s3 with cloudfront):
(good)
http://d3iq9oupxk0b1m.cloudfront.net/PirateDinosaur/2x/dinoBack._k91G0.jpg
(crashy)
http://d3iq9oupxk0b1m.cloudfront.net/PirateDinosaur/2x/PirateDino.3LHRmc.jpg
I can't see a meaningful difference between them, even when I look at them using Imagemagick's identify -verbose <filename>. Can any jpeg experts weigh in?
Note: In my next app version, to be released, I have added guards around releasing NULL references, to protect against this crash:
if (imageSource) { CFRelease(imageSource), imageSource = nil; }
...
if (thumbnail) { CGImageRelease(thumbnail), thumbnail = nil; }
However, I would still like to know what in the world is wrong with this jpeg that is causing this crash, so my artist can avoid it in the future.

Related

Memory growth during image editing

I'm trying to fetch an image of PDF page and edit it. Everything works fine, but there is an huge memory growth. Profiler says that there is no any memory leak. Also profiler says that 90% memory allocated at UIGraphicsGetCurrentContext() and UIGraphicsGetImageFromCurrentImageContext(). This code not runs in the loop and there is no need to wrap it with #autorelease.
if ((pageRotation == 0) || (pageRotation == 180) ||(pageRotation == -180)) {
UIGraphicsBeginImageContextWithOptions(cropBox.size, NO, PAGE_QUALITY);
}
else {
UIGraphicsBeginImageContextWithOptions(
CGSizeMake(cropBox.size.height, cropBox.size.width), NO, PAGE_QUALITY);
}
CGContextRef imageContext = UIGraphicsGetCurrentContext();
[PDFPageRenderer renderPage:_PDFPageRef inContext:imageContext pagePoint:CGPointMake(0, 0)];
UIImage *pageImage = UIGraphicsGetImageFromCurrentImageContext();
[[NSNotificationCenter defaultCenter] postNotificationName:#"PAGE_IMAGE_FETCHED" object:pageImage];
UIGraphicsEndImageContext();
But debug shows that the memory growing occurs only when I start editing the fetched image. For image editing I use the Leptonica library. For example:
+(void) testAction:(UIImage *) image{
PIX * pix =[self getPixFromUIImage:image];
pixConvertTo8(pix, FALSE);
pixDestroy(&pix);
}
Before pixConvertTo8 app takes 13MB, after - 50MB. Obviously growth depends on image size. Converting method:
+(PIX *) getPixFromUIImage:(UIImage *) image{
CFDataRef data = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
UInt8 const* pData = (UInt8*)CFDataGetBytePtr(data);
Pix *myPix = (Pix *) malloc(sizeof(Pix));
CGImageRef myCGImage = [image CGImage];
myPix->w = CGImageGetWidth (myCGImage)-1;
myPix->h = CGImageGetHeight (myCGImage);
myPix->d = CGImageGetBitsPerPixel([image CGImage]) ;
myPix->wpl = CGImageGetBytesPerRow (myCGImage)/4 ;
myPix->data = (l_uint32 *)pData;
myPix->colormap = NULL;
myPix->text="text";
CFRelease(data);
return myPix;
}
P.S. Sorry for my terrible English.

Memory increases when merging multiple high resolution images into single image, iOS

I have to merge multiple images in to single (all of high resolution), It acquires lots of memory. I saved original images to local directory and set resized images to imageviews, placed on different locations on main image. Now at the time of saving final merged image, I then read the original images from local directory. here the memory increases, that cause error (crash due to memory) for higher number of images.
here is code: retrieving original image from local directory
UIImage *originalImage = [UIImage imageWithContentsOfFile:[self getOriginalImagePath:imageview.tag]];
Is there any other way to get images from local directory without loading it into memory.
Thanks in advance
There is no way to load an image without it going into memory. With some image formats you could, in theory, implement your own reader that scales the image down while reading the file, so that the original size never ends up in memory, but that would require a lot of work for little gain.
Overall you would be better off just saving the different sizes of images as separate files and loading only the correct size (you seem to be scaling them based on the screen size, so there are not that many different versions required).
If you do keep to resizing them on the fly, try to ensure that you get rid of the original versions as soon as possible, i.e., don't keep any image reference no longer required, and perhaps wrap the whole thing in #autoreleasepool (assuming ARC is being used):
#autoreleasepool {
UIImage *originalImage = [UIImage imageWithContentsOfFile:[self getOriginalImagePath:imageview.tag]];
UIImage *pThumbsImage = [self scaleImageToSize:CGSizeMake(AppScreenBound.size.width, AppScreenBound.size.height) imageWithImage:pOrignalImage];
originalImage = nil;
imageView.image = pThumbImage;
pThumbImage = nil;
// … ?
}
Similarly treat any other image handling that creates intermediate versions, i.e., get rid of references no longer required as soon as possible (such as by assigning nil or having them fall out of scope), and put #autoreleasepool { … } around subsections that may generate temporary objects.
Found a solution, posting it as an answer to my own question, might help other people. reference from Image I/O Programming Guide
An alternative to "imageWithContentsOfFile:", one can use an Image Source
here is a code how I use it.
UIImage *originalWMImage = [self createCGImageFromFile:your-image-path];
the method createCGImageFromFile: get an image content without loading it to memory
-(UIImage*) createCGImageFromFile :(NSString*)path
{
// Get the URL for the pathname passed to the function.
NSURL *url = [NSURL fileURLWithPath:path];
CGImageRef myImage = NULL;
CGImageSourceRef myImageSource;
CFDictionaryRef myOptions = NULL;
CFStringRef myKeys[2];
CFTypeRef myValues[2];
// Set up options if you want them. The options here are for
// caching the image in a decoded form and for using floating-point
// values if the image format supports them.
myKeys[0] = kCGImageSourceShouldCache;
myValues[0] = (CFTypeRef)kCFBooleanTrue;
myKeys[1] = kCGImageSourceShouldAllowFloat;
myValues[1] = (CFTypeRef)kCFBooleanTrue;
// Create the dictionary
myOptions = CFDictionaryCreate(NULL, (const void **) myKeys,
(const void **) myValues, 2,
&kCFTypeDictionaryKeyCallBacks,
& kCFTypeDictionaryValueCallBacks);
// Create an image source from the URL.
myImageSource = CGImageSourceCreateWithURL((CFURLRef)url, myOptions);
CFRelease(myOptions);
// Make sure the image source exists before continuing
if (myImageSource == NULL){
fprintf(stderr, "Image source is NULL.");
return NULL;
}
// Create an image from the first item in the image source.
myImage = CGImageSourceCreateImageAtIndex(myImageSource,
0,
NULL);
CFRelease(myImageSource);
// Make sure the image exists before continuing
if (myImage == NULL){
fprintf(stderr, "Image not created from image source.");
return NULL;
}
return [UIImage imageWithCGImage:myImage];
}
Here is code: resized image and simply assigned to imageview. Then i perform scaling and rotation on imageview.
UIImage *pThumbsImage = [self scaleImageToSize:CGSizeMake(AppScreenBound.size.width, AppScreenBound.size.height) imageWithImage:pOrignalImage];
[imageView setImage:pThumbImage];
here when saving:this code is within for loop: (upto number of images to merge on main image)
// get size of the second image
CGFloat backgroundWidth = canvasSize.width;
CGFloat backgroundHeight = canvasSize.height;
//Image View: to be merged
UIImageView* imageView = [[UIImageView alloc] initWithImage:stampImage];
[imageView setFrame:CGRectMake(0, 0, stampFrameSize.size.width , stampFrameSize.size.height)];
// Rotate Image View
CGAffineTransform currentTransform = imageView.transform;
CGAffineTransform newTransform = CGAffineTransformRotate(currentTransform, radian);
[imageView setTransform:newTransform];
// Scale Image View
CGRect imageFrame = [imageView frame];
// Create Final Stamp View
UIView *finalStamp = nil;
finalStamp = [[UIView alloc] initWithFrame:CGRectMake(0, 0, imageFrame.size.width, imageFrame.size.height)];
// Set Center of Stamp Image
[imageView setCenter:CGPointMake(imageFrame.size.width /2, imageFrame.size.height /2)];
[finalImageView addSubview:imageView];
// Create Image From image View;
UIGraphicsBeginImageContext(finalStamp.frame.size);
[finalStamp.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImage *pfinalMainImage = nil;
// Create Final Image With Stamp
UIGraphicsBeginImageContext(CGSizeMake(backgroundWidth, backgroundHeight));
[canvasImage drawInRect:CGRectMake(0, 0, backgroundWidth, backgroundHeight)];
[viewImage drawInRect:CGRectMake(stampFrameSize.origin.x , stampFrameSize.origin.y , stampImageFrame.size.width , stampImageFrame.size.height) blendMode:kCGBlendModeNormal alpha:fAlphaValue];
pfinalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
everything is okay here. the problem occurs while saving it or generating merged image.
This is an old question, but I had to face something like that recently... so there is my answer.
I had to merge a lot of images into one, and had the same problem. The memory increased until the app crashes. The functions that I created, returned UIImage and that was the problem. The ARC was not releasing at time, so I had to change to return CGImageRef and release them at properly time.

Received memory warning when capturing screen and save to video ios

I am now writing a program to capture screen and convert to video. I can successfully save the video if it is less than 10 seconds. But, if more than that, I received memory warning and application crash. I wrote this code as follow. Where am I missing to release data ? I would like to know how to do.
-(void)captureAndSaveImage
{
if(!stopCapturing){
if (assetWriterInput.readyForMoreMediaData)
{
keepTrackOfBackGroundMood++;
NSLog(#"keepTrackOfBackGroundMood is %d",keepTrackOfBackGroundMood);
CVReturn cvErr = kCVReturnSuccess;
CGSize imageSize = screenCaptureAndDraw.bounds.size;
CGFloat imageScale = 0; //if zero, it reduce processing time
if (NULL != UIGraphicsBeginImageContextWithOptions)
{
UIGraphicsBeginImageContextWithOptions(imageSize, NO, imageScale);
}
else
{
UIGraphicsBeginImageContext(imageSize);
}
[self.hiddenView.layer renderInContext:UIGraphicsGetCurrentContext()];
[self.screenCaptureAndDraw.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
image = (CGImageRef) [img CGImage];
CVPixelBufferRef pixelBuffer = NULL;
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
FRAME_WIDTH2,
FRAME_HEIGHT2,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(imageData),
CGImageGetBytesPerRow(image),
NULL,
NULL,
NULL,
&pixelBuffer);
//CFRelease(imageData);
//CGImageRelease(image); //I can't write this code because I am not creating it and when I check online, it say it is not my responsibility to release. If I write, the application crash immediately
// calculate the time
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
if (appended) {
NSLog (#"appended sample at time %lf and keepTrackofappended is %d", CMTimeGetSeconds(presentationTime),keepTrackofappended);
keepTrackofappended++;
} else {
NSLog (#"failed to append");
[self stopRecording];
//self.startStopButton.selected = NO;
screenRecord=false;
}
}
}//stop capturing
// });
}
I agree that you don't want to do the CGImageRelease(image). This object was created by calling CGImage method of a UIImage object. Thus ownership was not transferred and ARC still does the memory management for your img object and no releasing of the image object is needed.
But I think you do want to restore your CFRelease(imageData). This is an object created by CGDataProviderCopyData, so you own it and must clean up.
I also think you have to release the pixelBuffer that you created with CVPixelBufferCreateWithBytes after you appendPixelBuffer. You can use the CVPixelBufferRelease function for that.
The Core Foundation memory rule is that if the function has Copy or Create in the name, you own that object and are responsible for releasing it. See the Create Rule in the Memory Management Programming Guide for Core Foundation.
I would have thought that the static analyzer (shift+command+B or "Analyze" from the Xcode "Product" menu) would have identified this issue, as it has gotten much better at finding Core Foundation memory issues (albeit, not perfect).
Alternatively, if you run your app through the Leaks tool in Instruments (which will also show you the Allocations tool at the same time), you can take a look at your memory usage. While the video capture requires a lot of Live Bytes, in my experience it stays pretty darn flat. If it's growing, you have a leak somewhere.

How can I get the dimensions of an image without downloading the entire image?

So far I have this method:
+ (CGSize) imageDimensions: (NSString *)url {
NSURL *imageFileURL = [NSURL URLWithString: url];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((__bridge CFURLRef)imageFileURL, NULL);
if (imageSource == NULL) {
return CGSizeMake(0, 0);
}
CGFloat width = 0.0f, height = 0.0f;
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
if (imageProperties != NULL) {
CFNumberRef widthNum = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
if (widthNum != NULL) {
CFNumberGetValue(widthNum, kCFNumberFloatType, &width);
}
CFNumberRef heightNum = CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
if (heightNum != NULL) {
CFNumberGetValue(heightNum, kCFNumberFloatType, &height);
}
CFRelease(imageProperties);
}
return CGSizeMake(width, height);
}
However I was doing some tests and it seems that this method only runs as fast as if I were to download the entire image and check the size property. I had thought this method worked by just downloading the image's meta data or headers and checking the resolution there. Am I wrong? Is there a quick way to get image dimensions from an internet image in objective C as quick as possible without grabbing the image?
Most graphical formats have their dimension stored in the header. If you limit yourself to a subset of formats (PNG/JPG/GIF, for example), and implement your own header parsing code, then it's possible. You'll be on your own, though - the Core Graphics image APIs do not work with partial files.
To request a part of the file from the 'Net, you can use the Range header. Some Web servers might not honor it, though. Request the number of bytes that's maximum among header sizes of supported formats. Also, you might want to validate the magic bytes in the header.

CGContextDrawImage memory is not freed

I'm using the PhotoScrollerNetwork project to provide a single high resolution image to a view in my project and automatically tile it, so memory is managed properly. It uses this block of code to draw the full high res image to memory, so that tiles can be calculated out of it.
-(void)drawImage:(CGImageRef)image {
madvise(ims[0].map.addr, ims[0].map.mappedSize - ims[0].map.emptyTileRowSize, MADV_SEQUENTIAL);
unsigned char *addr = ims[0].map.addr + ims[0].map.col0offset + ims[0].map.row0offset * ims[0].map.bytesPerRow;
CGContextRef context = CGBitmapContextCreate(addr, ims[0].map.width, ims[0].map.height, bitsPerComponent, ims[0].map.bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little);
assert(context);
CGContextSetBlendMode(context, kCGBlendModeCopy); // Apple uses this in QA1708
CGRect rect = CGRectMake(0, 0, ims[0].map.width, ims[0].map.height);
CGContextDrawImage(context, rect, image);
CGContextRelease(context);
madvise(ims[0].map.addr, ims[0].map.mappedSize - ims[0].map.emptyTileRowSize, MADV_FREE);
}
In the dealloc method of the class, the ims is freed ( 'free(ims)'), so this should be handled properly. However, if I make a new view (and thus a call to drawImage) repeatedly, my memory is getting filled. I found that if I comment CGContextDrawImage(context, rect, image);, the memory is ok, so I think something is kept in memory, but I can't get what... The dealloc method is always called, so that's not the problem.
EDIT:
My image is also released properly, this is the complete flow:
- (void)myFunc {
CFDictionaryRef options = [self createOptions];
CGImageRef image = CGImageSourceCreateImageAtIndex(imageSourcRef, 0, options);
CFRelease(options);
CFRelease(imageSourcRef);
if (image) {
[self decodeImage:image];
CGImageRelease(image);
}
}
- (void)decodeImage:(CGImageRef)image {
assert(decoder == cgimageDecoder);
size_t width = CGImageGetWidth(image);
size_t height = CGImageGetHeight(image);
#if LEVELS_INIT == 0
zoomLevels = [self zoomLevelsForSize:CGSizeMake(width, height)];
ims = calloc(zoomLevels, sizeof(imageMemory));
#endif
[self mapMemoryForIndex:0 width:width height:height];
[self drawImage:image];
[self createLevelsAndTile];
}
Running both local from-bundle images, and network images, it appears any significant leak is gone. This with iOS7 and Xcode 5.

Resources