Block leak with __block variable - ios

I have a big memory leak that I have pinpointed to happen in/on requestContentEditingInputWithOptions: method. If I understand it right it happens with the img variable. If I make it __block __weak the image is nil already after I assign it (img = [UIImage...]). Am I being silly somewhere? Or how would I avoid this memory leak?
- (UIImage*) getRightlySizedImgFromAsset:(PHAsset*)asset {
__block UIImage *img;
PHContentEditingInputRequestOptions *coptions = [PHContentEditingInputRequestOptions new];
coptions.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmentData) { return NO; };
//semaphore used so the block runs synchronously and I can return img from this method at the end
dispatch_semaphore_t sem = dispatch_semaphore_create(0);
[asset requestContentEditingInputWithOptions:coptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL* url = [contentEditingInput fullSizeImageURL];
int orientation = [contentEditingInput fullSizeImageOrientation];
CIImage* inputImage = [CIImage imageWithContentsOfURL:url options:nil];
inputImage = [inputImage imageByApplyingOrientation:orientation];
CIContext *context = [CIContext contextWithOptions:nil];
img = [UIImage imageWithCGImage:[context createCGImage:inputImage fromRect:inputImage.extent]];
dispatch_semaphore_signal(sem);
}];
dispatch_semaphore_wait(sem, DISPATCH_TIME_FOREVER);
if (needToDoSomethingWithImg){
[self doSomethingWithImage:img];
}
return img;
}

Run this code through the static analyzer (shift+command+B or choose "Analyze" from the "Product" menu) and it will point out that createCGImage is creating a CGImageRef that you're never releasing.
You might want to do something like:
CGImageRef imageRef = [context createCGImage:inputImage fromRect:inputImage.extent];
img = [UIImage imageWithCGImage:imageRef];
CFRelease(imageRef);
By the way, you should not do this synchronously. You should do something like:
- (void) getRightlySizedImgFromAsset:(PHAsset*)asset completionHandler:(void (^)(UIImage *))completionHandler {
PHContentEditingInputRequestOptions *coptions = [PHContentEditingInputRequestOptions new];
coptions.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmentData) { return NO; };
[asset requestContentEditingInputWithOptions:coptions completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL* url = [contentEditingInput fullSizeImageURL];
int orientation = [contentEditingInput fullSizeImageOrientation];
CIImage* inputImage = [CIImage imageWithContentsOfURL:url options:nil];
inputImage = [inputImage imageByApplyingOrientation:orientation];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef imageRef = [context createCGImage:inputImage fromRect:inputImage.extent];
UIImage *image = [UIImage imageWithCGImage:imageRef];
CFRelease(imageRef);
// if this stuff needs to happen on main thread, then dispatch it to the main thread
if (needtodosomethingwithit)
[self doSomethingWithImage:image];
if (completionHandler) {
completionHandler(image);
}
}];
}

Rob is right on the money. And images can be big, so that's why you have a big leak. The rule of thumb with Core Foundation objects is the "create rule." Search in Xcode on "Create Rule" and read the article. The gist of it is this:
Core Foundation functions have names that indicate when you own a
returned object:
Object-creation functions that have “Create” embedded in the name;
Object-duplication functions that have “Copy” embedded in the name. If
you own an object, it is your responsibility to relinquish ownership
(using CFRelease) when you have finished with it.

Related

Memory Leak in CMSampleBufferGetImageBuffer

I'm getting a UIImage from a CMSampleBufferRef video buffer every N video frames like:
- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion {
CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
if (sampleBuffer != nil) {
CFRetain(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
_lastAppendedVideoBuffer.sampleBuffer = nil;
if (_context == nil) {
_context = [CIContext contextWithOptions:nil];
}
CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
__block UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CFRelease(sampleBuffer);
if(completion) completion(image);
return;
}
if(completion) completion(nil);
}
XCode and Instruments detect a Memory Leak, but I'm not able to get rid of it.
I'm releasing the CGImageRef and CMSampleBufferRef as usual:
CGImageRelease(cgImage);
CFRelease(sampleBuffer);
[UPDATE]
I put in the AVCapture output callback to get the sampleBuffer.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if (captureOutput == _videoOutput) {
_lastVideoBuffer.sampleBuffer = sampleBuffer;
id<CIImageRenderer> imageRenderer = _CIImageRenderer;
dispatch_async(dispatch_get_main_queue(), ^{
#autoreleasepool {
CIImage *ciImage = nil;
ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
if(_context==nil) {
_context = [CIContext contextWithOptions:nil];
}
CGImageRef processedCGImage = [_context createCGImage:ciImage
fromRect:[ciImage extent]];
//UIImage *image=[UIImage imageWithCGImage:processedCGImage];
CGImageRelease(processedCGImage);
NSLog(#"Captured image %#", ciImage);
}
});
The code that leaks is the createCGImage:ciImage:
CGImageRef processedCGImage = [_context createCGImage:ciImage
fromRect:[ciImage extent]];
even having a autoreleasepool, the CGImageRelease of the CGImage reference and a CIContext as instance property.
This seems to be the same issue addressed here: Can't save CIImage to file on iOS without memory leaks
[UPDATE]
The leak seems to be due a bug. The issue is well described in
Memory leak on CIContext createCGImage at iOS 9?
A sample project shows how to reproduce this leak: http://www.osamu.co.jp/DataArea/VideoCameraTest.zip
The last comments assure that
It looks like they fixed this in 9.1b3. If anyone needs a workaround
that works on iOS 9.0.x, I was able to get it working with this:
in a test code (Swift in this case):
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (error) return;
__block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
dispatch_async(dispatch_get_main_queue(), ^
{
#autoreleasepool
{
CIImage *enhancedImage = [CIImage imageWithData:imageData];
if (!enhancedImage) return;
static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];
CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];
UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];
[[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];
CGImageRelease(imageRef);
}
});
}];
and the workaround for iOS9.0 should be
extension CIContext {
func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {
let width = Int(fromRect.width)
let height = Int(fromRect.height)
let rawData = UnsafeMutablePointer<UInt8>.alloc(width * height * 4)
render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())
let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}
return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!
}
}
We were experiencing a similar issue in an app we created, where we are processing each frame for feature keypoints with OpenCV, and sending off a frame every couple of seconds. After a while of running we would end up with quite a few memory pressure messages.
We managed to rectify this by running our processing code in it's own auto release pool like so (jpegDataFromSampleBufferAndCrop does something similar to what you are doing, with added cropping):
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
#autoreleasepool {
if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {
NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];
if (imageData) {
[self processImageData:imageData];
}
self.lastFrameSentAt = [NSDate date];
imageData = nil;
}
}
}
}
I can confirm that this memory leak still exists on iOS 9.2. (I've also posted on the Apple Developer Forum.)
I get the same memory leak on iOS 9.2. I've tested dropping EAGLContext by using MetalKit and MLKDevice. I've tested using different methods of CIContext like drawImage, createCGImage and render but nothing seem to work.
It is very clear that this is a bug as of iOS 9. Try it out your self by downloading the example app from Apple (see below) and then run the same project on a device with iOS 8.4, then on a device with iOS 9.2 and pay attention to the memory gauge in Xcode.
Download https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109
Add this to the APLEAGLView.h:20
#property (strong, nonatomic) CIContext* ciContext;
Replace APLEAGLView.m:118 with this
[EAGLContext setCurrentContext:_context];
_ciContext = [CIContext contextWithEAGLContext:_context];
And finaly replace APLEAGLView.m:341-343 with this
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
#autoreleasepool
{
CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIFilter* filter = [CIFilter filterWithName:#"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];
CIImage* filteredImage = filter.outputImage;
[_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];
}
glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);

Replicating convenience implementations in Swift

I am attempting to understand the following code and how you would convert it to Swift. Specifically, I understand this adds an instance method you can call on an instance of CIImage. My question is, how you can do the same thing in a Swift class?
This code is taken from AAPLAssetViewController.m in Apple's example app using the Photos framework.
#implementation CIImage (Convenience)
- (NSData *)aapl_jpegRepresentationWithCompressionQuality:(CGFloat)compressionQuality {
static CIContext *ciContext = nil;
if (!ciContext) {
EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
ciContext = [CIContext contextWithEAGLContext:eaglContext];
}
CGImageRef outputImageRef = [ciContext createCGImage:self fromRect:[self extent]];
UIImage *uiImage = [[UIImage alloc] initWithCGImage:outputImageRef scale:1.0 orientation:UIImageOrientationUp];
if (outputImageRef) {
CGImageRelease(outputImageRef);
}
NSData *jpegRepresentation = UIImageJPEGRepresentation(uiImage, compressionQuality);
return jpegRepresentation;
}
#end
Call it like so:
NSData *jpegData = [myCIImage aapl_jpegRepresentationWithCompressionQuality:0.9f];
From The Swift Programming Language - Extensions:
Extensions add new functionality to an existing class, structure, or enumeration type. (...) Extensions are similar to categories in Objective-C.
https://developer.apple.com/library/ios/documentation/Swift/Conceptual/Swift_Programming_Language/Extensions.html

Image filtering leads to high memory consumption and crash

I am using the following code for applying image filters. In my app I am filtering for brightness, contrast and saturation. I am using three separate sliders each to change the values. As I continue to move the values, the memory consumption goes over 1.5 gb and crashes. Is there a way to reduce this memory consumption for a crash free implementation?
(void)setBrightnessAndContrastOf:(UIImage *)image { // forTarget:(UIImageView *)imgView {
if (!image) {
return;
}
CIImage *inputImage = [[CIImage alloc] initWithImage:image];
CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:#"CIColorControls"];
[exposureAdjustmentFilter setDefaults];
[exposureAdjustmentFilter setValue:inputImage forKey:#"inputImage"];
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.contrastValue] forKey:#"inputContrast"]; //default = 1.00
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.brightnessValue] forKey:#"inputBrightness"]; //default = 0.00
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.saturationValue] forKey:#"inputSaturation"]; //default = 1.00
CIImage *outputImage = [exposureAdjustmentFilter valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef tempImage = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:tempImage];
[imageView performSelectorOnMainThread:#selector(setImage:) withObject:newImage waitUntilDone:NO];
CGImageRelease(tempImage);
inputImage = nil;
context = nil;
outputImage = nil;
exposureAdjustmentFilter = nil;
}
You're not supposed to do heavy image manipulation inside the main thread. Unless you've already implemented multithreading (which is not mentioned in your code snippet), please do so.
You might try:
dispatch_queue_t backgroundQueue = dispatch_queue_create("com.yourorg", DISPATCH_QUEUE_SERIAL);
dispatch_queue_t mainQueue = dispatch_get_main_queue();
dispatch_async(backgroundQueue, ^
{
// setBrightnessAndContrastOf method goes here
dispatch_sync(mainQueue, ^ {
//notify main thread about process status
});
});
Since you're using ARC, crashes due to over consumption of memory is not very likely. However, if you block the main thread for too long, watchdog timer takes it out through the backdoor and shoots it right in the head.
Use instruments to monitor your heap size and try to find out the root cause.
I am not sure what your setImage method is doing but I would move CGImageRelease(tempImage) before the performSelector.

Error occuring when loading photolibrary images : ERROR: FigCreateCGImageFromJPEG returned -12910. 423114 bytes. We will fall back to software decode

I'm working with an application in which I'm loading images from photolibrary.
I'm using the following code for binding the image to imageView.
-(void)loadImage:(UIImageView *)imgView FileName:(NSString *)fileName
{
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
UIImage *lImage;
if (iref)
{
lImage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]];
}
else
{
lImage = [UIImage imageNamed:#"Nofile.png"];
}
dispatch_async(dispatch_get_main_queue(), ^{
[imgView setImage:lImage];
});
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
UIImage *images = [UIImage imageNamed:#"Nofile.png"];
dispatch_async(dispatch_get_main_queue(), ^{
[imgView setImage:images];
});
};
NSURL *asseturl = [NSURL URLWithString:fileName];
ALAssetsLibrary *asset = [[ALAssetsLibrary alloc] init];
[asset assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
But when I tried to run it, an error is coming and the application is crashing sometimes.
The error printed on console is:
** * ERROR: FigCreateCGImageFromJPEG returned -12910. 423114 bytes. We will fall back to software decode.
Received memory warning.
My photo library contains high resolution images and their size between 10-30 MB.
Finally I fixed the issue.
I think the issue is with fetching the full resolution image.
Instead of :
CGImageRef iref = [rep fullResolutionImage];
I used:
CGImageRef iref = [myasset aspectRatioThumbnail];
And everything worked fine. No error in console, no crash, but quality/resolution of the image is reduced.
I have a similar error:
* ERROR: FigCreateCGImageFromJPEG returned -12909. 0 bytes. We will fall back to software decode.
app crush on call:
CGImageRef originalImage = [representation fullResolutionImage];
I fix it by replace to:
CGImageRef originalImage = [representation fullScreenImage];
[UIImage imageWithCGImage:]
imageWithCGImage is a stack memory function, it seems to overflow if the large image.
What about using the heap functions?.
lImage = [[[UIImage alloc]initWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]] autorelease];

Memory management in dispatch

I try to make thumbs on my iPad app of all the view in the background using the following code:
NSString *path = [self.page previewPathForOrientation:currentOrientation];
dispatch_async( dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
#autoreleasepool {
UIGraphicsBeginImageContextWithOptions(self.previewView.bounds.size, NO, 0.0);
[self.previewView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.previewView = nil;
float scale = [UIScreen mainScreen].scale;
CGRect previewRect = currentOrientation == Landscape ? [[OrientationLandscape singleton] frameForPreviewImage] : [[OrientationPortrait singleton] frameForPreviewImage];
CGSize previewSize = CGSizeMake(previewRect.size.width * scale, previewRect.size.height * scale);
UIImage *scaledImage = [image scaleImageToSize:previewSize];
CGImageDestinationRef imageDestination = CGImageDestinationCreateWithURL((__bridge CFURLRef)[[NSURL alloc] initFileURLWithPath:path], (__bridge CFStringRef)#"public.png", 1, NULL);
CGImageDestinationAddImage(imageDestination, [scaledImage CGImage], NULL);
CGImageDestinationFinalize(imageDestination);
CFRelease(imageDestination);
NSFileManager *fileMngr = [[NSFileManager alloc] init];
if(![fileMngr fileExistsAtPath:path])
{
ZAssert(0, #"could not save preview file");
}
dispatch_async(dispatch_get_main_queue(), ^{
rendered++;
//DLog(#"rendered %d items", rendered);
[GetController addSkipBackupAttributeToItemAtPath:path];
[self.page setPreviewRenderedForOrientation:currentOrientation];
contentsCount = 0;
currentContentIndex = 0;
//[self prepareOtherOrientation];
if(self.journal == nil && (![self.page previewRenderedForOrientation:Landscape] || ![self.page previewRenderedForOrientation:Portrait])){
[self appendPage:self.page];
}
DLog(#"rendered page %# in orientation %d", self.page, currentOrientation);
self.page = nil;
[self retry];
});
}
});
The retry function uses an NSTimer to start the same function again, after a short delay and with a different page. Using the Allocations tool, the heap just keeps growing. After a while I get Memory Warnings, shortly after the app crashes.
Everything works fine when I remove all the dispatch calls, but of course thats not what I want. Also, when I increase the delay in the retry method to say 5 seconds, the problem disappears too, so it seems memory isn't released when things get processed in quick succession.
I absolutely ensured that this method isn't running more than once at a time... any ideas what's going on here?

Resources