CGImageRef consumes lot of memory - ios

I am creating blur image for one of my apps screen, for this i am using following code
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:5] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
blurrImage = [UIImage imageWithCGImage:cgImage];
self.blurrImageView.image = blurrImage;
CGImageRelease(cgImage);
form the above code i am getting the correct blur image, but the problem is at CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]]; at this line.
upto this line memory usage showing is normal, but after this line memory usage is increased abnormally high,
hear is the screenchot of memory usage shown before the execution. memory usage is keep on increasing along the execution of this method , this is before
and this after execution of the line CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
is this is common behaviour..? i searched answer but i didn't get, so any one faced the same problem please help me on this
one thing i am "not using ARC"

I experience the same memory consumption problems with Core Image.
If you're looking for alternatives, in iOS 7, you can use UIImage+ImageEffects category, which is available as part of the iOS_UIImageEffects project at the WWDC 2013 sample code page. It provides a few new methods:
- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;
These don't suffer from the memory consumption issues that you experience with Core Image. (Plus, it's a much faster blurring algorithm.)
This technique is illustrated in WWDC 2013 video Implementing Engaging UI on iOS.

The fact you are using a screenshot could vary the memory usage, on retina display could be more that normal device. The doubled is ok in my opinion because you have the original UIImage and the blur image living in memory, probably also the context will keep some memory. I make a guess:
You are using a lot of autoreleased object, they will stay in memory
until the pool is drained, try to wrap the code in an
autoreleaseblock
#autoreleasepool{
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:5] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[inputImage extent]];
blurrImage = [UIImage imageWithCGImage:cgImage];
self.blurrImageView.image = blurrImage;
CGImageRelease(cgImage);
}

Related

Unable to draw CIImage on GLKView after few frames since updated to iOS 10.2?

Using following code in my application which was performing quiet fine to draw a CIImage on a GLKView again and again as recieved from AVCaptureOutput -didOutputSampleBuffer until I was using iOS <= 10.1.*
After updating device to iOS 10.2.1 it has stopped working. I am calling it for few frames the app just crashes with low memory warning. Whereas with iOS 10.1.1 and below I smoothly runs the app even on older device like iPhone 5S.
[_glkView bindDrawable];
if (self.eaglContext != [EAGLContext currentContext])
[EAGLContext setCurrentContext:self.eaglContext];
glClearColor(0.0, 0.0, 0.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
if (ciImage) {
[_ciContext drawImage:ciImage inRect:gvRect fromRect:dRect];
}
[_glkView display];
This is how I am making the CIImage.
- (CIImage*)ciImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer ofSampleBuffer:(CMSampleBufferRef)sampleBuffer {
CIImage *croppedImage = nil;
CFDictionaryRef attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:(NSDictionary *)attachments];
if (attachments)
CFRelease(attachments);
croppedImage = ciImage;
CIFilter *scaleFilter = [CIFilter filterWithName:#"CILanczosScaleTransform"];
[scaleFilter setValue:croppedImage forKey:#"inputImage"];
[scaleFilter setValue:[NSNumber numberWithFloat:self.zoom_Resize_Factor == 1 ? 0.25 : 0.5] forKey:#"inputScale"];
[scaleFilter setValue:[NSNumber numberWithFloat:1.0] forKey:#"inputAspectRatio"];
croppedImage = [scaleFilter valueForKey:#"outputImage"];
NSDictionary *options = #{(id)kCIImageAutoAdjustRedEye : #(false)};
NSArray *adjustments = [ciImage autoAdjustmentFiltersWithOptions:options];
for (CIFilter *filter in adjustments) {
[filter setValue:croppedImage forKey:kCIInputImageKey];
croppedImage = filter.outputImage;
}
CIFilter *selectedFilter = [VideoFilterFactory getFilterWithType:self.selectedFilterType]; //This line needs to be removed from here
croppedImage = [VideoFilterFactory applyFilter:selectedFilter OnImage:croppedImage];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
return croppedImage;
}
Here is imgur link http://imgur.com/a/u6Vyo of VM Tracker and OpenGL ES instruments result. Incase it eases to understand. Thanks.
Your GLKView rendering implementation looks fine, the issue seems to be coming from the amount of processing you're doing on PixelBuffer after converting it into CIImage.
Also the Imgur link you shared shows that GLKView is unable to prepare VideoTexture object correctly, most probably due to the memory overload being created in each iteration. You need to optimise this CIFilter Processing.

Core Image Filter Crash issue

I'm trying to load filters for the larger image size of dimensions nearly about 5000*3000 which I have downloaded from Web Search. When applying these filters for the larger image size the app crashes and hence the termination occurs. Below is the code which i'm using currently for the preview of filters :
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter.filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *displayImage = [UIImage imageWithCGImage:cgimg];
Line of code below is causing the issue, has anyone come across this issue ?
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
At this line
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
You create a new image reference and then create a new image from this reference
try to add this line below your last line
CGImageRelease(cgimg)
As ARC did not automatically dealloc this reference so you have to manually release this reference then it will work on your side
Code:
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter.filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *displayImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg); // this line release the image reference

iOS generated QR code not recognized on other platforms

I am generating a QR code image using the CIQRCodeGenerator filter available from CIFilter. The image is generated fine and when it's displayed I can read the image using AVCaptureSession. However, when I try to scan the QR code using a different platform (Android, BlackBerry, iOS 6) then it doesn't recognize the image. According to Apple's documentation the generated image is compliant with the ISO/IEC 18004:2006 standard. Is the problem that I need something that is compliant with ISO 18004:2000?
Here is the code I'm using to generate the image:
NSData *stringData = [stringToEncode dataUsingEncoding:NSISOLatin1StringEncoding];
CIFilter *qrFilter = [CIFilter filterWithName:#"CIQRCodeGenerator"];
[qrFilter setValue:stringData forKey:#"inputMessage"];
[qrFilter setValue:#"M" forKey:#"inputCorrectionLevel"];
CIImage *qrImage = qrFilter.outputImage;
return [UIImage squareUIImageFromCIImage:qrImage withSize:size];
Here is a sample QR code:
Does anybody know if there is a way to generate a more universally recognized QR code image using CIFilter? I'd really prefer not to go back to using ZXing.
Thank you!
I'm not sure if the slight change makes a difference, but here's a snippet from my recent project that generates a QR code that is scanned from an iPad camera successfully in under a second:
CIFilter *filter = [CIFilter filterWithName:#"CIQRCodeGenerator"];
[filter setDefaults];
NSData *data = [accountNumber dataUsingEncoding:NSUTF8StringEncoding];
[filter setValue:data forKey:#"inputMessage"];
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:outputImage
fromRect:[outputImage extent]];
UIImage *barcode = [UIImage imageWithCGImage:cgImage
scale:1.
orientation:UIImageOrientationUp];
// EDIT:
CFRelease(cgImage);
You are using the ISO-8859-1 character set, but different QR code readers assume different things about the character encoding depending on which version of the standard they're following. UTF-8 seems to be more common than ISO-8859-1.

Image filtering leads to high memory consumption and crash

I am using the following code for applying image filters. In my app I am filtering for brightness, contrast and saturation. I am using three separate sliders each to change the values. As I continue to move the values, the memory consumption goes over 1.5 gb and crashes. Is there a way to reduce this memory consumption for a crash free implementation?
(void)setBrightnessAndContrastOf:(UIImage *)image { // forTarget:(UIImageView *)imgView {
if (!image) {
return;
}
CIImage *inputImage = [[CIImage alloc] initWithImage:image];
CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:#"CIColorControls"];
[exposureAdjustmentFilter setDefaults];
[exposureAdjustmentFilter setValue:inputImage forKey:#"inputImage"];
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.contrastValue] forKey:#"inputContrast"]; //default = 1.00
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.brightnessValue] forKey:#"inputBrightness"]; //default = 0.00
[exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:self.saturationValue] forKey:#"inputSaturation"]; //default = 1.00
CIImage *outputImage = [exposureAdjustmentFilter valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef tempImage = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:tempImage];
[imageView performSelectorOnMainThread:#selector(setImage:) withObject:newImage waitUntilDone:NO];
CGImageRelease(tempImage);
inputImage = nil;
context = nil;
outputImage = nil;
exposureAdjustmentFilter = nil;
}
You're not supposed to do heavy image manipulation inside the main thread. Unless you've already implemented multithreading (which is not mentioned in your code snippet), please do so.
You might try:
dispatch_queue_t backgroundQueue = dispatch_queue_create("com.yourorg", DISPATCH_QUEUE_SERIAL);
dispatch_queue_t mainQueue = dispatch_get_main_queue();
dispatch_async(backgroundQueue, ^
{
// setBrightnessAndContrastOf method goes here
dispatch_sync(mainQueue, ^ {
//notify main thread about process status
});
});
Since you're using ARC, crashes due to over consumption of memory is not very likely. However, if you block the main thread for too long, watchdog timer takes it out through the backdoor and shoots it right in the head.
Use instruments to monitor your heap size and try to find out the root cause.
I am not sure what your setImage method is doing but I would move CGImageRelease(tempImage) before the performSelector.

Better performance for ffmpeg decoding editing image contrast for every frame

i'm developing an rtsp player using ffmpeg library and i must edit contrast image for every frame of video, searching the web i found this code for edit contrast :
- (UIImage*)contrast
{
CIImage *beginImage = [CIImage imageWithCGImage:[self CGImage]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, beginImage,
#"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg =
[context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
self = newImg;
CGImageRelease(cgimg);
return self;
}
it works perfectly, but on iPad i lose performance and when decoding video are show a lot of noise on screen. There is a better way in performance to modify contrast for image??
Yes, there is a better way... Use opengl es 2.0 shaders, btw this hard work will be done on GPU...

Resources