I'm using the below method to blur some images. Using instruments the CIImage's are leaking. I tried wrapping them in an #autoreleasepool, but no luck. Any ideas?
-(UIImage *)blurImage:(UIImage *)image withStrength:(float)strength
{
#autoreleasepool {
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:[NSNumber numberWithFloat:strength] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
float scale = [[UIScreen mainScreen] scale];
CIImage *cropped=[result imageByCroppingToRect:CGRectMake(0, 0, image.size.width*scale, image.size.height*scale)];
CGRect extent = [cropped extent];
CGImageRef cgImage = [context createCGImage:cropped fromRect:extent];
UIImage *returnImage = [UIImage imageWithCGImage:cgImage].copy;
CGImageRelease(cgImage);
return returnImage;
}
}
I see the same leak you're seeing when profiling the code. Try this instead which seems to avoid the leak and give you the same results:
- (UIImage*)blurImage:(UIImage*)image withStrength:(float)strength
{
#autoreleasepool {
CIImage* inputImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIFilter* filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:[NSNumber numberWithFloat:strength] forKey:#"inputRadius"];
CIImage* result = [filter valueForKey:kCIOutputImageKey];
float scale = [[UIScreen mainScreen] scale];
CIImage* cropped = [result imageByCroppingToRect:CGRectMake(0, 0, image.size.width * scale, image.size.height * scale)];
return [[UIImage alloc] initWithCIImage:cropped];
}
}
did you try to put CIImages to nil ?
-(UIImage *)blurImage:(UIImage *)image withStrength:(float)strength
{
//your code
CGImageRelease(cgImage);
cropped=nil;
result = nil;
inputImage = nil;
context = nil;
return returnImage;
}
}
Related
I am manipulating an image with the following method and there is a memory leak at first line. The code is not ARC so I have to manually release the memory. How can I release the memory leaked in the first line of the following function?
-(UIImage*) manipulateImage :(UIImage *)image :(int)intType
{
CIImage* inputImage = [[[CIImage alloc] initWithImage:image] autorelease]; //leak is here.
CIFilter* filter = [CIFilter filterWithName:#"CIColorControls"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:#(intType) forKey:kCIInputSaturationKey];
CIImage* result = [filter valueForKey:kCIOutputImageKey];
CIImage* returnImage = [result imageByCroppingToRect:[result extent]];
return [[[UIImage alloc] initWithCGImage:returnImage.CGImage] autorelease];
}
I am using CIFilter for image filtering in my iOS app.My following code works smoothly.But i have one problem,i think processing of image is time taken(I can’t get real time effect).How to solve this issue?
cameraImgView.image = [ImageProcessing sepian:myOriginalImage withInensity:0.5]; //method call
Method Definition
+(UIImage*)sepian:(UIImage*)img withInensity:(float)intensity{
CIImage *cimage = [[CIImage alloc] initWithImage:img];
CIFilter *sepiaFilter = [CIFilter filterWithName:#"CISepiaTone"];
[sepiaFilter setDefaults];
[sepiaFilter setValue:cimage forKey:#"inputImage"];
[sepiaFilter setValue:[NSNumber numberWithFloat:intensity]
forKey:#"inputIntensity"];
CIImage *outputImage = [sepiaFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:
outputImage fromRect:[outputImage extent]];
UIImage *resultUIImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return resultUIImage;
}
Try this:
filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, rawImageData,
#"inputIntensity", #3.5, nil];
Give this filter like below:
CIImage *filteredImageData = [filter valueForKey:#"outputImage"];
UIImage *filteredImage = [UIImage imageWithCIImage:filteredImageData];
self.img_view.image = filteredImage;
After changing the CIFilter on an imageView image, the image appears blurry. Trying to figure out why.
Here is the code:
- (UIImage *)convertImageToGrayScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectTonal"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (UIImage *)convertImageToColorScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (IBAction)colorize:(id)sender {
self.imageView.image = nil;
if(self.hasColor) {
self.hasColor = FALSE;
self.imageView.image = [self convertImageToGrayScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"remove Color");
}
else {
self.hasColor = TRUE;
self.imageView.image = [self convertImageToColorScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"Add Color");
}
}
When I add the filter the first time, it looks good. When I click the button to change the filter, it's blurry.
Try replacing the line return [UIImage imageWithCIImage:filter.outputImage]; with:
CIImage *outputImage = [filter outputImage];
CGImageRef imageRef = [[CIContext contextWithOptions:nil] createCGImage:outputImage fromRect:outputImage.extent];
UIImage *newImg = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return newImg;
My app is crashing when I attempt to apply a filter to my user-selected UIImage (It has been working fine without applying the filter). I added and imported the "CoreImage" framework to my project so I could create filters for user-selected images.
I am attempting to apply the filter by creating a category for UIImage (based on Apple's documentation, and then calling the corresponding method on the UIImage selected by the user. Following is the code of my category header and body; what am I doing wrong? (please note, "randColor" is a category UIColor class method to generate a random color)
#import <UIKit/UIKit.h>
#import <CoreImage/CoreImage.h>
#import "UIColor+CustomColorCategory.h"
#interface UIImage (MonoChromeFilter)
- (UIImage *) applyMonoChromeWithRandColor;
#end
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
- (UIImage *)applyMonoChromeWithRandColor
{
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *ciImage = [[CIImage alloc] initWithImage:self];
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:[UIColor randColor] forKey:kCIAttributeTypeColor];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Here is the method in the viewController where this category is being called:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:^{
UIImage *editedImage = [info objectForKey:UIImagePickerControllerEditedImage];
editedImage = [editedImage applyMonoChromeWithRandColor];
self.blogImageOutlet.image = editedImage;
self.blogImageOutlet.layer.cornerRadius = self.blogImageOutlet.frame.size.width / 2.0;
[self.blogImageOutlet setClipsToBounds:YES];
[self saveImageToLibrary:editedImage];
}];
}
I figured it out! After debugging and using some other projects as a point of reference, I realized that I was experiencing two issues. First, I was trying to use a UIColor for CIColor, which is not directly possible. I first had to covert the UIColor to a CIColor to be able to apply it. Next, I was not using the correct strings for the CIFilter value keys. Here is the following code after modifications (and now it works!)
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
+ (UIImage *) applyMonoChromeWithRandColor: (UIImage *)uIImage
{
// Convert UIColor to CIColor
CGColorRef colorRef = [UIColor randColor].CGColor;
NSString *colorString = [CIColor colorWithCGColor:colorRef].stringRepresentation;
CIColor *coreColor = [CIColor colorWithString:colorString];
CIContext *context = [CIContext contextWithOptions:nil];
// Convert UIImage to CIImage
CIImage *ciImage = [[CIImage alloc] initWithImage:uIImage];
// Set values for CIColorMonochrome Filter
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:#1.0 forKey:#"inputIntensity"];
[filter setValue:coreColor forKey:#"inputColor"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Try This , It's work like champs for me ,
We have applied different CIFilter for single image,
Below is code for implementation ,
//CISepiaTone Effect,
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc]initWithImage:inputimage];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, image,
#"inputIntensity", #1, nil];
CIImage *result = [filter valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
//CIVignette Effect
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc] initWithImage:inputimage];
CIFilter *vignette = [CIFilter filterWithName:#"CIVignette"];
[vignette setDefaults];
[vignette setValue: image forKey: #"inputImage"];
[vignette setValue: [NSNumber numberWithFloat: 1.0] forKey: #"inputIntensity"];
[vignette setValue: [NSNumber numberWithFloat: 10.00 ] forKey: #"inputRadius"];
CIImage *result = [vignette valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
For detail implementation you can refer this GitHub project file ImageFilter
Hope so This will help for some one .
In my iOS app I want to apply a filter CIGaussianBlur on UIImage, when it gets a image having big height it rotates the image
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage = [[CIImage alloc] initWithImage:image]; //get image for blur
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[blurFilter setDefaults];
[blurFilter setValue:inputImage forKey:#"inputImage"];
CGFloat blurLevel = 0.0f; // Set blur level
[blurFilter setValue:[NSNumber numberWithFloat:blurLevel] forKey:#"inputRadius"];
// set value for blur level
CIImage *outputImage = [blurFilter valueForKey:#"outputImage"];
CGRect rect = inputImage.extent; // Create Rect
rect.origin.x += blurLevel; // and set custom params
rect.origin.y += blurLevel; //
rect.size.height -= blurLevel*2.0f; //
rect.size.width -= blurLevel*2.0f; //
CGImageRef cgImage = [context createCGImage:outputImage fromRect:rect];
// Then apply new rect
UIImageOrientation originalOrientation = _imageView.image.imageOrientation;
CGFloat originalScale = _imageView.image.scale;
UIImage *fixedImage=[UIImage imageWithCGImage:cgImage scale:originalScale orientation:originalOrientation] ; //output of CIGaussianBlur
It works for me.
_imageView.image=image;
UIImageOrientation originalOrientation = _imageView.image.imageOrientation;
CGFloat originalScale = _imageView.image.scale;
UIImage *fixedImage=[UIImage imageWithCGImage:cgImage scale:originalScale orientation:originalOrientation] ;