Image becomes blur when applying CIFilter - ios

I am working in a iOS app to crop the rectangle image from camera. And I am using the CIDetector to get the rect features and using CIFilter in order to crop the rectangle image. But after applying the filter the result image quality becomes very poor.
Here is my code below.
I am getting video capture output from the following delegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Convert into CIIImage to find the rectfeatures.
self.sourceImage = [[CIImage alloc] initWithCGImage:[self imageFromSampleBuffer:sampleBuffer].CGImage options:nil];
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciimg = [CIImage imageWithCVPixelBuffer:pb];
// show result
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef ref = [context createCGImage:ciimg fromRect:ciimg.extent];
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:(UIImageOrientationUp)];
CFRelease(ref);
return (image);
}
And I am running a NSTimer in the background which will start detect rect features from the captured source image for every 0.2 seconds
- (void)performRectangleDetection:(CIImage *)image{
if(image == nil)
return;
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
if ([rectFeatures count] > 0 ) {
[self capturedImage:image];
}
}
-(void)capturedImage:(CIImage *)image
{
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
CIImage *resultImage = [image copy];
for (CIRectangleFeature *feature in rectFeatures) {
resultImage = [image imageByApplyingFilter:#"CIPerspectiveCorrection"
withInputParameters:#{#"inputTopLeft":[CIVector vectorWithCGPoint:feature.topLeft] ,
#"inputTopRight": [CIVector vectorWithCGPoint:feature.topRight],
#"inputBottomLeft": [CIVector vectorWithCGPoint:feature.bottomLeft],
#"inputBottomRight": [CIVector vectorWithCGPoint:feature.bottomRight]}];
}
UIImage *capturedImage = [[UIImage alloc] initWithCIImage: resultImage];
UIImage *finalImage = [self imageWithImage:capturedImage scaledToSize:capturedImage.size];
}
The finalImage will be retrieved after sending to this method
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
The final image quality becomes blurred sometimes. Is that because of the filter or because of camera output image ? Pls help me to solve this.

It is likely that you are not recreating the final image with the correct scale factor.
UIImage *finalImage = [UIImage imageWithCGImage:resultImage
scale:original.scale
orientation:original.imageOrientation];
If this doesn't solve the issue, please provide more code sample from the camera input, and how you converted the final CIImage from the filters into UIImage.

use the following method to crop image
-(UIImage*)cropImage:(UIImage*)image withRect:(CGRect)rect {
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, rect);
UIImage *cropedImage = [UIImage imageWithCGImage:cgImage];
return cropedImage;
}

Related

How to remove RedEye from image? [duplicate]

I want to remove red eye effect form photo but not get any sample can any one help me with working demo code or code snippet?
Thanks.
Use below category of UIImage :
#interface UIImage (Utitlities)
-(UIImage*)redEyeCorrection;
#end
#implementation UIImage (Utitlities)
-(UIImage*)redEyeCorrection
{
CIImage *ciImage = [[CIImage alloc] initWithCGImage:self.CGImage];
// Get the filters and apply them to the image
NSArray* filters = [ciImage autoAdjustmentFiltersWithOptions:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:kCIImageAutoAdjustEnhance]];
for (CIFilter* filter in filters)
{
[filter setValue:ciImage forKey:kCIInputImageKey];
ciImage = filter.outputImage;
}
// Create the corrected image
CIContext* ctx = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [ctx createCGImage:ciImage fromRect:[ciImage extent]];
UIImage* final = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return final;
}
#end
Usage: Sample code given below
UIImage *redEyeImage = [UIImage imageNamed:#"redEye.jpg"];
if (redEyeImage) {
UIImage *newRemovedRedEyeImage = [redEyeImage redEyeCorrection];
if (newRemovedRedEyeImage) {
imgView.image = newRemovedRedEyeImage;
}
}
Refer NYXImagesKit UIImage Enhancing link

Why is my picture getting blurring after switching CIFilter?

After changing the CIFilter on an imageView image, the image appears blurry. Trying to figure out why.
Here is the code:
- (UIImage *)convertImageToGrayScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectTonal"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (UIImage *)convertImageToColorScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (IBAction)colorize:(id)sender {
self.imageView.image = nil;
if(self.hasColor) {
self.hasColor = FALSE;
self.imageView.image = [self convertImageToGrayScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"remove Color");
}
else {
self.hasColor = TRUE;
self.imageView.image = [self convertImageToColorScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"Add Color");
}
}
When I add the filter the first time, it looks good. When I click the button to change the filter, it's blurry.
Try replacing the line return [UIImage imageWithCIImage:filter.outputImage]; with:
CIImage *outputImage = [filter outputImage];
CGImageRef imageRef = [[CIContext contextWithOptions:nil] createCGImage:outputImage fromRect:outputImage.extent];
UIImage *newImg = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return newImg;

CGImageRef not being released under ARC

I know this question has been asked before but nothing seems to work.
I have a static method that returns a filtered image using CIImage and CGIImage, I used the example on RayWenderlich.com and changed it to return an UIImage instead of setting it directly to an UIImageView.
My problem is that the CGImageRef cgimg is never being released. Which results in a rather fast memory leak. What am I doing wrong?
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
#autoreleasepool {
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"]
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImage;
}
}
It looks like cgimg is being released. I assume you mean that the CGImage is not deallocated. You wouldn't expect it to be, since you pass it to UIImage, which likely retains it.
Most likely you then later leak the UIImage (or keep it around when you didn't mean to, which is similar to a leak). I would audit the code around where you use and release the UIImage, and the CGImageRef will likely take care of itself.
Try this way ....
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"];
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
beginImage = nil;
context = nil;
filter = nil;
outputImage = nil;
CGImageRelease(cgimg);
return newImage;
}

iOS - Setting blurred image on top of other views, odd issues

So, I've got an odd scenario.
In my iOS app, I'm trying to blur the content area of the screen when a popover is opened.
I have this working when using Core Image, but only when using Gaussian blur- none of the other blurs work, which is odd.
I tried doing the same with GPUImage, and it blurs far faster, but doesn't actually put the view on top of the other views!
To summarize: in the source below, setBlurOnView will work properly- however setBlurOnViewWithGPUImage appears to not be working. The blur view (tag 6110) is created, but the app doesn't actually blur.
Note: This is on iOS 6, in the simulator.
Here's the relevant source:
// ScreenBlur.m
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <GPUImage/GPUImage.h>
#import "ScreenBlur.h"
#import "GlobalData.h"
#import "Logger.h"
#implementation ScreenBlur
+ (void) setBlurOnViewWithGPUImage:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view]];
GPUImageGaussianBlurFilter *blur = [[GPUImageGaussianBlurFilter alloc] init];
[imageSource addTarget:blur];
[imageSource processImage];
[self setImage:[imageSource imageFromCurrentlyProcessedOutput] toView:view];
}
+ (void) setBlurOnView:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
CIImage *inputImage = [CIImage imageWithCGImage:[self captureScreenInRect:view.frame inView:view].CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
// CIGaussianBlur has a tendency to shrink the image a little,
// this ensures it matches up exactly to the bounds of our original image
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[UIImage imageWithCGImage:cgImage] toView:view];
}
+ (void) setImage:(UIImage*)blurredImage toView:(UIView*)view {
UIView *blurView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, blurredImage.size.width, blurredImage.size.height)];
[blurView setBackgroundColor:[UIColor colorWithPatternImage:blurredImage]];
[blurView setTag:6110];
//set the image as the foreground for the view
[view addSubview:blurView];
[view bringSubviewToFront:blurView];
}
//same as the method above, but resizes the screenshot before applying the blur for increased performance at the expense of image quality.
+ (void) setBlurOnViewPerformanceMode:(UIView *)view {
//http://stackoverflow.com/questions/17041669/creating-a-blurring-overlay-view
UIImage *screenShot = [self imageWithImage:[self captureScreenInRect:view.frame inView:view] scaledToSize:CGSizeMake(view.frame.size.width / 2, view.frame.size.height / 2)];
CIImage *inputImage = [CIImage imageWithCGImage:screenShot.CGImage];
//CIContext *context = [CIContext contextWithOptions:nil];
if ([GlobalData getInstance].ciContext == nil) {
[Logger Log:#"ciContext does not exist, creating..." fromClass:#"ScreenBlur"];
// [GlobalData getInstance].ciContext = [CIContext contextWithOptions:nil]; //cpu context
[GlobalData getInstance].ciContext = [CIContext contextWithEAGLContext:[[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]];
}
//set up the blur filter
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:3.0f] forKey:#"inputRadius"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
//CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
CGImageRef cgImage = [[GlobalData getInstance].ciContext createCGImage:result fromRect:[inputImage extent]];
[self setImage:[self imageWithImage:[UIImage imageWithCGImage:cgImage] scaledToSize:view.frame.size] toView:view];
}
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
//UIGraphicsBeginImageContext(newSize);
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
+ (void) removeBlurFromView:(UIView *)view {
for (UIView *subView in view.subviews) {
if (subView.tag == 6110) {
[subView removeFromSuperview];
}
}
}
+ (UIImage *)captureScreenInRect:(CGRect)captureFrame inView:(UIView*) view {
CALayer *layer;
layer = view.layer;
UIGraphicsBeginImageContext(view.bounds.size);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureFrame);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return screenImage;
}
#end
And then in my view controller, it's simply called with
[ScreenBlur setBlurOnView:self.view];
I found a workaround for this (or, who knows, maybe this is how it was supposed to be done).
//ScreenBlur.m
+ (GPUImageView*) getBlurredImageWithGPUImageFromView:(UIView*)view {
GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:[self captureScreenInRect:view.frame inView:view] smoothlyScaleOutput:true];
GPUImageFastBlurFilter *blur = [[GPUImageFastBlurFilter alloc] init];
[blur setBlurPasses:3];
[imageSource addTarget:blur];
GPUImageView *filteredView = [[GPUImageView alloc] initWithFrame:view.frame];
[blur addTarget:filteredView];
[imageSource processImage];
return filteredView;
}
//ViewController.m
//blur the main screen
GPUImageView *blurred = [ScreenBlur getBlurredImageWithGPUImageFromView:self.view];
[blurred setTag:6110];
[self.view addSubview:blurred];
[self.view bringSubviewToFront:blurred];

iOS UIImage being rotated when after going through CIFilter

I'm working with filtering images that I'm taking with the camera. I pass the image I get from the camera through the below method. Which I have the returned UIImage sent to a UIImageView. For some reason when it passes through this method the image is getting rotated. What am I doing wrong?
- (UIImage *) applyFilterToImage:(UIImage *)image withFilter:(NSString *)filterName {
beginImage = [[[CIImage alloc] initWithImage:image] autorelease];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg scale:1.0 orientation:UIImageOrientationUp];
CGImageRelease(cgimg);
return scaleAndRotateImage(newImg);
}

Resources