I want to remove red eye effect form photo but not get any sample can any one help me with working demo code or code snippet?
Thanks.
Use below category of UIImage :
#interface UIImage (Utitlities)
-(UIImage*)redEyeCorrection;
#end
#implementation UIImage (Utitlities)
-(UIImage*)redEyeCorrection
{
CIImage *ciImage = [[CIImage alloc] initWithCGImage:self.CGImage];
// Get the filters and apply them to the image
NSArray* filters = [ciImage autoAdjustmentFiltersWithOptions:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:kCIImageAutoAdjustEnhance]];
for (CIFilter* filter in filters)
{
[filter setValue:ciImage forKey:kCIInputImageKey];
ciImage = filter.outputImage;
}
// Create the corrected image
CIContext* ctx = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [ctx createCGImage:ciImage fromRect:[ciImage extent]];
UIImage* final = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return final;
}
#end
Usage: Sample code given below
UIImage *redEyeImage = [UIImage imageNamed:#"redEye.jpg"];
if (redEyeImage) {
UIImage *newRemovedRedEyeImage = [redEyeImage redEyeCorrection];
if (newRemovedRedEyeImage) {
imgView.image = newRemovedRedEyeImage;
}
}
Refer NYXImagesKit UIImage Enhancing link
Related
After changing the CIFilter on an imageView image, the image appears blurry. Trying to figure out why.
Here is the code:
- (UIImage *)convertImageToGrayScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectTonal"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (UIImage *)convertImageToColorScale:(UIImage *)image {
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"];
CIImage *inputImage = [CIImage imageWithCGImage:image.CGImage];
[filter setValue:inputImage forKey:kCIInputImageKey];
return [UIImage imageWithCIImage:filter.outputImage];
}
- (IBAction)colorize:(id)sender {
self.imageView.image = nil;
if(self.hasColor) {
self.hasColor = FALSE;
self.imageView.image = [self convertImageToGrayScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"remove Color");
}
else {
self.hasColor = TRUE;
self.imageView.image = [self convertImageToColorScale:[UIImage imageWithContentsOfFile:self.filePath]];
NSLog(#"Add Color");
}
}
When I add the filter the first time, it looks good. When I click the button to change the filter, it's blurry.
Try replacing the line return [UIImage imageWithCIImage:filter.outputImage]; with:
CIImage *outputImage = [filter outputImage];
CGImageRef imageRef = [[CIContext contextWithOptions:nil] createCGImage:outputImage fromRect:outputImage.extent];
UIImage *newImg = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return newImg;
I know this question has been asked before but nothing seems to work.
I have a static method that returns a filtered image using CIImage and CGIImage, I used the example on RayWenderlich.com and changed it to return an UIImage instead of setting it directly to an UIImageView.
My problem is that the CGImageRef cgimg is never being released. Which results in a rather fast memory leak. What am I doing wrong?
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
#autoreleasepool {
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"]
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImage;
}
}
It looks like cgimg is being released. I assume you mean that the CGImage is not deallocated. You wouldn't expect it to be, since you pass it to UIImage, which likely retains it.
Most likely you then later leak the UIImage (or keep it around when you didn't mean to, which is similar to a leak). I would audit the code around where you use and release the UIImage, and the CGImageRef will likely take care of itself.
Try this way ....
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"];
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
beginImage = nil;
context = nil;
filter = nil;
outputImage = nil;
CGImageRelease(cgimg);
return newImage;
}
My app is crashing when I attempt to apply a filter to my user-selected UIImage (It has been working fine without applying the filter). I added and imported the "CoreImage" framework to my project so I could create filters for user-selected images.
I am attempting to apply the filter by creating a category for UIImage (based on Apple's documentation, and then calling the corresponding method on the UIImage selected by the user. Following is the code of my category header and body; what am I doing wrong? (please note, "randColor" is a category UIColor class method to generate a random color)
#import <UIKit/UIKit.h>
#import <CoreImage/CoreImage.h>
#import "UIColor+CustomColorCategory.h"
#interface UIImage (MonoChromeFilter)
- (UIImage *) applyMonoChromeWithRandColor;
#end
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
- (UIImage *)applyMonoChromeWithRandColor
{
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *ciImage = [[CIImage alloc] initWithImage:self];
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:[UIColor randColor] forKey:kCIAttributeTypeColor];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Here is the method in the viewController where this category is being called:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:^{
UIImage *editedImage = [info objectForKey:UIImagePickerControllerEditedImage];
editedImage = [editedImage applyMonoChromeWithRandColor];
self.blogImageOutlet.image = editedImage;
self.blogImageOutlet.layer.cornerRadius = self.blogImageOutlet.frame.size.width / 2.0;
[self.blogImageOutlet setClipsToBounds:YES];
[self saveImageToLibrary:editedImage];
}];
}
I figured it out! After debugging and using some other projects as a point of reference, I realized that I was experiencing two issues. First, I was trying to use a UIColor for CIColor, which is not directly possible. I first had to covert the UIColor to a CIColor to be able to apply it. Next, I was not using the correct strings for the CIFilter value keys. Here is the following code after modifications (and now it works!)
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
+ (UIImage *) applyMonoChromeWithRandColor: (UIImage *)uIImage
{
// Convert UIColor to CIColor
CGColorRef colorRef = [UIColor randColor].CGColor;
NSString *colorString = [CIColor colorWithCGColor:colorRef].stringRepresentation;
CIColor *coreColor = [CIColor colorWithString:colorString];
CIContext *context = [CIContext contextWithOptions:nil];
// Convert UIImage to CIImage
CIImage *ciImage = [[CIImage alloc] initWithImage:uIImage];
// Set values for CIColorMonochrome Filter
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:#1.0 forKey:#"inputIntensity"];
[filter setValue:coreColor forKey:#"inputColor"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Try This , It's work like champs for me ,
We have applied different CIFilter for single image,
Below is code for implementation ,
//CISepiaTone Effect,
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc]initWithImage:inputimage];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, image,
#"inputIntensity", #1, nil];
CIImage *result = [filter valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
//CIVignette Effect
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc] initWithImage:inputimage];
CIFilter *vignette = [CIFilter filterWithName:#"CIVignette"];
[vignette setDefaults];
[vignette setValue: image forKey: #"inputImage"];
[vignette setValue: [NSNumber numberWithFloat: 1.0] forKey: #"inputIntensity"];
[vignette setValue: [NSNumber numberWithFloat: 10.00 ] forKey: #"inputRadius"];
CIImage *result = [vignette valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
For detail implementation you can refer this GitHub project file ImageFilter
Hope so This will help for some one .
I have a UISlider which has a range from 0 to 30.
Unfortunately, when I'm using the UISlider, the image starts to blur, BUT it also rotates.
Here's the code:
-(IBAction)slider:(UISlider *)sender
{
float slideValue = [sender value];
CIImage *beginImage = [[CIImage alloc] initWithImage:image];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur" keysAndValues:kCIInputImageKey, beginImage, #"inputRadius", [NSNumber numberWithFloat:slideValue], nil];
CIImage *outputImage = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[beginImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[imageView setImage:newImg];
CGImageRelease(cgimg);
}
Where is the problem?
Thanks in advance.
The problem is this line:
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
This is too simple-minded. Call [UIImage imageWithCGImage:cgimg scale:originalScale orientation:originalOrientation]; (you will need to provide correct values for these).
I'm working with filtering images that I'm taking with the camera. I pass the image I get from the camera through the below method. Which I have the returned UIImage sent to a UIImageView. For some reason when it passes through this method the image is getting rotated. What am I doing wrong?
- (UIImage *) applyFilterToImage:(UIImage *)image withFilter:(NSString *)filterName {
beginImage = [[[CIImage alloc] initWithImage:image] autorelease];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg scale:1.0 orientation:UIImageOrientationUp];
CGImageRelease(cgimg);
return scaleAndRotateImage(newImg);
}