UISlider which blurs image with CiGaussianBlur also rotates image - ios

I have a UISlider which has a range from 0 to 30.
Unfortunately, when I'm using the UISlider, the image starts to blur, BUT it also rotates.
Here's the code:
-(IBAction)slider:(UISlider *)sender
{
float slideValue = [sender value];
CIImage *beginImage = [[CIImage alloc] initWithImage:image];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur" keysAndValues:kCIInputImageKey, beginImage, #"inputRadius", [NSNumber numberWithFloat:slideValue], nil];
CIImage *outputImage = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[beginImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[imageView setImage:newImg];
CGImageRelease(cgimg);
}
Where is the problem?
Thanks in advance.

The problem is this line:
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
This is too simple-minded. Call [UIImage imageWithCGImage:cgimg scale:originalScale orientation:originalOrientation]; (you will need to provide correct values for these).

Related

Memory leaks with CoreImage

Instruments detect this code with memory leaks
- (UIImage *)blurryOriginalImageWithLevel:(CGFloat)blur
{
CIImage *inputImage = [CIImage imageWithCGImage:[UIImage imageNamed:#"ss"].CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, inputImage,
#"inputRadius", #(blur),
nil];
CIImage *outputImage =inputImage;
if (filter) {
outputImage=filter.outputImage;
}
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef outImage = [context createCGImage:outputImage fromRect:[inputImage extent]];
UIImage *image = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return image;
}
screenshots:
but I can't find any problem in this code,am I wrong?

How to get real time image processing using CIFilter?

I am using CIFilter for image filtering in my iOS app.My following code works smoothly.But i have one problem,i think processing of image is time taken(I can’t get real time effect).How to solve this issue?
cameraImgView.image = [ImageProcessing sepian:myOriginalImage withInensity:0.5]; //method call
Method Definition
+(UIImage*)sepian:(UIImage*)img withInensity:(float)intensity{
CIImage *cimage = [[CIImage alloc] initWithImage:img];
CIFilter *sepiaFilter = [CIFilter filterWithName:#"CISepiaTone"];
[sepiaFilter setDefaults];
[sepiaFilter setValue:cimage forKey:#"inputImage"];
[sepiaFilter setValue:[NSNumber numberWithFloat:intensity]
forKey:#"inputIntensity"];
CIImage *outputImage = [sepiaFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:
outputImage fromRect:[outputImage extent]];
UIImage *resultUIImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return resultUIImage;
}
Try this:
filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, rawImageData,
#"inputIntensity", #3.5, nil];
Give this filter like below:
CIImage *filteredImageData = [filter valueForKey:#"outputImage"];
UIImage *filteredImage = [UIImage imageWithCIImage:filteredImageData];
self.img_view.image = filteredImage;

CGImageRef not being released under ARC

I know this question has been asked before but nothing seems to work.
I have a static method that returns a filtered image using CIImage and CGIImage, I used the example on RayWenderlich.com and changed it to return an UIImage instead of setting it directly to an UIImageView.
My problem is that the CGImageRef cgimg is never being released. Which results in a rather fast memory leak. What am I doing wrong?
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
#autoreleasepool {
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"]
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return newImage;
}
}
It looks like cgimg is being released. I assume you mean that the CGImage is not deallocated. You wouldn't expect it to be, since you pass it to UIImage, which likely retains it.
Most likely you then later leak the UIImage (or keep it around when you didn't mean to, which is similar to a leak). I would audit the code around where you use and release the UIImage, and the CGImageRef will likely take care of itself.
Try this way ....
+ (UIImage *)image:(UIImage *)image withFilterName:(NSString *)filterName
{
if (!image)
return nil;
CIImage *beginImage = [[CIImage alloc] initWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectProcess"];
[filter setValue:beginImage forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
beginImage = nil;
context = nil;
filter = nil;
outputImage = nil;
CGImageRelease(cgimg);
return newImage;
}

How do I apply a CIFilter to a UIImage?

My app is crashing when I attempt to apply a filter to my user-selected UIImage (It has been working fine without applying the filter). I added and imported the "CoreImage" framework to my project so I could create filters for user-selected images.
I am attempting to apply the filter by creating a category for UIImage (based on Apple's documentation, and then calling the corresponding method on the UIImage selected by the user. Following is the code of my category header and body; what am I doing wrong? (please note, "randColor" is a category UIColor class method to generate a random color)
#import <UIKit/UIKit.h>
#import <CoreImage/CoreImage.h>
#import "UIColor+CustomColorCategory.h"
#interface UIImage (MonoChromeFilter)
- (UIImage *) applyMonoChromeWithRandColor;
#end
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
- (UIImage *)applyMonoChromeWithRandColor
{
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *ciImage = [[CIImage alloc] initWithImage:self];
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:[UIColor randColor] forKey:kCIAttributeTypeColor];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Here is the method in the viewController where this category is being called:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissViewControllerAnimated:YES completion:^{
UIImage *editedImage = [info objectForKey:UIImagePickerControllerEditedImage];
editedImage = [editedImage applyMonoChromeWithRandColor];
self.blogImageOutlet.image = editedImage;
self.blogImageOutlet.layer.cornerRadius = self.blogImageOutlet.frame.size.width / 2.0;
[self.blogImageOutlet setClipsToBounds:YES];
[self saveImageToLibrary:editedImage];
}];
}
I figured it out! After debugging and using some other projects as a point of reference, I realized that I was experiencing two issues. First, I was trying to use a UIColor for CIColor, which is not directly possible. I first had to covert the UIColor to a CIColor to be able to apply it. Next, I was not using the correct strings for the CIFilter value keys. Here is the following code after modifications (and now it works!)
#import "UIImage+MonoChromeFilter.h"
#implementation UIImage (MonoChromeFilter)
+ (UIImage *) applyMonoChromeWithRandColor: (UIImage *)uIImage
{
// Convert UIColor to CIColor
CGColorRef colorRef = [UIColor randColor].CGColor;
NSString *colorString = [CIColor colorWithCGColor:colorRef].stringRepresentation;
CIColor *coreColor = [CIColor colorWithString:colorString];
CIContext *context = [CIContext contextWithOptions:nil];
// Convert UIImage to CIImage
CIImage *ciImage = [[CIImage alloc] initWithImage:uIImage];
// Set values for CIColorMonochrome Filter
CIFilter *filter = [CIFilter filterWithName:#"CIColorMonochrome"];
[filter setValue:ciImage forKey:kCIInputImageKey];
[filter setValue:#1.0 forKey:#"inputIntensity"];
[filter setValue:coreColor forKey:#"inputColor"];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIImage *filteredImage = [[UIImage alloc] initWithCGImage:cgImage];
return filteredImage;
}
#end
Try This , It's work like champs for me ,
We have applied different CIFilter for single image,
Below is code for implementation ,
//CISepiaTone Effect,
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc]initWithImage:inputimage];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone"
keysAndValues: kCIInputImageKey, image,
#"inputIntensity", #1, nil];
CIImage *result = [filter valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
//CIVignette Effect
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc] initWithImage:inputimage];
CIFilter *vignette = [CIFilter filterWithName:#"CIVignette"];
[vignette setDefaults];
[vignette setValue: image forKey: #"inputImage"];
[vignette setValue: [NSNumber numberWithFloat: 1.0] forKey: #"inputIntensity"];
[vignette setValue: [NSNumber numberWithFloat: 10.00 ] forKey: #"inputRadius"];
CIImage *result = [vignette valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
detailsVc.filterImage=targetImage;
[self.navigationController pushViewController:detailsVc animated:YES];
For detail implementation you can refer this GitHub project file ImageFilter
Hope so This will help for some one .

iOS UIImage being rotated when after going through CIFilter

I'm working with filtering images that I'm taking with the camera. I pass the image I get from the camera through the below method. Which I have the returned UIImage sent to a UIImageView. For some reason when it passes through this method the image is getting rotated. What am I doing wrong?
- (UIImage *) applyFilterToImage:(UIImage *)image withFilter:(NSString *)filterName {
beginImage = [[[CIImage alloc] initWithImage:image] autorelease];
context = [CIContext contextWithOptions:nil];
filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg scale:1.0 orientation:UIImageOrientationUp];
CGImageRelease(cgimg);
return scaleAndRotateImage(newImg);
}

Resources