Most performant way to change a uiimage saturation (during animation) - ios

I am trying to change a UIImage saturation during an animation. But it seems that it is too heavy/slow to do so this way:
-(void) changeSaturation:(CGFloat)value{
CIContext * context = [CIContext contextWithOptions:nil];
CIFilter *colorControlsFilter = [CIFilter filterWithName:#"CIColorControls"];
CIImage *image = self.imageView.image.CIImage;
[colorControlsFilter setValue:image forKey:#"inputImage"];
[colorControlsFilter setValue:[NSNumber numberWithFloat:value] forKey:#"inputSaturation"];
CIImage *outputImage = [colorControlsFilter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage
fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
self.imageView.image = newImage;
CGImageRelease(cgimg);}
This method is called everytime the view is dragged by the user (when the distance changes, the saturation changes as well), which obviously is not the right way to do it but I would like a similar behavior.
I wonder if I can achieve this with a layer on top of the UIImageview.
Can anyone advise me on how to achieve my goal?

1) Calculate filtered image in async method:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
UIImage * filtredImage = /*Your method*/
dispatch_async(dispatch_get_main_queue(), ^{
self.imageView.image = filtredImage;
}
}
2) Smooth change image: use coreAnimation. self.layer - is top CALayer in your view
- (UIImage*)changeSaturation:(CGFloat)value
{
//Your filter process
}
- (void)changeSaturation:(CGFloat)value duration:(CGFloat)duration
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
NSUInteger cadreCount = ceil(duration * 60);
CGFloat incrimentValue = value / cadreCount;
NSMutableArray * mutArrayAnimationValues = [NSMutableArray new];
for (int cadreIndex = 0; cadreIndex <= cadreCount; cadreIndex++)
{
CGFloat currentValue = incrimentValue * cadreIndex;
UIImage * imageForCurrentCadr = [self changeSaturation:currentValue];
[mutArrayAnimationValues addObject:(id)imageForCurrentCadr.CGImage];
}
dispatch_async(dispatch_get_main_queue(), ^{
self.layer.contents = (id)[mutArrayAnimationValues lastObject];
CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:#"contents"];
[animation setValues:mutArrayAnimationValues];
[animation setDuration:duration];
[animation setTimingFunction:[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionEaseOut]];
[animation setFillMode:kCAFillModeBoth];
[pathAnimation setRemovedOnCompletion:NO];
[self.layer addAnimation:pathAnimation forKey:#"imageAnimation"];
}
}
}

Turns out my main issue was that the view being dragged contained a big resolution image. My solution above works perfectly fine.

If you change saturation between 0.0 and 1.0, you can add Black&White (Grayscale) image above normal image and change it's alpha. It will be most performant.

Related

Problem clipping background to coreimage aztec generation with CIBlendWithMask

I am trying to add an image background to a generated atec code so far I can get the aztec code to generate but am having trouble with the CIBlendWithMask filter, i'm not sure exactly what i'm doing wrong I believe the user selected background as kCIInputBackgroundImageKey is correct and the aztec output image as the kCIInputImageKey is correct, I think where i'm going wrong is the kCIInputMaskImageKey but not exactly sure why I need to do I thought the aztec output would be a sufficient mask image - do I need to select the color or something to get the background clipping to the aztec image?
CIFilter *aztecFilter = [CIFilter filterWithName:#"CIAztecCodeGenerator"];
CIFilter *colorFilter = [CIFilter filterWithName:#"CIFalseColor"];
[aztecFilter setValue:stringData forKey:#"inputMessage"];
[colorFilter setValue:aztecFilter.outputImage forKey:#"background"];
NSData* imageData = [[NSUserDefaults standardUserDefaults] objectForKey:#"usertheme"];
CIImage *image = [UIImage imageWithData:imageData].CIImage;
[colorFilter setValue:[CIColor colorWithCGColor:[[UIColor blackColor] CGColor]] forKey:#"inputColor0"];
[colorFilter setValue:[CIColor colorWithRed:1 green:1 blue:1 alpha:0] forKey:#"inputColor1"];
CIFilter *blendFilter = [CIFilter filterWithName:#"CIBlendWithMask"];
[blendFilter setValue:colorFilter.outputImage forKey:kCIInputImageKey];
[blendFilter setValue:image forKey:kCIInputBackgroundImageKey];
[blendFilter setValue:colorFilter.outputImage forKey:kCIInputMaskImageKey];
Trying to create something like this but for aztec codes instead of QR
You may be over-thinking this a bit...
You can use the CIBlendWithMask filter with only a background image and a mask image.
So, if we start with a gradient image (you may be generating it on-the-fly):
Then generate the Aztec Code image (yellow outline is just to show the frame):
We can use that code image as the mask.
Here is sample code:
- (void)viewDidLoad {
[super viewDidLoad];
self.view.backgroundColor = [UIColor systemYellowColor];
// create a vertical stack view
UIStackView *sv = [UIStackView new];
sv.axis = UILayoutConstraintAxisVertical;
sv.spacing = 8;
sv.translatesAutoresizingMaskIntoConstraints = NO;
[self.view addSubview:sv];
// add 3 image views to the stack view
for (int i = 0; i < 3; ++i) {
UIImageView *imgView = [UIImageView new];
[sv addArrangedSubview:imgView];
[imgView.widthAnchor constraintEqualToConstant:240].active = YES;
[imgView.heightAnchor constraintEqualToAnchor:imgView.widthAnchor].active = YES;
}
[self.view addSubview:sv];
[sv.centerXAnchor constraintEqualToAnchor:self.view.safeAreaLayoutGuide.centerXAnchor].active = YES;
[sv.centerYAnchor constraintEqualToAnchor:self.view.safeAreaLayoutGuide.centerYAnchor].active = YES;
// load a gradient image for the background
UIImage *gradientImage = [UIImage imageNamed:#"bkgGradient"];
// put it in the first image view
((UIImageView *)sv.arrangedSubviews[0]).image = gradientImage;
// create aztec filter
CIFilter *aztecFilter = [CIFilter filterWithName:#"CIAztecCodeGenerator"];
// give it some string data
NSString *qrString = #"My string to encode";
NSData *stringData = [qrString dataUsingEncoding: NSUTF8StringEncoding];
[aztecFilter setValue:stringData forKey:#"inputMessage"];
// get the generated aztec image
CIImage *aztecCodeImage = aztecFilter.outputImage;
// scale it to match the background gradient image
float scaleX = gradientImage.size.width / aztecCodeImage.extent.size.width;
float scaleY = gradientImage.size.height / aztecCodeImage.extent.size.height;
aztecCodeImage = [[aztecCodeImage imageBySamplingNearest] imageByApplyingTransform:CGAffineTransformMakeScale(scaleX, scaleY)];
// convert to UIImage and set the middle image view
UIImage *scaledCodeImage = [UIImage imageWithCIImage:aztecCodeImage scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp];
((UIImageView *)sv.arrangedSubviews[1]).image = scaledCodeImage;
// create a blend with mask filter
CIFilter *blendFilter = [CIFilter filterWithName:#"CIBlendWithMask"];
// set the background image
CIImage *bkgInput = [CIImage imageWithCGImage:[gradientImage CGImage]];
[blendFilter setValue:bkgInput forKey:kCIInputBackgroundImageKey];
// set the mask image
[blendFilter setValue:aztecCodeImage forKey:kCIInputMaskImageKey];
// get the blended CIImage
CIImage *output = [blendFilter outputImage];
// convert to UIImage
UIImage *blendedImage = [UIImage imageWithCIImage:output scale:[UIScreen mainScreen].scale orientation:UIImageOrientationUp];
// set the bottom image view to the result
((UIImageView *)sv.arrangedSubviews[2]).image = blendedImage;
}
which produces:

How to increase the Gaussian Blur as the user scrolls up

I have a UIImage inside UIImageView on which gaussian blur filter with radius of 50 has been applied as of now. As per the new requirement, I need to set its initial Gaussian Blur at 3px. Then increase it slowly from 3px to 10 px as the user scrolls up the view? Could anybody please help me understand how can this be done?
This is the code that I'm using to blur the image as of now with a radius of 50.
- (UIImage *)blurWithCoreImage:(UIImage *)sourceImage blurValue:(int)valBlur
{
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:[NSString stringWithFormat:#"%d",valBlur] forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(self.view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, -self.view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, self.view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, self.view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
-(void)scrollViewDidScroll:(UIScrollView *)scrollView {
/* This is the offset at the bottom of the scroll view. */
CGFloat totalScroll = scrollView.contentSize.height - scrollView.bounds.size.height;
/* This is the current offset. */
CGFloat offset = - scrollView.contentOffset.y;
/* This is the percentage of the current offset / bottom offset. */
CGFloat percentage = offset / totalScroll;
/* When percentage = 0, the Blur should be 3 so we should flip the percentage. */
imageview.blurLevel= (3.0f + percentage);
}
Source : SFOS

iOS8 can I animate hue/saturation/brightness of UIImage using CIFilter?

I'm looking at an Apple example that uses core image filters to adjust Hue/saturation/brightness of an image. In this case the input image has it's properties adjusted, and another image is returned as a result of the filter operation. I'm interested if this transition can be animated (step by step transition).
Is there a way for me to programmatically animate a black and white image having color slowly fading in?
I checked the Apple view programming guidelines, but don't see anything about animating images or colors.
- (void)hueChanged:(id)sender
{
CGFloat hue = [hueSlider value];
CGFloat saturation = [saturationSlider value];
CGFloat brightness = [brightnessSlider value];
// Update labels
[hueLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Hue: %f", #"Hue label format."), hue]];
[saturationLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Saturation: %f",
#"Saturation label format."), saturation]];
[brightnessLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Brightness: %f", #"Brightness label format."), brightness]];
// Apply effects to image
dispatch_async(processingQueue, ^{
if (!self.colorControlsFilter) {
self.colorControlsFilter = [CIFilter filterWithName:#"CIColorControls"];
}
[self.colorControlsFilter setValue:self.baseCIImage forKey:kCIInputImageKey];
[self.colorControlsFilter setValue:#(saturation) forKey:#"inputSaturation"];
[self.colorControlsFilter setValue:#(brightness) forKey:#"inputBrightness"];
CIImage *coreImageOutputImage = [self.colorControlsFilter valueForKey:kCIOutputImageKey];
if (!self.hueAdjustFilter) {
self.hueAdjustFilter = [CIFilter filterWithName:#"CIHueAdjust"];
}
[self.hueAdjustFilter setValue:coreImageOutputImage forKey:kCIInputImageKey];
[self.hueAdjustFilter setValue:#(hue) forKey:#"inputAngle"];
coreImageOutputImage = [self.hueAdjustFilter valueForKey:kCIOutputImageKey];
CGRect rect = CGRectMake(0,0,self.image.size.width,self.image.size.height);
CGImageRef cgImage = [self.context createCGImage:coreImageOutputImage fromRect:rect];
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
dispatch_async(dispatch_get_main_queue(), ^{
[imageView setImage:image];
});
});
// imageView.center = self.view.center;
// imageView.frame = self.view.frame;
}
For slowly fade animations you can use core animations in IOS
theView.alpha = 0;
[UIView animateWithDuration:1.0
animations:^{
//Apply effects to image here
theView.center = midCenter;
theView.alpha = 1;
}
completion:^(BOOL finished){
[UIView animateWithDuration:1.0
animations:^{
//Apply effects to image here
}
completion:^(BOOL finished){
}];
}];

How to apply HSB color filters to UIImage

I've been struggling for a few days for a project on UIImage colorization.
The idea is that the app will embark a set of images that I will have to colorize with values retrieved from a webservice. Some sort of themes if you wish.
The designer I work with gave me a background image on all of his Photoshop values.
The first problem is that Photoshop uses HSL and iOS uses HSB. So the first challenge was to translate the values from Photoshop.
Photoshop HSL: -28 (range -180 => +180), 100 (range -100 => +100), 25 (range -100 => +100).
Luckily I found some code online, here it is.
//adapted from https://gist.github.com/peteroupc/4085710
- (void)convertLightnessToBrightness:(CGFloat)lightness withSaturation:(CGFloat)saturation completion:(void (^)(CGFloat, CGFloat))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
CGFloat brightness = 0.0f;
CGFloat saturationOut = 0.0f;
if (lightness > 0.0f)
{
CGFloat lumScale = (1.0f - MAX((lightness - 0.5f), 0.0f) * 2.0f);
lumScale = ((lumScale == 0) ? 0 : (1.0f / lumScale));
CGFloat lumStart = MAX(0.0f, (lumScale - 0.5f));
CGFloat lumDiv = (lumScale - lumStart);
lumDiv = (lumStart + (saturation * lumDiv));
saturationOut = ((lumDiv == 0) ? 0.0f : (saturation / lumDiv));
brightness = (lightness + (1.0f - lightness) * saturation);
}
NSLog(#"saturation: %0.2f - brightness: %0.2f", saturationOut, brightness);
completion(saturationOut, brightness);
}
Using an online converter I verified that this method returns the good values.
I needed to change the ranges (H: 0->360, S: 0->100, L: 0->100)
So HSL 152, 100, 62 gives HSB 152, 76, 100. And the method returns 75 for saturation and 100 for brightness so we are good.
Next I needed to apply those values to the image, so here is the code to change...
The HUE:
#define DEGREES_TO_RADIANS(angle) ((angle) / 180.0f * M_PI)
- (void)colorize:(UIImage *)input hue:(CGFloat)hueDegrees completion:(void(^)(UIImage *outputHue))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
CGFloat hue = DEGREES_TO_RADIANS(hueDegrees);
NSLog(#"degress: %0.2f | radian: %0.2f", hueDegrees, hue);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *hueFilter = [CIFilter filterWithName:#"CIHueAdjust" keysAndValues:kCIInputImageKey, inputImage, nil];
[hueFilter setDefaults];
[hueFilter setValue:[NSNumber numberWithFloat:hue] forKey:kCIInputAngleKey];
//---
CIImage *outputImage = [hueFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
The SATURATION:
- (void)colorize:(UIImage *)input saturation:(CGFloat)saturation completion:(void(^)(UIImage *outputSaturation))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
NSLog(#"saturation: %0.2f", saturation);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *saturationFilter = [CIFilter filterWithName:#"CIColorControls" keysAndValues:kCIInputImageKey, inputImage, nil];
[saturationFilter setDefaults];
[saturationFilter setValue:[NSNumber numberWithFloat:saturation] forKey:#"inputSaturation"];
//---
CIImage *outputImage = [saturationFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
The BRIGHTNESS:
- (void)colorize:(UIImage *)input brightness:(CGFloat)brightness completion:(void(^)(UIImage *outputBrightness))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
NSLog(#"brightness: %0.2f", brightness);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *brightnessFilter = [CIFilter filterWithName:#"CIColorControls" keysAndValues:kCIInputImageKey, inputImage, nil];
[brightnessFilter setDefaults];
[brightnessFilter setValue:[NSNumber numberWithFloat:brightness] forKey:#"inputBrightness"];
//---
CIImage *outputImage = [brightnessFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
And everything put together:
CGFloat hue = -28.0f; //152 in 360° range (180 - 28 = 152)
CGFloat saturation = 1.0f; //((100 + 100.0f) / 200.0f)
CGFloat lightness = 0.625f; //((25 + 100.0f) / 200.0f)
[self convertLightnessToBrightness:ligthness withSaturation:saturation completion:^(CGFloat saturationOut, CGFloat brightness) {
//saturarationOut = 0.75f and brigthness = 1.0f
[self colorize:input hue:hue completion:^(UIImage *outputHue) {
[self colorize:outputHue saturation:saturationOut completion:^(UIImage *outputSaturation) {
[self colorize:outputSaturation brightness:brightness completion:completion];
}];
}];
}];
The last completion block simply applies the output image to the image view.
Now here are the results:
Base image
Colorize (hue only)
Colorize (hue and saturation)
Colorize (hue, saturation and brightness)
Expected result
As you can see, the final image is completely white (brigthness is 100%).
I'm completely lost here, I've tried many combination (applying H, S and B in every order), I've tried others libs such as iOS-Image-Filters, without any success. I've also read a lot of question here on Stack Overflow.
Links:
Core Image Filter Reference
CIHueAdjust core image filter setup
How to programmatically change the hue of UIImage?
iOS: Values for CIFilter (Hue) from Photoshop
As anyone succeeded to apply HSL/HSB value to UIImages?
Well finally, we decided to change techniques.
I'm now using a semi-transparent image to wich I apply a blend mode with the desired color.
Following this post I made a category on UIImage.
- (UIImage *)tintedBackgroundImageWithColor:(UIColor *)tintColor
{
UIGraphicsBeginImageContextWithOptions(self.size, NO, 0.0f);
[tintColor setFill];
CGRect bounds = CGRectMake(0, 0, self.size.width, self.size.height);
UIRectFill(bounds);
[self drawInRect:bounds blendMode:kCGBlendModeSourceAtop alpha:1.0f];
UIImage *tintedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tintedImage;
}

Gaussian Blur Filter Reversal: iOS

I've got a gaussian blur which I'm doing for an app.
//Get a UIImage from the UIView
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Blur the UIImage
CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey:#"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat:2] forKey:#"inputRadius"];
CIImage *resultImage = [gaussianBlurFilter valueForKey:#"outputImage"];
UIImage *endImage = [[UIImage alloc] initWithCIImage:resultImage];
//Place the UIImage in a UIImageView
newView = [[UIImageView alloc] initWithFrame:self.view.bounds];
newView.image = endImage;
[self.view addSubview:newView];
It works great, but I want to be able to reverse it and return the view back to normal.
How can I do this as trying to simply remove the blur's view from it's superview didn't work. I've also tried setting various properties to nil with no luck.
Keep a pointer to viewImage in a property
#property (nonatomic, strong) UIImage* originalImage;
then after
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
add
self.originalImage = viewImage;
To revert your image:
newView.image = self.originalImage;
when you apply the blur it does not alter viewImage.. you have created a separate CIImage which gets blurred, and you make a new UIImage from the blurred CIImage.
First thing i saw in your code is that you don't apply the filter in a async task. blur the image take time, so if you don't want to freeze the main thread you should use:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//blur the image in a second thread
dispatch_async(dispatch_get_main_queue(), ^{
//set the blurred image to your imageView in the main thread
});
});
To can reverse the original image, you just put the blurred copy in other imageView that appear over the original. In my case the original image isn't a imageView, but the view itself. So i only add a imageView in my view to set the blurred image, and set to nil when i want to reverse.
Finally, if you want to avoid the blink when you set the blurred image, you can play with alpha in animations to make a soft transition:
[self.blurredImageView setAlpha: 0.0]; //make the imageView invisible
[self.blurredImageView setImage:blurredImage];
//and after set the image, make it visible slowly.
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 1.0];
}
completion:nil];
Here is my complete methods:
- (void)makeBlurredScreenShot{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *sourceImage = [CIImage imageWithCGImage:imageView.CGImage];
// Apply clamp filter:
// this is needed because the CIGaussianBlur when applied makes
// a trasparent border around the image
NSString *clampFilterName = #"CIAffineClamp";
CIFilter *clamp = [CIFilter filterWithName:clampFilterName];
if (!clamp)
return;
[clamp setValue:sourceImage forKey:kCIInputImageKey];
CIImage *clampResult = [clamp valueForKey:kCIOutputImageKey];
// Apply Gaussian Blur filter
NSString *gaussianBlurFilterName = #"CIGaussianBlur";
CIFilter *gaussianBlur = [CIFilter filterWithName:gaussianBlurFilterName];
if (!gaussianBlur)
return;
[gaussianBlur setValue:clampResult forKey:kCIInputImageKey];
[gaussianBlur setValue:[NSNumber numberWithFloat:8.0] forKey:#"inputRadius"];
CIImage *gaussianBlurResult = [gaussianBlur valueForKey:kCIOutputImageKey];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
CGImageRef cgImage = [context createCGImage:gaussianBlurResult fromRect:[sourceImage extent]];
UIImage *blurredImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
dispatch_async(dispatch_get_main_queue(), ^{
[self.blurredImageView setAlpha: 0.0];
[self.blurredImageView setImage:blurredImage];
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 1.0];
}
completion:nil];
});
});
}
- (void)removeBlurredScreenShot{
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 0.0];
}
completion:^(BOOL finished) {
[self.blurredImageView setImage:nil];
}];
}
Like I said in my previous comment, you could create a property/iVar that holds the image before the effect is applied and revert in case you want to undo.
if(!_originalImage)
_originalImage = [[UIImage alloc] init];
_originalImage = viewImage; //(Creates a copy, not a C-Type pass-by-reference)
// Do your Blur Stuff
// Now somewhere down the line in your program, if you don't like the blur and the user would like to undo:
viewImage = _originalImage;
As per your comment on #HeWas's answer, you shouldn't blur the view completely. If the view is getting blurred, there's something else you are doing wrong elsewhere in your program.

Resources