How to blur image using CALayer - ios

The following method attempts to apply gausian blur to an image. However it isn't doing anything. Can you please tell me what is wrong, and if you also know the reason why it's wrong, that would also help. I am trying to learn about CALayers and quartzcore.
Thanks
-(void)updateFavoriteRecipeImage{
[self.favoriteRecipeImage setImageWithURL:[NSURL URLWithString:self.profileVCModel.favoriteRecipeImageUrl] placeholderImage:[UIImage imageNamed:#"miNoImage"]];
//Set content mode
[self.favoriteRecipeImage setContentMode:UIViewContentModeScaleAspectFill];
self.favoriteRecipeImage.layer.masksToBounds = YES;
//Blur the image
CALayer *blurLayer = [CALayer layer];
CIFilter *blur = [CIFilter filterWithName:#"CIGaussianBlur"];
[blur setDefaults];
blurLayer.backgroundFilters = [NSArray arrayWithObject:blur];
[self.favoriteRecipeImage.layer addSublayer:blurLayer];
[self.favoriteRecipeImage setAlpha:0];
//Show image using fade
[UIView animateWithDuration:.3 animations:^{
//Load alpha
[self.favoriteRecipeImage setAlpha:1];
[self.favoriteRecipeImageMask setFrame:self.favoriteRecipeImage.frame];
}];
}

The documentation of the backgroundFilters property says this:
Special Considerations
This property is not supported on layers in iOS.
As of iOS 6.1, there is no public API for applying live filters to layers on iOS. You can write code to draw the underlying layers to a CGImage and then apply filters to that image and set it as your layer's background, but doing so is somewhat complex and isn't “live” (it doesn't update automatically if the underlying layers change).

Try something like below :
CIImage *inputImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:#"test.png"]] ;
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"] ;
[blurFilter setDefaults] ;
[blurFilter setValue:inputImage forKey:#"inputImage"] ;
[blurFilter setValue: [NSNumber numberWithFloat:10.0f] forKey:#"inputRadius"];
CIImage *outputImage = [blurFilter valueForKey: #"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
self.bluredImageView.image = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];

Related

Pixelated layer on image in ios

I need to add pixelated rectangular layer on UIImage which can be undo. Just like this..
I used this code but its not doing the same thing as i need
CALayer *maskLayer = [CALayer layer];
CALayer *mosaicLayer = [CALayer layer];
// Mask image ends with 0.15 opacity on both sides. Set the background color of the layer
// to the same value so the layer can extend the mask image.
mosaicLayer.contents = (id)[img CGImage];
mosaicLayer.frame = CGRectMake(0,0, img.size.width, img.size.height);
UIImage *maskImg = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"mask" ofType:#"png"]];
maskLayer.contents = (id)[maskImg CGImage];
maskLayer.frame = CGRectMake(100,150, maskImg.size.width, maskImg.size.height);
mosaicLayer.mask = maskLayer;
[imageView.layer addSublayer:mosaicLayer];
UIGraphicsBeginImageContext(imageView.layer.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *saver = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
is there any built-in filter by apple for iOS? Please guide me Thanks
You can use GPUImage's GPUImagePixellateFilter https://github.com/BradLarson/GPUImage/blob/8811da388aed22e04ed54ca9a5a76791eeb40551/framework/Source/GPUImagePixellateFilter.h
We can use GPUImage framework but lot better is to use iOS own filters. easy coding :)
- (UIImage *)applyCIPixelateFilter:(UIImage*)fromImage withScale:(double)scale
{
/*
Makes an image blocky by mapping the image to colored squares whose color is defined by the replaced pixels.
Parameters
inputImage: A CIImage object whose display name is Image.
inputCenter: A CIVector object whose attribute type is CIAttributeTypePosition and whose display name is Center.
Default value: [150 150]
inputScale: An NSNumber object whose attribute type is CIAttributeTypeDistance and whose display name is Scale.
Default value: 8.00
*/
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter= [CIFilter filterWithName:#"CIPixellate"];
CIImage *inputImage = [[CIImage alloc] initWithImage:fromImage];
CIVector *vector = [CIVector vectorWithX:fromImage.size.width /2.0f Y:fromImage.size.height /2.0f];
[filter setDefaults];
[filter setValue:vector forKey:#"inputCenter"];
[filter setValue:[NSNumber numberWithDouble:scale] forKey:#"inputScale"];
[filter setValue:inputImage forKey:#"inputImage"];
CGImageRef cgiimage = [context createCGImage:filter.outputImage fromRect:filter.outputImage.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage scale:1.0f orientation:fromImage.imageOrientation];
CGImageRelease(cgiimage);
return newImage;
}

CIGaussianBlur issues

After much research I found out how to get iOS's CIGaussianBlur to semi work. I am trying to blur an UIImageView, but instead it's blurring the entire view. Here is my code:
// Get a UIImage from the UIView
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Blur the UIImage
CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey:#"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat:2] forKey:#"inputRadius"];
CIImage *resultImage = [gaussianBlurFilter valueForKey:#"outputImage"];
UIImage *endImage = [[UIImage alloc] initWithCIImage:resultImage];
// Place the UIImage in a UIImageView
_backgroundImage = [[UIImageView alloc] initWithFrame:self.view.bounds];
_backgroundImage.image = endImage;
[self.view addSubview:_backgroundImage];
(My UIImageView's name is backgroundImage)
Thanks for any help!
Are you trying to create the iOS 7 blur? If so you can use this category...
https://github.com/iGriever/TWSReleaseNotesView/tree/master/TWSReleaseNotesView
It is one that an apple employee released to make it easy to recreate the iOS 7 blur.
Just take the snap shot and then use the category to "applyLightEffects" etc...

CIFilter (CIStripesGenerator) with SKTexture?

I am trying to get a strips generated using CIFilter, then create a SKTexture from it.
Here is my code.
CIFilter *filter = [CIFilter filterWithName:#"CIStripesGenerator"];
[filter setDefaults];
[filter setValue:[CIColor colorWithRed:1 green:1 blue:1] forKey:#"inputColor0"];
[filter setValue:[CIColor colorWithRed:0 green:0 blue:0] forKey:#"inputColor1"];
//updated the code, whith this line
//stil the same problem
CIImage *croppedImage = [filter.outputImage imageByCroppingToRect:CGRectMake(0, 0, 320, 480)];
SKTexture *lightTexture = [SKTexture textureWithImage:[UIImage imageWithCIImage:croppedImage]];
SKSpriteNode *light = [SKSpriteNode spriteNodeWithTexture:lightTexture size:self.size];
However, i receive a run time error at the last line, any help would be appreciated, except for (lldb), the compiler does not give any more explanation.
UPDATE:
Thanks to rickster for guiding me towards the solution
-(UIImage*) generateImage {
// 1
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIStripesGenerator"];
[filter setDefaults];
[filter setValue:[CIColor colorWithRed:1 green:1 blue:1] forKey:#"inputColor0"];
[filter setValue:[CIColor colorWithRed:0 green:0 blue:0] forKey:#"inputColor1"];
// 2
CGImageRef cgimg =
[context createCGImage:filter.outputImage fromRect:CGRectMake(0, 0, 320, 480)];
// 3
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
// 4
CGImageRelease(cgimg);
return newImage;
}
Then, i can create texture from the image:
SKTexture *stripesTexture = [SKTexture textureWithImage:[self generateImage]];
SKSpriteNode *stripes = [SKSpriteNode spriteNodeWithTexture:stripesTexture];
stripes.position=CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
[self addChild: stripes];
There are a couple of issues here:
You don't have a Core Image context for rendering your image. Create one with:
CIContext *context = [CIContext contextWithOptions:nil];
This probably won't provide real-time rendering performance. But it looks like you just want one-time generation of a static texture, so that's okay.
Most of the generator filters produce images of infinite extent. You need to either add a crop filter to the filter chain, or render your image using a method that lets you specify what rect of the image you want, like createCGImage:fromRect:. Then make an SKTexture from the resulting CGImageRef.

Core Image Filter CISourceOverCompositing not working Properly

I am Working on Photo Editing App and I have to merge two Images one Over another like this.
I have implemented the following code to do so:
Here imgedit is the background image and
imgEdit is the UIImageView containing imgedit.
UIImage *tempImg = [UIImage imageNamed:[NSString stringWithFormat:#"borderImg"]];
CIImage *inputBackgroundImage = [[CIImage alloc] initWithImage:imgedit];
CIImage *inputImage = [[CIImage alloc]initWithImage:tempImg] ;
CIFilter *filter = [CIFilter filterWithName:#"CISourceOverCompositing"];
[filter setDefaults];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:inputBackgroundImage forKey:#"inputBackgroundImage"];
CIImage *outputImage1 = [filter valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
imgEdit.image = [UIImage imageWithCGImage:[context createCGImage:outputImage1 fromRect:outputImage1.extent]];
But the outputImage I am getting after implementing above code is:
I have also tried to resize the input white frame image, by using following code:
tempImg=[tempImg resizedImageToSize:CGSizeMake(imgEdit.image.size.width,imgEdit.image.size.height)];
By using above code image get resized properly but But that is also not working.
Please help me out from here.
Your valuable help will be highly appreciated.
Thankyou in advance.
A better way to resize is as follows:
inputImage = [inputImage imageByApplyingTransform:CGAffineTransformMakeScale(inputBackgroundImage.extent.size.width/inputImage.extent.size.with, inputBackgroundImage.extent.size.height/inputImage.extent.size.height)];

Applying a CIFilter to a CALayer

CI Filters are now available in iOS 5, and I'm trying to apply one to a CALayer, the way you'd do it on Mac. Here's my code:
CALayer *myCircle = [CALayer layer];
myCircle.bounds = CGRectMake(0,0,30,30);
myCircle.position = CGPointMake(100,100);
myCircle.cornerRadius = 15;
myCircle.borderColor = [UIColor whiteColor].CGColor;
myCircle.borderWidth = 2;
myCircle.backgroundColor = [UIColor whiteColor].CGColor;
CIFilter *blurFilter = [CIFilter filterWithName:#"CIDiscBlur"];
[blurFilter setDefaults];
[blurFilter setValue:[NSNumber numberWithFloat:5.0f] forKey:#"inputRadius"];
[myCircle setFilters:[NSArray arrayWithObjects:blurFilter, nil]];
[self.view.layer addSublayer:myCircle];
My white circle draws fine, but the filter isn't applied.
Aside from the fact that CIDiskBlur is not available (as of iOS SDK 5.1) and that setFilters: seems to be not available either you could do the following:
Create the input CIImage from the contents of your layer:
CIImage *inputImage = [CIImage imageWithCGImage:(CGImageRef)(myCircle.contents)];`
Apply your filters and get the result in an CGImageRef:
CIFilter *filter = [CIFilter filterWith...];// A filter that is available in iOS or a custom one :)
...
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
Finally set the CGImageRef to the layer:
[myCircle setContents:(id)cgimg];
This should work :)

Resources