CI Filters are now available in iOS 5, and I'm trying to apply one to a CALayer, the way you'd do it on Mac. Here's my code:
CALayer *myCircle = [CALayer layer];
myCircle.bounds = CGRectMake(0,0,30,30);
myCircle.position = CGPointMake(100,100);
myCircle.cornerRadius = 15;
myCircle.borderColor = [UIColor whiteColor].CGColor;
myCircle.borderWidth = 2;
myCircle.backgroundColor = [UIColor whiteColor].CGColor;
CIFilter *blurFilter = [CIFilter filterWithName:#"CIDiscBlur"];
[blurFilter setDefaults];
[blurFilter setValue:[NSNumber numberWithFloat:5.0f] forKey:#"inputRadius"];
[myCircle setFilters:[NSArray arrayWithObjects:blurFilter, nil]];
[self.view.layer addSublayer:myCircle];
My white circle draws fine, but the filter isn't applied.
Aside from the fact that CIDiskBlur is not available (as of iOS SDK 5.1) and that setFilters: seems to be not available either you could do the following:
Create the input CIImage from the contents of your layer:
CIImage *inputImage = [CIImage imageWithCGImage:(CGImageRef)(myCircle.contents)];`
Apply your filters and get the result in an CGImageRef:
CIFilter *filter = [CIFilter filterWith...];// A filter that is available in iOS or a custom one :)
...
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
Finally set the CGImageRef to the layer:
[myCircle setContents:(id)cgimg];
This should work :)
Related
I need to add pixelated rectangular layer on UIImage which can be undo. Just like this..
I used this code but its not doing the same thing as i need
CALayer *maskLayer = [CALayer layer];
CALayer *mosaicLayer = [CALayer layer];
// Mask image ends with 0.15 opacity on both sides. Set the background color of the layer
// to the same value so the layer can extend the mask image.
mosaicLayer.contents = (id)[img CGImage];
mosaicLayer.frame = CGRectMake(0,0, img.size.width, img.size.height);
UIImage *maskImg = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"mask" ofType:#"png"]];
maskLayer.contents = (id)[maskImg CGImage];
maskLayer.frame = CGRectMake(100,150, maskImg.size.width, maskImg.size.height);
mosaicLayer.mask = maskLayer;
[imageView.layer addSublayer:mosaicLayer];
UIGraphicsBeginImageContext(imageView.layer.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *saver = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
is there any built-in filter by apple for iOS? Please guide me Thanks
You can use GPUImage's GPUImagePixellateFilter https://github.com/BradLarson/GPUImage/blob/8811da388aed22e04ed54ca9a5a76791eeb40551/framework/Source/GPUImagePixellateFilter.h
We can use GPUImage framework but lot better is to use iOS own filters. easy coding :)
- (UIImage *)applyCIPixelateFilter:(UIImage*)fromImage withScale:(double)scale
{
/*
Makes an image blocky by mapping the image to colored squares whose color is defined by the replaced pixels.
Parameters
inputImage: A CIImage object whose display name is Image.
inputCenter: A CIVector object whose attribute type is CIAttributeTypePosition and whose display name is Center.
Default value: [150 150]
inputScale: An NSNumber object whose attribute type is CIAttributeTypeDistance and whose display name is Scale.
Default value: 8.00
*/
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter= [CIFilter filterWithName:#"CIPixellate"];
CIImage *inputImage = [[CIImage alloc] initWithImage:fromImage];
CIVector *vector = [CIVector vectorWithX:fromImage.size.width /2.0f Y:fromImage.size.height /2.0f];
[filter setDefaults];
[filter setValue:vector forKey:#"inputCenter"];
[filter setValue:[NSNumber numberWithDouble:scale] forKey:#"inputScale"];
[filter setValue:inputImage forKey:#"inputImage"];
CGImageRef cgiimage = [context createCGImage:filter.outputImage fromRect:filter.outputImage.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage scale:1.0f orientation:fromImage.imageOrientation];
CGImageRelease(cgiimage);
return newImage;
}
Im trying to use a color dodge blend mode (CIFilter) on an image and composite it with my whole Scene (which is an SKScene node). CIColorDodgeBlendMode unfortunately does only take CIImage as an input for the Background. Is there possibly a workaround?
Basically i want the same result like in Photoshop having 2 layers and the upper layer has the color dodge blending mode applied.
Here is my code ('self' would be the SKScene node):
UIImage *inputUIImage = [UIImage imageNamed:#"inputImage.png"];
CIImage *inputCIImage = [[CIImage alloc]initWithImage:inputUIImage];
CIFilter *filter = [CIFilter filterWithName:#"CIColorDodgeBlendMode"];
[filter setValue:inputCIImage forKey:#"inputImage"];
[filter setValue:self forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cg = [context createCGImage:outputImage fromRect:[outputImage extent]];
SKTexture *outputTexture = [SKTexture textureWithCGImage:cg];
SKSpriteNode * outputSprite = [[SKSpriteNode alloc]initWithTexture:outputTexture];
[self addChild:outputSprite];
I am trying to get a strips generated using CIFilter, then create a SKTexture from it.
Here is my code.
CIFilter *filter = [CIFilter filterWithName:#"CIStripesGenerator"];
[filter setDefaults];
[filter setValue:[CIColor colorWithRed:1 green:1 blue:1] forKey:#"inputColor0"];
[filter setValue:[CIColor colorWithRed:0 green:0 blue:0] forKey:#"inputColor1"];
//updated the code, whith this line
//stil the same problem
CIImage *croppedImage = [filter.outputImage imageByCroppingToRect:CGRectMake(0, 0, 320, 480)];
SKTexture *lightTexture = [SKTexture textureWithImage:[UIImage imageWithCIImage:croppedImage]];
SKSpriteNode *light = [SKSpriteNode spriteNodeWithTexture:lightTexture size:self.size];
However, i receive a run time error at the last line, any help would be appreciated, except for (lldb), the compiler does not give any more explanation.
UPDATE:
Thanks to rickster for guiding me towards the solution
-(UIImage*) generateImage {
// 1
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIStripesGenerator"];
[filter setDefaults];
[filter setValue:[CIColor colorWithRed:1 green:1 blue:1] forKey:#"inputColor0"];
[filter setValue:[CIColor colorWithRed:0 green:0 blue:0] forKey:#"inputColor1"];
// 2
CGImageRef cgimg =
[context createCGImage:filter.outputImage fromRect:CGRectMake(0, 0, 320, 480)];
// 3
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
// 4
CGImageRelease(cgimg);
return newImage;
}
Then, i can create texture from the image:
SKTexture *stripesTexture = [SKTexture textureWithImage:[self generateImage]];
SKSpriteNode *stripes = [SKSpriteNode spriteNodeWithTexture:stripesTexture];
stripes.position=CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
[self addChild: stripes];
There are a couple of issues here:
You don't have a Core Image context for rendering your image. Create one with:
CIContext *context = [CIContext contextWithOptions:nil];
This probably won't provide real-time rendering performance. But it looks like you just want one-time generation of a static texture, so that's okay.
Most of the generator filters produce images of infinite extent. You need to either add a crop filter to the filter chain, or render your image using a method that lets you specify what rect of the image you want, like createCGImage:fromRect:. Then make an SKTexture from the resulting CGImageRef.
I'm merging two images together by using the CIFilter #"CIDarkenBlendMode". It works fine except for one thing. I want the images to be exactly aligned on top of each other regardless of the image size but I am not able to achieve this. Do I have to create my own filter?
This is what I get:
This is what I want:
My merge-code:
-(void)mergeImagesWithCIImage:(UIImage*)image
{
CIImage *topImage = [[CIImage alloc]initWithImage:image];
CIImage *scaledImage = [self scaleImageWithCIImage:topImage];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:self.vImage.image];
CIFilter *darkenFilter = [CIFilter filterWithName:#"CIDarkenBlendMode" keysAndValues:kCIInputImageKey,scaledImage,
#"inputBackgroundImage",backgroundImage,nil];
CIImage *filterOutputImage = darkenFilter.outputImage;
CIContext *ctx = [CIContext contextWithOptions:nil];
CGImageRef createdImage = [ctx createCGImage:filterOutputImage fromRect:filterOutputImage.extent];
UIImage *outputImage = [UIImage imageWithCGImage:createdImage];
CGImageRelease(createdImage);
createdImage = nil;
self.vImage.image = outputImage;
}
Instead of using a CIFilter I used:
[_image drawInRect:CGRectMake(centerX,centerY,_image.size.width,_image.size.height) blendMode:kCGBlendModeDarken alpha:0.8];
and centered the images.
The following method attempts to apply gausian blur to an image. However it isn't doing anything. Can you please tell me what is wrong, and if you also know the reason why it's wrong, that would also help. I am trying to learn about CALayers and quartzcore.
Thanks
-(void)updateFavoriteRecipeImage{
[self.favoriteRecipeImage setImageWithURL:[NSURL URLWithString:self.profileVCModel.favoriteRecipeImageUrl] placeholderImage:[UIImage imageNamed:#"miNoImage"]];
//Set content mode
[self.favoriteRecipeImage setContentMode:UIViewContentModeScaleAspectFill];
self.favoriteRecipeImage.layer.masksToBounds = YES;
//Blur the image
CALayer *blurLayer = [CALayer layer];
CIFilter *blur = [CIFilter filterWithName:#"CIGaussianBlur"];
[blur setDefaults];
blurLayer.backgroundFilters = [NSArray arrayWithObject:blur];
[self.favoriteRecipeImage.layer addSublayer:blurLayer];
[self.favoriteRecipeImage setAlpha:0];
//Show image using fade
[UIView animateWithDuration:.3 animations:^{
//Load alpha
[self.favoriteRecipeImage setAlpha:1];
[self.favoriteRecipeImageMask setFrame:self.favoriteRecipeImage.frame];
}];
}
The documentation of the backgroundFilters property says this:
Special Considerations
This property is not supported on layers in iOS.
As of iOS 6.1, there is no public API for applying live filters to layers on iOS. You can write code to draw the underlying layers to a CGImage and then apply filters to that image and set it as your layer's background, but doing so is somewhat complex and isn't “live” (it doesn't update automatically if the underlying layers change).
Try something like below :
CIImage *inputImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:#"test.png"]] ;
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur"] ;
[blurFilter setDefaults] ;
[blurFilter setValue:inputImage forKey:#"inputImage"] ;
[blurFilter setValue: [NSNumber numberWithFloat:10.0f] forKey:#"inputRadius"];
CIImage *outputImage = [blurFilter valueForKey: #"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
self.bluredImageView.image = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];