I am having a problem to add filter in CALayer.... here is the code but at the very end there is a line where we adding filter On CALayer..
CIImage *inputImage = [[CIImage alloc]initWithImage:[UIImage imageNamed:#"%d.jpg"]];
CIFilter *minimumComponent = [ CIFilter filterWithName:#"CIMinimumComponent"];
[minimumComponent setValue:inputImage forKey:#"inputImage"];
[minimumComponent setDefaults];
CIImage *outputImage = [minimumComponent valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
imageLayer = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
I guess here is the error
imageLayer = [UIImage imageWithCGImage:[context createCGImage:outputImage fromRect:outputImage.extent]];
here is the link where i get the idea ...
OK, first off I'm not 100% certain what the error you are getting is but I'd suggest changing to something like this...
CIImage *outputImage = [minimumComponent valueForKey:#"outputImage"];
UIImage *uiImage = [UIImage imageWithCIImage:outputImage];
Related
I need to filter a image like this.
This is main image
This is the output image
Please help me to get this effect.I am trying like this, but I am not getting the exact output.
-(UIImage *)setImage:(UIImage *)image_
{
UIImage *entryImage = image_;
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *image = [CIImage imageWithCGImage:[entryImage CGImage]];
CIFilter *filter = [CIFilter filterWithName:#"CIMaskToAlpha"];
[filter setDefaults];
[filter setValue:image forKey:kCIInputImageKey];
// CIImage *result = [filter valueForKey:kCIOutputImageKey];
CIImage *result = [filter outputImage];
CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgImage scale:[entryImage scale] orientation:UIImageOrientationUp];
CGImageRelease(cgImage);
return newImage;
}
The best way is LUT(Look Up Table) filter. You can find sample LUT filter on below link.
link : https://nghiatran.me/filter-me-color-lookup-table-part-2/
Brad Larson's library GPUImage provide method to use LUT filter on image.
link : https://github.com/BradLarson/GPUImage
example link
link : How to use GPUImageLookupFilter without GPUImageFilterGroup?
Feel free to ask anything :)
I am writing a function that will apply a filter to an image and return the new image. I wrote the following code:
+ (UIImage*)applyFilter:(UIImage*) photo {
CIImage *image = [[CIImage alloc] initWithCGImage:photo.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIphotoEffectTransfer"
keysAndValues: kCIInputImageKey, image,
#"inputIntensity", #0.8, nil];
CIImage *outputImage = [filter outputImage];
UIImage* newPhoto = [self imageFromCIImage:outputImage];
return newPhoto;
}
The problem I am running into is that the function is returning a nil photo instead of one with a filter applied. Interestingly, if I change the filter name to #"CIVignetteEffect" it will work. I don't understand why one filter works but the other will not. I found both of the filters from the following link: https://developer.apple.com/library/tvos/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIPhotoEffectTransfer
I believe the correct name of the filter is CIPhotoEffectTransfer, not CIphotoeffectTransfer.
Try this code...I have used it for photo filtering in the past so I know it works:
+ (UIImage*)applyFilter:(UIImage*) photo {
UIImageOrientation orientation = photo.imageOrientation;
CIImage* image = [CIImage imageWithCGImage:photo.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectTransfer"];
[filter setValue:image forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newPhoto = [UIImage imageWithCGImage:cgimg scale:1.0 orientation:orientation];
CGImageRelease(cgimg);
context = nil;
return newPhoto;
}
Try This ,
We have implemented CIFilter effects ,
//CIVignette Effect
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc] initWithImage:inputimage];
CIFilter *vignette = [CIFilter filterWithName:#"CIVignette"];
[vignette setDefaults];
[vignette setValue: image forKey: #"inputImage"];
[vignette setValue: [NSNumber numberWithFloat: 1.0] forKey: #"inputIntensity"];
[vignette setValue: [NSNumber numberWithFloat: 10.00 ] forKey: #"inputRadius"];
CIImage *result = [vignette valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
For detail implementation of multiple effect you can refer this GitHub project file ImageFilter
Hope so this answer will help for some one .
For some reason, attempting to apply the CIComicEffect is giving me the GLES error I put in the title. Here is my code:
CGImageRef imageRef = imageToPass.CGImage;
CIContext *context = [CIContext contextWithOptions:nil]; // 1
CIImage *image = [CIImage imageWithCGImage:imageRef]; // 2
CIFilter *filter = [CIFilter filterWithName:#"CIComicEffect"]; // 3
[filter setValue:image forKey:kCIInputImageKey];
//[filter setValue:#0.8f forKey:kCIInputIntensityKey];
CIImage *result = [filter valueForKey:kCIOutputImageKey]; // 4
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent]; // 5
UIImageOrientation originalOrientation = imageToPass.imageOrientation;
CGFloat originalScale = imageToPass.scale;
imageToPass = [UIImage imageWithCGImage:cgImage scale:originalScale orientation:originalOrientation];
I have 0 idea what is going wrong. Further, a Google search results in practically nothing. Here are the relevant iOS docs:
https://developer.apple.com/library/mac/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIComicEffect
https://developer.apple.com/library/ios/documentation/GraphicsImaging/Reference/QuartzCoreFramework/Classes/CIFilter_Class/index.html#//apple_ref/occ/cl/CIFilter
I can't see where it is happening in your code, but most likely you are executing that code repeatedly, so the filter gets applied over and over. I.e., imageToPass is both the result and the input for next iteration.
I want to convert image to pure black & white. I tried and got the result which is left image I have attached and the result should be the right image I have attached according to the requirements of the application.
I have used lots of CIFilter like (CIColorMonochrome, CIColorControls, CIPhotoEffectTonal etc.) but none of this is working.
Please check below code and attached result of Images.
- (UIImage *)imageBlackAndWhite
{
CIImage *beginImage = [CIImage imageWithData: UIImageJPEGRepresentation(self.captureImage.image, .1)];
//CIImage *beginImage = [CIImage imageWithCGImage:self.CGImage];
CIImage *output = [CIFilter filterWithName:#"CIColorMonochrome" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:1.0], #"inputColor", [[CIColor alloc] initWithColor:[UIColor whiteColor]], nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
return newImage;
}
My Result----------------------------------------------Expected Result
You can use blendmode: kCGBlendModeLuminosity with alpha value:1.0
I'm creating a photo app that involves filters. The filters work, but they stay permanent and overlap each other. I want, for example, the sepia filter to be removed from the image and replaced with the instant filter when the instant filter button is pressed. How do I achieve this? Here is some code from the project:
- (IBAction)sepiaFilter:(id)sender {
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.picture.image)];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:0.8], nil ];
CIImage *outputImage = [filter outputImage];
CGImageRef cging = [context createCGImage:outputImage fromRect:[outputImage extent]];
self.picture.image = [UIImage imageWithCGImage:cging];
CGImageRelease(cging);
}
- (IBAction)instantFilter:(id)sender {
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.picture.image)];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filterTwo = [CIFilter filterWithName:#"CIPhotoEffectInstant" keysAndValues:kCIInputImageKey, beginImage, nil ];
CIImage *outputImage = [filterTwo outputImage];
CGImageRef cging = [context createCGImage:outputImage fromRect:[outputImage extent]];
self.picture.image = [UIImage imageWithCGImage:cging];
CGImageRelease(cging);
}
(I actually have nine filters, but I just need an idea on how to do this) Thanks for any help
Its quite simple. You are using self.picture.image for input as well as output. Keep an original UIImage separately for input, and use self.picture.image only for output, so the only thing in your code changes is the first line of both the filters ie
CIImage *beginImage = [CIImage imageWithData: UIImagePNGRepresentation(self.originalImage)];
Hope this helps.