How would I make an UIImageView have a gradient blurred background? In other words, the image is fully blurred on the left and it gets less blurred on the right?
we did the same in a proyect! we used this method:
(UIImage *)blurImage:(CGImageRef*)image withBlurLevel:(CGFloat)blur
{
CIImage *inputImage = [CIImage imageWithCGImage:image];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, inputImage, #"inputRadius", #(blur), nil];
CIImage *outputImage = filter.outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef outImage = [context createCGImage:outputImage
fromRect:[inputImage extent]];
UIImage *returnImage = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return returnImage;
}
This code use your GPU to make the blur, but you need to call that method in another thread if you don't want your UI to be blocked for a second.
Hope it helps!
Related
I'm trying to add a blur effect using category.
+ (UIImage *)blurImageWithImage:(UIImage*) imageName withView:(UIView*)view {
UIImage *sourceImage = imageName;
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
// Apply gaussian blur filter with radius of 30
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:#10 forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage; }
then I call this function inside a UIView like this:
UIImage *image = [UIImage imageNamed:#"xxx"]
UIImageView *page = [[UIImageView alloc] initWithImage:[UIImage blurImageWithImage:image withView:self]];
If I add this function directly in the class, it works, but not if I do it in UIImage category.
I had face same problem earlier. But thank fully I am getting the solution.
Please follow step. Make sure your blur image function working fine.
1) In category add instance method instead of class method.
ex.
- (UIImage *)blurImageWithImage:(UIImage*) imageName withView:(UIView*)view
2) Import category in your VC
3) Use category, ex
UIImage *image = [UIImage imageNamed:#"xxx"]
UIImageView *page = [[UIImageView alloc] initWithImage:[image blurImageWithImage:image withView:self]];
Let me know this solution is working fine for you.
Turns out the problem was I forgot to add "-" when doing context translate.
So what I ended up doing is I create a class method.
Interface:
+ (UIImage *)blurImageWithImageName:(NSString*) imageName withView:(UIView*)view;
Implementation :
+ (UIImage *)blurImageWithImageName:(NSString*) imageName withView:(UIView*)view {
UIImage *sourceImage = [UIImage imageNamed:imageName];
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
// Apply gaussian blur filter with radius of 30
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:#10 forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, -view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
Im trying to use a color dodge blend mode (CIFilter) on an image and composite it with my whole Scene (which is an SKScene node). CIColorDodgeBlendMode unfortunately does only take CIImage as an input for the Background. Is there possibly a workaround?
Basically i want the same result like in Photoshop having 2 layers and the upper layer has the color dodge blending mode applied.
Here is my code ('self' would be the SKScene node):
UIImage *inputUIImage = [UIImage imageNamed:#"inputImage.png"];
CIImage *inputCIImage = [[CIImage alloc]initWithImage:inputUIImage];
CIFilter *filter = [CIFilter filterWithName:#"CIColorDodgeBlendMode"];
[filter setValue:inputCIImage forKey:#"inputImage"];
[filter setValue:self forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cg = [context createCGImage:outputImage fromRect:[outputImage extent]];
SKTexture *outputTexture = [SKTexture textureWithCGImage:cg];
SKSpriteNode * outputSprite = [[SKSpriteNode alloc]initWithTexture:outputTexture];
[self addChild:outputSprite];
I'm applying filter on UIImage but after conversion, it appear as sideway. Below is my code.
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"
keysAndValues:kCIInputImageKey, ciImage, nil];
[filter setDefaults];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgImage = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [[UIImage alloc]initWithCIImage:outputImage];
It renders successfully in newImage, and dimensions, but the image is side way. Is there any place in above code that cause orientation change ?
Scale the image to proper orientation before applying filter.
follow this link: https://stackoverflow.com/q/538064/871102
Instead of:
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
write:
UIImage *properOrientedImage = [self scaleAndRotateImage:self.imgToProcess];
CIImage *ciImage = [[CIImage alloc] initWithImage:properOrientedImage.CGImage];
Yes, you may consider setting the image orientation to be sure of one initializing the UIImage with method:
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
And you have couple of image orientations:
typedef enum {
UIImageOrientationUp,
UIImageOrientationDown, // 180 deg rotation
UIImageOrientationLeft, // 90 deg CW
UIImageOrientationRight, // 90 deg CCW
UIImageOrientationUpMirrored, // as above but image mirrored along
// other axis. horizontal flip
UIImageOrientationDownMirrored, // horizontal flip
UIImageOrientationLeftMirrored, // vertical flip
UIImageOrientationRightMirrored, // vertical flip
} UIImageOrientation;
The Up is default - so try to rotate it.
I'm merging two images together by using the CIFilter #"CIDarkenBlendMode". It works fine except for one thing. I want the images to be exactly aligned on top of each other regardless of the image size but I am not able to achieve this. Do I have to create my own filter?
This is what I get:
This is what I want:
My merge-code:
-(void)mergeImagesWithCIImage:(UIImage*)image
{
CIImage *topImage = [[CIImage alloc]initWithImage:image];
CIImage *scaledImage = [self scaleImageWithCIImage:topImage];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:self.vImage.image];
CIFilter *darkenFilter = [CIFilter filterWithName:#"CIDarkenBlendMode" keysAndValues:kCIInputImageKey,scaledImage,
#"inputBackgroundImage",backgroundImage,nil];
CIImage *filterOutputImage = darkenFilter.outputImage;
CIContext *ctx = [CIContext contextWithOptions:nil];
CGImageRef createdImage = [ctx createCGImage:filterOutputImage fromRect:filterOutputImage.extent];
UIImage *outputImage = [UIImage imageWithCGImage:createdImage];
CGImageRelease(createdImage);
createdImage = nil;
self.vImage.image = outputImage;
}
Instead of using a CIFilter I used:
[_image drawInRect:CGRectMake(centerX,centerY,_image.size.width,_image.size.height) blendMode:kCGBlendModeDarken alpha:0.8];
and centered the images.
I have a black and white image, I would like to change the black in blue color and the white in yellow color(for example) in objective-c.
How could I do this ?
Thanks
You can use Core Image to do this.
UIImage *bwImage = ... // Get your image from somewhere
CGImageRef bwCGImage = bwImage.CGImage;
CIImage *bwCIImage = [CIImage imageWithCGImage:bwCGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIHueAdjust"];
// Change the float value here to change the hue
[filter setValue:[NSNumber numberWithFloat:0.5] forKey: #"inputAngle"];
// input black and white image
[filter setValue:bwCIImage forKey:kCIInputImageKey];
// get output from filter
CIImage *hueImage = [filter valueForKey:kCIOutputImageKey];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:hueImage
fromRect:[hueImage extent]];
UIImage *coloredImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
See documentation for more info: Core Image Filter Reference