I have a UIImage inside UIImageView on which gaussian blur filter with radius of 50 has been applied as of now. As per the new requirement, I need to set its initial Gaussian Blur at 3px. Then increase it slowly from 3px to 10 px as the user scrolls up the view? Could anybody please help me understand how can this be done?
This is the code that I'm using to blur the image as of now with a radius of 50.
- (UIImage *)blurWithCoreImage:(UIImage *)sourceImage blurValue:(int)valBlur
{
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:[NSString stringWithFormat:#"%d",valBlur] forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(self.view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, -self.view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, self.view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, self.view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
-(void)scrollViewDidScroll:(UIScrollView *)scrollView {
/* This is the offset at the bottom of the scroll view. */
CGFloat totalScroll = scrollView.contentSize.height - scrollView.bounds.size.height;
/* This is the current offset. */
CGFloat offset = - scrollView.contentOffset.y;
/* This is the percentage of the current offset / bottom offset. */
CGFloat percentage = offset / totalScroll;
/* When percentage = 0, the Blur should be 3 so we should flip the percentage. */
imageview.blurLevel= (3.0f + percentage);
}
Source : SFOS
Related
Hello I'd like to create the following Black and White Photoshop effect on a UIImage
https://drive.google.com/file/d/0B5dHxpdDwpPec3dPTWdLVnNhZFk/view?usp=sharing
In which you can change the brightness of each of the six colors (reds yellows green cyans blues magentas)
I used this to make the image black and white but it doesn't allow me to change the specific colors
self.imageView.image = chosenImage;
CIImage *beginImage = [CIImage imageWithCGImage:chosenImage.CGImage];
CIImage *blackAndWhite = [CIFilter filterWithName:#"CIColorControls" keysAndValues:kCIInputImageKey, beginImage, #"inputBrightness", [NSNumber numberWithFloat:0.0], #"inputContrast", [NSNumber numberWithFloat:1.1], #"inputSaturation", [NSNumber numberWithFloat:0.0], nil].outputImage;
CIImage *output = [CIFilter filterWithName:#"CIExposureAdjust" keysAndValues:kCIInputImageKey, blackAndWhite, #"inputEV", [NSNumber numberWithFloat:0.7], nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
self.imageView.image = newImage;
Thank You for your time
I think you can accomplish that effect with the following function:
- (UIImage *)grayScaleImageWith:(UIImage *)image blackPoint:(CGFloat)blackPoint whitePoint:(CGFloat)whitePoint andGamma:(CGFloat)gamma {
// Create image rectangle with current image width/height
CGRect imageRect = CGRectMake(0, 0, image.size.width, image.size.height);
// Grayscale color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateCalibratedGray(whitePoint, blackPoint, gamma);
// Create bitmap content with current image size and grayscale colorspace
CGContextRef context = CGBitmapContextCreate(nil, image.size.width, image.size.height, 8, 0, colorSpace, kCGImageAlphaNone);
// Draw image into current context, with specified rectangle
// using previously defined context (with grayscale colorspace)
CGContextDrawImage(context, imageRect, [image CGImage]);
// Create bitmap image info from pixel data in current context
CGImageRef imageRef = CGBitmapContextCreateImage(context);
// Create a new UIImage object
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
// Release colorspace, context and bitmap information
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
CFRelease(imageRef);
// Return the new grayscale image
return newImage;
}
Then call it filling the black and white values with the values selected on the UI:
CGFloat black[3] = { 0, 0, 0 }; // replace content with values from interface
CGFloat white[3] = { 100, 100, 100 }; // replace content with values from interface
[self grayScaleImageWith:image blackPoint:black whitePoint:white andGamma:1.8f];
I have not tested this code yet but I hope at least it points you in the right direction.
I'm trying to add a blur effect using category.
+ (UIImage *)blurImageWithImage:(UIImage*) imageName withView:(UIView*)view {
UIImage *sourceImage = imageName;
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
// Apply gaussian blur filter with radius of 30
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:#10 forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage; }
then I call this function inside a UIView like this:
UIImage *image = [UIImage imageNamed:#"xxx"]
UIImageView *page = [[UIImageView alloc] initWithImage:[UIImage blurImageWithImage:image withView:self]];
If I add this function directly in the class, it works, but not if I do it in UIImage category.
I had face same problem earlier. But thank fully I am getting the solution.
Please follow step. Make sure your blur image function working fine.
1) In category add instance method instead of class method.
ex.
- (UIImage *)blurImageWithImage:(UIImage*) imageName withView:(UIView*)view
2) Import category in your VC
3) Use category, ex
UIImage *image = [UIImage imageNamed:#"xxx"]
UIImageView *page = [[UIImageView alloc] initWithImage:[image blurImageWithImage:image withView:self]];
Let me know this solution is working fine for you.
Turns out the problem was I forgot to add "-" when doing context translate.
So what I ended up doing is I create a class method.
Interface:
+ (UIImage *)blurImageWithImageName:(NSString*) imageName withView:(UIView*)view;
Implementation :
+ (UIImage *)blurImageWithImageName:(NSString*) imageName withView:(UIView*)view {
UIImage *sourceImage = [UIImage imageNamed:imageName];
CIImage *inputImage = [CIImage imageWithCGImage:sourceImage.CGImage];
// Apply Affine-Clamp filter to stretch the image so that it does not
// look shrunken when gaussian blur is applied
CGAffineTransform transform = CGAffineTransformIdentity;
CIFilter *clampFilter = [CIFilter filterWithName:#"CIAffineClamp"];
[clampFilter setValue:inputImage forKey:#"inputImage"];
[clampFilter setValue:[NSValue valueWithBytes:&transform objCType:#encode(CGAffineTransform)] forKey:#"inputTransform"];
// Apply gaussian blur filter with radius of 30
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: #"CIGaussianBlur"];
[gaussianBlurFilter setValue:clampFilter.outputImage forKey: #"inputImage"];
[gaussianBlurFilter setValue:#10 forKey:#"inputRadius"];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:gaussianBlurFilter.outputImage fromRect:[inputImage extent]];
// Set up output context.
UIGraphicsBeginImageContext(view.frame.size);
CGContextRef outputContext = UIGraphicsGetCurrentContext();
// Invert image coordinates
CGContextScaleCTM(outputContext, 1.0, -1.0);
CGContextTranslateCTM(outputContext, 0, -view.frame.size.height);
// Draw base image.
CGContextDrawImage(outputContext, view.frame, cgImage);
// Apply white tint
CGContextSaveGState(outputContext);
CGContextSetFillColorWithColor(outputContext, [UIColor colorWithWhite:1 alpha:0.2].CGColor);
CGContextFillRect(outputContext, view.frame);
CGContextRestoreGState(outputContext);
// Output image is ready.
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
I've been struggling for a few days for a project on UIImage colorization.
The idea is that the app will embark a set of images that I will have to colorize with values retrieved from a webservice. Some sort of themes if you wish.
The designer I work with gave me a background image on all of his Photoshop values.
The first problem is that Photoshop uses HSL and iOS uses HSB. So the first challenge was to translate the values from Photoshop.
Photoshop HSL: -28 (range -180 => +180), 100 (range -100 => +100), 25 (range -100 => +100).
Luckily I found some code online, here it is.
//adapted from https://gist.github.com/peteroupc/4085710
- (void)convertLightnessToBrightness:(CGFloat)lightness withSaturation:(CGFloat)saturation completion:(void (^)(CGFloat, CGFloat))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
CGFloat brightness = 0.0f;
CGFloat saturationOut = 0.0f;
if (lightness > 0.0f)
{
CGFloat lumScale = (1.0f - MAX((lightness - 0.5f), 0.0f) * 2.0f);
lumScale = ((lumScale == 0) ? 0 : (1.0f / lumScale));
CGFloat lumStart = MAX(0.0f, (lumScale - 0.5f));
CGFloat lumDiv = (lumScale - lumStart);
lumDiv = (lumStart + (saturation * lumDiv));
saturationOut = ((lumDiv == 0) ? 0.0f : (saturation / lumDiv));
brightness = (lightness + (1.0f - lightness) * saturation);
}
NSLog(#"saturation: %0.2f - brightness: %0.2f", saturationOut, brightness);
completion(saturationOut, brightness);
}
Using an online converter I verified that this method returns the good values.
I needed to change the ranges (H: 0->360, S: 0->100, L: 0->100)
So HSL 152, 100, 62 gives HSB 152, 76, 100. And the method returns 75 for saturation and 100 for brightness so we are good.
Next I needed to apply those values to the image, so here is the code to change...
The HUE:
#define DEGREES_TO_RADIANS(angle) ((angle) / 180.0f * M_PI)
- (void)colorize:(UIImage *)input hue:(CGFloat)hueDegrees completion:(void(^)(UIImage *outputHue))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
CGFloat hue = DEGREES_TO_RADIANS(hueDegrees);
NSLog(#"degress: %0.2f | radian: %0.2f", hueDegrees, hue);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *hueFilter = [CIFilter filterWithName:#"CIHueAdjust" keysAndValues:kCIInputImageKey, inputImage, nil];
[hueFilter setDefaults];
[hueFilter setValue:[NSNumber numberWithFloat:hue] forKey:kCIInputAngleKey];
//---
CIImage *outputImage = [hueFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
The SATURATION:
- (void)colorize:(UIImage *)input saturation:(CGFloat)saturation completion:(void(^)(UIImage *outputSaturation))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
NSLog(#"saturation: %0.2f", saturation);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *saturationFilter = [CIFilter filterWithName:#"CIColorControls" keysAndValues:kCIInputImageKey, inputImage, nil];
[saturationFilter setDefaults];
[saturationFilter setValue:[NSNumber numberWithFloat:saturation] forKey:#"inputSaturation"];
//---
CIImage *outputImage = [saturationFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
The BRIGHTNESS:
- (void)colorize:(UIImage *)input brightness:(CGFloat)brightness completion:(void(^)(UIImage *outputBrightness))completion
{
if (!completion)
return; //What's the point of calling this method without a completion block!
NSLog(#"brightness: %0.2f", brightness);
CIImage *inputImage = [CIImage imageWithCGImage:input.CGImage];
//---
CIFilter *brightnessFilter = [CIFilter filterWithName:#"CIColorControls" keysAndValues:kCIInputImageKey, inputImage, nil];
[brightnessFilter setDefaults];
[brightnessFilter setValue:[NSNumber numberWithFloat:brightness] forKey:#"inputBrightness"];
//---
CIImage *outputImage = [brightnessFilter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *outputUIImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
completion(outputUIImage);
}
And everything put together:
CGFloat hue = -28.0f; //152 in 360° range (180 - 28 = 152)
CGFloat saturation = 1.0f; //((100 + 100.0f) / 200.0f)
CGFloat lightness = 0.625f; //((25 + 100.0f) / 200.0f)
[self convertLightnessToBrightness:ligthness withSaturation:saturation completion:^(CGFloat saturationOut, CGFloat brightness) {
//saturarationOut = 0.75f and brigthness = 1.0f
[self colorize:input hue:hue completion:^(UIImage *outputHue) {
[self colorize:outputHue saturation:saturationOut completion:^(UIImage *outputSaturation) {
[self colorize:outputSaturation brightness:brightness completion:completion];
}];
}];
}];
The last completion block simply applies the output image to the image view.
Now here are the results:
Base image
Colorize (hue only)
Colorize (hue and saturation)
Colorize (hue, saturation and brightness)
Expected result
As you can see, the final image is completely white (brigthness is 100%).
I'm completely lost here, I've tried many combination (applying H, S and B in every order), I've tried others libs such as iOS-Image-Filters, without any success. I've also read a lot of question here on Stack Overflow.
Links:
Core Image Filter Reference
CIHueAdjust core image filter setup
How to programmatically change the hue of UIImage?
iOS: Values for CIFilter (Hue) from Photoshop
As anyone succeeded to apply HSL/HSB value to UIImages?
Well finally, we decided to change techniques.
I'm now using a semi-transparent image to wich I apply a blend mode with the desired color.
Following this post I made a category on UIImage.
- (UIImage *)tintedBackgroundImageWithColor:(UIColor *)tintColor
{
UIGraphicsBeginImageContextWithOptions(self.size, NO, 0.0f);
[tintColor setFill];
CGRect bounds = CGRectMake(0, 0, self.size.width, self.size.height);
UIRectFill(bounds);
[self drawInRect:bounds blendMode:kCGBlendModeSourceAtop alpha:1.0f];
UIImage *tintedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tintedImage;
}
I have a image & i want to change the color of that image through programatically.
& I want to change the color of this image
UPDATE:
Use this method...
-(UIImage *)imageNamed:(NSString *)name withColor:(UIColor *)color {
// load the image
UIImage *img = [UIImage imageNamed:name];
// begin a new image context, to draw our colored image onto
UIGraphicsBeginImageContext(img.size);
// get a reference to that context we created
CGContextRef context = UIGraphicsGetCurrentContext();
// set the fill color
[color setFill];
// translate/flip the graphics context (for transforming from CG* coords to UI* coords
CGContextTranslateCTM(context, 0, img.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
// set the blend mode to color burn, and the original image
CGContextSetBlendMode(context, kCGBlendModeColorBurn);
CGRect rect = CGRectMake(0, 0, img.size.width, img.size.height);
CGContextDrawImage(context, rect, img.CGImage);
// set a mask that matches the shape of the image, then draw (color burn) a colored rectangle
CGContextClipToMask(context, rect, img.CGImage);
CGContextAddRect(context, rect);
CGContextDrawPath(context,kCGPathFill);
// generate a new UIImage from the graphics context we drew onto
UIImage *coloredImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//return the color-burned image
return coloredImg;
}
Use it like below...
yourImageView.image = [self imageNamed:#"yourImageName" withColor:[UIColor orangeColor]];
Here is the Swift version:
extension UIImage {
func colorizeWith(color: UIColor) -> UIImage {
UIGraphicsBeginImageContext(self.size)
let context = UIGraphicsGetCurrentContext()
color.setFill()
CGContextTranslateCTM(context, 0, self.size.height)
CGContextScaleCTM(context, 1.0, -1.0)
// set the blend mode to color burn, and the original image
CGContextSetBlendMode(context, kCGBlendModeNormal);
let rect = CGRectMake(0, 0, self.size.width, self.size.height);
CGContextDrawImage(context, rect, self.CGImage);
// set a mask that matches the shape of the image, then draw (color burn) a colored rectangle
CGContextClipToMask(context, rect, self.CGImage);
CGContextAddRect(context, rect);
CGContextDrawPath(context,kCGPathFill);
// generate a new UIImage from the graphics context we drew onto
let coloredImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//return the color-burned image
return coloredImage;
}
}
You can try Ankish Jain's answer, it works for me.
theImageView.image = [theImageView.image imageWithRenderingMode:UIImageRenderingModeAlwaysTemplate];
[theImageView setTintColor:[UIColor redColor]];
CoreImage Color Filters work great for this kind of tasks - I find them slightly more straightforward than using the Core Graphic classes (CG...) : They work by allowing you to adjust the RGB and Alpha characteristics of the image I have been using them to change the white background of a QRCode to colored. RGBA of white is (1,1,1,1) in your case I believe you have to reverse the colour. Just check the CI documentation of Apple, there are a few dozen filters available CIColorMatrix is just one of them.
CIImage *beginImage = [CIImage imageWithCGImage:image.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filtercd = [CIFilter filterWithName:#"CIColorMatrix" //rgreen
keysAndValues: kCIInputImageKey, beginImage, nil];
[filtercd setValue:[CIVector vectorWithX:0 Y:1 Z:1 W:0] forKey:#"inputRVector"]; // 5
[filtercd setValue:[CIVector vectorWithX:1 Y:0 Z:1 W:0] forKey:#"inputGVector"]; // 6
[filtercd setValue:[CIVector vectorWithX:1 Y:1 Z:0 W:0] forKey:#"inputBVector"]; // 7
[filtercd setValue:[CIVector vectorWithX:0 Y:0 Z:0 W:1] forKey:#"inputAVector"]; // 8
[filtercd setValue:[CIVector vectorWithX:1 Y:1 Z:0 W:0] forKey:#"inputBiasVector"];
CIImage *doutputImage = [filtercd outputImage];
CGImageRef cgimgd = [context createCGImage:doutputImage fromRect:[doutputImage extent]];
UIImage *newImgd = [UIImage imageWithCGImage:cgimgd];
filterd.image = newImgd;
CGImageRelease(cgimgd);
As I've answered here iPhone - How do you color an image? in my opinion the best way to colorize an image from iOS 7 is by using
myImageView.image = [[UIImage imageNamed:#"myImage"] imageWithRenderingMode:UIImageRenderingModeAlwaysTemplate];
and then change the tintColor of the imageView or whatever contains the image.
In my project I have to make a screenshot of the screen and apply blur to create the effect of frosted glass. Content can be moved under the glass and then method bluredImageWithRect: called. I'm trying to optimize the following method to speed up the application. Major losses occur when a blur filter is applied to the screenshot, so I'm looking for a way to take a screenshot in a lower resolution, apply a blur to the screenshot, and then stretch it to fit some rect.
- (CIImage *)bluredImageWithRect:(CGRect)rect {
CGSize smallSize = CGSizeMake(rect.size.width, rect.size.height);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(nil, smallSize.width, smallSize.height, 8, 0, colorSpaceRef, kCGImageAlphaPremultipliedFirst);
CGContextClearRect(ctx, rect);
CGColorSpaceRelease(colorSpaceRef);
CGContextSetInterpolationQuality(ctx, kCGInterpolationNone);
CGContextSetShouldAntialias(ctx, NO);
CGContextSetAllowsAntialiasing(ctx, NO);
CGContextTranslateCTM(ctx, 0.0, self.view.frame.size.height);
CGContextScaleCTM(ctx, 1, -1);
CGImageRef maskImage = [UIImage imageNamed:#"mask.png"].CGImage;
CGContextClipToMask(ctx, rect, maskImage);
[self.view.layer renderInContext:ctx];
CGImageRef imageRef1 = CGBitmapContextCreateImage(ctx);
CGContextRelease(ctx);
NSDictionary *options = #{(id)kCIImageColorSpace : (id)kCFNull};
CIImage *beforeFilterImage = [CIImage imageWithCGImage:imageRef1 options:options];
CGImageRelease(imageRef1);
CIFilter *blurFilter = [CIFilter filterWithName:#"CIGaussianBlur" keysAndValues:kCIInputImageKey, beforeFilterImage, #"inputRadius", [NSNumber numberWithFloat:3.0f], nil];
CIImage *afterFilterImage = blurFilter.outputImage;
CIImage *croppedImage = [afterFilterImage imageByCroppingToRect:CGRectMake(0, 0, smallSize.width, smallSize.height)];
return croppedImage;
}
Here is a tutorial iOS image processing with the accelerate framework that shows how to do a blur effect that may be fast enough for what you need.