Get hue value from a UIColor? - ios

I am using the following code to change hue of UIImage
UIImage *image = [UIImage imageNamed:#"Image.png"];
// Create a Core Image version of the image.
CIImage *sourceCore = [CIImage imageWithCGImage:[image CGImage]];
// Apply a CIHueAdjust filter
CIFilter *hueAdjust = [CIFilter filterWithName:#"CIHueAdjust"];
[hueAdjust setDefaults];
[hueAdjust setValue: sourceCore forKey: #"inputImage"];
[hueAdjust setValue: [NSNumber numberWithFloat: 1.0f] forKey: #"inputAngle"];
CIImage *resultCore = [hueAdjust valueForKey: #"outputImage"];
// Convert the filter output back into a UIImage.
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef resultRef = [context createCGImage:resultCore fromRect:[resultCore extent]];
UIImage *result = [UIImage imageWithCGImage:resultRef];
CGImageRelease(resultRef);
The code works fine, i'm just trying to find a way to get the correct NSNumber values of inputAngle for the specific colors.
so can i maybe:
Get the value by converting a [UIColor colorWithRed:0.89 green:0.718 blue:0.102 alpha:1.0]
or maybe just use the UIColor in some way ?
or is there any list with the specific numbers for each color ?
The docs says:
Thanks in advance

This post might answer your question:
Is there function to convert UIColor to Hue Saturation Brightness?
The angle should be the hue value you get there.
Here you find some information on how the angle has to be understood:
iOS: Values for CIFilter (Hue) from Photoshop
EDIT
Here is some example code based on yours:
First let's define the color you want to filter (for your inputAngle)
UIColor *myColor = [UIColor redColor]; // The color you want to filter
Then we determine the hue value of that color (that's the actual inputAngle)
CGFloat hue;
CGFloat saturation;
CGFloat brightness;
CGFloat alpha;
[myColor getHue:&hue saturation:&saturation brightness:&brightness alpha:&alpha];
This is your code (unchanged)
UIImage *image = [UIImage imageNamed:#"Image.png"];
// Create a Core Image version of the image.
CIImage *sourceCore = [CIImage imageWithCGImage:[image CGImage]];
// Apply a CIHueAdjust filter
CIFilter *hueAdjust = [CIFilter filterWithName:#"CIHueAdjust"];
[hueAdjust setDefaults];
[hueAdjust setValue: sourceCore forKey: #"inputImage"];
Here we apply the filter using the determined hue value of the chosen color
[hueAdjust setValue: [NSNumber numberWithFloat: hue] forKey: #"inputAngle"];
This is your code (unchanged)
CIImage *resultCore = [hueAdjust valueForKey: #"outputImage"];
// Convert the filter output back into a UIImage.
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef resultRef = [context createCGImage:resultCore fromRect:[resultCore extent]];
UIImage *result = [UIImage imageWithCGImage:resultRef];
CGImageRelease(resultRef);
Hope this fits your needs.

Related

Imageview not resetting to original after sharpness filter used

Not able to get original image after applying the filter on imageview.As I want to get original image after moving the slider from 1 to 0 (max to min value, I mean in reverse direction).
Below is the code for applying sharpness effect
- (IBAction)actionSharp:(UISlider *)sender {
demoImage = _affectView.image;
CIImage* image = [CIImage imageWithCGImage:demoImage.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
NSNumber *testNSNumber = [NSNumber numberWithFloat:_sharpActionWithSlider.value];
NSLog(#"ffghhn %f",_sharpActionWithSlider.value);
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CISharpenLuminance" keysAndValues: #"inputImage", image, nil];
[gaussianBlurFilter setDefaults];
[gaussianBlurFilter setValue:testNSNumber forKey:kCIInputSharpnessKey];
[gaussianBlurFilter setDefaults];
CIImage *result2 = [gaussianBlurFilter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result2 fromRect:[result2 extent]];
UIImage *sharp= [UIImage imageWithCGImage:cgImage];
UIImage *p = sharp;
self.affectView.image= p;
self.affectView.alpha = sender.value;
CGImageRelease(cgImage);
}
Just copy the image on which you are adding filter and keep it aside. Add effects to copied image and show them on your _affectView image view.
On your slider action every time get reference from your original image and effect to that image rather than image from image view.
Create global var as mainImage as
UIImage *mainImage;
in your viewDidLoad assign origial image to your main image
mainImage = [UIImage imagenamed:"yourMainImage"]; // assign from source local or url
In you IBAction on every slider action get refernce of original image instead of fetching image from imageview.
- (IBAction)actionSharp:(UISlider *)sender {
UIImage *demoImage = mainImage;
CIImage* image = [CIImage imageWithCGImage:demoImage.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
NSNumber *testNSNumber = [NSNumber numberWithFloat:_sharpActionWithSlider.value];
NSLog(#"ffghhn %f",_sharpActionWithSlider.value);
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CISharpenLuminance" keysAndValues: #"inputImage", image, nil];
[gaussianBlurFilter setDefaults];
[gaussianBlurFilter setValue:testNSNumber forKey:kCIInputSharpnessKey];
[gaussianBlurFilter setDefaults];
CIImage *result2 = [gaussianBlurFilter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result2 fromRect:[result2 extent]];
UIImage *sharp= [UIImage imageWithCGImage:cgImage];
UIImage *p = sharp;
self.affectView.image= p;
self.affectView.alpha = sender.value;
CGImageRelease(cgImage);
}
You are getting your demo image from _affectView, applying a filter to it and then saving it in _affectView again. The next time you retrieve _affectView into a UIImage it will not be the original one but the modified one so it is imposible to get back to the original.
Just at the start (outside this method) save your original image in demo and modify demo every time you want.

Objective C: CIFilter returning nil image

I am writing a function that will apply a filter to an image and return the new image. I wrote the following code:
+ (UIImage*)applyFilter:(UIImage*) photo {
CIImage *image = [[CIImage alloc] initWithCGImage:photo.CGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIphotoEffectTransfer"
keysAndValues: kCIInputImageKey, image,
#"inputIntensity", #0.8, nil];
CIImage *outputImage = [filter outputImage];
UIImage* newPhoto = [self imageFromCIImage:outputImage];
return newPhoto;
}
The problem I am running into is that the function is returning a nil photo instead of one with a filter applied. Interestingly, if I change the filter name to #"CIVignetteEffect" it will work. I don't understand why one filter works but the other will not. I found both of the filters from the following link: https://developer.apple.com/library/tvos/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIPhotoEffectTransfer
I believe the correct name of the filter is CIPhotoEffectTransfer, not CIphotoeffectTransfer.
Try this code...I have used it for photo filtering in the past so I know it works:
+ (UIImage*)applyFilter:(UIImage*) photo {
UIImageOrientation orientation = photo.imageOrientation;
CIImage* image = [CIImage imageWithCGImage:photo.CGImage];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectTransfer"];
[filter setValue:image forKey:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newPhoto = [UIImage imageWithCGImage:cgimg scale:1.0 orientation:orientation];
CGImageRelease(cgimg);
context = nil;
return newPhoto;
}
Try This ,
We have implemented CIFilter effects ,
//CIVignette Effect
CIContext *imageContext = [CIContext contextWithOptions:nil];
CIImage *image = [[CIImage alloc] initWithImage:inputimage];
CIFilter *vignette = [CIFilter filterWithName:#"CIVignette"];
[vignette setDefaults];
[vignette setValue: image forKey: #"inputImage"];
[vignette setValue: [NSNumber numberWithFloat: 1.0] forKey: #"inputIntensity"];
[vignette setValue: [NSNumber numberWithFloat: 10.00 ] forKey: #"inputRadius"];
CIImage *result = [vignette valueForKey: #"outputImage"];
CGImageRef cgImageRef = [imageContext createCGImage:result fromRect:[result extent]];
UIImage *targetImage = [UIImage imageWithCGImage:cgImageRef];
For detail implementation of multiple effect you can refer this GitHub project file ImageFilter
Hope so this answer will help for some one .

CIFilter iOS "Program exceeds GLES uniform limits."

For some reason, attempting to apply the CIComicEffect is giving me the GLES error I put in the title. Here is my code:
CGImageRef imageRef = imageToPass.CGImage;
CIContext *context = [CIContext contextWithOptions:nil]; // 1
CIImage *image = [CIImage imageWithCGImage:imageRef]; // 2
CIFilter *filter = [CIFilter filterWithName:#"CIComicEffect"]; // 3
[filter setValue:image forKey:kCIInputImageKey];
//[filter setValue:#0.8f forKey:kCIInputIntensityKey];
CIImage *result = [filter valueForKey:kCIOutputImageKey]; // 4
CGRect extent = [result extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent]; // 5
UIImageOrientation originalOrientation = imageToPass.imageOrientation;
CGFloat originalScale = imageToPass.scale;
imageToPass = [UIImage imageWithCGImage:cgImage scale:originalScale orientation:originalOrientation];
I have 0 idea what is going wrong. Further, a Google search results in practically nothing. Here are the relevant iOS docs:
https://developer.apple.com/library/mac/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIComicEffect
https://developer.apple.com/library/ios/documentation/GraphicsImaging/Reference/QuartzCoreFramework/Classes/CIFilter_Class/index.html#//apple_ref/occ/cl/CIFilter
I can't see where it is happening in your code, but most likely you are executing that code repeatedly, so the filter gets applied over and over. I.e., imageToPass is both the result and the input for next iteration.

Get black & white image from UIImage in iPhone SDK?

I want to convert image to pure black & white. I tried and got the result which is left image I have attached and the result should be the right image I have attached according to the requirements of the application.
I have used lots of CIFilter like (CIColorMonochrome, CIColorControls, CIPhotoEffectTonal etc.) but none of this is working.
Please check below code and attached result of Images.
- (UIImage *)imageBlackAndWhite
{
CIImage *beginImage = [CIImage imageWithData: UIImageJPEGRepresentation(self.captureImage.image, .1)];
//CIImage *beginImage = [CIImage imageWithCGImage:self.CGImage];
CIImage *output = [CIFilter filterWithName:#"CIColorMonochrome" keysAndValues:kCIInputImageKey, beginImage, #"inputIntensity", [NSNumber numberWithFloat:1.0], #"inputColor", [[CIColor alloc] initWithColor:[UIColor whiteColor]], nil].outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgiimage = [context createCGImage:output fromRect:output.extent];
UIImage *newImage = [UIImage imageWithCGImage:cgiimage];
CGImageRelease(cgiimage);
return newImage;
}
My Result----------------------------------------------Expected Result
You can use blendmode: kCGBlendModeLuminosity with alpha value:1.0

UIImageJPEGRepresentation with new alpha color (background color)

I have a partially-transparent UIImage that I would like to convert to a JPEG.
NSData * output = UIImageJPEGRepresentation(myUIImage,.90);
The JPEG always has a white background. I would like it to be black. How can I do that?
Performance is a concern. The image has just been rendered in CoreImage, where it would also be possible to set a background Color.
CIFilter *filter = [CIFilter filterWithName:#"CIPixellate"];
[filter setDefaults];
[filter setValue:[CIImage imageWithCGImage:editImage.CGImage] forKey:kCIInputImageKey];
[filter setValue:#(amount) forKey:#"inputScale"];
[filter setValue:vector forKey:#"inputCenter"];
CIImage* result = [filter valueForKey:kCIOutputImageKey];
Currently I immediately re-render 'result' into a new UIGraphicsImageContext
CGContextSetFillColorWithColor(ref, backgroundFillColor.CGColor);
CGRect drawRect = (CGRect){{0,0},editImage.size};
CGContextFillRect(ref, drawRect);
CGContextDrawImage(ref, drawRect, cgImage);
UIImage* filledImage = UIGraphicsGetImageFromCurrentImageContext();
but that adds up to 82% more execution time vs skipping the step and having a white JPEG background.
I'd so appreciate help on this. Thank you.
Update: I tried the following with CISourceOverCompositing, which increased runtime by 198% in some cases
CIFilter * constantColorFilter = [CIFilter filterWithName:#"CIConstantColorGenerator"];
[constantColorFilter setValue:[CIColor colorWithCGColor:[backgroundFillColor CGColor]] forKey:kCIInputColorKey];
CIFilter * composeFilter = [CIFilter filterWithName:#"CISourceOverCompositing"];
CIImage * bgColorResult = [constantColorFilter valueForKey:kCIOutputImageKey];
[composeFilter setValue:bgColorResult forKey:kCIInputBackgroundImageKey];
[composeFilter setValue:pixelateResult forKey:kCIInputImageKey];
result = [composeFilter valueForKey:kCIOutputImageKey];
I tried using singletons CIFilters to avoid re-creating CIFilter objects, but it had trivial impact on performance.
I get sub-optimal performance (80% runtime increase), but this is what I use now:
CIFilter *pixelateFilter = [CIFilter filterWithName:#"CIPixellate"];
[pixelateFilter setDefaults];
[pixelateFilter setValue:[CIImage imageWithCGImage:editImage.CGImage] forKey:kCIInputImageKey];
[pixelateFilter setValue:#(amount) forKey:#"inputScale"];
[pixelateFilter setValue:vector forKey:#"inputCenter"];
CIImage* result = [pixelateFilter valueForKey:kCIOutputImageKey];
CIContext *context = [CIContext contextWithOptions:nil];
CGRect extent = [pixelateResult extent];
CGImageRef cgImage = [context createCGImage:result fromRect:extent];
UIGraphicsBeginImageContextWithOptions(editImage.size, YES, [editImage scale]);
CGContextRef ref = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ref, 0, editImage.size.height);
CGContextScaleCTM(ref, 1.0, -1.0);
CGContextSetFillColorWithColor(ref, backgroundFillColor.CGColor);
CGRect drawRect = (CGRect){{0,0},editImage.size};
CGContextFillRect(ref, drawRect);
CGContextDrawImage(ref, drawRect, cgImage);
UIImage* filledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
returnImage = filledImage;
CGImageRelease(cgImage);

Resources