change colors in image objective c - ios

I have a black and white image, I would like to change the black in blue color and the white in yellow color(for example) in objective-c.
How could I do this ?
Thanks

You can use Core Image to do this.
UIImage *bwImage = ... // Get your image from somewhere
CGImageRef bwCGImage = bwImage.CGImage;
CIImage *bwCIImage = [CIImage imageWithCGImage:bwCGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIHueAdjust"];
// Change the float value here to change the hue
[filter setValue:[NSNumber numberWithFloat:0.5] forKey: #"inputAngle"];
// input black and white image
[filter setValue:bwCIImage forKey:kCIInputImageKey];
// get output from filter
CIImage *hueImage = [filter valueForKey:kCIOutputImageKey];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:hueImage
fromRect:[hueImage extent]];
UIImage *coloredImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
See documentation for more info: Core Image Filter Reference

Related

Compositing CIFilter with SKScene

Im trying to use a color dodge blend mode (CIFilter) on an image and composite it with my whole Scene (which is an SKScene node). CIColorDodgeBlendMode unfortunately does only take CIImage as an input for the Background. Is there possibly a workaround?
Basically i want the same result like in Photoshop having 2 layers and the upper layer has the color dodge blending mode applied.
Here is my code ('self' would be the SKScene node):
UIImage *inputUIImage = [UIImage imageNamed:#"inputImage.png"];
CIImage *inputCIImage = [[CIImage alloc]initWithImage:inputUIImage];
CIFilter *filter = [CIFilter filterWithName:#"CIColorDodgeBlendMode"];
[filter setValue:inputCIImage forKey:#"inputImage"];
[filter setValue:self forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cg = [context createCGImage:outputImage fromRect:[outputImage extent]];
SKTexture *outputTexture = [SKTexture textureWithCGImage:cg];
SKSpriteNode * outputSprite = [[SKSpriteNode alloc]initWithTexture:outputTexture];
[self addChild:outputSprite];

Core Image Filter CISourceOverCompositing not working Properly

I am Working on Photo Editing App and I have to merge two Images one Over another like this.
I have implemented the following code to do so:
Here imgedit is the background image and
imgEdit is the UIImageView containing imgedit.
UIImage *tempImg = [UIImage imageNamed:[NSString stringWithFormat:#"borderImg"]];
CIImage *inputBackgroundImage = [[CIImage alloc] initWithImage:imgedit];
CIImage *inputImage = [[CIImage alloc]initWithImage:tempImg] ;
CIFilter *filter = [CIFilter filterWithName:#"CISourceOverCompositing"];
[filter setDefaults];
[filter setValue:inputImage forKey:#"inputImage"];
[filter setValue:inputBackgroundImage forKey:#"inputBackgroundImage"];
CIImage *outputImage1 = [filter valueForKey:#"outputImage"];
CIContext *context = [CIContext contextWithOptions:nil];
imgEdit.image = [UIImage imageWithCGImage:[context createCGImage:outputImage1 fromRect:outputImage1.extent]];
But the outputImage I am getting after implementing above code is:
I have also tried to resize the input white frame image, by using following code:
tempImg=[tempImg resizedImageToSize:CGSizeMake(imgEdit.image.size.width,imgEdit.image.size.height)];
By using above code image get resized properly but But that is also not working.
Please help me out from here.
Your valuable help will be highly appreciated.
Thankyou in advance.
A better way to resize is as follows:
inputImage = [inputImage imageByApplyingTransform:CGAffineTransformMakeScale(inputBackgroundImage.extent.size.width/inputImage.extent.size.with, inputBackgroundImage.extent.size.height/inputImage.extent.size.height)];

UIImage sideway after converting it from CIImage

I'm applying filter on UIImage but after conversion, it appear as sideway. Below is my code.
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"
keysAndValues:kCIInputImageKey, ciImage, nil];
[filter setDefaults];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgImage = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [[UIImage alloc]initWithCIImage:outputImage];
It renders successfully in newImage, and dimensions, but the image is side way. Is there any place in above code that cause orientation change ?
Scale the image to proper orientation before applying filter.
follow this link: https://stackoverflow.com/q/538064/871102
Instead of:
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
write:
UIImage *properOrientedImage = [self scaleAndRotateImage:self.imgToProcess];
CIImage *ciImage = [[CIImage alloc] initWithImage:properOrientedImage.CGImage];
Yes, you may consider setting the image orientation to be sure of one initializing the UIImage with method:
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
And you have couple of image orientations:
typedef enum {
UIImageOrientationUp,
UIImageOrientationDown, // 180 deg rotation
UIImageOrientationLeft, // 90 deg CW
UIImageOrientationRight, // 90 deg CCW
UIImageOrientationUpMirrored, // as above but image mirrored along
// other axis. horizontal flip
UIImageOrientationDownMirrored, // horizontal flip
UIImageOrientationLeftMirrored, // vertical flip
UIImageOrientationRightMirrored, // vertical flip
} UIImageOrientation;
The Up is default - so try to rotate it.

Aligning two images when merging them with CIFilter

I'm merging two images together by using the CIFilter #"CIDarkenBlendMode". It works fine except for one thing. I want the images to be exactly aligned on top of each other regardless of the image size but I am not able to achieve this. Do I have to create my own filter?
This is what I get:
This is what I want:
My merge-code:
-(void)mergeImagesWithCIImage:(UIImage*)image
{
CIImage *topImage = [[CIImage alloc]initWithImage:image];
CIImage *scaledImage = [self scaleImageWithCIImage:topImage];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:self.vImage.image];
CIFilter *darkenFilter = [CIFilter filterWithName:#"CIDarkenBlendMode" keysAndValues:kCIInputImageKey,scaledImage,
#"inputBackgroundImage",backgroundImage,nil];
CIImage *filterOutputImage = darkenFilter.outputImage;
CIContext *ctx = [CIContext contextWithOptions:nil];
CGImageRef createdImage = [ctx createCGImage:filterOutputImage fromRect:filterOutputImage.extent];
UIImage *outputImage = [UIImage imageWithCGImage:createdImage];
CGImageRelease(createdImage);
createdImage = nil;
self.vImage.image = outputImage;
}
Instead of using a CIFilter I used:
[_image drawInRect:CGRectMake(centerX,centerY,_image.size.width,_image.size.height) blendMode:kCGBlendModeDarken alpha:0.8];
and centered the images.

Blur UIImageView with gradient alpha

How would I make an UIImageView have a gradient blurred background? In other words, the image is fully blurred on the left and it gets less blurred on the right?
we did the same in a proyect! we used this method:
(UIImage *)blurImage:(CGImageRef*)image withBlurLevel:(CGFloat)blur
{
CIImage *inputImage = [CIImage imageWithCGImage:image];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, inputImage, #"inputRadius", #(blur), nil];
CIImage *outputImage = filter.outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef outImage = [context createCGImage:outputImage
fromRect:[inputImage extent]];
UIImage *returnImage = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return returnImage;
}
This code use your GPU to make the blur, but you need to call that method in another thread if you don't want your UI to be blocked for a second.
Hope it helps!

Resources