Im trying to use a color dodge blend mode (CIFilter) on an image and composite it with my whole Scene (which is an SKScene node). CIColorDodgeBlendMode unfortunately does only take CIImage as an input for the Background. Is there possibly a workaround?
Basically i want the same result like in Photoshop having 2 layers and the upper layer has the color dodge blending mode applied.
Here is my code ('self' would be the SKScene node):
UIImage *inputUIImage = [UIImage imageNamed:#"inputImage.png"];
CIImage *inputCIImage = [[CIImage alloc]initWithImage:inputUIImage];
CIFilter *filter = [CIFilter filterWithName:#"CIColorDodgeBlendMode"];
[filter setValue:inputCIImage forKey:#"inputImage"];
[filter setValue:self forKey:#"inputBackgroundImage"];
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cg = [context createCGImage:outputImage fromRect:[outputImage extent]];
SKTexture *outputTexture = [SKTexture textureWithCGImage:cg];
SKSpriteNode * outputSprite = [[SKSpriteNode alloc]initWithTexture:outputTexture];
[self addChild:outputSprite];
Related
I'm applying filter on UIImage but after conversion, it appear as sideway. Below is my code.
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
CIFilter *filter = [CIFilter filterWithName:#"CIPhotoEffectChrome"
keysAndValues:kCIInputImageKey, ciImage, nil];
[filter setDefaults];
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgImage = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [[UIImage alloc]initWithCIImage:outputImage];
It renders successfully in newImage, and dimensions, but the image is side way. Is there any place in above code that cause orientation change ?
Scale the image to proper orientation before applying filter.
follow this link: https://stackoverflow.com/q/538064/871102
Instead of:
CIImage *ciImage = [[CIImage alloc] initWithImage:self.imgToProcess];
write:
UIImage *properOrientedImage = [self scaleAndRotateImage:self.imgToProcess];
CIImage *ciImage = [[CIImage alloc] initWithImage:properOrientedImage.CGImage];
Yes, you may consider setting the image orientation to be sure of one initializing the UIImage with method:
- (id)initWithCGImage:(CGImageRef)imageRef scale:(CGFloat)scale orientation:(UIImageOrientation)orientation
And you have couple of image orientations:
typedef enum {
UIImageOrientationUp,
UIImageOrientationDown, // 180 deg rotation
UIImageOrientationLeft, // 90 deg CW
UIImageOrientationRight, // 90 deg CCW
UIImageOrientationUpMirrored, // as above but image mirrored along
// other axis. horizontal flip
UIImageOrientationDownMirrored, // horizontal flip
UIImageOrientationLeftMirrored, // vertical flip
UIImageOrientationRightMirrored, // vertical flip
} UIImageOrientation;
The Up is default - so try to rotate it.
I'm merging two images together by using the CIFilter #"CIDarkenBlendMode". It works fine except for one thing. I want the images to be exactly aligned on top of each other regardless of the image size but I am not able to achieve this. Do I have to create my own filter?
This is what I get:
This is what I want:
My merge-code:
-(void)mergeImagesWithCIImage:(UIImage*)image
{
CIImage *topImage = [[CIImage alloc]initWithImage:image];
CIImage *scaledImage = [self scaleImageWithCIImage:topImage];
CIImage *backgroundImage = [[CIImage alloc]initWithImage:self.vImage.image];
CIFilter *darkenFilter = [CIFilter filterWithName:#"CIDarkenBlendMode" keysAndValues:kCIInputImageKey,scaledImage,
#"inputBackgroundImage",backgroundImage,nil];
CIImage *filterOutputImage = darkenFilter.outputImage;
CIContext *ctx = [CIContext contextWithOptions:nil];
CGImageRef createdImage = [ctx createCGImage:filterOutputImage fromRect:filterOutputImage.extent];
UIImage *outputImage = [UIImage imageWithCGImage:createdImage];
CGImageRelease(createdImage);
createdImage = nil;
self.vImage.image = outputImage;
}
Instead of using a CIFilter I used:
[_image drawInRect:CGRectMake(centerX,centerY,_image.size.width,_image.size.height) blendMode:kCGBlendModeDarken alpha:0.8];
and centered the images.
How would I make an UIImageView have a gradient blurred background? In other words, the image is fully blurred on the left and it gets less blurred on the right?
we did the same in a proyect! we used this method:
(UIImage *)blurImage:(CGImageRef*)image withBlurLevel:(CGFloat)blur
{
CIImage *inputImage = [CIImage imageWithCGImage:image];
CIFilter *filter = [CIFilter filterWithName:#"CIGaussianBlur"
keysAndValues:kCIInputImageKey, inputImage, #"inputRadius", #(blur), nil];
CIImage *outputImage = filter.outputImage;
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef outImage = [context createCGImage:outputImage
fromRect:[inputImage extent]];
UIImage *returnImage = [UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return returnImage;
}
This code use your GPU to make the blur, but you need to call that method in another thread if you don't want your UI to be blocked for a second.
Hope it helps!
I have a black and white image, I would like to change the black in blue color and the white in yellow color(for example) in objective-c.
How could I do this ?
Thanks
You can use Core Image to do this.
UIImage *bwImage = ... // Get your image from somewhere
CGImageRef bwCGImage = bwImage.CGImage;
CIImage *bwCIImage = [CIImage imageWithCGImage:bwCGImage];
CIFilter *filter = [CIFilter filterWithName:#"CIHueAdjust"];
// Change the float value here to change the hue
[filter setValue:[NSNumber numberWithFloat:0.5] forKey: #"inputAngle"];
// input black and white image
[filter setValue:bwCIImage forKey:kCIInputImageKey];
// get output from filter
CIImage *hueImage = [filter valueForKey:kCIOutputImageKey];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:hueImage
fromRect:[hueImage extent]];
UIImage *coloredImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
See documentation for more info: Core Image Filter Reference
CI Filters are now available in iOS 5, and I'm trying to apply one to a CALayer, the way you'd do it on Mac. Here's my code:
CALayer *myCircle = [CALayer layer];
myCircle.bounds = CGRectMake(0,0,30,30);
myCircle.position = CGPointMake(100,100);
myCircle.cornerRadius = 15;
myCircle.borderColor = [UIColor whiteColor].CGColor;
myCircle.borderWidth = 2;
myCircle.backgroundColor = [UIColor whiteColor].CGColor;
CIFilter *blurFilter = [CIFilter filterWithName:#"CIDiscBlur"];
[blurFilter setDefaults];
[blurFilter setValue:[NSNumber numberWithFloat:5.0f] forKey:#"inputRadius"];
[myCircle setFilters:[NSArray arrayWithObjects:blurFilter, nil]];
[self.view.layer addSublayer:myCircle];
My white circle draws fine, but the filter isn't applied.
Aside from the fact that CIDiskBlur is not available (as of iOS SDK 5.1) and that setFilters: seems to be not available either you could do the following:
Create the input CIImage from the contents of your layer:
CIImage *inputImage = [CIImage imageWithCGImage:(CGImageRef)(myCircle.contents)];`
Apply your filters and get the result in an CGImageRef:
CIFilter *filter = [CIFilter filterWith...];// A filter that is available in iOS or a custom one :)
...
CIImage *outputImage = [filter outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
Finally set the CGImageRef to the layer:
[myCircle setContents:(id)cgimg];
This should work :)