iOS color to transparent in UIImage - ios

How can I clear out the magenta part of an UIImage and make it transparent?
I've looked through numerous answers and links on SO and nothing works (e.g. How to make one color transparent on a UIImage? answer 1 removes everything but red, answer 2 apparently doesn't work because of Why is CGImageCreateWithMaskingColors() returning nil in this case?).
Update:
If I use CGImageCreateWithMaskingColors with the UIImage I get a nil value. If I remove the alpha channel (I represent the image as JPEG and read it back) CGImageCreateWithMaskingColors returns an image painted with a black background.
Update2, the code:
Returning nil:
const float colorMasking[6] = {222, 255, 222, 255, 222, 255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(anchorWithMask.CGImage, colorMasking);
NSLog(#"image ref %#", imageRef);
// this is going to return a nil imgref.
UIImage *image = [UIImage imageWithCGImage:imageRef];
Returning an image with black background (which is normal since there is not alpha channel):
UIImage *inputImage = [UIImage imageWithData:UIImageJPEGRepresentation(anchorWithMask, 1.0)];
const float colorMasking[6] = {222, 255, 222, 255, 222, 255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(inputImage.CGImage, colorMasking);
NSLog(#"image ref %#", imageRef);
// imgref is NOT nil.
UIImage *image = [UIImage imageWithCGImage:imageRef];
Update3:
I got it working by adding the alpha channel after the masking process.

UIImage *image = [UIImage imageNamed:#"image.png"];
const float colorMasking[6] = {1.0, 1.0, 0.0, 0.0, 1.0, 1.0};
image = [UIImage imageWithCGImage: CGImageCreateWithMaskingColors(image.CGImage, colorMasking)];
You receive nil, because parameters, that you send is invalid. If you open Apple documentation, you will see this:
Components
An array of color components that specify a color or range of colors
to mask the image with. The array must contain 2N
values { min[1], max[1], ... min[N], max[N] } where N is the number of
components in color space of image. Each value in components must be a
valid image sample value. If image has integer pixel components, then
each value must be in the range [0 .. 2**bitsPerComponent - 1] (where
bitsPerComponent is the number of bits/component of image). If image
has floating-point pixel components, then each value may be any
floating-point number which is a valid color component.
You can quickly open documentation by holding Option + Mouse Click on some function or class, like CGImageCreateWithMaskingColors.

I made this static function that removes the white background you can use it replacing the mask with the color range you want to remove:
+ (UIImage*) processImage :(UIImage*) image
{
const float colorMasking[6]={222,255,222,255,222,255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(image.CGImage, colorMasking);
UIImage* imageB = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return imageB;
}

I did this in CIImage with the post-processing from Vision being a pixelBuffer:
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
let filteredImage = ciImage.applyingFilter("CIMaskToAlpha")
self.picture.image = UIImage(ciImage: filteredImage)

Related

iOS: Change black color in UIImage to another color

I have an image with white border and black color filled inside it. I want to just change the black color to another color at runtime. The user will select the color at runtime in HSB format. How can I do that? I tried CGImageCreateWithMaskingColors by taking
const float colorMasking[4]={255, 255, 255, 255};
but I am getting a nil CGImageRef everytime. Please help.
- (UIImage*) maskBlackInImage :(UIImage*) image color:(UIColor*)color
{
const CGFloat colorMasking[4] = { 222, 255, 222, 255 };
CGImageRef imageRef = CGImageCreateWithMaskingColors(image.CGImage, colorMasking);
UIImage* imageB = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return imageB;
}
I am attaching the image of bulb - Bulb with Black color filled, white border and transparent background
Update:
I was able to fill black color with another color after using the code in the accepted answer. But, I could see a little bit of color on white border. The image doesn't look that sharp. Attaching the output:
Create a category of UIImage class and add the following method
- (UIImage *)imageTintedWithColor:(UIColor *)color
{
UIImage *image;
if (color) {
// Construct new image the same size as this one.
UIGraphicsBeginImageContextWithOptions([self size], NO, 0.0); // 0.0 for scale means "scale for device's main screen".
CGRect rect = CGRectZero;
rect.size = [self size];
// tint the image
[self drawInRect:rect];
[color set];
UIRectFillUsingBlendMode(rect, kCGBlendModeScreen);
// restore alpha channel
[self drawInRect:rect blendMode:kCGBlendModeDestinationIn alpha:1.0f];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
return image;
}

iOS : How to convert an image gray scale to original by finger touch?

Firstly, I am converting an image original to gray scale and its successfully converted. But the problem is, how to convert gray scale back to original image by user touch on that place.
What I'm unable to understand is how to convert **Gray Scale to original **.
Here is my code ** Original to gray scale **
- (UIImage *)convertImageToGrayScale:(UIImage *)image
{
CGRect imageRect = CGRectMake(0, 0, image.size.width, image.size.height);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
CGContextRef context = CGBitmapContextCreate(nil, image.size.width, image.size.height, 8, 0, colorSpace, kCGImageAlphaNone);
CGContextDrawImage(context, imageRect, [image CGImage]);
CGImageRef imageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:imageRef];
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
CFRelease(imageRef);
return newImage;
}
Guidance needed. Thanks in advance.
You can't convert a gray scale image back to color because you no longer have any color information in the image data.
If you mean you have a color image that you're converting to gray scale, and then when the user taps you show a color version, then instead you need to hang on to the original image and show that one in color.
I am using fixes size images which are 98 by 98 pixels. What I ended up doing is created a blank 98 by 98 png in Photoshop and calling it rgboverlay.png. Then I just overlay my grayscale image on top of the blank one and the resulting image is RGB. Here's the code. I originally got the code to overlay one image on another.
originalimage = your grayscale image
static UIImage* temp = nil;
thumb = [UIImage imageNamed:#"rgboverlay.png"];
CGSize size = CGSizeMake(98, 98);
UIGraphicsBeginImageContext(size);
CGPoint tempPoint = CGPointMake(0, 25 - temp.size.height / 2);
[temp drawAtPoint:testPoint];
CGPoint starredPoint = CGPointMake(1, 1);
[originalimage drawAtPoint:starredPoint];
// result is the new RGB image
result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This ended up working for me.

CGImageCreateWithMaskingColors Doesn't Work with iOS7

I've developed an app on iOS5 and iOS6. After I upgraded to XCode 5 and iOS7, I have some new bugs to play with.
The main one is the colorMasking no longer works. The exact same code still compiles and works on a phone with iOS6. On iOS7, the masked color is still there. I tried to find the answer on Google, but haven't found an answer. Is it a bug of iOS7, or does anybody know of a better way of doing colormasking?
Here is the code:
- (UIImage*) processImage :(UIImage*) image
{
UIImage *inputImage = [UIImage imageWithData:UIImageJPEGRepresentation(image, 1.0)];
const float colorMasking[6]={100.0, 255.0, 0.0, 100.0, 100.0, 255.0};
CGImageRef imageRef = CGImageCreateWithMaskingColors(inputImage.CGImage, colorMasking);
UIImage* finalImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return finalImage;
}
Here are a couple StackOverflow posts I found that helped me get it working in iOS6 the first place:
Transparency iOS
iOS color to transparent in UIImage
I have stumbled across some strange behavior of CGImageCreateWithMaskingColors in conjunction with UIImagePNGRepresentation. This may or may not be related to your problem. I have found that if:
If use CGImageCreateWithMaskingColors and immediately add that image to an image view, I can see that the transparency appears to have been applied correctly;
But in iOS 7, if I then:
take this image from CGImageCreateWithMaskingColors and create a NSData using UIImagePNGRepresentation; and
if reload the image from that NSData using imageWithData, then the resulting image will no longer have its transparency.
To confirm this, if I writeToFile for this NSData and examine the saved image in a tool like Photoshop, I can confirm that the file does not have any transparency applied.
This only manifests itself in iOS 7. In iOS 6 it's fine.
But if I take the image in step 1 and roundtrip it through drawInRect, the same process of saving the image and subsequently loading it works fine.
This following code illustrates the issue:
- (UIImage*) processImage :(UIImage*) inputImage
{
const float colorMasking[6] = {255.0, 255.0, 255.0, 255.0, 255.0, 255.0};
CGImageRef imageRef = CGImageCreateWithMaskingColors(inputImage.CGImage, colorMasking);
UIImage* finalImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
// If I put this image in an image view, I see the transparency fine.
self.imageView.image = finalImage; // this works
// But if I save it to disk and the file does _not_ have any transparency
NSString *documentsPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)[0];
NSString *pathWithoutTransparency = [documentsPath stringByAppendingPathComponent:#"image-but-no-transparency.png"];
NSData *data = UIImagePNGRepresentation(finalImage);
[data writeToFile:pathWithoutTransparency atomically:YES]; // save it so I can check out the file in Photoshop
// In iOS 7, the following imageview does not honor the transparency
self.imageView2.image = [UIImage imageWithData:data]; // this does not work in iOS 7
// but, if I round-trip the original image through `drawInRect` one final time,
// the transparency works
UIGraphicsBeginImageContextWithOptions(finalImage.size, NO, 1.0);
[finalImage drawInRect:CGRectMake(0, 0, finalImage.size.width, finalImage.size.height)];
UIImage *anotherRendition = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
data = UIImagePNGRepresentation(anotherRendition);
NSString *pathWithTransparency = [documentsPath stringByAppendingPathComponent:#"image-with-transparancy.png"];
[data writeToFile:pathWithTransparency atomically:YES];
// But this image is fine
self.imageView3.image = [UIImage imageWithContentsOfFile:pathWithTransparency]; // this does work
return anotherRendition;
}
I was loading a JPEG which for some reason loads with an alpha channel, which won't work when masking, so here I recreate the CGImage ignoring the alpha channel. There may be a better way of doing this but this works!
- (UIImage *)imageWithChromaKeyMasking {
const CGFloat colorMasking[6]={255.0,255.0,255.0,255.0,255.0,255.0};
CGImageRef oldImage = self.CGImage;
CGBitmapInfo oldInfo = CGImageGetBitmapInfo(oldImage);
CGBitmapInfo newInfo = (oldInfo & (UINT32_MAX ^ kCGBitmapAlphaInfoMask)) | kCGImageAlphaNoneSkipLast;
CGDataProviderRef provider = CGImageGetDataProvider(oldImage);
CGImageRef newImage = CGImageCreate(self.size.width, self.size.height, CGImageGetBitsPerComponent(oldImage), CGImageGetBitsPerPixel(oldImage), CGImageGetBytesPerRow(oldImage), CGImageGetColorSpace(oldImage), newInfo, provider, NULL, false, kCGRenderingIntentDefault);
CGDataProviderRelease(provider); provider = NULL;
CGImageRef im = CGImageCreateWithMaskingColors(newImage, colorMasking);
UIImage *ret = [UIImage imageWithCGImage:im];
CGImageRelease(im);
return ret;
}

cgimagecreatewithmaskingcolors changes all the transparent region to black

I am using CGImageCreateWithMaskingColors() to remove particular color from the UIImage. The color removal process works fine but the transparent region in the image turns black after masking process.See the code pasted below.
CGImageRef imageRef = self.editedImage.CGImage;
CGImageRef myColorMaskedImage = CGImageCreateWithMaskingColors(imageRef, myMaskingColors);
UIImage *newImage = [self normalizeWithAlpha:[UIImage imageWithCGImage:myColorMaskedImage]];
CGImageRelease(myColorMaskedImage);
You have to do a second masking pass to mask the black:
float colorMaskingLow[6] = {0, low, 0, low, 0, low};
float colorMaskingHigh[6] = {high, 255, high, 255, high, 255};
UIImage *image = [self maskColors:colorMaskingHigh inImage:image];
return [self maskColors:colorMaskingLow inImage:image];
(where the maskColors:inImage: function is what you specify above)

CGImageCreateWithMaskingColors leaving border when removing color

I'm using the code below to make the color magenta transparent:
UIImage *inputImage = [UIImage imageWithData:UIImageJPEGRepresentation(anchorWithMask, 1.0)];
const float colorMasking[6] = {0.0, 255.0, 0.0, 2.0, 0.0, 255.0};
CGImageRef imageRef = CGImageCreateWithMaskingColors(inputImage.CGImage, colorMasking);
UIImage *image = [UIImage imageWithCGImage:imageRef];
// add an alpha channel to the image.
The result is an image that has magenta borders: http://i.imgur.com/4g6lM.png. The borders are present before adding the alpha channel, so it's not related to that.
Is it possible to get rid of those borders?
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(), NO);
It will be ok.
The problem was in the mask I was building. Setting antialias to off fixed the issue.

Resources