How to create reverse masks on CALayers - ios

I'll make this as simple as possible.
How do I create reverse masks on CALayers in iOS?
I have a red view and an image that is used to mask the red view.
I use the view's CALayer's mask property to apply the mask, the result is the following.
However what I desire would be the opposite result, such as
(imagine that the white part here is actually the wood in the background, because I am not very good with image editing software)
To put in other words: I want the masking image to punch a hole through the view, and not act as an actual masking layer.
Answers in both C# (MonoTouch) or Obj-C are fine either way

Hope it is helpful
Step 1: Create a mask without any alpha channel.
Step 2: Create a invert mask by following method
- (UIImage*)createInvertMask:(UIImage *)maskImage withTargetImage:(UIImage *) image {
CGImageRef maskRef = maskImage.CGImage;
CGBitmapInfo bitmapInfo = kCGImageAlphaNone;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef mask = CGImageCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGColorSpaceCreateDeviceGray(),
bitmapInfo,
CGImageGetDataProvider(maskRef),
nil,
NO,
renderingIntent);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
CGImageRelease(mask);
CGImageRelease(maskRef);
return [UIImage imageWithCGImage:masked];
}
Step 3: Set the invert mask to a UIView by following method
- (void)maskWithImage:(UIImage*) maskImage TargetView:(UIView*) targetView {
CALayer *_maskingLayer = [CALayer layer];
_maskingLayer.frame = targetView.bounds;
[_maskingLayer setContents:(id)[maskImage CGImage]];
[targetView.layer setMask:_maskingLayer];
}
Finished.
How to call?
UIImage* targetImage = [UIImage imageNamed:#"sky_bg.png"];
UIImage* mask = [UIImage imageNamed:#"mask01.png"];
UIImage* invertMask = [self createInvertMask:mask withTargetImage:targetImage];
[self maskWithImage:invertMask TargetView:targetview];

You need to reverse the alpha of your mask, so the "holes" are the transparent pixels. You'll also want to stretch your image to cover the whole red view.

Related

Cropping UIImage by custom shape

I have a background UIImage, and I would like to crop the background UIImage with a custom shape so this background image only "appears" through the custom shape. For example, I have a moon-shaped custom shape, and I would like the background image to only come through on the moon-shaped part.
Based on another answer, I am trying to impose a mask on the image like so:
- (UIImage *)createImageFromImage:(UIImage *)image
withMaskImage:(UIImage *)mask {
CGImageRef imageRef = image.CGImage;
CGImageRef maskRef = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef),
NULL,
YES);
CGImageRef maskedReference = CGImageCreateWithMask(imageRef, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
However, the result of this looks like , and there are several issues with it:
the moon I have is 100*100, this has stretched do a strange proportion
My goal is only for the image to come through the moon - however, at the moment the moon is a white shape that is coming over the image
Any ideas about how I could fix the crop issue would be much appreciated. Thanks!
On your background image you'll have to add a custom mask through CALayer.
Keep in mind, everything you color in the mask.png (moon) will be visible, everything transparent will not display.
UIImage *moonImage = [UIImage imageNamed:#"mask.png"];
CALayer *maskLayer = [CALayer layer];
[maskLayer setContents:(id)moonImage.CGImage];
[maskLayer setFrame:CGRectMake(0.0f, 0.0f, moonImage.size.width, moonImage.size.height)];
UIImageView *yourBackgroundImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 1024, 768)];
// Keep your image view in tact here, just add:
[yourBackgroundImageView.layer setMask:maskLayer];

iOS SDK - Image masking

How to mask an image with another image by non transparent pixels?
E.g.
When the mask image is black&white, I use this function:
- (UIImage *)maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
But how to do this in my case?
Here are the steps, to achieve the result
1) Create an image view, and calculate the optimal size for it (+ set aspectFit option)
2) Get the image view frame, and create a mask with that options (i.e. scale your predefined mask image)
3) Mask image view, with your created mask
UIImageView *maskView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"star_mask_alpha.png"]];
self.needsMaskImageView.layer.mask = maskView.layer;
[self.needsMaskImageView setNeedsDisplay];

How to crop an image in a desired shape

I'm trying to develop an easy game. But then, I came across this problem with my UIImage.
When I import an image, I get this annoying background on this ball, not just the ball itself. So when I play with this ball, this background has to match the UIView color, or else it will look weird. How do I solve this problem?
You can use following code to get a mask shape image with a mask:
//beCroppedImage is the image be cropped, maskImage is a image with a white background color and a black sharp that ever you want.
theShapeImage = [beCroppedImage maskImageWithMask:maskImage];
- (UIImage *)maskImageWithMask:(UIImage*)mask {
CGImageRef imgRef = [self CGImage];
CGImageRef maskRef = [mask CGImage];
CGImageRef actualMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask(imgRef, actualMask);
UIImage *image = [UIImage imageWithCGImage:masked];
CGImageRelease(masked);
CGImageRelease(actualMask);
return image;
}

Masking UIImage in ios

I'am trying to create profile picture image with custom shape pattern with using masking in ios. Here is what i've used to create masked image :
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedImageRef = CGImageCreateWithMask([image CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedImageRef];
CGImageRelease(mask);
CGImageRelease(maskedImageRef);
// returns new image with mask applied
return maskedImage;
}
After implementing instance , i am calling : *(_profileImage is global UIImage)
UIImage *maskImage = [UIImage imageNamed:#"maskProfilePicture.png"];
UIImage *maskedImage = [self maskImage:_profileImage withMask:maskImage];
and result (Image should be added in subview of mask. But it seems , mask image overlap on Image that need to be masked ):
The mask image : (Maybe mask image properties are wrong that i created in photoshop :)
The Core Graphics calls look correct.
Change the mask image to be have a black where you want the image to show through, and white elsewhere.
You can use CALayer's mask property to achieve the result.
CALayer Apple Documentation
Your masking layer should be transparent in places where you want to cut out the image.

Mask a UIImage that has been moved/scaled

I can accomplish masking like this:
CGAffineTransform t = self.imageForEditing.transform;
NSLog(#"x offset: %f, y offset: %f", self.imageForEditing.frame.origin.x, self.imageForEditing.frame.origin.y);
NSLog(#"scale: %f", sqrt(t.a * t.a + t.c * t.c));
UIImage *maskImage = [UIImage imageNamed:#"faceMask"];
UIImage *colorsImage = [self.imageForEditing.image imageRotatedByDegrees:180];
CGImageRef maskRef = maskImage.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedRef = CGImageCreateWithMask(colorsImage.CGImage, imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedRef];
You can see in the log statements that I can access how much the original imaged has been moved and/or scaled. How can I apply this information so that the new masked image takes it into account? Imagine the mask is just a circular cutout in the middle of the screen. self.imageForEditing (a UIImageView) can be moved around. When the new image is created, it should just be the part that is visible through the cutout. The code above works but doesn't take the moving/scaling of the underlying image into account.
EDIT: I think it might be easier to just create a new image based on the current state of the image view (self.imageForEditing). It lies within a container, self.editContainer. So how can I just create a new image based on the current pixels contained in self.editContainer?
Finally solved this by rendering the container view as an image, then using a UIImage Category I found to trim the transparent pixels. In case it helps someone:
on the container uiview:
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;

Resources