Mask a UIImage that has been moved/scaled - ios

I can accomplish masking like this:
CGAffineTransform t = self.imageForEditing.transform;
NSLog(#"x offset: %f, y offset: %f", self.imageForEditing.frame.origin.x, self.imageForEditing.frame.origin.y);
NSLog(#"scale: %f", sqrt(t.a * t.a + t.c * t.c));
UIImage *maskImage = [UIImage imageNamed:#"faceMask"];
UIImage *colorsImage = [self.imageForEditing.image imageRotatedByDegrees:180];
CGImageRef maskRef = maskImage.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedRef = CGImageCreateWithMask(colorsImage.CGImage, imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedRef];
You can see in the log statements that I can access how much the original imaged has been moved and/or scaled. How can I apply this information so that the new masked image takes it into account? Imagine the mask is just a circular cutout in the middle of the screen. self.imageForEditing (a UIImageView) can be moved around. When the new image is created, it should just be the part that is visible through the cutout. The code above works but doesn't take the moving/scaling of the underlying image into account.
EDIT: I think it might be easier to just create a new image based on the current state of the image view (self.imageForEditing). It lies within a container, self.editContainer. So how can I just create a new image based on the current pixels contained in self.editContainer?

Finally solved this by rendering the container view as an image, then using a UIImage Category I found to trim the transparent pixels. In case it helps someone:
on the container uiview:
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;

Related

Cropping UIImage by custom shape

I have a background UIImage, and I would like to crop the background UIImage with a custom shape so this background image only "appears" through the custom shape. For example, I have a moon-shaped custom shape, and I would like the background image to only come through on the moon-shaped part.
Based on another answer, I am trying to impose a mask on the image like so:
- (UIImage *)createImageFromImage:(UIImage *)image
withMaskImage:(UIImage *)mask {
CGImageRef imageRef = image.CGImage;
CGImageRef maskRef = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef),
NULL,
YES);
CGImageRef maskedReference = CGImageCreateWithMask(imageRef, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
However, the result of this looks like , and there are several issues with it:
the moon I have is 100*100, this has stretched do a strange proportion
My goal is only for the image to come through the moon - however, at the moment the moon is a white shape that is coming over the image
Any ideas about how I could fix the crop issue would be much appreciated. Thanks!
On your background image you'll have to add a custom mask through CALayer.
Keep in mind, everything you color in the mask.png (moon) will be visible, everything transparent will not display.
UIImage *moonImage = [UIImage imageNamed:#"mask.png"];
CALayer *maskLayer = [CALayer layer];
[maskLayer setContents:(id)moonImage.CGImage];
[maskLayer setFrame:CGRectMake(0.0f, 0.0f, moonImage.size.width, moonImage.size.height)];
UIImageView *yourBackgroundImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 1024, 768)];
// Keep your image view in tact here, just add:
[yourBackgroundImageView.layer setMask:maskLayer];

How to crop an image which is coming inside circle in iOS

I am working on a project where I need to show an screen same as below
Here the image should be cropped which is visible only in the circle. I have tried image masking as below. But it always crop in square.
- (UIImage*) maskImage1:(UIImage *) image withMask:(UIImage *) mask
{
CGImageRef imageReference = image.CGImage;
CGImageRef maskReference = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskReference),
CGImageGetHeight(maskReference),
CGImageGetBitsPerComponent(maskReference),
CGImageGetBitsPerPixel(maskReference),
CGImageGetBytesPerRow(maskReference),
CGImageGetDataProvider(maskReference),
NULL, // Decode is null
YES // Should interpolate
);
CGImageRef maskedReference = CGImageCreateWithMask(imageReference, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Please suggest how can I achieve this?
Use the demo to scale and crop image circle.Circle Image Crop
for Masking your image in to circle as below cede.
//Masking the image
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
UIGraphicsBeginImageContextWithOptions(hiddenView.bounds.size, self.view.opaque, 0.0); //In this I have take screenshot of a hiddenView that I have added from IB with a background color anything( in this case it's orange).
//Replace this hiddenView object with your object of whom you want to take screenshot.
[hiddenView.layer renderInContext:UIGraphicsGetCurrentContext()]; //Similarly replace hiddenView here with your object.
UIImage*theImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData*theImageData=UIImageJPEGRepresentation(theImage, 1.0 );
imgView.image = [UIImage imageWithData:theImageData]; //I placed a UIImageView to check and place the screenshot into it as Image ,simply to cross check if I'm getting a right screenshot or not.
//So you could also remove this line after your have verified that your getting right screen shot.

How to crop an image in a desired shape

I'm trying to develop an easy game. But then, I came across this problem with my UIImage.
When I import an image, I get this annoying background on this ball, not just the ball itself. So when I play with this ball, this background has to match the UIView color, or else it will look weird. How do I solve this problem?
You can use following code to get a mask shape image with a mask:
//beCroppedImage is the image be cropped, maskImage is a image with a white background color and a black sharp that ever you want.
theShapeImage = [beCroppedImage maskImageWithMask:maskImage];
- (UIImage *)maskImageWithMask:(UIImage*)mask {
CGImageRef imgRef = [self CGImage];
CGImageRef maskRef = [mask CGImage];
CGImageRef actualMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask(imgRef, actualMask);
UIImage *image = [UIImage imageWithCGImage:masked];
CGImageRelease(masked);
CGImageRelease(actualMask);
return image;
}

Masking UIImage in ios

I'am trying to create profile picture image with custom shape pattern with using masking in ios. Here is what i've used to create masked image :
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedImageRef = CGImageCreateWithMask([image CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedImageRef];
CGImageRelease(mask);
CGImageRelease(maskedImageRef);
// returns new image with mask applied
return maskedImage;
}
After implementing instance , i am calling : *(_profileImage is global UIImage)
UIImage *maskImage = [UIImage imageNamed:#"maskProfilePicture.png"];
UIImage *maskedImage = [self maskImage:_profileImage withMask:maskImage];
and result (Image should be added in subview of mask. But it seems , mask image overlap on Image that need to be masked ):
The mask image : (Maybe mask image properties are wrong that i created in photoshop :)
The Core Graphics calls look correct.
Change the mask image to be have a black where you want the image to show through, and white elsewhere.
You can use CALayer's mask property to achieve the result.
CALayer Apple Documentation
Your masking layer should be transparent in places where you want to cut out the image.

IOS: Masking an image keeping retina scale factor in account

I want to mask an image by passing another image as mask. I am able to mask the image but the resulting image doesn't look good. It is jagged at borders.
I guess the problem is related to retina graphics. The scale property for the two images are different as:
The image from which I want to mask has a scale value 1. This image generally has a resolution greater than 1000x1000 pixels.
The image according to which I want the resulting image(image having black and white colors only) has scale value 2. This image is generally of resolution 300x300 pixels.
The resulting image has a scale value of 1.
The code I am using is:
+ (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
CGImageRelease(mask);
UIImage *maskedImage = [UIImage imageWithCGImage:masked ];
CGImageRelease(masked);
return maskedImage;
}
How can I get a masked image which follows retina scale?
I had the same issue. It appears that this line ignore scale factor.
UIImage *maskedImage = [UIImage imageWithCGImage:masked];
So you should draw the image by yourself. Replace it by the following :
UIGraphicsBeginImageContextWithOptions(image.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect rect = CGRectMake(0, 0, image.size.width, image.size.height);
CGContextDrawImage(context, rect, masked);
UIImage * maskedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Works fine by me.
EDIT
OR
UIImage * maskedImage = [UIImage imageWithCGImage:masked
scale:[[UIScreen mainScreen] scale]
orientation:UIImageOrientationUp];
You can do
UIImage * maskedImage = [UIImage imageWithCGImage:masked scale:[[UIScreen mainScreen] scale] orientation:UIImageOrientationUp];

Resources