Image Blur/Quality Loss when Masking in iOS - ios

I am masking an image (left) with this function
- (UIImage*)maskWithMask:(UIImage *)maskImage
{
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([self CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:masked];
CGImageRelease(masked);
UIGraphicsBeginImageContextWithOptions(maskedImage.size,NO,0.0);
[maskedImage drawAtPoint:CGPointZero];
UIImage *newImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
after the mask the resulting image appears to have lost quality / blurred slightly as shown on the right of the photo. I cannot seam to figure out why. I know it isn't a major loss, but its enough to notice on a retina display, which is what I am developing on. Any thoughts?

Related

Memory Increase When Masking Image

I am using the following code to create a masked UIImage. I am finding however that when i run the code multiple times the memory is increasing and not releasing. Can someone see where there may be a leak?
-(UIImage*)processImage:(UIImage *)sourceImage maskImage:(UIImage *)maskImage {
UIImage *editedImage = nil;
UIImage *mask = [self createMaskImage:maskImage canvasImage:sourceImage maskWidth:50 maskHeight:50];
editedImage = [self maskImage:sourceImage withMask:mask];
return editedImage;
}
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
Since you are using CoreGraphics, ARC will not work for you.
You are creating / allocating with CGImageCreateWithMask.
Then you need to remove it with CGImageRelease.
From Apple Documentation:
An image created by masking image with mask. You are responsible for releasing this object by calling CGImageRelease.
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
// Create image from CGImageRef
UIImage *myImage = [UIImage imageWithCGImage:masked];
// Release it
CGImageRelease(masked);
return myImage;
}

How to crop an image which is coming inside circle in iOS

I am working on a project where I need to show an screen same as below
Here the image should be cropped which is visible only in the circle. I have tried image masking as below. But it always crop in square.
- (UIImage*) maskImage1:(UIImage *) image withMask:(UIImage *) mask
{
CGImageRef imageReference = image.CGImage;
CGImageRef maskReference = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskReference),
CGImageGetHeight(maskReference),
CGImageGetBitsPerComponent(maskReference),
CGImageGetBitsPerPixel(maskReference),
CGImageGetBytesPerRow(maskReference),
CGImageGetDataProvider(maskReference),
NULL, // Decode is null
YES // Should interpolate
);
CGImageRef maskedReference = CGImageCreateWithMask(imageReference, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Please suggest how can I achieve this?
Use the demo to scale and crop image circle.Circle Image Crop
for Masking your image in to circle as below cede.
//Masking the image
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
UIGraphicsBeginImageContextWithOptions(hiddenView.bounds.size, self.view.opaque, 0.0); //In this I have take screenshot of a hiddenView that I have added from IB with a background color anything( in this case it's orange).
//Replace this hiddenView object with your object of whom you want to take screenshot.
[hiddenView.layer renderInContext:UIGraphicsGetCurrentContext()]; //Similarly replace hiddenView here with your object.
UIImage*theImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData*theImageData=UIImageJPEGRepresentation(theImage, 1.0 );
imgView.image = [UIImage imageWithData:theImageData]; //I placed a UIImageView to check and place the screenshot into it as Image ,simply to cross check if I'm getting a right screenshot or not.
//So you could also remove this line after your have verified that your getting right screen shot.

How to get UIBezierPath of shape in UIImage or crop UIImage in a certain shape

I am new in iOS, I want to know if I can get the UIBezierPath of a UIImage. I have a UIImage of face layout and want to get the UIBezierPath, which helps me in cropping the UIImage.
Or, can any one tell me about other ways of cropping UIImages?, but make sure cropping is in a custom shape (like: face, heart etc), not in a rectangle.
Here is the code to mask image with image:
- (UIImage *)cerateImageFromImage:(UIImage *)image
withMaskImage:(UIImage *)mask {
CGImageRef imageRef = image.CGImage;
CGImageRef maskRef = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef),
NULL,
YES);
CGImageRef maskedReference = CGImageCreateWithMask(imageRef, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Usage:
UIImage *image = [UIImage imageNamed:#"Photo.png"];
UIImage *mask = [UIImage imageNamed:#"Mask.png"];
self.imageView.image = [self cerateImageFromImage:image
withMaskImage:mask];

How to crop an image in a desired shape

I'm trying to develop an easy game. But then, I came across this problem with my UIImage.
When I import an image, I get this annoying background on this ball, not just the ball itself. So when I play with this ball, this background has to match the UIView color, or else it will look weird. How do I solve this problem?
You can use following code to get a mask shape image with a mask:
//beCroppedImage is the image be cropped, maskImage is a image with a white background color and a black sharp that ever you want.
theShapeImage = [beCroppedImage maskImageWithMask:maskImage];
- (UIImage *)maskImageWithMask:(UIImage*)mask {
CGImageRef imgRef = [self CGImage];
CGImageRef maskRef = [mask CGImage];
CGImageRef actualMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask(imgRef, actualMask);
UIImage *image = [UIImage imageWithCGImage:masked];
CGImageRelease(masked);
CGImageRelease(actualMask);
return image;
}

IOS: Masking an image keeping retina scale factor in account

I want to mask an image by passing another image as mask. I am able to mask the image but the resulting image doesn't look good. It is jagged at borders.
I guess the problem is related to retina graphics. The scale property for the two images are different as:
The image from which I want to mask has a scale value 1. This image generally has a resolution greater than 1000x1000 pixels.
The image according to which I want the resulting image(image having black and white colors only) has scale value 2. This image is generally of resolution 300x300 pixels.
The resulting image has a scale value of 1.
The code I am using is:
+ (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
CGImageRelease(mask);
UIImage *maskedImage = [UIImage imageWithCGImage:masked ];
CGImageRelease(masked);
return maskedImage;
}
How can I get a masked image which follows retina scale?
I had the same issue. It appears that this line ignore scale factor.
UIImage *maskedImage = [UIImage imageWithCGImage:masked];
So you should draw the image by yourself. Replace it by the following :
UIGraphicsBeginImageContextWithOptions(image.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect rect = CGRectMake(0, 0, image.size.width, image.size.height);
CGContextDrawImage(context, rect, masked);
UIImage * maskedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Works fine by me.
EDIT
OR
UIImage * maskedImage = [UIImage imageWithCGImage:masked
scale:[[UIScreen mainScreen] scale]
orientation:UIImageOrientationUp];
You can do
UIImage * maskedImage = [UIImage imageWithCGImage:masked scale:[[UIScreen mainScreen] scale] orientation:UIImageOrientationUp];

Resources