Memory Increase When Masking Image - ios

I am using the following code to create a masked UIImage. I am finding however that when i run the code multiple times the memory is increasing and not releasing. Can someone see where there may be a leak?
-(UIImage*)processImage:(UIImage *)sourceImage maskImage:(UIImage *)maskImage {
UIImage *editedImage = nil;
UIImage *mask = [self createMaskImage:maskImage canvasImage:sourceImage maskWidth:50 maskHeight:50];
editedImage = [self maskImage:sourceImage withMask:mask];
return editedImage;
}
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}

Since you are using CoreGraphics, ARC will not work for you.
You are creating / allocating with CGImageCreateWithMask.
Then you need to remove it with CGImageRelease.

From Apple Documentation:
An image created by masking image with mask. You are responsible for releasing this object by calling CGImageRelease.
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
// Create image from CGImageRef
UIImage *myImage = [UIImage imageWithCGImage:masked];
// Release it
CGImageRelease(masked);
return myImage;
}

Related

How to crop an image which is coming inside circle in iOS

I am working on a project where I need to show an screen same as below
Here the image should be cropped which is visible only in the circle. I have tried image masking as below. But it always crop in square.
- (UIImage*) maskImage1:(UIImage *) image withMask:(UIImage *) mask
{
CGImageRef imageReference = image.CGImage;
CGImageRef maskReference = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskReference),
CGImageGetHeight(maskReference),
CGImageGetBitsPerComponent(maskReference),
CGImageGetBitsPerPixel(maskReference),
CGImageGetBytesPerRow(maskReference),
CGImageGetDataProvider(maskReference),
NULL, // Decode is null
YES // Should interpolate
);
CGImageRef maskedReference = CGImageCreateWithMask(imageReference, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Please suggest how can I achieve this?
Use the demo to scale and crop image circle.Circle Image Crop
for Masking your image in to circle as below cede.
//Masking the image
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
UIGraphicsBeginImageContextWithOptions(hiddenView.bounds.size, self.view.opaque, 0.0); //In this I have take screenshot of a hiddenView that I have added from IB with a background color anything( in this case it's orange).
//Replace this hiddenView object with your object of whom you want to take screenshot.
[hiddenView.layer renderInContext:UIGraphicsGetCurrentContext()]; //Similarly replace hiddenView here with your object.
UIImage*theImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData*theImageData=UIImageJPEGRepresentation(theImage, 1.0 );
imgView.image = [UIImage imageWithData:theImageData]; //I placed a UIImageView to check and place the screenshot into it as Image ,simply to cross check if I'm getting a right screenshot or not.
//So you could also remove this line after your have verified that your getting right screen shot.

How to get UIBezierPath of shape in UIImage or crop UIImage in a certain shape

I am new in iOS, I want to know if I can get the UIBezierPath of a UIImage. I have a UIImage of face layout and want to get the UIBezierPath, which helps me in cropping the UIImage.
Or, can any one tell me about other ways of cropping UIImages?, but make sure cropping is in a custom shape (like: face, heart etc), not in a rectangle.
Here is the code to mask image with image:
- (UIImage *)cerateImageFromImage:(UIImage *)image
withMaskImage:(UIImage *)mask {
CGImageRef imageRef = image.CGImage;
CGImageRef maskRef = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef),
NULL,
YES);
CGImageRef maskedReference = CGImageCreateWithMask(imageRef, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Usage:
UIImage *image = [UIImage imageNamed:#"Photo.png"];
UIImage *mask = [UIImage imageNamed:#"Mask.png"];
self.imageView.image = [self cerateImageFromImage:image
withMaskImage:mask];

Masking changes colors of UIImage- iOS

Here is what I am doing to mask a UIImage dynamically. It is working but for some reason the colors of output image is not the same as original one. What would be causing this? Thanks..
- (void) setClippingPath:(UIBezierPath *)clippingPath : (UIImageView *)imgView {
CAShapeLayer *maskLayer = [CAShapeLayer layer];
maskLayer.frame = self.imgView.frame;
maskLayer.path = [clippingPath CGPath];
maskLayer.fillColor = [[UIColor whiteColor] CGColor];
maskLayer.backgroundColor = [[UIColor clearColor] CGColor];
self.imgView.image = [self maskImage:self.imgView.image withClippingMask:[self imageFromLayer:maskLayer]];
}
- (UIImage *)imageFromLayer:(CALayer *)layer
{
UIGraphicsBeginImageContext([layer frame].size);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
-(UIImage*)maskImage:(UIImage *)image withClippingMask:(UIImage *)maskImage
{
CGImageRef maskRef = image.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedImageRef = CGImageCreateWithMask([maskImage CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedImageRef];
CGImageRelease(mask);
CGImageRelease(maskedImageRef);
// returns new image with mask applied
return maskedImage;
}
Original Image
Mask
Output Image
The documentation for CGImageMaskCreate mentions:
When you draw into a context with a bitmap image mask, Quartz uses the mask to determine where and how the current fill color is applied to the image rectangle.
So if you want to just replace the black with white then you should be able to set the context color before creating the mask:
-(UIImage*)maskImage:(UIImage *)image withClippingMask:(UIImage *)maskImage
{
CGImageRef maskRef = image.CGImage;
CGContextSetFillColorWithColor( UIGraphicsGetCurrentContext( ), [ UIColor whiteColor ] );
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef maskedImageRef = CGImageCreateWithMask([maskImage CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedImageRef];
CGImageRelease(mask);
CGImageRelease(maskedImageRef);
// returns new image with mask applied
return maskedImage;
}
You might also want to update your mask to a more basic, greyscale JPG, something like this:

CGImageRef or CGImageMaskCreate performance issue

Wondering if you guys ever face the scrolling performance issue while using the codes as follows.
- (UIImage *)maskImage:(UIImage *)image withMask:(UIImage *)maskImage
{
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
Ok, so basically the codes above are doing image masking which I found from this site.
http://iosdevelopertips.com/cocoa/how-to-mask-an-image.html#comment-47347
And I've successfully integrate it to my project which bringing me some poor performance issue while scrolling the table view.
From what I've suspect is because each and every time the cell will keep "re-render" the image masking and cause the poor performance.

Image Blur/Quality Loss when Masking in iOS

I am masking an image (left) with this function
- (UIImage*)maskWithMask:(UIImage *)maskImage
{
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([self CGImage], mask);
UIImage *maskedImage = [UIImage imageWithCGImage:masked];
CGImageRelease(masked);
UIGraphicsBeginImageContextWithOptions(maskedImage.size,NO,0.0);
[maskedImage drawAtPoint:CGPointZero];
UIImage *newImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImg;
}
after the mask the resulting image appears to have lost quality / blurred slightly as shown on the right of the photo. I cannot seam to figure out why. I know it isn't a major loss, but its enough to notice on a retina display, which is what I am developing on. Any thoughts?

Resources