This question already has answers here:
Draw iOS 7-style squircle programmatically
(8 answers)
Closed 7 years ago.
Please help me!!! How create a rounded inflated square UIImageView. I need to create icons as a contact such as in the viber application .
Sample image
https://graphicdesign.stackexchange.com/questions/35579/create-a-rounded-inflated-square-in-illustrator-photoshop
I Asked about rounded INFLATED square, And i think questions have a different.
The easy way to do it is to use a mask and apply it to your image.
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
See this tutorial : http://iosdevelopertips.com/cocoa/how-to-mask-an-image.html#comment-47347
And the stackoverflow topic about your issue:
How to Mask an UIImageView
Possible duplicate here: Draw iOS 7-style squircle programatically
Anyway, what you want is a squircle/superellipse. Read up on the uibezierpath and customize the curves bigger for your usage.
Related
How can specify mask image color for extracting a part of image?
I have found this code:
- (UIImage *)maskImage:(UIImage *)image withMask:(UIImage *)maskImage
{
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
As Apple doc says:
The image to apply the mask parameter to. This image
must not be an image mask and may not have an image mask or masking
color associated with it. mask A mask. If the mask is an image, it
must be in the DeviceGray color space, must not have an alpha
component, and may not itself be masked by an image mask or a masking
color. If the mask is not the same size as the image specified by the
image parameter, then Quartz scales the mask to fit the image.
I have blue source image:
and I have image with specified colors:
So based on these colors I want to extract a part of image from the source image and it also will be great if I can detect center coordinates of extracted images in the source image. Right not problem is that using DeviceGray colors I need to have one source image and five mask images, but instead of it I want to specify color which I want to extract and have only 2 images - source and "mask" image.
In one of my iOS applications, I am trying to cut a portion of an image using CGImageMask. I have succeeded in masking the image with the following code:
- (UIImage *)maskImage:(UIImage *)referenceImage withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([referenceImage CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
So, my image will be:
myImageView.image = [self maskImage:[UIImage imageNamed:#"image.png"]
withMask:[UIImage imageNamed:#"mask.png"]];
Problem:
The output image is of the same size of reference image('image.png') with transparent area around. But I want to avoid those transparent area, and crop the result image. How can I achieve this? There are several masks, and the mask frames are not similar to all. I am attaching a reference image of the problem overview here. Please help me friends. Thanks in advance.
Look up auto-cropping a UIImage. This should crop out anything transparent.
How do I autocrop a UIImage?
This question already has answers here:
Cropping an UIImage
(25 answers)
Closed 8 years ago.
I'm making an application that need to crop the image to another image.
i want to crop the source Image ( like green rectangle) to destination image ( like white rectangle). I can get the size of source and destination image and the offset x and y . How can i got that crop image and save it to library?
You can see the image attach here:
How can I crop to that image? and if you can please give me an example source code
so much thanks for that
Use this method and pass image, rect as parameter. You can specify y offset and x offset in cropRect
-(UIImage *)cropImage:(UIImage *)image rect:(CGRect)cropRect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
UIImage *cropedImg = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropedImg;
}
Check Below Code:
-(UIImage *)imageWithImageSimple:(UIImage*)image scaledToSize:(CGSize)newSize
{
UIGraphicsBeginImageContext(newSize);
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Check below Links For reference:
http://code4app.net/ios/Image-crop-demo/501e1f3f6803faea5d000000
http://code4app.net/ios/Photo-Cropper-View-Controller/4f95519c06f6e7d870000000
http://code4app.net/ios/Image-Cropper/4f8cc87f06f6e7d86c000000
http://code4app.net/ios/Simple-Image-Editor-View/4ff2af4c6803fa381b000000
Get sample code from here. Then you can customise your own.
I am trying to apply a layer mask to a UIImage using CGImageCreateWithMask but the mask over the image is coming out very low resolution and appears blocky.
This is the code I'm trying (link to github) but I don't know what's wrong.
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, true);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
Here's what it looks like in the end.
My mask is a transparent PNG with an alpha channel (the circle in the screen-shot).
I don't know what I'm doing wrong.
Thanks for any/all help.
I'm guessing your mask is getting resized to fit the passed image and thus the jagged edges.
From the docs for CGImageCreateWithMask
If the mask is not the same size as the image specified by the image parameter, then Quartz scales the mask to fit the image
If that is not the case check the scale properties of the image and mask.
I was never able to figure this out.
It was probably something wrong with how the iOS runtime was interpreting the alpha-PNG I was using. After saving the mask as a JPG everything started working w/o any code-changes.
I find this very strange since the exact same mask files worked on both Android and WindowsPhone.
I assumed that my PNG files should work (as documented) but there must have been an encode setting that was incompatible with CoreGraphics
Situation: want to apply interesting photo frame to images, and the photo frame is implemented as layer mask, is it possible to dynamically build the layer mask by load a photo frame template outside of the obj-C code so that I can change the frame layer without ever touching the code?
the end result will be something like this. http://a3.mzstatic.com/us/r1000/106/Purple/9e/b9/9b/mzl.rdrrpcgr.320x480-75.jpg, except the photo edge/frame is dynamically loaded outside of the app, rather than built-into the app.
Ideally, would like to easily create a photo frame in photoshop as png file where the black pixel here will allow full transparency.. and then I can load this photo frame in the iOS app as the frame layer will allow the layer underneath to fully go through wherever the mask layer is black...
+ (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
return [UIImage imageWithCGImage:masked];
}
Then you could use this method with a UIImage you load from a URL.
The URL could serve a different UIImage, or could take a parameter for which UIImageMask to load. Does this answer your question?