Current code:
self.backgroundImageView.image = [self.message imageOfSize:self.message.size]; // Random image, random size
UIImage *rightBubbleBackground = [[UIImage imageNamed:#"BubbleRight"]
resizableImageWithCapInsets:BubbleRightCapInsets
resizingMode:UIImageResizingModeStretch];
CALayer *mask = [CALayer layer];
mask.contents = (id)[rightBubbleBackground CGImage];
mask.frame = self.backgroundImageView.layer.frame;
self.backgroundImageView.layer.mask = mask;
self.backgroundImageView.layer.masksToBounds = YES;
This does not work properly. Though the mask is applied, the rightBubbleBackground does not resize correctly to fit self.backgroundImageView, even though it has resizing cap insets (BubbleRightCapInsets) set.
Original Image:
Mask image (rightBubbleBackground):
Result:
I found this answer but it only works for symmetrical images. Maybe I could modify that answer for my use.
I was wrong. That answer can be modified to work for asymmetrical images. I worked on that answer a bit and solved my own problem.
The following code made my cap insets work for the mask layer:
mask.contentsCenter =
CGRectMake(BubbleRightCapInsets.left/rightBubbleBackground.size.width,
BubbleRightCapInsets.top/rightBubbleBackground.size.height,
1.0/rightBubbleBackground.size.width,
1.0/rightBubbleBackground.size.height);
Result:
I had (part of) the same problem - i.e. the pixelated layer contents. For me it was solved by setting the contentsScale value on the CALayer - for some reason the default scale on CALayers is always 1.0, even on Retina devices.
i.e.
layer.contentsScale = [UIScreen mainScreen].scale;
Also, if you're drawing a shape using CAShapeLayer and wondering its edges look a little jagged on retina devices, try:
shapeLayer.rasterizationScale = [UIScreen mainScreen].scale;
shapeLayer.shouldRasterize = YES;
Related
I have a UIImageView in which I have a UIImage obviously. I want to create a shadow effect only on the UIImage. My problem is that I cannot get the CGRect of the UIImage inside the UIImageView so I can apply the shadow effect on it by using the following method.
[mImageView.layer.shadowColor = [UIColor grayColor].CGColor;
mImageView.layer.shadowOffset = CGSizeMake(0.0f, 0.0f);
mImageView.layer.shadowOpacity = 0.9f;
mImageView.layer.masksToBounds = NO;
CGRect imageFrame = mImageView.frame;
UIEdgeInsets shadowInsets = UIEdgeInsetsMake(0, 0, -1.5f, 0);
UIBezierPath *shadowPath = [UIBezierPath bezierPathWithRect:UIEdgeInsetsInsetRect(imageFrame, shadowInsets)];
mImageView.layer.shadowPath = shadowPath.CGPath;
Please consider the image attached for this problem.
The problem is critical too because the UIImage can be an image of a rigid dimension because it is a cropped image as you can see in the picture attached.
The UIImageView’s bound is equal to the view’s bound here. So when applying the effect using the method above, it creates a UIBezierPath on the whole UIImageView, not only to the UIImage. As in the method, I cannot get the exact CGRect of the UIImage.
Any solution? What am I missing?
cropped image
UIImage is always rectangular, so is UIImageView. I believe you want to put shadow only around the jagged border of the cropped area right? If that is the case, you cannot use this method. You need to use CoreGraphics or others, to get the effect you want. For example, you can create a copy of this image in memory, blackened it, and blur it and paste it behind your image to create a shadowy effect.
My app sends a GET request to google to attain certain user information. One piece of crucial returned data is a users picture which is placed inside a UIImageView that is always exactly (100, 100) then redrawn to create a round mask for this imageView. These pictures come from different sources and thus always have different aspect ratios. Some have a smaller width compared to their height, sometimes it's vice-versa. This results in the image looking compressed. I've tried things such as the following (none of them worked):
_personImage.layer.masksToBounds = YES;
_personImage.layer.borderWidth = 0;
_personImage.contentMode = UIViewContentModeScaleAspectFit;
_personImage.clipsToBounds = YES;
Here is the code I use to redraw my images (it was attained from user fnc12 as the third answer in Making a UIImage to a circle form):
/** Returns a redrawn image that had a circular mask created for the inputted image. */
-(UIImage *)roundedRectImageFromImage:(UIImage *)image size:(CGSize)imageSize withCornerRadius:(float)cornerRadius
{
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0.0); //<== Notice 0.0 as third scale parameter. It is important because default draw scale ≠ 1.0. Try 1.0 - it will draw an ugly image...
CGRect bounds = (CGRect){CGPointZero, imageSize};
[[UIBezierPath bezierPathWithRoundedRect:bounds cornerRadius:cornerRadius] addClip];
[image drawInRect:bounds];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return finalImage;
}
This method is always called like so:
[_personImage setImage:[self roundedRectImageFromImage:image size:CGSizeMake(_personImage.frame.size.width, _personImage.frame.size.height) withCornerRadius:_personImage.frame.size.width/2]];
So I end up having a perfectly round image but the image it self isn't right aspect-wise. Please help.
P.S. Here's how images look when their width is roughly 70% that of their height before the redrawing of the image to create a round mask:
Hello dear friend there!
Here is my version that works:
Code in ViewController:
[self.profilePhotoImageView setContentMode:UIViewContentModeCenter];
[self.profilePhotoImageView setContentMode:UIViewContentModeScaleAspectFill];
[CALayer roundView:self.profilePhotoImageView];
roundView function in My CALayer+Additions class:
+(void)roundView:(UIView*)view{
CALayer *viewLayer = view.layer;
[viewLayer setCornerRadius:view.frame.size.width/2];
[viewLayer setBorderWidth:0];
[viewLayer setMasksToBounds:YES];
}
May be you should try to change your way to create rounded ImageView using my version that create rounded ImageView by modifying ImageView's view layer . Hope it helps.
To maintain aspect ratio of UIImageView, after setting image use following line of code.
[_personImage setContentMode:UIViewContentModeScaleAspectFill];
For detailed description follow reference link:
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIImageView_Class/
I have an image (png) which must fill the entire screen of my app. I'm using CALayers and doing everything programatically but still this sounds like something that should be trivial to but I can't get it to work. I have two versions of the image a retina version (2048px x 1536px) and a non-retina version 1024px x 768px). The image is listed a universal image in the Asset catalogue
The code is simple enough I think:
// CREATE FULL SCREEN CALAYER
CALayer *myLayer = [[CALayer alloc] init];
[myLayer setBounds:CGRectMake(0, 0, bounds.size.width, bounds.size.height)];
[myLayer setPosition:CGPointMake(bounds.size.width/2, bounds.size.height/2)];
[self.view.layer addSublayer:myLayer];
// LOAD THE IMAGE INTO THE LAYER —— AM EXPECTING IT TO FILL THE LAYER
UIImage *layerImage = [UIImage imageNamed:#"infoScreen"];
CGImageRef image = [layerImage CGImage];
[myLayer setContents:(__bridge id)image];
[myLayer setContentsGravity:kCAGravityCenter]; /* IT WORKS FINE IF I USE setContentsGravity:kCAGravityResizeAspectFill */
This code works fine on a non-iPad retina. However on the Retina iPad, the image is always loaded at twice the actual size (so it appears zoomed in). I'm using the Simulator and iOS 8. What am I doing wrong?
Beging your image processing with
func UIGraphicsBeginImageContextWithOptions(size: CGSize, opaque: Bool, scale: CGFloat)
The last parameter in the above function determines the scaling for the graphics. You can set
this value by retrieving the scale property of the main screen. In swift I would do it this way:
var screen = UIScreen.mainScreen()
var scale = screen.scale
Hope it helps.
Edit: - Code for doing this in swift, you can modify it to suit your need.
UIGraphicsBeginImageContextWithOptions(rect.size, true, 0.0)
var ctx : CGContextRef = UIGraphicsGetCurrentContext()
<UIImage>.drawInRect(rect)
I had this same problem, was solved by setting the contentsScale value on the CALayer - for some reason the default scale on CALayers is always 1.0, even on Retina devices.
i.e.
layer.contentsScale = [UIScreen mainScreen].scale;
Also, if you're drawing a shape using CAShapeLayer and wondering its edges look a little jagged on retina devices, try:
shapeLayer.rasterizationScale = [UIScreen mainScreen].scale;
shapeLayer.shouldRasterize = YES;
My requirement is to crop the image using the maskImage.
Am able the to crop the image but not in the exact ratio as expected. I googled round and tried to implement it but unfortunately didn't got result as expected.This is what am getting after cropping the image.
Following is the code i'm using.
- (UIImage*) maskImage1:(UIImage *) image withMask:(UIImage *) mask
{
CGImageRef imageReference = image.CGImage;
CGImageRef maskReference = mask.CGImage;
CGImageRef imageMask = CGImageMaskCreate(CGImageGetWidth(maskReference),
CGImageGetHeight(maskReference),
CGImageGetBitsPerComponent(maskReference),
CGImageGetBitsPerPixel(maskReference),
CGImageGetBytesPerRow(maskReference),
CGImageGetDataProvider(maskReference),
NULL, // Decode is null
YES // Should interpolate
);
CGImageRef maskedReference = CGImageCreateWithMask(imageReference, imageMask);
CGImageRelease(imageMask);
UIImage *maskedImage = [UIImage imageWithCGImage:maskedReference];
CGImageRelease(maskedReference);
return maskedImage;
}
Thanks..!!
An alternative
You can also achieve the same effect with CALayers, and, in my opinion, is clear.
- (UIImage*) maskImage1:(UIImage *) image withMask:(UIImage *) mask
{
UIImage* maskedImage = image;
CALayer *maskLayer = [CALayer layer];
maskLayer.frame = maskedImage.bounds;
maskLayer.contents = (__bridge id) mask.CGImage;
maskedImage.layer.mask = maskLayer;
return maskedImage;
}
Probably a solution
Your mask UIImage probably has the contentScale wrong
mask.layer.contentScale = [UISCreen mainScreen].scale;
You can also force the size of your mask before you do CGImageMaskCreate:
mask.frame = image.bounds;
Maybe you had already solved this, but as I had the same problem and I solved it, I will explain the solution: The mask is applied to the real size of the photo, in my case it was 3264x2448 and logical it is not the iphone screen size, so when the mask is applied on the image, it became very small. I solved creating a layer on photoshop that have this 3264x2448 size and scaled the mask to stay exactly the same way like on iphone screen.
Other problem that I had is that the image got another orientation when I took the picture(I was using the camera), then I had to turn the mask to the same orientation of the picture on this photoshop layer. Modifying the orientation gives you changing sides, what was height now is width, as the inverse too, so, when I had to calculate the scale, I had to pay attention for which side should be multiplied to get the correct scale.
I have a CALayer and I want to add to it a stretchable image. If I just do:
_layer.contents = (id)[[UIImage imageNamed:#"grayTrim.png"] resizableImageWithCapInsets:UIEdgeInsetsMake(0.0, 15.0, 0.0, 15.0)].CGImage;
it won't work since the layers' default contentGravity is kCAGravityResize.
I've read that this could be accomplished using the contentsCenter but I cannot seem to figure out how exactly would I use that to achieve the stretched image in my CALayer.
Any ideas are welcome!
Horatiu
The response to this question is this one. Lets say you have a stretchable image which stretches only in width and has the height fixed (for simplicity sake).
The image is 31px width ( 15px fixed size - doesn't stretch -, 1px will be stretched)
Assuming your layer is a CALayer subclass your init method should look like this:
- (id)init
{
self = [super init];
if (self) {
UIImage *stretchableImage = (id)[UIImage imageNamed:#"stretchableImage.png"];
self.contents = (id)stretchableImage.CGImage;
self.contentsScale = [UIScreen mainScreen].scale; //<-needed for the retina display, otherwise our image will not be scaled properly
self.contentsCenter = CGRectMake(15.0/stretchableImage.size.width,0.0/stretchableImage.size.height,1.0/stretchableImage.size.width,0.0/stretchableImage.size.height);
}
return self;
}
as per documentation the contentsCenter rectangle must have values between 0-1.
Defaults to the unit rectangle (0.0,0.0) (1.0,1.0) resulting in the entire image being scaled. If the rectangle extends outside the unit rectangle the result is undefined.
This is it. Hopefully someone else will find this useful and it will save some development time.