Hai all am using CALayer to mask the UIImage, when i add the layer on imageview it display only layer portion hides remanning all image with white color filled
CALayer *mask = [CALayer layer];
mask.contents = (id)[[UIImage imageNamed:#"mask.png"] CGImage];
mask.frame = CGRectMake(0, 0, mainImageWidth+30, mainImageHeight);
mask.shadowOffset = CGSizeMake(0, 3);
mask.shadowOpacity = 1.5;
mainImageView.layer.mask = mask;
[mainImageView.layer setMasksToBounds:NO];
it hides my image view except the layer portion. how do i solve this
Instead of create mask using image view i have added three mask for UIView now its working perfect
mainLayer.frame = CGRectMake(0, 0, mainImageWidth, mainImageHeight);
[self.view.layer addSublayer:mainLayer];
mainLayer.backgroundColor = [UIColor clearColor].CGColor;
secondLayer.frame = CGRectMake(gloss_x, gloss_y, gloss_w, gloss_h);
maskLayer = [CALayer layer];
UIImage *mask = [UIImage imageNamed:#"mask.png"];
maskLayer.contents = (id)mask.CGImage;
maskLayer.frame = maskRect;//CGRectMake(gloss_x, gloss_y-50, gloss_w, gloss_h);
secondLayer.contents = (id)s_glossImage.CGImage;
[mainLayer addSublayer:secondLayer];
mainLayer.mask = maskLayer;
[mainImageView.layer addSublayer:mainLayer];
Now its working fine.
Related
I have a UIView I'm trying to put together which will be layered above another view. This new view needs to have a fully transparent hole in it. I've attached a screenshot of what it is I'm trying to accomplish (checkerboard pattern is the underlying UIView that will be adding this new UIView as a sublayer, red is a UIImage).
I have the following code that will render the black background with the hole in the center:
- (void)
drawRect:(CGRect)rect
{
CGRect boxRect = CGRectMake(_location.x - (kPointRadius/2), _location.y - (kPointRadius/2), kPointRadius, kPointRadius);
UIBezierPath *path = [UIBezierPath
bezierPathWithRoundedRect:CGRectMake(0, 0, self.layer.frame.size.width, self.layer.frame.size.height)
cornerRadius:0];
UIBezierPath *circlePath = [UIBezierPath
bezierPathWithRoundedRect:boxRect
cornerRadius:kPointRadius];
[path appendPath:circlePath];
[path setUsesEvenOddFillRule:YES];
CAShapeLayer *fillLayer = [CAShapeLayer layer];
fillLayer.path = path.CGPath;
fillLayer.fillRule = kCAFillRuleEvenOdd;
fillLayer.fillColor = [UIColor blackColor].CGColor;
fillLayer.borderWidth = 5.0;
fillLayer.borderColor = [UIColor redColor].CGColor;
fillLayer.opacity = 0.7;
[self.layer addSublayer:fillLayer];
}
However, when I add an image to that and use [[UIImage imageNamed:#"testimg" drawAtPoint:CGPointMake(x, y)]; to add the image, the image covers the hole.
Can anyone point me in the right direction here? I'm stumped.
EDIT: I'm now able to get it almost there. I can get everything I need EXCEPT the image layered on top of the black, 90% opaque background is also at 90% opacity.
CGRect boxRect = CGRectMake(
_location.x - (kPointRadius/2),
_location.y - (kPointRadius/2),
kPointRadius,
kPointRadius);
UIBezierPath *path = [UIBezierPath
bezierPathWithRoundedRect:CGRectMake(0, 0, self.layer.frame.size.width, self.layer.frame.size.height) cornerRadius:0];
UIBezierPath *circlePath = [UIBezierPath
bezierPathWithRoundedRect:boxRect
cornerRadius:kPointRadius];
[path appendPath:circlePath];
[path setUsesEvenOddFillRule:YES];
UIImage *image = [UIImage imageNamed:#"testimg"];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(13, 57, 350, 230)];
imageView.image = image;
CGRect r = CGRectMake(self.layer.frame.origin.x, self.layer.frame.origin.y, self.layer.frame.size.width, self.layer.frame.size.height);
UIGraphicsBeginImageContextWithOptions(r.size, NO, 0);
CGContextRef c = UIGraphicsGetCurrentContext();
CGContextAddPath(c, CGPathCreateCopy(path.CGPath));
CGContextEOClip(c);
CGContextFillRect(c, r);
UIImage* maskim = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CALayer* mask = [CALayer layer];
mask.frame = r;
mask.contents = (id)maskim.CGImage;
imageView.layer.mask = mask;
self.layer.mask = mask;
self.layer.backgroundColor = [UIColor blackColor].CGColor;
self.layer.opacity = 0.8;
[self.layer addSublayer:imageView.layer];
EDIT: I'm now able to get it almost there. I can get everything I need EXCEPT the image layered on top of the black, 90% opaque background is also at 90% opacity.
Instead of setting its opacity, you could set the layer's background color to [UIColor colorWithWhite:0.0 alpha:0.9].
I am working on an ios application and i have a issue using CALayer it has created a transparent mask on every image i put in the Layer.
Please have a look at the code below :
self.mask = [CALayer layer];
self.mask.contents = CFBridgingRelease(([UIImage imageNamed:#"launch screen.png"].CGImage));
self.mask.bounds = CGRectMake(0, 0, 200, 200);
self.mask.anchorPoint = CGPointMake(0.5, 0.5);
self.mask.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds));
self.imageView = _imageView;
self.view.layer.mask = self.mask;
Please Refer to this image for example :
http://31.media.tumblr.com/10cc0ba92377a2cba9fb35c9943fd2ca/tumblr_inline_n6zpokNxpC1qh9cw7.gif
How texture(like below) image can be applied to calayer with opacity levels like combining texture and calayer image?
Sample texture
You can use this
UIImage *yourImage = [UIImage imageNamed:#"yourTextureImage"];
self.yourView.layer.contents = (__bridge id) yourImage.CGImage;
EDIT Try this add sublayer on existing layer.It will give you idea how to do that.
//It will draw some circle
int radius = 30.0;
UIBezierPath *path = [UIBezierPath bezierPathWithRoundedRect:CGRectMake(0, 0, 20.0, 20.0) cornerRadius:0];
UIBezierPath *circlePath = [UIBezierPath bezierPathWithRoundedRect:CGRectMake(0, 0, 2.0*radius, 2.0*radius) cornerRadius:radius];
[path appendPath:circlePath];
[path setUsesEvenOddFillRule:YES];
CAShapeLayer *fillLayer = [CAShapeLayer layer];
fillLayer.path = path.CGPath;
fillLayer.fillRule = kCAFillRuleEvenOdd;
fillLayer.fillColor = [UIColor blackColor].CGColor;
fillLayer.opacity = 0.7;
[self.view.layer addSublayer:fillLayer];
You can also try setting mask
[self.view.layer setMask:fillLayer];
I think, if I am not mistaken, you need to use CALayer's mask property to achieve the required result.
Please try the following
//create mask layer
CALayer *maskLayer = [CALayer layer];
maskLayer.frame = self.imageView.bounds;
UIImage *maskImage = [UIImage imageNamed:#"Your image"];
maskLayer.contents = (__bridge id)maskImage.CGImage;
//apply mask to image layer
self.imageView.layer.mask = maskLayer;
I have a UIViewController with a UIImageView (imageView), and I'm defining several layers that will be nested in the imageview as follows in viewDidLoad:
//container layer - the very top layer
CALayer *containerLayer = [CALayer layer];
containerLayer.opacity = 0;
containerLayer.bounds = [self.imageView.layer frame];
// Holder Layer
CALayer *holderLayer = [CALayer layer];
holderLayer.opacity = 0;
holderLayer.bounds = self.imageView.bounds;
// Hierarchy layers
[containerLayer setValue:holderLayer forKey:#"__holderLayer"];
[containerLayer addSublayer:holderLayer];
[self.imageView.layer addSublayer:containerLayer];
I have the following code that works when loading an image from a UIImagePicker :
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageView *newImgView = [[UIImageView alloc] initWithFrame:self.imageView.frame];
CGRect frame = self.imageView.frame;
newImgView.image = image;
frame.origin = CGPointMake(0, 0);
newImgView.layer.frame = frame;
newImgView.layer.opacity = .9;
newImgView.layer.contentsGravity = kCAGravityResizeAspectFill;
CALayer * containerLayer = self.imageView.layer.sublayers[0];
if (containerLayer != nil)
{
[containerLayer setValue:newImgView.layer forKey:#"__imageLayer"];
CALayer * holderLayer = [containerLayer valueForKey:#"__holderLayer"];
if (holderLayer != nil)
{
//!!!!!line below doesn't work!!!
//[holderLayer addSublayer:newImgView.layer];
//line below works!
[self.imageView.layer addSublayer:newImgView.layer];
}
}
[self.imageView setNeedsDisplay];
[self checkAndPrintLayers];
so the first nested layer is containerLayer, then holderLayer, and I'm expecting to add various images as sublayers to the holderLayer and then manipulate it. However, calling
[holderLayer addSublayer:newImgView.layer];
doesn't work; the imageView stays blank. However, calling
[self.imageView.layer addSublayer:newImgView.layer];
and adding a sublayer to the top layer works just dandy.
Am I missing something obvious here? Would love any suggestions.
thanks.
Because
holderLayer and containerLayer are transparent (holderLayer.opacity = 0 / containerLayer.opacity = 0),
Make it,
//container layer - the very top layer
CALayer *containerLayer = [CALayer layer];
containerLayer.opacity = 1.0;
containerLayer.bounds = [self.imageView.layer frame];
// Holder Layer
CALayer *holderLayer = [CALayer layer];
holderLayer.opacity = 1.0;
holderLayer.bounds = self.imageView.bounds;
I'm trying to track down a bug in some iOS code for an iPad app. In one of our views, we've added sublayers to have a shadow and make sure the bottom of the view has rounded edges. Here's the code where we add the sublayers:
UIBezierPath *maskPath = [UIBezierPath bezierPathWithRoundedRect:self.bounds
byRoundingCorners:(UIRectCornerBottomLeft|UIRectCornerBottomRight)
cornerRadii:CGSizeMake(12.0f, 12.0f)];
// Create the shadow layer
shadowLayer = [CAShapeLayer layer];
[shadowLayer setFrame:self.bounds];
[shadowLayer setMasksToBounds:NO];
[shadowLayer setShadowPath:maskPath.CGPath];
shadowLayer.shadowColor = [UIColor blackColor].CGColor;
shadowLayer.shadowOffset = CGSizeMake(0.0f, 0.0f);
shadowLayer.shadowOpacity = 0.5f;
shadowLayer.shadowRadius = 6.0f;
roundedLayer = [CALayer layer];
[roundedLayer setFrame:self.bounds];
[roundedLayer setBackgroundColor:[UIColor colorFromHex:#"#e4ecef"].CGColor];
[self.layer insertSublayer:shadowLayer atIndex:0];
// Add inner view (since we're rounding corners, parent view can't mask to bounds b/c of shadow - need extra view)
maskLayer = [CAShapeLayer layer];
maskLayer.frame = self.bounds;
maskLayer.path = maskPath.CGPath;
innerView = [[UIView alloc] initWithFrame:self.bounds];
innerView.backgroundColor = [UIColor whiteColor];
innerView.layer.mask = maskLayer;
[self addSubview:innerView];
It shows up fine on the screen of the iPad, but I want to take a screenshot programmatically. I've added a category to UIView with this method:
- (UIImage*)screenshot {
UIGraphicsBeginImageContext(self.frame.size);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
return viewImage;
}
When I look at the screenshot that is taken, it no longer has the rounded corners or the shadow behind my view. Why aren't they showing up?
Found an explanation here: CALayer renderInContext
Additionally, layers that use 3D transforms are not rendered, nor are
layers that specify backgroundFilters, filters, compositingFilter, or
a mask values.
It looks like some sublayers can't be handled by renderInContext, which is why they aren't showing up in my screenshots.
Gradient sublayers
The snapshot is not taking any gradient layers with alpha channel in it.
Remove the alpha from colors, works for me