I want to apply a chat bubble mask to an image view, and I would like the mask image to scale to fit in the frame of the image view. This is the code I am using to create and apply the mask:
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
mask.contents = maskImage.CGImage
mask.frame = CGRectMake(0, 0, imageView.frame.width, imageView.frame.height)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
And this is my mask:
However this is what I get:
And when I add slicing to the mask image (i.e. create resizable cap insets) I get this:
I made a minimal project that demonstrates the issue and put it here
I do not fundamentally understand why the right and bottom sides of my mask image are getting clipped out of view. The mask's frame and bounds match the image view's, and the image used for the mask is actually smaller than the image view, so if anything I would expect it to not cover the whole thing, rather than getting partially clipped.
Can someone please explain what's happening here? How do I prevent this behavior?
EDIT: I've been playing around with this some more, and the results make a little more sense when I set the mask's size to match the maskImage size like so:
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
mask.contents = maskImage.CGImage
mask.frame = CGRectMake(0, 0, maskImage.size.width, maskImage.size.height)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
This gives me:
Or with slicing on the mask asset, I get the image in J.Hunter's answer. This is still not what I want because the mask is not being stretched to fill the imageView and a large portion around the sides is masked out that shouldn't be, but I can at least understand what's happening in this case. I am setting the mask's frame to a size smaller than the imageView's frame so of course it doesn't fill the entire area.
However I would expect setting the frame of the mask to match that of the imageView would make the right and bottom edges of the mask that I can see in that last screenshot line up with the right and bottom edges of the imageView. If I print out the masks's frame, bounds, and contentsRect, they are exactly what I would expect: the frame and bounds match the imageView and the contentsRect is (0.0, 0.0, 1.0, 1.0). It was my understanding that a CALayer will display a portion of its contents based on the contentsRect, so unless that rect was smaller than 1.0x1.0, the entire contents image should be displayed. Right? But why is that not what I'm seeing?
I download your demo project and rewrite some codes like this:
override func viewDidLoad() {
super.viewDidLoad()
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
print("image size is \(maskImage.size)")
mask.contents = maskImage.CGImage
mask.frame = CGRect(origin: CGPointZero, size: maskImage.size)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
}
the image size of your maskImage is (39.5, 32.5). It looks like something wrong with your chatGif.png
the snapshot
Okay, I finally figured out what's going on. I added the same debug print statements to the sample project I made that I had put in my actual project and discovered that the imageViews frame at the time I was setting the mask was much larger that I expected it, and not its final display size. That makes sense because in the sample project I was setting the mask in viewDidLoad, before any subviews had been laid out.
I didn't realize this was the case when printing out values in my actual project because my UIImageView is in a UITableViewCell, so it's un-initialized size was the same as the placeholder size in the storyboard, which wasn't that much larger than the final size, so it didn't read as obviously too big.
Now if only I could get the last 4 hours of my life back.
EDIT: To clarify, the initial code works fine if it's run after the image view has it's final layout size set. So you should set the mask in viewDidLayoutSubviews, or if it's a table/collection cell you can check if the frame size has changed in layoutSubviews.
Related
I was wondering what I would need if I wanted to use a mask image to get my UIImageView in a specific shape. From what I understand, to create a mask, I need to have an image with the shape of the mask all black on top of a white background. Something like this, for example:
First of all, is this sufficient to shape an image view, and if so, how do I do it in Swift 3? I can only find masking code that is either outdated or written in Objective-C. I've tried simply assigning the image above to an UIImageView and then assign the image view to the mask property of the UIImageView I want to shape, like so:
self.defaultImageView.mask = self.maskImageView
This didn't do anything. It just made self.maskImageView disappear (both image view's added through the storyboard and connected using IBOutlet properties). I'm sure I'm forgetting to do something. It can't be this simple. I would appreciate it if someone could help me out. Like I said, I put both image views on the exact same spot, on top of each other, in the storyboard.
UPDATE:
My first attempt to set the mask programmatically after deleting it from my storyboard.
let layer:CALayer = CALayer()
let mask:UIImage = UIImage(named: "Black-Star-Photographic-Agency")!
layer.contents = mask
layer.frame = CGRect(x: 0, y: 0, width: ((self.defaultImageView.image?.size.width)!), height: (self.defaultImageView.image?.size.height)!)
self.defaultImageView.layer.mask = layer
self.defaultImageView.layer.masksToBounds = true
The result was that the image view had completely disappeared and wasn't visible anymore. Am I doing something, am I forgetting something or both?
You should use a png image, which supports transparency, unlike jpg.
In Photoshop your image should look similar to this:
It doesn't matter if your shape is black or white. What matters is transparency of each pixel. Opaque area (black in this case) will be visible and transparent area will get trimmed.
Edit:
You should not create mask view from storyboard if you do so. It is not going to be a part of your view hierarchy. Just add it programmatically like this:
let maskView = UIImageView()
override func viewDidLoad() {
super.viewDidLoad()
maskView.image = UIImage(named: "mask")
imageView.mask = maskView
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
maskView.frame = imageView.bounds
}
Output:
Here is a test project to show how it's working.
Also if you're using a custom frame/image and run into the mask not showing properly, try setting the content mode of the mask:
maskView.contentMode = .scaleAspectFit
A layer's borderWidth and borderColor properties draw a border inside the view. Remykits pointed this out here.
A layer's shadow... properties cannot be used to create a border that both appears on all four sides and is opaque for reasons I showed here.
What I failed to specify in that question (and the reason I've opened a new one) is that I want the border to be outside the view. Increasing the frame of the view to compensate for the space lost, as has been suggested, doesn't work; I'm using a UIImageView, so even if the frame is increased, the image is still cropped.
Another suggestion was to change the contentMode of the UIImageView to .Center, in combination with changing the size of the view, but this doesn't work as the view then isn't the proper size.
The solution I first thought of was to create another UIView "behind" this UIImageView, and give it a backgroundColor to mimic the effect of a border. I also thought of creating a custom subclass of UImageView. Both courses of action, however, involve making calculations based on the frame of the view. I've had many problems with the frame not being set by AutoLayout at the proper time, etc.
Other things that come to mind are digitally adding a border to the image or positioning the image in a specific part of the UIImageView. (My attempt at the latter was imageView.layer.contentsRect = CGRectInset(imageView.bounds, 4, 4), which resulted in a strangely pixellated image.)
To be clear, what I'm looking for is this:
It really feels like there should be a simpler way to do this than creating a new class or view. Any help appreciated.
Aha! Stitching together aykutt's comment about resizing the image and changing the conentMode, Paul Lynch's answer about resizing images, and rene's (life-saving) answer about what to do your subviews actually aren't laid out in viewDidLayoutSubviews:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
self.myContainer.setNeedsLayout()
self.myContainer.layoutIfNeeded()
var width: CGFloat = 4 //the same width used for the border of the imageView
var rect = CGRectInset(imageView.bounds, width, width)
var size = CGSizeMake(rect.width, rect.height)
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
image.drawInRect(CGRectMake(0, 0, size.width, size.height))
var new = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.imageView.contentMode = .Center
self.imageView.image = new
}
I have a view where I need to place an UIImageView, and I was told to place it inside a rectangle that takes the screen width and its height is smaller than width (fixed size). However, the images I've been given are more or less square jpeg images, so the idea is to fill the width of the rectangle that should contain the image and set the height of the image in a way that its aspect ratio is kept. Then, I should be able to let the user scroll the image vertically to see the complete image.
How could I do this?
Thanks in advance
EDIT: I need to do this for several images that have different sizes, but that should fit the same rectangular area size within the view
You can set imageview content mode.
imageView.contentMode = UIViewContentModeScaleAspectFit;
This will make sure that the image is displayed by keeping original aspect ratio.
Edit:
I think this is what you wanted:
UIImage *originalImage = [UIImage imageNamed:[picArr objectAtIndex:a]];
double width = originalImage.size.width;
double height = originalImage.size.height;
double apect = width/height;
double nHeight = 320.f/ apect;
self.img.frame = CGRectMake(0, 0, 320, nHeight);
self.img.center = self.view.center;
self.img.image = originalImage;
Hope this helps.. :)
I have put an UIImageView control on my view with IB.
The size of the control is just something I decided upon, pretty random size really
What I want to do is the control to resize automatically whenever I set the image property to a new image. I want it to actually resize to the size of the image.
Can it be done automatically ? without any code intervention ?
If not - what will the best approach be in this case ?
What happens today is strange. I load images into the ImageView and I see the images getting displayed properly even though the size of the ImageView is not changed. This interferes with my intention of grabbing users touches over the ImageView. The user touches the actual image, but since some parts of the image are outside ( and this is the strange part ) of the ImageView - point mapping goes crazy
Can someone think of any explanation to this ?
thanks
The size of an image has no bearing on how large the UIImageView actually is, rather the size of the UIImageView solely depends on the size given to it in Interface Builder (or that you assigned to it). Else the images would be all whacky when you use the #2x images for Retina displays for example.
If you want to fix this, you must change the frame when setting the image as well. If you're doing this now:
[imageView setImage:[UIImage imageNamed:#"myImage.jpg"]];
change it to:
UIImage img = [UIImage imageNamed:#"myImage.jpg"];
[imageView setImage:img];
imageView.frame = CGRectMake(imageView.frame.origin.x, imageView.frame.origin.y,
img.size.width, img.size.height);
This will however not change the layout of view it is contained within, you can make it change the sizes of the other views automatically under iOS 6 using Layout Constraints. If you are an Apple Developer you can watch the WWDC instruction videos, they explain how that system works quite well.
If you're fine with the view not growing, and the problem is just how the image overflows it's bounds when you change it to one that does not match the dimension of the containing view, you can set the "Clip Subviews" checkbox in Interface Builder for the image view. This will make it so that the view will not draw anything outside it's own bounding box, if you also set the scaling mode to "Aspect Fill" or "Scale To Fill", the image will always fill up the entire bounds of the containing view.
Here is the code snippet I cut and paste often, using the same method as Hampus Nilsson above -
Example
func demo(x0:CGFloat, y0:CGFloat) {
let imgView = UIImageView(image: UIImage(named: "someImg.png"));
imgView.sizeToImage();
imgView.center = CGPoint(x:x0, y:y0);
}
UIImageView Extension
extension UIImageView {
/******************************************************************************/
/** #fcn sizeToImage()
* #brief size view to image
* ##assum (image!=nil)
*/
/******************************************************************************/
func sizeToImage() {
//Grab loc
let xC = self.center.x;
let yC = self.center.y;
//Size to fit
self.frame = CGRect (x: 0, y: 0, width: (self.image?.size.width)!/2, height: (self.image?.size.height)!/2);
//Move to loc
self.center = CGPoint(x:xC, y:yC);
return;
}
A wonderful cut & paste, use if needed!
let containerView = UIView(frame: CGRect(x:0,y:0,width:320,height:500))
let imageView = UIImageView()
if let image = UIImage(named: "Image_Name_Here") {
let ratio = image.size.width / image.size.height
if containerView.frame.width > containerView.frame.height {
let newHeight = containerView.frame.width / ratio
imageView.frame.size = CGSize(width: containerView.frame.width, height: newHeight)
}
else{
let newWidth = containerView.frame.height * ratio
imageView.frame.size = CGSize(width: newWidth, height: containerView.frame.height)
}
}
I've seen similar questions, but haven't found workable answers.
I want to mask a UIView using a grey image (need to convert to alpha scale for masking). The UIView has background. It should be easy to mask an image, but I want to mask any UIView.
Any clues will be appreciated.
I've been working on this problem for a couple of hours and have a solution that I think will do what you want. First, create your masking image using whatever means you see fit. Note that we only need the alpha values here, all other colours will be ignored, so make certain that the method you use supports alpha values. In this example I'm loading from a .png file, but don't try it with .jpg files as they don't have alpha values.
Next, create a new layer, assign your mask to its contents and set this new layer to your UIView's own layer, like so: you should find that this masks the UIView and all its attached subviews:
UIImage *_maskingImage = [UIImage imageNamed:#"mask"];
CALayer *_maskingLayer = [CALayer layer];
_maskingLayer.frame = theView.bounds;
[_maskingLayer setContents:(id)[_maskingImage CGImage]];
[theView.layer setMask:_maskingLayer];
With this done, you can set the UIView's background colour to whatever you like and the mask will be used to create a coloured filter.
EDIT: As of iOS8 you can now mask a view simply by assigning another view to its maskView property. The general rules stay the same in that the maskView's alpha layer is used to determine the opacity of the view it is applied to.
For apps targeting iOS 8.0+ this worked well (in this case, using a gradient as the mask) It avoids any need to resize or position the mask.
// Add gradient mask to view
func addGradientMask(targetView: UIView)
{
let gradientMask = CAGradientLayer()
gradientMask.frame = targetView.bounds
gradientMask.colors = [UIColor.blackColor().CGColor, UIColor.clearColor().CGColor]
gradientMask.locations = [0.8, 1.0]
let maskView: UIView = UIView()
maskView.layer.addSublayer(gradientMask)
targetView.maskView = maskView
}
In my case, I want to remove the mask once the user starts scrolling. This is done with:
func scrollViewWillBeginDragging(scrollView: UIScrollView) {
exerDetailsTableView.maskView = nil
}
where the view is defined as an #IBOutlet:
#IBOutlet weak var exerDetailsTableView: UITableView!
Result:
I don't know the exact code off the top of my head but the basic idea is to have two UIViews. One UIView would have it's image property set to be the grey scale image and the other UIView would be set as usual the only difference is that you would position the initial UIView directly on top of the UIView containing the "normal" image.
I hope that is enough to push your idea a step further.