What do I need for masking a UIImageView and how do I do it in Swift 3? - ios

I was wondering what I would need if I wanted to use a mask image to get my UIImageView in a specific shape. From what I understand, to create a mask, I need to have an image with the shape of the mask all black on top of a white background. Something like this, for example:
First of all, is this sufficient to shape an image view, and if so, how do I do it in Swift 3? I can only find masking code that is either outdated or written in Objective-C. I've tried simply assigning the image above to an UIImageView and then assign the image view to the mask property of the UIImageView I want to shape, like so:
self.defaultImageView.mask = self.maskImageView
This didn't do anything. It just made self.maskImageView disappear (both image view's added through the storyboard and connected using IBOutlet properties). I'm sure I'm forgetting to do something. It can't be this simple. I would appreciate it if someone could help me out. Like I said, I put both image views on the exact same spot, on top of each other, in the storyboard.
UPDATE:
My first attempt to set the mask programmatically after deleting it from my storyboard.
let layer:CALayer = CALayer()
let mask:UIImage = UIImage(named: "Black-Star-Photographic-Agency")!
layer.contents = mask
layer.frame = CGRect(x: 0, y: 0, width: ((self.defaultImageView.image?.size.width)!), height: (self.defaultImageView.image?.size.height)!)
self.defaultImageView.layer.mask = layer
self.defaultImageView.layer.masksToBounds = true
The result was that the image view had completely disappeared and wasn't visible anymore. Am I doing something, am I forgetting something or both?

You should use a png image, which supports transparency, unlike jpg.
In Photoshop your image should look similar to this:
It doesn't matter if your shape is black or white. What matters is transparency of each pixel. Opaque area (black in this case) will be visible and transparent area will get trimmed.
Edit:
You should not create mask view from storyboard if you do so. It is not going to be a part of your view hierarchy. Just add it programmatically like this:
let maskView = UIImageView()
override func viewDidLoad() {
super.viewDidLoad()
maskView.image = UIImage(named: "mask")
imageView.mask = maskView
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
maskView.frame = imageView.bounds
}
Output:
Here is a test project to show how it's working.

Also if you're using a custom frame/image and run into the mask not showing properly, try setting the content mode of the mask:
maskView.contentMode = .scaleAspectFit

Related

Just want to add a darken layer or make the UIImageView darken by Swift

As people already know that this seems simple but Stackoverflow has no answers for me. Basically I want to make an image darker like the bg image of Beijing ancient building below:
to make it darker so that the tags on it can be more contrasting obvious.
I tried adding a layer which is one of the answers from Stackoverflow or adding tintColor but none of them worked, is there any method that can really work? Thank you, guys.
I'm under iOS 13 and swift 5.1
Add this UIView extension to your project
extension UIView {
func addoverlay(color: UIColor = .black,alpha : CGFloat = 0.6) {
let overlay = UIView()
overlay.autoresizingMask = [.flexibleWidth, .flexibleHeight]
overlay.frame = bounds
overlay.backgroundColor = color
overlay.alpha = alpha
addSubview(overlay)
}
//This function will add a layer on any `UIView` to make that `UIView` look darkened
}
then use it like on any UIView(In your Case yourImageView)
yourImageView.addoverlay()
Or you can specify your own overlay color and alpha value
yourImageView.addoverlay(color: .blue, alpha: 0.5)

Create shadow around a UIVisualEffectView without covering the whole view

Is it possible to create a shadow around a UIVisualView with UIBlurEffect without letting the UIVisualView get coloured by the shadow underneath?
I basically just want the shadow around the view but with this code the shadow will cover the whole view which darkens the whole view to much:
let borderPath = UIBezierPath(roundedRect: view.bounds, byRoundingCorners: [.topLeft, .topRight], cornerRadii: CGSize(width: 15, height: 15)).cgPath
shadowView.frame = view.bounds
shadowView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
shadowView.layer.shadowOpacity = 0.3
shadowView.layer.shadowRadius = 3.0
shadowView.backgroundColor = UIColor.clear
shadowView.layer.shadowPath = borderPath
shadowView.layer.shadowOffset = CGSize(width: 0, height: 0)
self.view.insertSubview(shadowView, at: 0)
let blurEffect = UIBlurEffect(style: .extraLight)
let blurView = UIVisualEffectView(effect: blurEffect)
blurView.frame = view.bounds
blurView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
blurView.clipsToBounds = true
blurView.layer.cornerRadius = 15
view.insertSubview(blurView, aboveSubview: shadowView)
EDIT.
I need to achieve the same thing as in Apple's Maps application. Where the draggable favourite view both uses the UIVisualEffectView and a shadow around its top, without interfering with the UIVisualEffectView's background.
See example screenshots:
Ok, so the problem was that my background in the underlying view was white. And with the UIBlurEffect .extraLight used on a background which is lighter than the BlurEffect the shadow beneath a UIVisualView appears darker than with a more vivid background.
Also described in this question:
Fix UIVisualEffectView extra light blur being gray on white background
UPDATE
I found a project explaining how to solve this on Github The solution involves creating a 9-part UIImage to represent the shadow. The creator also explains the underlying layers of the iOS 10 Maps.
So i'm trying to recreate the look of iOS 10 maps. I decided to attach the maps app in the simulator to the debugger to see what was going on...
Apple actually get around this by having a UIImage over the top of the content with the border and shadow. Not the most elegant way to do it but i'm going for the exact look so I'm going to take the exact approach.
I also grabbed the asset (using this) from the Maps app to save making your own one. Shame they only have #2x artwork in it though :/
I found in iOS 12, this is not a problem, the UIVisualEffectView ignore the shadow of underneath views, it just sees through the shadow, like the shadow not exist.
The trick is to not set the shadowPath of the layer. If it is set, the shadow is painted below the visual effect view and, in consequence, darkens the blurred view.
The documentation for shadowPath states:
If you specify a value for this property, the layer creates its shadow
using the specified path instead of the layer’s composited alpha
channel.
The "instead…" part is what we actually need: the shadow should be composed after rendering the content, based on what the content leaves transparent. Now you might be deterred from not setting a shadowPath because the documentation also states that an "explicit path usually improves rendering performance". However, I didn't see any issues in real life so far. In comparison to the rendering cost of the UIVisualEffectView, painting the shadow doesn't make a difference, I assume.
Also make sure to set the shadow on a superview of the UIVisualEffectView, not on a sibling view.
I usually solve this kind of situation using composition of views. Instead of setting the shadow in the target view, I create a ShadowView and I put it behind the target view, both views with the same frame.
I've posted and example and code in this question.
An example of the result of this approach is the following:

How to prevent clipping of UIImageView mask

I want to apply a chat bubble mask to an image view, and I would like the mask image to scale to fit in the frame of the image view. This is the code I am using to create and apply the mask:
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
mask.contents = maskImage.CGImage
mask.frame = CGRectMake(0, 0, imageView.frame.width, imageView.frame.height)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
And this is my mask:
However this is what I get:
And when I add slicing to the mask image (i.e. create resizable cap insets) I get this:
I made a minimal project that demonstrates the issue and put it here
I do not fundamentally understand why the right and bottom sides of my mask image are getting clipped out of view. The mask's frame and bounds match the image view's, and the image used for the mask is actually smaller than the image view, so if anything I would expect it to not cover the whole thing, rather than getting partially clipped.
Can someone please explain what's happening here? How do I prevent this behavior?
EDIT: I've been playing around with this some more, and the results make a little more sense when I set the mask's size to match the maskImage size like so:
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
mask.contents = maskImage.CGImage
mask.frame = CGRectMake(0, 0, maskImage.size.width, maskImage.size.height)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
This gives me:
Or with slicing on the mask asset, I get the image in J.Hunter's answer. This is still not what I want because the mask is not being stretched to fill the imageView and a large portion around the sides is masked out that shouldn't be, but I can at least understand what's happening in this case. I am setting the mask's frame to a size smaller than the imageView's frame so of course it doesn't fill the entire area.
However I would expect setting the frame of the mask to match that of the imageView would make the right and bottom edges of the mask that I can see in that last screenshot line up with the right and bottom edges of the imageView. If I print out the masks's frame, bounds, and contentsRect, they are exactly what I would expect: the frame and bounds match the imageView and the contentsRect is (0.0, 0.0, 1.0, 1.0). It was my understanding that a CALayer will display a portion of its contents based on the contentsRect, so unless that rect was smaller than 1.0x1.0, the entire contents image should be displayed. Right? But why is that not what I'm seeing?
I download your demo project and rewrite some codes like this:
override func viewDidLoad() {
super.viewDidLoad()
let mask = CALayer()
let maskImage = UIImage(named: "chatGif")!
print("image size is \(maskImage.size)")
mask.contents = maskImage.CGImage
mask.frame = CGRect(origin: CGPointZero, size: maskImage.size)
imageView.layer.mask = mask
imageView.layer.masksToBounds = true
}
the image size of your maskImage is (39.5, 32.5). It looks like something wrong with your chatGif.png
the snapshot
Okay, I finally figured out what's going on. I added the same debug print statements to the sample project I made that I had put in my actual project and discovered that the imageViews frame at the time I was setting the mask was much larger that I expected it, and not its final display size. That makes sense because in the sample project I was setting the mask in viewDidLoad, before any subviews had been laid out.
I didn't realize this was the case when printing out values in my actual project because my UIImageView is in a UITableViewCell, so it's un-initialized size was the same as the placeholder size in the storyboard, which wasn't that much larger than the final size, so it didn't read as obviously too big.
Now if only I could get the last 4 hours of my life back.
EDIT: To clarify, the initial code works fine if it's run after the image view has it's final layout size set. So you should set the mask in viewDidLayoutSubviews, or if it's a table/collection cell you can check if the frame size has changed in layoutSubviews.

iOS: Solid Border Outside UIView

A layer's borderWidth and borderColor properties draw a border inside the view. Remykits pointed this out here.
A layer's shadow... properties cannot be used to create a border that both appears on all four sides and is opaque for reasons I showed here.
What I failed to specify in that question (and the reason I've opened a new one) is that I want the border to be outside the view. Increasing the frame of the view to compensate for the space lost, as has been suggested, doesn't work; I'm using a UIImageView, so even if the frame is increased, the image is still cropped.
Another suggestion was to change the contentMode of the UIImageView to .Center, in combination with changing the size of the view, but this doesn't work as the view then isn't the proper size.
The solution I first thought of was to create another UIView "behind" this UIImageView, and give it a backgroundColor to mimic the effect of a border. I also thought of creating a custom subclass of UImageView. Both courses of action, however, involve making calculations based on the frame of the view. I've had many problems with the frame not being set by AutoLayout at the proper time, etc.
Other things that come to mind are digitally adding a border to the image or positioning the image in a specific part of the UIImageView. (My attempt at the latter was imageView.layer.contentsRect = CGRectInset(imageView.bounds, 4, 4), which resulted in a strangely pixellated image.)
To be clear, what I'm looking for is this:
It really feels like there should be a simpler way to do this than creating a new class or view. Any help appreciated.
Aha! Stitching together aykutt's comment about resizing the image and changing the conentMode, Paul Lynch's answer about resizing images, and rene's (life-saving) answer about what to do your subviews actually aren't laid out in viewDidLayoutSubviews:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
self.myContainer.setNeedsLayout()
self.myContainer.layoutIfNeeded()
var width: CGFloat = 4 //the same width used for the border of the imageView
var rect = CGRectInset(imageView.bounds, width, width)
var size = CGSizeMake(rect.width, rect.height)
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
image.drawInRect(CGRectMake(0, 0, size.width, size.height))
var new = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.imageView.contentMode = .Center
self.imageView.image = new
}

How to mask UIViews in iOS

I've seen similar questions, but haven't found workable answers.
I want to mask a UIView using a grey image (need to convert to alpha scale for masking). The UIView has background. It should be easy to mask an image, but I want to mask any UIView.
Any clues will be appreciated.
I've been working on this problem for a couple of hours and have a solution that I think will do what you want. First, create your masking image using whatever means you see fit. Note that we only need the alpha values here, all other colours will be ignored, so make certain that the method you use supports alpha values. In this example I'm loading from a .png file, but don't try it with .jpg files as they don't have alpha values.
Next, create a new layer, assign your mask to its contents and set this new layer to your UIView's own layer, like so: you should find that this masks the UIView and all its attached subviews:
UIImage *_maskingImage = [UIImage imageNamed:#"mask"];
CALayer *_maskingLayer = [CALayer layer];
_maskingLayer.frame = theView.bounds;
[_maskingLayer setContents:(id)[_maskingImage CGImage]];
[theView.layer setMask:_maskingLayer];
With this done, you can set the UIView's background colour to whatever you like and the mask will be used to create a coloured filter.
EDIT: As of iOS8 you can now mask a view simply by assigning another view to its maskView property. The general rules stay the same in that the maskView's alpha layer is used to determine the opacity of the view it is applied to.
For apps targeting iOS 8.0+ this worked well (in this case, using a gradient as the mask) It avoids any need to resize or position the mask.
// Add gradient mask to view
func addGradientMask(targetView: UIView)
{
let gradientMask = CAGradientLayer()
gradientMask.frame = targetView.bounds
gradientMask.colors = [UIColor.blackColor().CGColor, UIColor.clearColor().CGColor]
gradientMask.locations = [0.8, 1.0]
let maskView: UIView = UIView()
maskView.layer.addSublayer(gradientMask)
targetView.maskView = maskView
}
In my case, I want to remove the mask once the user starts scrolling. This is done with:
func scrollViewWillBeginDragging(scrollView: UIScrollView) {
exerDetailsTableView.maskView = nil
}
where the view is defined as an #IBOutlet:
#IBOutlet weak var exerDetailsTableView: UITableView!
Result:
I don't know the exact code off the top of my head but the basic idea is to have two UIViews. One UIView would have it's image property set to be the grey scale image and the other UIView would be set as usual the only difference is that you would position the initial UIView directly on top of the UIView containing the "normal" image.
I hope that is enough to push your idea a step further.

Resources