iOS 12, Xcode 10: UIView setNeedsDisplay(_:) seems to be broken - ios

After updating to Xcode 10 I realized that the draw(_ rect: CGRect) routine of my custom UIView (class derived from UIView) in my application was called with the wrong rect. Indeed it is always called with rect being the full frame of the underlying UIView, instead of the rect being specified by setNeedsDisplay(_ rect: CGRect).
Here is a code snippet that can be run as a playground, which at least in my setup shows the erroneous behavior described above in a minimalistic setting:
import Foundation
import UIKit
import PlaygroundSupport
class CustomView: UIView {
override func draw(_ rect: CGRect) {
print("rect = \(rect)")
}
}
let customView = CustomView(frame: CGRect(origin: CGPoint.zero, size: CGSize(width: 200.0, height: 200.0)))
PlaygroundPage.current.liveView = customView
print("test")
customView.setNeedsDisplay(CGRect(origin: CGPoint.zero, size: CGSize(width: 100.0, height: 100.0)))
The output I get is
rect = (0.0, 0.0, 200.0, 200.0)
test
rect = (0.0, 0.0, 200.0, 200.0)
The first printed output for rect is the standard full redraw of the view, but the second one after printing "test" yields the problem. The output there is from redrawing due to calling customView.setNeedsDisplay before and should be the smaller specified rectangle (0.0, 0.0, 100.0, 100.0).
So my obvious questions are:
Can you reproduce this behavior?
Am I missing something obvious?
Is this a bug?

This is actually intentional with iOS 12's new dynamic backing store feature.
What is a backing store
A backing store is what stores the drawn view, and that needs memory assigned to do so. That memory amount is dependent on how big the view is as it is essentially a map between colours and pixels.
If you were to draw a grayscale image but the memory has been assigned for the wide colour gamut then that would result in lots of empty assigned memory (gray scale has a lower footprint that RGBA). To get around this the dynamic backing store feature works by drawing the whole content of a view, and THEN working out how much memory it needs rather than assuming everything needs wide colour backing from the start.
The knock on effect of this is that you can't re-draw a smaller sub section of the view as that might then change this store.
How to get around it
This is a great new feature, but if you really do need to work around it you can disable dynamic backing stores on your view. The way you do that is by explicitly setting the contentsFormat property of the views layer.
There are three options you can chose which relate to grayscale, RGBA 8bit and RGBA 16 bit (wide colour)
so just call:
layer.contentsFormat = .RGBA16Float
and your setNeedsDisplay(_ rect: CGRect) will start working as expected again
You can read up on the property here: https://developer.apple.com/documentation/quartzcore/calayer/1792104-contentsformat
There's also a great talk from WWDC 18 that explains the new dynamic backing store and (very quietly) mentions this technique
https://developer.apple.com/videos/play/wwdc2018/219/?time=1451

I tested this in Xcode 9, 10 & 10.1.
The behaviour has definitely changed between iOS 11 and iOS 12 / 12.1
There's no indication in the documentation or header file that this was intentional.
Looks like a bug to me.

Related

how to change size of customView passed as UICalanderView Decoration?

I could not find much detail about how to add a customView as decoration for UICalenderView. There are many blogs telling how to add images but could not find anyone about CustomView. In images, we can return the decoration item with size parameter however in case of customView there is no option to pass size along with customView that you are adding. So in the end, I was able to add a view with red background, but the size is wrong. I tried to create a view and give it frame but it had no effect. So im confused how to adjust its size. Here is the method in which I add customView that im creating:
func calendarView(_ calendarView: UICalendarView, decorationFor dateComponents: DateComponents) -> UICalendarView.Decoration? {
return .customView(addActivityCircle)
}
And this is my addActivityCircle method which for now just creating a view with red background color:
private func addActivityCircle() -> UIView {
let view = UIView()
view.backgroundColor = .red
view.clipsToBounds = false
view.frame = CGRect(x: 0, y: 0, width: 50, height: 50)
return view
}
When I run this code I do see a view with red color but it's like a small rectangle, not 50x50. If I pass small values like 20x20, I do see a small rectangle but anything above that I see a rectangle of fixed size. I think that's the limit of decoration item but in apps like Fitness app by apple, there are bigger activity rings than that so there should be a way to have bigger sized custom views as this is just too small. The width is fine but the height is just too less. This is what im getting and it does not get any higher than that:

Slider - incorrect color at the beginning and in the end

I have a custom slider where i have to increase slider's height (thickness). The code looks like this:
class CustomSlider: UISlider
{
override open func trackRect(forBounds bounds: CGRect) -> CGRect {
var defaultBounds = super.trackRect(forBounds: bounds)
let newHeight: CGFloat = 20
return CGRect(x: defaultBounds.origin.x,
y: defaultBounds.origin.y + defaultBounds.size.height/2 - newHeight/2,
width: defaultBounds.size.width,
height: newHeight)
}
}
The height is increased, but the problem now is that slider is not colored properly at the beginning and in the end. For example, at the beginning it now looks like this:
wrong color at the beginning
After some point the color becomes correct and fills blue: correct color after some point
In the end there is the same problem, at first everything works as expected: normal behavior
But then after some point the color becomes updated to blue too soon: wrong color in the end
Has anyone experienced anything similar before? Is there any solution for this?
I have tried using setMinimumTrackImage and setMaximumTrackImage instead of minimumTrackTintColor and maximumTrackTintColor, it works, but i cannot use it because when i rotate screen - slider stretches and the image which i am using stretches as well, so slider's corner radius looks stretched and not the same as it has to be.
Also an interesting fact is that the more I increase slider height - the later the correct color appears at the beginning.
Since setting track images is working and the only issue is stretched corners, the latter can be solved by using resizableImage(withCapInsets:) on your track images, so only the middle part of the image will be stretched and the rest will remain untouched.
These articles cover the topic in great details:
https://www.natashatherobot.com/ios-stretchable-button-uiedgeinsetsmake/
https://www.hackingwithswift.com/example-code/media/how-to-make-resizable-images-using-resizableimagewithcapinsets

Swift button frame height issue (viewDidLayoutSubviews)

I've got some square buttons that I'd like to add rounded corners to that are proportional to the button's height. In past versions of my app, I had implemented this feature without issues using viewDidLayoutSubviews(). For some reason, after pushing a new version of my app with other features I had tweaked, this section of code no longer functions as expected. Here is the code:
override func viewDidLayoutSubviews() {
for button in buttons {
button!.layer.shadowColor = UIColor.black.cgColor
button!.layer.shadowOffset = CGSize(width: 0, height: 1.0)
button!.layer.shadowOpacity = 0.4
button!.layer.shadowRadius = button!.frame.height / 40
button!.layer.cornerRadius = button!.frame.height / 10
}
Again, this block of code used to work just fine but for some reason it no longer works. What I am experiencing is much larger relative radii on smaller buttons (iPhone SE) compared to bigger buttons (iPads).
To troubleshoot, in viewDidLayoutSubviews(), I'm printing the button!.frame.height and I'm noticing that no matter what device I use the frame height is 395.5, which I believe is the correct size only on the 12.9" iPad. Therefore, the buttons look correct on the 12.9" iPad but the radii end up being too large on all of the smaller devices.
Any idea what's going on here? Why is it that they're all returning the same frame height even though they're visually very different sizes on the different devices?
I copy and pasted the above code into the viewWillAppear() method and
the problem was resolved. I then deleted the code from
viewWillAppear(), leaving me with my original code during posting of
question, and it is continuing to run as expected (working). What
could possibly be the cause of this intermittent behavior
The reason when you initialized the buttons in viewWillAppear and remove them but it still work because your button's frame did not change in the viewDidLayoutSubview method. And the viewDidLayoutSubview is invoked only controller's view is updated, rotated, or changed, which in your case it does not.
If you try to rotate your device you will see your parent view's frame changed.
For more information about view hierarchy. See this article
Try like this:-
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
for button in buttons {
button!.layer.shadowColor = UIColor.black.cgColor
button!.layer.shadowOffset = CGSize(width: 0, height: 1.0)
button!.layer.shadowOpacity = 0.4
button!.layer.shadowRadius = button!.frame.height / 40
button!.layer.cornerRadius = button!.frame.height / 10
}

Gradient layer not in the right place

I have the following code as follows:
playView.layer.cornerRadius = 16
let gradient1 = CAGradientLayer()
gradient1.frame = playView.frame
gradient1.cornerRadius = 16
if #available(iOS 10.0, *) {
// set P3 colour
} else {
// set sRGB colour
}
gradient1.startPoint = CGPoint(x: 0, y: 0)
gradient1.endPoint = CGPoint(x: 1, y: 1)
playView.layer.insertSublayer(gradient1, at: 0)
On the 3rd line of the block of code above, I set the frame of the gradient equal to the frame I want it to fill.
When I run the app on different devices, the gradient layer will only fill the correct area if the device the app is being run on is the one selected in the Interface Builder.
I currently have the code in viewDidLoad(), and so the issue can be solved by moving the code to viewDidAppear(), but then when the app is loaded, there will be a slight delay before the gradient appears, not giving a smooth look and feel.
Is there another method I can put the code in, so that the gradient shows in the correct place, whilst at the same time being there as soon as the user sees the screen? Or alternatively, a way to make the gradient fill the view, whilst still keeping the code in viewDidLoad()?
EDIT: viewWillAppear() does not work, nor does viewWillLayoutSubviews(). Surely there must be away to solve this?
You can put inside this block:
override func viewWillLayoutSubviews() {
super.viewWillLayoutSubviews()
}
viewDidAppear() works. This screen is the root controller - I don't know if that makes a difference or not, but there is no visible delay on applying the gradient backgrounds.
I would be interested if anyone could explain this? I have a bar chart in another part of the app and in viewDidAppear() there is code to complete the bar chart, however there is a delay in it being filled in.
Change the layer.frame property inside the viewDidLayoutSubviews
method. This is to make sure that the subview (playView) has already a proper frame.

Put a mask layer on a UIView of varying size inside UITableViewCell

I have a UITableView whose cells contain a subview on which I need to perform three things:
change its width constraint at runtime depending on a value inside an object specific to this cell
change its background color depending on that same value
round the top left and bottom left corners of the view but keep the corners on the right hand side as they are (so layer.cornerRadius is not an option)
I use the following code inside my custom UITableViewCell subclass to achieve the rounded corner effect on one side of the view only, which I call from tableView:cellForRowAt::
func roundLeadingEdgesOfBar() {
let roundedLayer = CAShapeLayer()
roundedLayer.bounds = viewInQuestion.frame
roundedLayer.position = viewInQuestion.center
roundedLayer.path = UIBezierPath(roundedRect: viewInQuestion.bounds,
byRoundingCorners: [.topLeft, .bottomLeft],
cornerRadii: CGSize(width: 2, height: 2)).cgPath
viewInQuestion.layer.mask = roundedLayer
print("frame: \(viewInQuestion.frame)")
}
However what I see when I run this code is an effect like this:
The print statement in the code above produces the following output, indicating that viewInQuestion has the same frame every time when clearly on the screen it hasn't:
frame: (175.0, 139.5, 200.0, 5.0)
frame: (175.0, 139.5, 200.0, 5.0)
frame: (175.0, 139.5, 200.0, 5.0)
frame: (175.0, 139.5, 200.0, 5.0)
So I assume the width constraint on the view has not been rendered by the time I call this function. When I scroll the entire table view up until all cells are out of view, and then scroll them back into view, everything looks correct and the printed frames are all different, like I would expect:
frame: (136.5, 79.5, 238.5, 5.0)
frame: (169.5, 79.5, 205.5, 5.0)
frame: (226.0, 79.5, 149.0, 5.0)
frame: (247.5, 79.5, 127.5, 5.0)
I've read several times on SO to execute code that is dependent on constraints having been applied from within layoutSubviews, but that gave me the same result. I even tried calling roundLeadingEdgesOfBar from within tableView:willDisplay:forRowAt:, to no avail.
I also found this response to a similar problem which suggests putting the mask layer code inside drawRect:. This actually fixes 99% of the problem for me (leaving performance issues aside), but there are still corner cases (no pun intended) left for very long table views where I still see the wrong behavior.
My last resort was to call my rounding function via performSelector with a 0.00001 delay, which works in the sense that you see the bug for about a second on screen before it then disappears - still far from ideal behavior, let alone the awful code I had to write for it.
Is there any way to reliably apply the shape layer on the view inside a UITableViewCell using its correct runtime frame?
Instead of calling a function to "round the edges," I suggest creating a UIView subclass and let it handle the rounding.
For example:
class BulletBar: UIView {
override func layoutSubviews() {
let roundedLayer = CAShapeLayer()
roundedLayer.frame = bounds
roundedLayer.path = UIBezierPath(roundedRect: bounds,
byRoundingCorners: [.topLeft, .bottomLeft],
cornerRadii: CGSize(width: 2, height: 2)).cgPath
layer.mask = roundedLayer
}
}
Now, set the class of your "bar" subview in the cell to BulletBar. Use constraints to pin it to the right and bottom and to constrain the height and width. Create an IBOutlet for the width constraint, and then set the barWidthConstraint.constant as desired.
The class itself will handle rounding the corners.
Result:
I think what you should be doing is that,
Set the properties related to your current object in cellForRowAtIndexPath, try setting constraints in the setter of that particular value.
Call setNeedsLayout which will make a future call to layoutIfNeeded -> layoutSubviews
In your layoutSubviews, set the rounded corners.
I hope this will help you.

Resources