I'm trying to create a subclass of UIBezierPath to add some properties that are useful to me.
class MyUIBezierPath : UIBezierPath {
var selectedForLazo : Bool! = false
override init(){
super.init()
}
/* Compile Error: Must call a designated initializer of the superclass 'UIBezierPath' */
init(rect: CGRect){
super.init(rect: rect)
}
/* Compile Error: Must call a designated initializer of the superclass 'UIBezierPath' */
init(roundedRect: CGRect, cornerRadius: CGFloat) {
super.init(roundedRect: roundedRect, cornerRadius: cornerRadius)
}
required init(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
EDIT:
I need this because in my code i write
var path = MyUIBezierPath(roundedRect: rect, cornerRadius: 7)
and it results in a compile error:
"Must call a designated initializer of the superclass 'UIBezierPath'"
I tried to add that initializers in the subclass but it seems not to work.
Can you help me please?
NOTE: This problem is solved in iOS 9, where the API has been rewritten so that init(rect:) exists, and so do all the others, as convenience initializers, as they should be.
The Problem
In a nutshell, the problem you're experiencing is that the following code does not compile:
class MyBezierPath : UIBezierPath {}
let b = MyBezierPath(rect:CGRectZero)
From the Swift point of view, that seems just wrong. The documentation appears to say that UIBezierPath has an initializer init(rect:). But then why isn't UIBezierPath's init(rect:) being inherited in our subclass MyBezierPath? According to the normal rules of initializer inheritance, it should be.
The Explanation
UIBezierPath is not intended for subclassing. Accordingly, it doesn't have any initializers - except for init(), which it inherits from NSObject. In Swift, UIBezierPath looks as if it has initializers; but this is a false representation. What UIBezierPath actually has, as we can see if we look at the Objective-C headers, are convenience constructors, which are class methods, such as this:
+ (UIBezierPath *)bezierPathWithRect:(CGRect)rect;
Now, this method (along with its siblings) demonstrates some unusual features that Swift does not deal with very well:
It is not merely a variant of an initializer; it is a pure convenience constructor. Objective-C shows us that UIBezierPath has no corresponding true initializer initWithRect:. That's a very unusual situation in Cocoa.
It returns UIBezierPath*, not instancetype. This means that it cannot be inherited, because it returns an instance of the wrong type. In a subclass MyBezierPath, calling bezierPathWithRect: yields a UIBezierPath, not a MyBezierPath.
Swift copes badly with this situation. On the one hand, it translates the class method bezierPathWithRect: into an apparent initializer init(rect:), in accordance with its usual policy. But on the other hand, this is not a "real" initializer, and cannot be inherited by a subclass.
You have thus been misled by the apparent initializer init(rect:) and then surprised and stumped when you could not call it on your subclass because it isn't inherited.
NOTE: I'm not saying that Swift's behavior here is not a bug; I think it is a bug (though I'm a little hazy on whether to blame the bug on Swift or on the UIBezierPath API). Either Swift should not turn bezierPathWithRect: into an initializer, or, if it does make it an initializer, it should make that initializer inheritable. Either way, it should be inheritable. But it isn't, so now we have to look for a workaround.
Solutions
So what should you do? I have two solutions:
Don't subclass. Subclassing UIBezierPath was a bad idea to start with. It is not made for this sort of thing. Instead of a subclass, make a wrapper - a class or struct that, rather than having the feature that is a UIBezierPath, has the feature that it has a UIBezierPath. Let's call it MyBezierPathWrapper:
struct MyBezierPathWrapper {
var selectedForLazo : Bool = false
var bezierPath : UIBezierPath!
}
This simply couples your custom properties and methods with a normal UIBezierPath. You could then create it in two steps, like this:
var b = MyBezierPathWrapper()
b.bezierPath = UIBezierPath(rect:CGRectZero)
If that feels unsatisfactory, you can make this a one-step creation by adding an initializer that takes the UIBezierPath:
struct MyBezierPathWrapper {
var selectedForLazo : Bool = false
var bezierPath : UIBezierPath
init(_ bezierPath:UIBezierPath) {
self.bezierPath = bezierPath
}
}
And now you can create it like this:
var b = MyBezierPathWrapper(UIBezierPath(rect:CGRectZero))
Subclass with a convenience constructor. If you insist on subclassing, even though UIBezierPath is not intended for that sort of thing, you can do it by supplying a convenience constructor. This works because the only important thing about a UIBezierPath is its CGPath, so you can make this convenience constructor a copy constructor merely transferring the path from a real UIBezierPath:
class MyBezierPath : UIBezierPath {
var selectedForLazo : Bool! = false
convenience init(path:UIBezierPath) {
self.init()
self.CGPath = path.CGPath
}
}
Now we can create one very similarly to the previous approach:
let b = MyBezierPath(path:UIBezierPath(rect:CGRectZero))
It isn't great, but I think it's marginally more satisfying than having to redefine all the initializers as your solution does. In the end I'm really doing exactly the same thing you're doing, in a more compressed way. But on balance I prefer the first solution: don't subclass in the first place.
I found this simple workaround that seems to do its job.
class MyUIBezierPath : UIBezierPath {
var selectedForLazo : Bool! = false
override init() {
super.init()
}
init(rect: CGRect){
super.init()
self.appendPath(UIBezierPath(rect: rect))
}
init(roundedRect: CGRect, cornerRadius: CGFloat) {
super.init()
self.appendPath(UIBezierPath(roundedRect: roundedRect, cornerRadius: cornerRadius))
}
required init(coder aDecoder: NSCoder) {
super.init()
}
}
Please post other solutions because they'll surely be better than mine.
Related
On whatever project you're currently working on, simply
On any screen drop in a UIView
Just add a width constraint (666 or whatever is fine),
Change the custom class of the constraint, to Constrainty
Run the app,
How can this possibly be?
Does it actually call for the value of the constant, "before the class is initialized," or some such?
How can it happen, and how to solve?
At first, I had the two variables as #IBInspectable and I was surprised that didn't work at all. But then I changed them to ordinary let constants - and that doesn't work!
class Constrainty: NSLayoutConstraint {
let percentageWidth: CGFloat = 77 // nothing up my sleeve
let limitWidth: CGFloat = 350
override var constant: CGFloat {
get { return teste() }
set { super.constant = newValue }
}
func teste()->CGFloat {
print("\n\n HERE WE GO \(percentageWidth) \(limitWidth) \n\n")
if let sw = (firstItem as? UIView)?.superview?.bounds.width {
let w = sw * ( percentageWidth / 100.0 )
let r = w > limitWidth ? limitWidth : w
print("result is \(r) \n\n")
return r
}
return 50
}
}
I don't think it's a good idea to subclass NSLayoutConstraint. I don't think it was designed to be subclassed outside of Apple.
Anyway, the problem is that NSLayoutConstraint conforms to the NSCoding protocol, but doesn't declare that it conforms to NSCoding in the header files. Because of this, Swift doesn't know that NSLayoutConstraint can be initialized by -[NSLayoutConstraint initWithCoder:], so it doesn't generate an override of initWithCoder: that initializes the instance variables you add in your subclass.
Here's how to fix it.
First, if your project doesn't have a bridging header, add one. The easiest way to add one is to create a new Objective-C class in your project, accept Xcode's offer to create the bridging header, then delete the .h and .m files it created for the class (but keep the bridging header).
Then, in the bridging header, declare that NSLayoutConstraint conforms to NSCoding:
//
// Use this file to import your target's public headers that you would like to expose to Swift.
//
#import UIKit;
#interface NSLayoutConstraint (MyProject) <NSCoding>
#end
Finally, in your Constrainty class, override init(coder:) like this:
required init(coder decoder: NSCoder) {
super.init(coder: decoder)!
}
Et voila:
HERE WE GO 77.0 350.0
result is 246.4
My wild guess it's all because of a class named UIClassSwapper. It's a private class that handles all the UI objects initialization from the Interface Builder files. I would suggest to replace your let constants with computed properties.
//...
var percentageWidht: CGFloat { // nothing up my sleeve
return 77.0
}
var limitWidth: CGFloat {
return 110.0
}
//...
UPD
Swift default property values(properties with values in their declaration) are being set before the initializer call. E.G. if you have a class MyClass with a property let someVar: CGFloat = 12.0 and it's bridged to Objective-C, when you allocate memory for your object and do not call an initializer MyClass *obj = [MyClass alloc] your variable will have a default value of 0.0 and will stay so unless you’ll call an initializer like [obj init]. So my second wild guess is that because NSLayoutConstraint class is written in Objective-C and it's initWithCoder: initializer isn't declared in it's header(it's private), the ObjC-Swift bridging mechanism doesn't recognize it's call as an initializer call(it thinks it is just a simple instance method), so your Swift properties with default values aren't being initialized at all.
In my app I read calendar events of type EKEvent, and I've made an extension with a lot of computed vars so I can easily get the duration, number of man-hours etc. for each event in the calendar. But in large scale, the performance is bad - so I want to use lazy vars instead, to cache all my extra data.
Therefore, I want to make a subclass of EKEvent - called CustomEvent, which adds the lazy vars, but my problem is that the EKEventStore always returns EKEvents, and I need to convert that to instances of my CustomEvent subclass, in order to be able to access the lazy vars etc.
A simple typecast is not enough, and I've tried in a playground, to see what could work, but got nothing useful. I need a special constructor for CustomRectangle, which can initialize a CustomRectangle from a NativeRectangle. An alternative solution is to make a wrapper class that holds the original object as a property, but that wouldn't be my favorite solution, since I'd then have to map all methods and properties
class NativeRectangle: NSObject {
var width: Int
var height: Int
init(width: Int, height: Int) {
self.width = width
self.height = height
super.init()
}
}
class CustomRectangle: NativeRectangle {
var area: Int { return width * height}
}
let rect = NativeRectangle(width: 100, height: 20)
let customRect = CustomRectangle(rect) // This fails, i need a constructor
print(customRect.area)
There is no way in Swift (and in general in most Object Oriented languages) to use an existing instance of a base class object when creating a child class instance.
From a general programming stand-point you have the two options in this situation:
Use composition: Make the CustomRectangle contain a NativeRectangle and forward all methods to it that you need.
Use a map to link NativeRectangles to additional information. In Objective C and Swift you can you objc_AssociationPolicy to have such an internal map most easily. See https://stackoverflow.com/a/43056053/278842
Btw. There is no way that you will see any speed-up from "caching" a simple computation as width * height.
If you already work in the Objective-C land, there’s an option to wrap the native class and forward all (except the added) messages automatically:
- (NSMethodSignature*) methodSignatureForSelector: (SEL) selector
{
NSMethodSignature *ours = [super methodSignatureForSelector:selector];
return ours ?: [wrappedObject methodSignatureForSelector:selector];
}
I can’t remember if this is everything that was needed for the forwarding to work, but it should be pretty close. Also, I don’t know how this would play with Swift, so I guess we could consider this an interesting piece of trivia from the Objective-C days and look for a better solution…
A second, also slightly hacky option that comes to mind is using the associated objects feature to link the cached data to the original instance. That way you could keep your extensions approach.
You created your own CustomRectangle(object: rect) , so swift will not provide default init() any more. You explicitly need to call one of your own holding your property and make call to super.init(), as your class also inherits from super class. –
class NativeRectangle: NSObject {
var width: Int
var height: Int
// Super class custom init()
init(width: Int, height: Int) {
self.width = width
self.height = height
super.init()
}
}
class CustomRectangle: NativeRectangle {
// computed property area
var area: Int { return width * height}
// Sub class Custom Init
init(object:NativeRectangle) {
// call to super to check proper initialization
super.init(width: object.width, height: object.height)
}
}
let rect = NativeRectangle(width: 100, height: 20)
let customRect = CustomRectangle(object: rect)
print(customRect.area) //2000
I want to be able to instantiate a subclass, here named MyLabel, which is a subclass of UILabel, using array literals. I am making use of this in my framework ViewComposer which allows for creating UIViews using an array of enums attributing the view, like this:
let label: UILabel = [.text("Hello World"), .textColor(.red)]
In this question I have dramatically simplified the use case, instead allowing for writing:
let vanilla: UILabel = [1, 2, 3, 4] // works!
print(vanilla.text!) // prints: "Sum: 10"
What I want to do is use the same ExpressibleByArrayLiteral syntax, but to a subclass of UILabel, called MyLabel. However the compiler stops me when I am trying to :
let custom: MyLabel = [1, 2, 3, 4] // Compilation error: "Could not cast value of type 'UILabel' to 'MyLabel'"
Instantiation of UILabel using array literals works, thanks to conformance to custom protocol Makeable below.
Is it somehow possible to make the compiler understand that I am referring to the array literal initializer of the subclass MyLabel and not its superclass UILabel?
The following code might not make any logical sense, but it is a minimal example, hiding away what I really want:
// This protocol has been REALLY simplified, in fact it has another name and important code.
public protocol ExpressibleByNumberArrayLiteral: ExpressibleByArrayLiteral {
associatedtype Element
}
public protocol Makeable: ExpressibleByNumberArrayLiteral {
// we want `init()` but we are unable to satisfy such a protocol from `UILabel`, thus we need this work around
associatedtype SelfType
static func make(values: [Element]) -> SelfType
}
public protocol Instantiatable: ExpressibleByNumberArrayLiteral {
init(values: [Element])
}
// My code might have worked if it would be possible to check for _NON-conformance_ in where clause
// like this: `extension Makeable where !(Self: Instantiatable)`
extension Makeable {
public init(arrayLiteral elements: Self.Element...) {
self = Self.make(values: elements) as! Self
}
}
extension Instantiatable {
init(arrayLiteral elements: Self.Element...) {
self.init(values: elements)
}
}
extension UILabel: Makeable {
public typealias SelfType = UILabel
public typealias Element = Int
public static func make(values: [Element]) -> SelfType {
let label = UILabel()
label.text = "Sum: \(values.reduce(0,+))"
return label
}
}
public class MyLabel: UILabel, Instantiatable {
public typealias Element = Int
required public init(values: [Element]) {
super.init(frame: .zero)
text = "Sum: \(values.reduce(0,+))"
}
public required init?(coder: NSCoder) { fatalError() }
}
let vanilla: UILabel = [1, 2, 3, 4]
print(vanilla.text!) // prints: "Sum: 10"
let custom: MyLabel = [1, 2, 3, 4] // Compilation error: "Could not cast value of type 'UILabel' to 'MyLabel'"
I have also tried conforming to ExpressibleByArrayLiteral protocol by extending ExpressibleByNumberArrayLiteral instead (I suspect the two solutions might be equivalent, and compile to the same code..), like this:
extension ExpressibleByNumberArrayLiteral where Self: Makeable {
public init(arrayLiteral elements: Self.Element...) {
self = Self.make(values: elements) as! Self
}
}
extension ExpressibleByNumberArrayLiteral where Self: Instantiatable {
init(arrayLiteral elements: Self.Element...) {
self.init(values: elements)
}
}
But that did not work either. The same compilation error occurs.
I've written a comment in the big code block above, the compiler might have been able to determine which array literal initializer I was referring to if I would have been able to use negation in the where clause:
extension Makeable where !(Self: Instantiatable)
But AFAIK that is not possible, that code does not compile at least. Nor does extension Makeable where Self != Instantiatable.
Is what I want to do possible?
I would be okay with having to make MyLabel a final class. But that makes no difference.
Please please please say that this is possible.
after going through the Apple Docs, I initially thought this was not possible. However, I did find a post here, which applied to Strings and other non-UI classes. From the post i merged the idea that you cannot apply ExpressibleByArrayLiteral to a subclass through the inheritance approach, which is probably why you get the error (which I could reproduce many times with many other approaches).
Finally, by moving the ExpressibleByArrayLiteral adoption directly onto your UILabel subclass, it seems to be working!
public class MyLabel: UILabel, ExpressibleByArrayLiteral {
public typealias Element = Int
public override init(frame: CGRect) {
super.init(frame: frame)
}
required public init(values: [Element]) {
super.init(frame: .zero)
text = "Sum: \(values.reduce(0,+))"
}
public convenience required init(arrayLiteral: Element...) {
self.init()
self.text = "Sum: \(arrayLiteral.reduce(0,+))"
}
public required init?(coder: NSCoder) { fatalError() }
}
let vanilla: MyLabel = [1, 2, 3, 4]
print(vanilla.text) // prints Sum: 10
Turns out we can't even inherit Expressibility on element type classes (Strings, Ints, etc), you still have to re-specify the initializer for it.
With some tweaking, I also applied these other methods for thought!
public class MyLabel: UILabel, ExpressibleByArrayLiteral {
public typealias Element = Int
private var values : [Element]?
public var append : Element? {
didSet {
if let t = append {
values?.append(t)
}
}
}
public var sum : Element {
get {
guard let s = values else {
return 0
}
return s.reduce(0,+)
}
}
public var sumString : String {
get {
return "\(sum)"
}
}
public var label : String {
get {
guard let v = values, v.count > 0 else {
return ""
}
return "Sum: \(sumString)"
}
}
public override init(frame: CGRect) {
super.init(frame: frame)
}
required public init(values: [Element]) {
super.init(frame: .zero)
text = "Sum: \(values.reduce(0,+))"
}
public convenience required init(arrayLiteral: Element...) {
self.init()
self.values = arrayLiteral
self.text = label
}
public required init?(coder: NSCoder) { fatalError() }
}
let vanilla: MyLabel = [1, 2, 3, 4]
print(vanilla.label) //prints out Sum: 10 , without unwrapping ;)
For now, I can't seem to apply the Expressibility to a protocol approach like you did. However, as workarounds go, this seems to do the trick. Guess for now we just have to apply the initializers to each subclass. Unfortunate, but still worth looking all this up!
UPDATE TO RE-AFFIRM THE ISSUE WITH THE EXTENSION APPROACH
Swift's inheritance prohibits convenience inits when it cannot guarantee that the superclass will not be altered dramatically. While your init does not change UILABEL's properties, the strict-typing for extensions just does not support the combinations of required and convenience on this type of initializer.
I'm taking this from this post, which was included in the link above btw:
Because this is a non-final class. Consider if there were a subclass
to Stack that had its own required initializer. How would you ensure
that init(arrayLiteral:) called it? It couldn't call it (because it
wouldn't know that it existed). So either init(arrayLiteral:) has to
be required (which means it needs to be part of the main declaration
and not a extension), or Stack has to be final.
If you mark it final, this works like you expect. If you want it to be
subclassed, just move it out of the extension and into the main body.
And if we look at the TWO errors you get, simply by trying to extend UILabel to ExpressibleByArrayLiteral directly, and not through a nested network of protocols like what you are doing:
Initializer requirement 'init(arrayLiteral:)' can only be satisfied by a 'required' initializer in the definition of non-final class 'UILabel'
'required' initializer must be declared directly in class 'UILabel' (non in an extension).
So first. ExpressibleByArrayLiteral requires a 'required' initializer method to conform to it, to which as the compiler says: you cannot implement directly inside of extensions for the superclass you wish to customize. Unfortunately, by this logic alone.. your desired approach is flawed.
Second. The very specific initializer you want, 'init(arrayLiteral:), is for final - type classes. As in, classes which you mark with the keyword FINAL on it's declaration header, or class TYPES (Strings are one, and so are number classes). UILabel is simply non a final-class to allow subclassing and you cannot change this without hacking the language. To illustrate non-final and final, try subclass String, and you will get an error since it is not a class or protocol. which wouldn't get your through the App Store ANYWAYS. So by DESIGN, you just cannot use this method on UILabel itself via an extension.
Three. You take the custom protocol approach, and try to apply it to UILabel by extension and inheritance.. I do apologize, but Swift just does not ignore it's language structure simply because you layer some custom code between the two ends of the constrtaint. Your protocol approach, while elegant as it is, is just nesting the problem here, not addressing it. And this is because it just re-applies these initialization constraints back onto UILabel regardless of your own middleware.
Fourth. On a bit of a logical train of thought here. If you look at the Apple Docs on Expressibles inside of XCode (the code file itself), you notice that the protocols especially applies to RawRepresentable classes and types (Strings, Ints, Doubles, etc.):
For any enumeration with a string, integer, or floating-point raw
type, the Swift compiler automatically adds RawRepresentable
conformance. When defining your own custom enumeration, you give it a
raw type by specifying the raw type as the first item in the
enumeration's type inheritance list. You can also use literals to
specify values for one or more cases.
So anything which is data representation as the root-level of the class, can adopt this by extension. You can clearly see above, when adding this protocol immediately to UILabel, you're also imposing the RawRepresentable protocol to it. Which it cannot adopt my nature, which i'm willing to bet is the source of the "MyLabel Cannot be cast to UILabel" error. UILabel is not one of these things, which is kind of why it gets this non-final class attribute: It's a UI element, which is a combination of many RawRepresentables. So it makes sense that you should not be able to directly initialize a class which is an ensemble of RawRepresentables directly, because if some mix up happens on init compiler-side and it does not alter the Raw type you're aiming for, it could just corrupt the class instance altogether and take every one down a debug nightmare.
To illustrate the RawRepresentable point I'm trying to make, here's what happens when you apply this approach to String, a RawRepresentable-conforming type:
extension String: ExpressibleByArrayLiteral {
public typealias Element = Int
public init(arrayLiteral elements: Element...) {
self.init()
self = "Sum: \(elements.reduce(0,+))"//WE GET NO ERRORS
}
}
let t : String = [1,2,3,4]
print(t) // prints -> Sum: 10
Whereas...
extension UILabel: ExpressibleByArrayLiteral {
public typealias Element = Int
public convenience required init(arrayLiteral elements: Element...) { //The compiler even keeps suggesting you add the method types here illogically without taking to account what is there.. ie: it's confused by what you're trying to do..
self.init()
self.text = "Sum: \(elements.reduce(0,+))" //WE GET THE ERRORS
}
}
//CANNOT PRINT.....
I'll even demonstrate how far the constraint goes in terms of add Expressibles on UILabel subclasses, and it's second-tier subclasses:
class Label : UILabel, ExpressibleByArrayLiteral {
public typealias Element = Int
override init(frame: CGRect) {
super.init(frame: frame)
}
public required init(arrayLiteral elements: Element...) {
super.init(frame: .zero)
self.text = "Sum: \(elements.reduce(0,+))"
}
public required init?(coder: NSCoder) { fatalError() }
}
let te : Label = [1,2,3,4]
print(te.text!) //prints: Sum: 10
class SecondLabel : Label {
public typealias Element = Int
required init(arrayLiteral elements: Element...) {
//THIS IS STILL REQUIRED... EVEN THOUGH WE DO NOT NEED TO MANUALLY ADOPT THE PROTOCOL IN THE CLASS HEADER
super.init(frame: .zero)
self.text = "Sum: \(elements.reduce(0,+))"
}
public required init?(coder: NSCoder) { fatalError() }
}
let ta : SecondLabel = [1,2,3,4]
print(ta.text!) //prints: Sum: 10
In conclusion.
Swift is designed this way. You cannot apply this particular protocol directly onto it because UILabel is a language level superclass and the guys who come up with this stuff don't want you to have this much over-reach into UILabel. So, you simply cannot do this because this protocol cannot be applied through extensions of non-final superclasses due to the nature of UILabel and the protocol itself. They are just not compatible this way. However, you can apply this on its subclasses on a per-subclass basis. Meaning you have to re-declare the conforming initializer each time. It sucks! But it's just how it works.
I commend your approach, it seems to almost get the extensions approach down to a T. However, there seems to be some things in how Swift is built that you just cannot circumvent. I am not the only one to affirm this conclusion (just check the links), so I would ask you to remove your downvote. You have a solution that i've given you, I've given you the references to validate my point, and i've also provided code to show you how you can remedy this constraint in the language's nature. Sometimes there just is no solution to the desired approach. And other times another approach is the only way to get around things.
I've ended up with a solution using a postfix operator.
Since I need to be able to instatiate the UIKits UILabel using ExpressibleByArrayLiteral I cannot use murphguys proposed solution with Label and SecondLabel.
My original code works by adding this postfix operator:
postfix operator ^
postfix func ^<I: Instantiatable>(attributes: [I.Element]) -> I {
return I(values: attributes)
}
Which makes the code compile and work. Although it feels a bit "hacky"...
let custom: MyLabel = [1, 2, 3, 4]^ // note use of caret operator. Now compiles
print(custom.text!) // prints "Sum 10"
If you are interested in why and how I use this you can take a look at my framework ViewComposer which enables this syntax:
let label: UILabel = [.text("Hello World"), .textColor(.red)]
But I also wanted to be able to create my own Composable subclass called MyLabel (or just Label..)
let myLabel: MyLabel = [.text("Hello World"), .textColor(.red)] // does not compile
Which did not work earlier, but now works using the caret postfix operator ^, like this:
let myLabel: MyLabel = [.text("Hello World"), .textColor(.red)]^ // works!
Which for now is the most elegant solution I think.
Here is the layout of an example Class, can someone guide me on what's best practice when creating a subclass of NSObject?
class MyClass: NSObject {
var someProperty: NSString! = nil
override init() {
self.someProperty = "John"
super.init()
}
init(fromString string: NSString) {
self.someProperty = string
super.init()
}
}
Is this correct, am I following best practice here?
I wonder if I'm correctly setting up the initializers (one that sets the string to a default, and one which I can pass in a string)?
Should I call super.init() at the end of each of the initializers?
Should my more specific (the one that takes a string) initializer simply call self.init() at the end rather than super.init()?
What is the right way to set up the initializers in Swift when subclassing NSObject? - and how should I call the super init ?
This question (albeit in Objective C) suggests you should have an init, which you always call and simply set the properties in more specific inits: Objective-C Multiple Initialisers
I'm not Swift ninja but I would write MyClass as:
class MyClass: NSObject {
var someProperty: NSString // no need (!). It will be initialised from controller
init(fromString string: NSString) {
self.someProperty = string
super.init() // can actually be omitted in this example because will happen automatically.
}
convenience override init() {
self.init(fromString:"John") // calls above mentioned controller with default name
}
}
See the initialization section of the documentation
If someProperty can be nil, then I think you want to define the property as:
var someProperty: NSString?
This also eliminates the need for a custom initializer (at least, for this property), since the property doesn't require a value at initialization time.
In complement to the answers, a good idea is to call super.init() before other statements. I think it's a stronger requirement in Swift because allocations are implicit.
As a learning exercise I am trying to implement a subclass of SKShapeNode that provides a new convenience initializer that takes a number and constructs a ShapeNode that is a square of number width and height.
According to the Swift Book:
Rule 1
If your subclass doesn’t define any designated initializers, it automatically inherits all of its superclass designated initializers.
Rule 2
If your subclass provides an implementation of all of its superclass designated initializers—either by inheriting them as per rule 1, or by providing a custom implementation as part of its definition—then it automatically inherits all of the superclass convenience initializers.”
However, the following class doesn't work:
class MyShapeNode : SKShapeNode {
convenience init(squareOfSize value: CGFloat) {
self.init(rectOfSize: CGSizeMake(value, value))
}
}
Instead I get:
Playground execution failed: error: <REPL>:34:9: error: use of 'self' in delegating initializer before self.init is called
self.init(rectOfSize: CGSizeMake(value, value))
^
<REPL>:34:14: error: use of 'self' in delegating initializer before self.init is called
self.init(rectOfSize: CGSizeMake(value, value))
^
<REPL>:35:5: error: self.init isn't called on all paths in delegating initializer
}
My understanding is that MyShapeNode should inherit all of SKShapeNode's convenience initializers because I am not implementing any of my own designated initializers, and because my convenience initializer is calling init(rectOfSize), another convenience initializer, this should work. What am I doing wrong?
There are two problems here:
SKShapeNode has only one designated initializer: init(). This means that we cannot get out of our initializer without calling init().
SKShapeNode has a property path declared as CGPath!. This means that we don't want to get out of our initializer without somehow initializing the path.
The combination of those two things is the source of the issue. In a nutshell, SKShapeNode is incorrectly written. It has a property path that must be initialized; therefore it should have a designated initializer that sets the path. But it doesn't (all of its path-setting initializers are convenience initializers). That's the bug. Putting it another way, the source of the problem is that, convenience or not, the shapeNodeWith... methods are not really initializers at all.
You can, nevertheless, do what you want to do — write a convenience initializer without being forced to write any other initializers — by satisfying both requirements in that order, i.e. by writing it like this:
class MyShapeNode : SKShapeNode {
convenience init(squareOfSize value: CGFloat) {
self.init()
self.init(rectOfSize: CGSizeMake(value, value))
}
}
It looks illegal, but it isn't. Once we've called self.init(), we've satisfied the first requirement, and we are now free to refer to self (we no longer get the "use of 'self' in delegating initializer before self.init is called" error) and satisfy the second requirement.
My understanding of Initializer Inheritance is the same as yours, and I think we are both well aligned with what the book states. I don't think it's an interpretation issue or a misunderstanding of the stated rules. That said, I don't think you're doing anything wrong.
I tested the following in a Playground and it works as expected:
class RectShape: NSObject {
var size = CGSize(width: 0, height: 0)
convenience init(rectOfSize size: CGSize) {
self.init()
self.size = size
}
}
class SquareShape: RectShape {
convenience init(squareOfSize size: CGFloat) {
self.init(rectOfSize: CGSize(width: size, height: size))
}
}
RectShape inherits from NSObject and doesn't define any designated initializers. Thus, as per Rule 1, it inherits all of NSObject's designated initializers. The convenience initializer I provided in the implementation correctly delegates to a designated initializer, prior to doing the setup for the intance.
SquareShape inherits from RectShape, doesn't provide a designated initializer and, again, as per Rule 1, inherits all of SquareShape's designated initializers. As per Rule 2, it also inherits the convenience initializer defined in RectShape. Finally, the convenience initializer defined in SquareShape properly delegates across to the inherited convenience initializer, which in turn delegates to the inherited designated initializer.
So, given the fact you're doing nothing wrong and that my example works as expected, I am extrapolating the following hypothesis:
Since SKShapeNode is written in Objective-C, the rule which states that "every convenience initializer must call another initializer from the same class" is not enforced by the language. So, maybe the convenience initializer for SKShapeNode doesn't actually call a designated initializer. Hence, even though the subclass MyShapeNode inherits the convenience initializers as expected, they don't properly delegate to the inherited designated initializer.
But, again, it's only a hypothesis. All I can confirm is that the mechanics works as expected on the two classes I created myself.
Building on Matt's answer, we had to include an additional function, or else the compiler complained about invoking an initializer with no arguments.
Here's what worked to subclass SKShapeNode:
class CircleNode : SKShapeNode {
override init() {
super.init()
}
convenience init(width: CGFloat, point: CGPoint) {
self.init()
self.init(circleOfRadius: width/2)
// Do stuff
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
Good news from 2019! I can report that I now have a SKShape subclass that has the following three initializers:
override init() {
super.init()
}
convenience init(width: CGFloat, point: CGPoint) {
self.init(circleOfRadius: width/2)
self.fillColor = .green
self.position = point
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
and that behaves exactly as expected: when you call the convenience initializer, you get green dots in the desired position. (The double calling of init() as described by #matt and #Crashalot, on the other hand, now results in an error).
I'd prefer to have the ability to modify SKShapeNodes in the .sks scene editor, but you can't have everything. YET.