Swift: Convert Foo<String> to Foo<Any> - ios

here is my code
class Foo<T> {
}
class Main {
static func test() {
var foo: Foo<Any>
var bar = Foo<String>()
//Do all my stuff with foo necessitating String
foo = bar
}
}
When I try to assign foo = bar I get an error Cannot assign a value of type Foo<String> to a value of type Foo<Any>.
I don't understand why I got this error, since String conforms to Any.
If I do exactly the same thing but using Array, I don't have any error
static func test() {
var foo: Array<Any>
var bar = Array<String>()
//Do all my stuff with foo necessitating String
foo = bar
}
Does anyone know what's wrong with my code?
Thanks

Arrays and dictionaries are special types that have this kind of behavior built in. This does, however, not apply to custom generic types. The type Foo<Any> is not a supertype (or in this case superclass) of Foo<String> even though Any is a supertype of String. Therefore, you cannot assign variables of these types to each other.
Depending on your particular case the solution outlined in Swift Cast Generics Type might work for you. When Foo wraps a value of type T you can add a generic constructor that converts the value from a differently typed instance of Foo.
class Foo<T> {
var val: T
init(val: T) {
self.val = val
}
init<O>(other: Foo<O>) {
self.val = other.val as! T
}
}
class Main {
static func test() {
var foo: Foo<Any>
var bar = Foo<String>(val: "abc")
//Do all my stuff with foo necessitating String
foo = Foo<Any>(other: bar)
}
}

I have improved on #hennes answer by adding the ability to define how you want to convert from type O to type T
class Foo<T> {
var val: T
init(val: T) {
self.val = val
}
init<O>(other:Foo<O>, conversion:(O) -> T) {
self.val = conversion(other.val)
}
}
class Main {
static func test() {
var foo: Foo<Int>
var bar = Foo<String>(val: "10")
//Do all my stuff with foo necessitating String
foo = Foo<Int>(other: bar, conversion: {(val) in
return Int(val.toInt()!)
})
print(foo.val) //prints 10
print(foo.val.dynamicType) //prints Swift.Int
}
}
This gives you the ability to convert between two types that don't support casting to each other. Additionally, it gives a compiler warning when the conversion is illegal, as opposed to crashing due to the forced cast.
No idea why there is a difference in compiler warnings for 1 or >1 line closures but there are.
Bad compiler warning for 1-line closure
Good compiler warning for 2-line closure

Related

Why can't I use a Color as a ShapeStyle? [duplicate]

I have a generic class that I want to be able to use with a default type. Right now I can initialize it with any type, but I have to be explicit.
//Initialize with a type
MyManager<MyCustomerObject>()
// Initialize with NSObject (what I want to be my default type)
MyManager<NSObject>()
// This doesn't work, but I want this kind of functionality
class MyManager<T = NSObject> {}
// So I can create my manager like so and it inserts the default type as NSObject
MyManager() //Or MyManager<>()
Is this possible in Swift?
There's no support for default generic arguments, but you can fake it by defining the default init() on a type-constrained extension, in which case the compiler will be smart enough to use that type. E.g.:
class MyManager<T> {
let instance: T
init(instance: T) {
self.instance = instance
}
}
extension MyManager where T == NSObject {
convenience init() {
self.init(instance: NSObject())
}
}
And now you can initialize the type with no argument and it will default to MyManager<NSObject>:
let mm1 = MyManager(instance: "Foo") // MyManager<String>
let mm2 = MyManager(instance: 1) // MyManager<Int>
let mm3 = MyManager() // MyManager<NSObject>
SwiftUI uses this technique quite a lot.
No, this currently isn't possible – although it is a part of the Generics Manifesto, so might be something that the Swift team will consider for a future version of the language.
Default generic arguments
Generic parameters could be given the ability to provide default
arguments, which would be used in cases where the type argument is not
specified and type inference could not determine the type argument.
For example:
public final class Promise<Value, Reason=Error> { ... }
func getRandomPromise() -> Promise<Int, Error> { ... }
var p1: Promise<Int> = ...
var p2: Promise<Int, Error> = p1 // okay: p1 and p2 have the same type Promise<Int, Error>
var p3: Promise = getRandomPromise() // p3 has type Promise<Int, Error> due to type inference
In the meantime however, a somewhat unsatisfactory compromise would be the use of a typealias:
class MyManager<T> {}
typealias MyManagerDefault = MyManager<NSObject>
let defaultManager = MyManagerDefault()
Not nearly as slick as just being able to say MyManager(), but it does show up next to MyManager in auto-complete, which is pretty handy.
If T is always a NSObject subclass, you can use a generic constraint in Swift 5.3:
class MyManager<T: NSObject> {
let t = T()
}
class MyCustomerObject: NSObject { }
let a = MyManager()
let b = MyManager<MyCustomerObject>()

Initialiser Inheritance confusion

I am trying to build some mocking infrastructure, I want to be able to return a stubbed value and count the times the value was accessed. I have something simple like this:
class BasicMock<T> {
var callsCount = 0
private let backing: T
var result: T {
callsCount++
return backing
}
init(result: T) {
self.backing = result
}
}
class MockTimeDefinitionSerialiser: BasicMock<[String: [AnyObject]]>, TimeDefinitionSerialiserProtocol {
func serialiseTravelTime(travelTime: JSSTravelTime) -> [String: AnyObject] {
return result
}
}
However trying to build it:
let mockTimeDefinitionSerialiser = MockTimeDefinitionSerialiser(result: ["": ""])
Emits the error 'MockTimeDefinitionSerialiser' cannot be constructed because it has no accessible initialisers
My interpretation of the Swift docs is that I should automatically inherit the initialiser as I have set all stored properties.
What am I doing wrong?
Please remove any unnecessary code when asking a question. I was able to reduce your problem to this:
class Base<T> {
init(t: T) {}
}
class Sub: Base<Int> {}
Sub(t: 0) // error: 'Sub' cannot be constructed because it has no accessible initialisers
It seems like even though you specified the T in the subclass, the compiler cannot infer what the initialiser uses for T. I couldn't find a way to get the initialiser to be inherited, you'd have to use a workaround:
class Sub: Base<Int> {
override init(t: Int) {
super.init(t: t)
}
}

How do I add different types conforming to a protocol with an associated type to a collection?

As an exercise in learning I'm rewriting my validation library in Swift.
I have a ValidationRule protocol that defines what individual rules should look like:
protocol ValidationRule {
typealias InputType
func validateInput(input: InputType) -> Bool
//...
}
The associated type InputType defines the type of input to be validated (e.g String). It can be explicit or generic.
Here are two rules:
struct ValidationRuleLength: ValidationRule {
typealias InputType = String
//...
}
struct ValidationRuleCondition<T>: ValidationRule {
typealias InputType = T
// ...
}
Elsewhere, I have a function that validates an input with a collection of ValidationRules:
static func validate<R: ValidationRule>(input i: R.InputType, rules rs: [R]) -> ValidationResult {
let errors = rs.filter { !$0.validateInput(i) }.map { $0.failureMessage }
return errors.isEmpty ? .Valid : .Invalid(errors)
}
I thought this was going to work but the compiler disagrees.
In the following example, even though the input is a String, rule1's InputType is a String, and rule2s InputType is a String...
func testThatItCanEvaluateMultipleRules() {
let rule1 = ValidationRuleCondition<String>(failureMessage: "message1") { $0.characters.count > 0 }
let rule2 = ValidationRuleLength(min: 1, failureMessage: "message2")
let invalid = Validator.validate(input: "", rules: [rule1, rule2])
XCTAssertEqual(invalid, .Invalid(["message1", "message2"]))
}
... I'm getting extremely helpful error message:
_ is not convertible to ValidationRuleLength
which is cryptic but suggests that the types should be exactly equal?
So my question is... how do I append different types that all conform to a protocol with an associated type into a collection?
Unsure how to achieve what I'm attempting, or if it's even possible?
EDIT
Here's it is without context:
protocol Foo {
typealias FooType
func doSomething(thing: FooType)
}
class Bar<T>: Foo {
typealias FooType = T
func doSomething(thing: T) {
print(thing)
}
}
class Baz: Foo {
typealias FooType = String
func doSomething(thing: String) {
print(thing)
}
}
func doSomethingWithFoos<F: Foo>(thing: [F]) {
print(thing)
}
let bar = Bar<String>()
let baz = Baz()
let foos: [Foo] = [bar, baz]
doSomethingWithFoos(foos)
Here we get:
Protocol Foo can only be used as a generic constraint because it has
Self or associated type requirements.
I understand that. What I need to say is something like:
doSomethingWithFoos<F: Foo where F.FooType == F.FooType>(thing: [F]) {
}
Protocols with type aliases cannot be used this way. Swift doesn't have a way to talk directly about meta-types like ValidationRule or Array. You can only deal with instantiations like ValidationRule where... or Array<String>. With typealiases, there's no way to get there directly. So we have to get there indirectly with type erasure.
Swift has several type-erasers. AnySequence, AnyGenerator, AnyForwardIndex, etc. These are generic versions of protocols. We can build our own AnyValidationRule:
struct AnyValidationRule<InputType>: ValidationRule {
private let validator: (InputType) -> Bool
init<Base: ValidationRule where Base.InputType == InputType>(_ base: Base) {
validator = base.validate
}
func validate(input: InputType) -> Bool { return validator(input) }
}
The deep magic here is validator. It's possible that there's some other way to do type erasure without a closure, but that's the best way I know. (I also hate the fact that Swift cannot handle validate being a closure property. In Swift, property getters aren't proper methods. So you need the extra indirection layer of validator.)
With that in place, you can make the kinds of arrays you wanted:
let len = ValidationRuleLength()
len.validate("stuff")
let cond = ValidationRuleCondition<String>()
cond.validate("otherstuff")
let rules = [AnyValidationRule(len), AnyValidationRule(cond)]
let passed = rules.reduce(true) { $0 && $1.validate("combined") }
Note that type erasure doesn't throw away type safety. It just "erases" a layer of implementation detail. AnyValidationRule<String> is still different from AnyValidationRule<Int>, so this will fail:
let len = ValidationRuleLength()
let condInt = ValidationRuleCondition<Int>()
let badRules = [AnyValidationRule(len), AnyValidationRule(condInt)]
// error: type of expression is ambiguous without more context

Closure expression {exp} vs {return exp}

Given the following function declaration
func foo(f:()->Foo) -> Bar
What is the difference in the following two variants of code using Closure Expressions:
A)
let result = foo {
return Foo()
}
B)
let result = foo {
Foo()
}
Please notice that the type of the constant result is not specified and must be inferred.
The reason why I am asking is that the compiler seems to make a difference - at least currently. This is due to the fact that in quite many scenarios the compiler is unable to infer the type of the closure expression when using return Foo() as the closure expression. Omitting the return on the other hand may issue another error by the compiler since it may require that return (even though I disagree with the compiler, but I do digress ...)
The issue can most often be solved by completely specifying the closure expression, e.g.:
let result = foo { () -> Foo in
return Foo()
}
or sometimes it can be alleviated by explicitly specifying the type of result.
foo function returns Bar. So compiler can easily infer that result is Bar. Foo() returns Foo instance. So compiler can easily infer that closure is correct whether you specify return or not. Here is the code that I play with in Xcode 6.0.1 playground.
struct Foo {
func foos() {
println("foos")
}
}
struct Bar {
func bars() {
println("bars")
}
}
func foo(f: () -> Foo) -> Bar {
let foof = f()
foof.foos()
return Bar()
}
let result = foo {
return Foo()
}
result.bars()

Cannot convert the expression's type 'StaticString' to type 'StringLiteralConvertible' (Swift)

Generics in a programming language that runs on iOS? Shut up and take my money!
Here's my experimental code (this is running in a playground):
class Foo<GenericType1> {
var value:GenericType1?
init(value:GenericType1) {
self.value = value;
}
}
class Foo2<GenericType2> : Foo<GenericType2> {
override init(value:GenericType2) {
super.init(value:value);
}
}
extension Foo {
class func floop<T>(value: T) -> Foo2<T> {
return Foo2<T>(value:value)
}
}
Followed by:
var foo = Foo.floop("test")
The last part throws an error message:
Cannot convert the expression's type 'StaticString' to type 'StringLiteralConvertible'
I can't for the life of me figure out why. Any help would be much appreciated.
I found a similar problem and I could make it work by specifying the type at the moment of instantiating the class, i.e. you must tell what type the generic will have once the class is instantiated, in your case it could be something like this:
var foo = Foo<String>.floop("test") //or,
var foo = Foo<Int>.floop("test")
I tried it in your code and it worked in a playground.
Hope this helps!

Resources