How do you use perform(#selector(setter:)) to set the value of a property using swift 4? - ios

I'm trying to run this code but it's yielding unexpected results.
class Test: NSObject {
#objc var property: Int = 0
}
var t = Test()
t.perform(#selector(setter: Test.property), with: 100)
print(t.property)
The value being printed is some junk number -5764607523034233277. How can I set the value of a property using the perform method?

The performSelector:withObject: method requires an object argument, so Swift is converting the primitive 100 to an object reference.
The setProperty: method (which is what #selector(setter: Test.property) evaluates to) takes a primitive NSInteger argument. So it is treating the object reference as an integer.
Because of this mismatch, you cannot invoke setProperty: using performSelector:withObject: in a meaningful way.
You can, however, use setValue:forKey: and a #keyPath instead, because Key-Value Coding knows how to convert objects to primitives:
t.setValue(100, forKey: #keyPath(Test.property))

You can't use perform(_:with:) to do that since Int is not a NSObject subclass which is necessary for the arg. To specify the type you can workaround it with:
let setterSelector = #selector(setter: Test.property)
let setterIMP = t.method(for: setterSelector)
//Invoking setter
unsafeBitCast(setterIMP,to:(#convention(c)(Any?,Selector,Int)->Void).self)(t, setterSelector,100)
let getterSelector = #selector(getter: Test.property)
let getterIMP = t.method(for: getterSelector)
//Invoking getter
let intValue = unsafeBitCast(getterIMP,to:(#convention(c)(Any?,Selector)->Int).self)(t, getterSelector)
You can find more details and examples in my answer here

Related

Using init as a Closure

Recently I saw the following code line in a book (about CoreData)
return modelURLs(in: modelName).compactMap(NSManagedObjectModel.init)
I know what the code does but the question is: Why and how does it work?
There should be a closure as the argument of the compactMap function but there's only a "NSManagedObjectModel.init" in NORMAL parenthesis. What's the secret about it? What is it doing there? I would understand it if there's a static/class property called init which returns a closure but I don't think there is.
Unfortunately the book doesn't say more about this line of code. I would like to have further readings from the apple docs but I can't find anything. When I make a google search about "init in closures" then I don't get helpful results.
So you guys are my last hope :)
By the way: the function modelURLs(in: modelName) returns an Array of URLs but that's not really important here.
When using closures different syntax can be used as in the below example that converts an int array to a string array
let array = [1, 2, 3]
The following calls to compactMap will all correctly convert the array and generate the same result
let out1 = array.compactMap({return String($0)})
let out2 = array.compactMap({String($0)})
let out3 = array.compactMap {String($0)}
let out4 = array.compactMap(String.init)
When there are two init methods that takes the same number and types of argument then you must add the full signature for the init method to use. Consider this simple example struct
struct TwoTimesInt: CustomStringConvertible {
let value: Int
let twiceTheValue: Int
var description: String {
return "\(value) - \(twiceTheValue)"
}
init(value: Int) {
self.value = value
self.twiceTheValue = 2 * value
}
}
With only 1 init method we can do
let out5 = array.compactMap(TwoTimesInt.init)
But if we add a second init method
init(twiceTheValue: Int) {
self.value = twiceTheValue / 2
self.twiceTheValue = twiceTheValue
}
Then we need to give the full signature of the init method to use
let out6 = array.compactMap( TwoTimesInt.init(value:) )
Another thing worth mentioning when it comes to which method is selected is to look at the full signature of the init method including if it returns an optional value or not. So for example if we change the signature of the second init method to return an optional value
init?(twiceTheValue: Int) {
self.value = twiceTheValue / 2
self.twiceTheValue = twiceTheValue
}
then compactMap will favour this init since it expects a closure that returns an optional value, so if we remove the argument name in the call
let out7 = array.compactMap(TwoTimesInt.init)
will use the second init while the map function on the other hand will use the first init method if called the same way.
let out8 = array.map(TwoTimesInt.init)

Can't perform methods of objects stored in Array[Any]

I want to store objects of different types in an array.
The program below is only a minimum demo. In the anyArray:[Any] an instance of Object1 is stored. The print statement prints out the expected object type. In the following line the test of the stored object's type returns true. This means, during run time the correct object type is known and every thing seems to be fine.
class Object1 {
var name = "Object1"
}
var anyArray:[Any] = [Object1()]
print("\(type(of: anyArray[0]))")
let testResult = anyArray[0] is Object1
print("Test result:\(testResult)")
//print("Name:\((anyArray[0]).name)")
Console output:
Object1
Test result:true
However, if I try to print out the name property of the object, I get an error message from the editor:
Value of type 'Any' has no member 'name'
Well, at compile time the object's type is unknown. That's why the compiler complains. How can I tell the compiler that it is OK to access the properties of the stored object?
The difference comes from the difference from Type Checking in:
runtime, or
compile time
The is operator checks at runtime whether the expression can be cast to the specified type. type(of:) checks, at runtime, the exact type, without consideration for subclasses.
anyArray[0].name doesn't compile since the Type Any doesn't have a name property.
If you're sure anyArray[0] is an Object1, you could use the downcast operator as!:
print("\((anyArray[0] as! Object1).name)")
To check at runtime if an element from anyArray could be an Object1 use optional binding, using the conditional casting operator as?:
if let:
if let object = anyArray[0] as? Object1 {
print(object.name)
}
Or use the guard statement, if you want to use that object in the rest of the scope:
guard let object = anyArray[0] as? Object1 else {
fatalError("The first element is not an Object1")
}
print(object.name)
If all objects in your array have a name property, and you don't want to go through all the hoops of optional binding repeatedly, then use a protocol. Your code will look like this:
protocol Named {
var name: String {get set}
}
class Object1: Named {
var name = "Object1"
}
var anyArray:[Named] = [Object1()]
print("\(type(of: anyArray[0]))")
let testResult = anyArray[0] is Object1
print("Test result:\(testResult)")
print("Name:\(anyArray[0].name)")
Notice that anyArray is now an array of Named objects, and that Object1 conforms to the Named protocol.
To learn more about protocols, have a look here.
You object is still of type Any. You just checked if it can be of type Object1, but you did not cast it. If you want the object as Object1, you need to cast it.
Also if multiple classes can have name, you need to use Protocol like #vadian has mentioned in his comment and cast it to that protocol.
protocol NameProtocol {
var name: String {get set}
}
class Object1: NameProtocol {
var name = "Object1"
}
if let testResult = anyArray[0] as? NameProtocol {
print(testResult.name)
}
Edit: "I want to store objects of different types in an array". The solution that you have marked as correct will not work if all the objects that you have do not conform to the protocol.

.self after struct type in Swift

I’m confused by a line of code found in the Metal example where the memory pointer is bound to a type.
uniforms = UnsafeMutableRawPointer(uniformBuffer.contents()).bindMemory(to: Uniforms.self, capacity: 1)
My confusion is the .self after the Uniforms type. Uniforms is a struct defined in an Objective-C file and the code wont run without .self being in the call. Why is that necessary?
The .self returns the metatype instance for the corresponding type. Think of it as a typesafe type identifier (e.g., way safer than using a string for that). You can then safely call the available initializers, static methods, static properties on such metatype instance.
For instance, you could store it in a variable as well:
let metatype: Uniforms.Type = Uniforms.self
and Uniforms.Type is the actual metatype (i.e., the type's type).
Metatype crash course. A very quick example to get a feel of how this meta stuff might be actually useful:
class Super {
let id: Int
required init(id: Int) { self.id = id }
}
class SubA: Super { ... }
class SubB: Super { ... }
let subclass: Super.Type = SubA.self
and then, later on, use subclass to create an instance without hardcoding the actual subclass type name:
let obj = subclass.init(id: 123) // new SubA instance.
In Swift, .self could be used on a type to extract its meta type or on an instance of a type. Example, use .self to get the meta type and pass it to the API:
self.tableView.registerClass(
UITableViewCell.self, forCellReuseIdentifier: "myUIViewCell")

Argument of '#selector' does not refer to an initializer or method in

I update my project from Swift2.2 to Swift3.0 But "Argument of '#selector' does not refer to an initializer or method" issue received.
Here is code :
for object in Students {
let sectionNumber = collation.section(for: object.firstName!, collationStringSelector: #selector(NSObjectProtocol.self))
sections[sections.count - 1 - sectionNumber].append(object)
}
class Person: NSObject {
#objc var name: String
init(name: String) {
self.name = name
}
}
let p = Person(name: "Alice")
let collation = UILocalizedIndexedCollation.current()
collation.section(for: p, collationStringSelector: #selector(getter: Person.name))
This is also fine since Selector is from Objective-C. Which we need to :NSObject and #objc.
As per docs
func section(for object: Any, collationStringSelector selector: selector) -> Int
Description
Returns an integer identifying the section in which a model object belongs.
The table-view controller should iterate through all model objects for the table view and call this method for each object. If the application provides a Localizable.strings file for the current language preference, the indexed-collation object localizes each string returned by the method identified by selector. It uses this localized name when collating titles. The controller should use the returned integer to identify a local “section” array in which it should insert object.
Parameters
object
A model object of the application that is part of the data model for the table view.
*selector*
A selector that identifies a method returning an identifying string for object that is used in collation. The method should take no arguments and return an NSString object. For example, this could be a name property on the object.
Returns
An integer that identifies the section in which the model object belongs. The numbers returned indicate a sequential ordering.
Solution
Change like below
collation.section(for: "test", collationStringSelector: #selector(getStr)) //here getStr is an other function returning String
func getStr()->String{
return "test"; // this should return an identifying string for object that is used in your collation
}
I implement user2215977 answer but app crash again & again. Now i just change the #selector(NSObjectProtocol.self) to "self". All error gone but just one warning received " Use of string literal for Objective-C selectors is deprecated; use #selector instead ".
If any person have idea to resolve this warning then share me.Otherwise error go now.

Declare an array of Int in Realm Swift

How can I declare an array of integers inside RLMObject?
Like :
dynamic var key:[Int]?
Gives the following error :
Terminating app due to uncaught exception 'RLMException', reason: ''NSArray' is not supported as an RLMObject property. All properties must be primitives, NSString, NSDate, NSData, RLMArray, or subclasses of RLMObject. See https://realm.io/docs/objc/latest/api/Classes/RLMObject.html for more information.'
Lists of primitives are not supported yet unfortunately. There is issue #1120 to track adding support for that. You'll find there some ideas how you can workaround that currently.
The easiest workaround is create a object to hold int values. Then the model to have a List of the object.
class Foo: Object {
let integerList = List<IntObject>() // Workaround
}
class IntObject: Object {
dynamic var value = 0
}
Fortunately arrays of primitive types are now supported in Realm 3.0 and above. (Oct 31 2017)
You can now store primitive types or their nullable counterparts (more specifically: booleans, integer and floating-point number types, strings, dates, and data) directly within RLMArrays or Lists. If you want to define a list of such primitive values you no longer need to define cumbersome single-field wrapper objects. Instead, you can just store the primitive values themselves!
class MyObject : Object {
#objc dynamic var myString: String = ""
let myIntArray = List<Int>()
}
Source: https://realm.io/blog/realm-cocoa-reaches-3-0/
The accepted offer is very costly in term of memory.
You might get a List of very big "n" of objects.
It's not a matter of right and wrong but I think it's good to write here a different workaround.
Another approach:
I decided to use a single string to represent an Int array.
In my Realm class I defined a variable:
dynamic var arrInt: String? = nil
And use it very easily:
let arrToSave = [0, 1, 33, 12232, 394]
<MY_CUSTOM_REALM_CLASS>.arrInt = arrToSave.map { String(describing: $0) }.joined(separator: ",")
And the way back:
let strFetched = <MY_CUSTOM_REALM_CLASS>.arrInt
let intArray = strFetched.components(separatedBy: ",").flatMap { Int($0) }
Will be happy to hear your feedback, as I think this approach is better.
As the error message states, you have to use RLMArray - or rather it's swift equivalent List.
See: Realm docs

Resources