Convert Unmanaged<AnyObject>! to Bool in Swift - ios

I am trying to get the result of a method of an existing Objective C class, called using performSelector
let result = controlDelegate.performSelector("methodThatReturnsBOOL") as? Bool
I need to cast this result to Bool type of Swift.
The code mentioned above, gives a warning
"Cast from 'Unmanaged!' to unrelated type 'Bool' always fails"
and the result is always "false", even when the method returns YES.
Any suggestions for converting the result to Bool ?
Signature for methodThatReturnsBOOL
- (BOOL)methodThatReturnsBOOL
{
return YES;
}

It's been a long time since this has remained unanswered so I'm adding what I have learned along the way.
To convert a BOOL value returned by an Objective C method you can simply cast it using,
if let result = controlDelegate.performSelector("methodThatReturnsBOOL") {
print("true")
} else {
print("false")
}
Here you can also assign the value true/false to a Swift Bool, if required.
Note : I tried casting Unmanaged<AnyObject> directly to Bool using takeRetainedValue() as suggested by many answers on SO, but that doesn't seem to work in this scenario.

You can't do what you want nicely in Swift. My issue with the accepted solution is that it takes advantage of the idea that 0x0 just so happens to be nil. This isn't actually guaranteed by Swift and Objective-C specifications. The same applies to boolean values since 0x0 being false and 0x1 being true is just an arbitrary implementation decision. Aside from being technically incorrect, it's also just awful code to understand. Without thinking about what a nil pointer is on most platforms (32/64 bits of zeros), what was suggested makes zero sense.
After talking to an engineer at WWDC '19 for a while, he suggested that you can actually use valueFor(forKey:) with the key being the function name/selector description. This works since the Obj-C runtime will actually execute any function with the given name/key in order to evaluate the expression. This is still a bit hacky since it requires knowledge of the Objective-C runtime, however it is guaranteed to be platform and implementation independent because valueFor(forKey:) returns an Any? which can be cast into an Int or a Bool without any trouble at all. By using the built in casts instead of speculating on what 0x0 or 0x1 means, we avoid the whole issue of interpreting a nil pointer.
Example:
#objc func doThing() -> Bool{
return true
}
...
let target = someObjectWithDoThing
let selectorCallResult = target.value(forKey: "doThing")
let intResult = selectorCallResult as? Int //Optional<Int(1)>
let boolResult = selectorCallResult as? Bool //Optional<Bool(true)>

This is the solution in Swift 3, as the methods are a bit different. This method also checks if Objective-C object responds to selector, to avoid crashing.
import ObjectiveC
let selector = NSSelectorFromString("methodThatReturnsBOOL")
guard controlDelegate.responds(to: selector) else {
return
}
if let result = controlDelegate.perform(selector) {
print("true")
}
else {
print("false")
}

Similarly to my answer here this can be done with #convention(c) instead:
let selector: Selector = NSSelectorFromString("methodThatReturnsBOOL")
let methodIMP: IMP! = controlDelegate.method(for: selector)
let boolResult: Bool = unsafeBitCast(methodIMP,to:(#convention(c)(Any?,Selector)->Bool).self)(controlDelegate,selector)
This^ particular syntax is available since Swift 3.1, also possible with one extra variable in Swift 3.

More compact cast to bool:
let result = controlDelegate.perform(NSSelectorFromString("methodThatReturnsBOOL")) != nil

Related

How can I convert a string in a textfield to an Int in Swift?

I tried for a long time to turn the text into an Int but it did not work. I tried it like this:
(AnzahlString is a textfield)
var AnzahlAInt = 0
if let AnzahlAString = AnzahlString.text {
let AnzahlAInt = Int(AnzahlAString)
}
But then I always get the error:
Value of optional type 'Int?' must be unwrapped to a value of type 'Int'
Then I added a ! at the end of Int(AnzahlAString)! so I don't get a error, but now when I press on the button, the app crashes. It was predictable, but how can I change this now to an Int without the !?
At first glance, it looks like you have two things to check for:
is AnzahlString.text present, and
does it represent an Int
The first check is in fact not necessary, since .text will never return nil, even though it's marked as Optional. This means you can safely force-unwrap it.
The second check is easily done by using the ?? operator:
let AnzahlAInt = Int(AnzahlString.text!) ?? 0
PS, just as a stylistic hint: variable names in Swift ususally start with a lowercase letter, names starting with capital letters are used for types.
PPS: your code as written shadows AnzahlAInt - the value of your var is never changed.
The reason why the resulting Int is optional, is that parsing might or might not succeed. For example, if you try to parse the string "Fluffy Bunnies" into an Int, there is no reasonable Int that can be returned, therefore the result of parsing that string will be nil.
Furthermore, if you force the parser by using !, you're telling Swift that you know for sure that the string you pass will always result in a valid Int, and when it doesn't, the app crashes.
You need to handle the situation in which the parse result is nil. For example:
if let AnzahlAIntResult = Int(AnzahlAString) {
// We only get here if the parse was successful and we have an Int.
// AnzahlAIntResult is now an Int, so it can be assigned to AnzahlAInt.
AnzahlAInt = AnzahlAIntResult
}
You did a good job so far but missed out one thing.
This line tries to convert the String into an Int. However this can fail, since your String can be something like this "dfhuse".
let AnzahlAInt = Int(AnzahlAString)
This is why the result of Int(AnzahlAString) is an Optional (Int?). To use it as an real Int, you have to unwrap it.
First solution is the !, however, every time this does fail your app crashes. Not a good Idea to use so.
The best solution would be Optional Binding, as you already used to get the text of your text field.
if let AnzahlAString = AnzahlString.text {
if let safeInt = Int(AnzahlAString) {
// You can use safeInt as a real Int
} else {
print("Converting your String to an Int failed badly!")
}
}
Hope this helps you. Feel free to ask again if something is unclear.
For unwrapping you can also use guard like this
Simple and easy
guard let AnzahlAInt = Int(AnzahlString.text!) else {
return
}
print(AnzahlAInt)

Result of call to finalizeStatusForAccount is Unused in swift 3

self.coreDataStack.finalizeStatusForAccount(currentAccount, fromStatus: ObjectStatus.tempClear, toStatus: ObjectStatus.clear)
self.coreDataStack.finalizeStatusForAccount(currentAccount, fromStatus: ObjectStatus.tempGreen, toStatus: ObjectStatus.green)
I'm getting these two warnings can anyone please suggest how to remove these warnings
The warning says that the function finalizeStatusForAccount returns a value yet you don’t use it. You might want to assign the return value to a variable, like this:
let returnValue = self.coreDataStack.finalizeStatusForAccount...
Or if you know you don’t need to check the return value you would write this:
let _ = self.coreDataStack.finalizeStatusForAccount...
The latter is a standard pattern in Swift.
If this is your function finalizeStatusForAccount you can add #discardableResult this way:
#discardableResult
func finalizeStatusForAccount() -> Any {
return 100500
}

How to convert 'void *' return from a C function to 'UnsafeMutableRawPointer' in Swift 3?

I'm trying to convert a lua bridge from Swift 2 to Swift 3. I am not the original author so there are aspects of the library I don't know very well and the original author seems not interested to continue working on the project. I have most of the conversion done but there remain one place I'm stuck and could not figure out. I've tried searching on SO and on the Internet but could not find anything that could help me solve the problem.
If anyone is interested in looking at the full source code, here is my fork of the project on github: https://github.com/weyhan/lua4swift (My changes is in the Swift3 branch)
Allow me setup the context to the error I'm stuck on. There is a Userdata class, specifically in the method userdataPointer<T>() -> UnsafeMutablePointer<T> the c function lua_touserdata returns the block address of userdata as a void * pointer type.
Original code written in Swift 2:
public class Userdata: StoredValue {
public func userdataPointer<T>() -> UnsafeMutablePointer<T> {
push(vm)
let ptr = lua_touserdata(vm.vm, -1)
vm.pop()
return UnsafeMutablePointer<T>(ptr)
}
public func toCustomType<T: CustomTypeInstance>() -> T {
return userdataPointer().memory
}
public func toAny() -> Any {
return userdataPointer().memory
}
override public func kind() -> Kind { return .Userdata }
}
After the conversion with Xcode 8 migration tool, Xcode is complaining about the return line with error Cannot invoke initializer for type 'UnsafeMutablePointer<T>' with an argument list of type '(UnsafeMutableRawPointer?)':
return UnsafeMutablePointer<T>(ptr)
I've fixed it with:
return (ptr?.assumingMemoryBound(to: T.self))!
Following that above change, now Xcode 8 is now complaining about the calling statement in createCustomType:
public func createCustomType<T: CustomTypeInstance>(setup: (CustomType<T>) -> Void) -> CustomType<T> {
lua_createtable(vm, 0, 0)
let lib = CustomType<T>(self)
pop()
setup(lib)
registry[T.luaTypeName()] = lib
lib.becomeMetatableFor(lib)
lib["__index"] = lib
lib["__name"] = T.luaTypeName()
let gc = lib.gc
lib["__gc"] = createFunction([CustomType<T>.arg]) { args in
let ud = args.userdata
// ******* Here's the line that is causing problem in Swift 3
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
// *******
let o: T = ud.toCustomType()
gc?(o)
return .Nothing
}
if let eq = lib.eq {
lib["__eq"] = createFunction([CustomType<T>.arg, CustomType<T>.arg]) { args in
let a: T = args.customType()
let b: T = args.customType()
return .Value(eq(a, b))
}
}
return lib
}
Where I'm getting stuck is the line :
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
I believe the original author is attempting to clear the memory block where the pointer returned by userdataPointer() call is pointing to.
With the Xcode 8 auto migration tool the above line is converted as below:
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
However Xcode now is then complains that Cannot convert call result type 'UnsafeMutablePointer<_>' to expected type 'UnsafeMutableRawPointer'.
From my research, the change to the return line in userdataPointer seems correct, so I think the issue is with the cast to UnsafeMutableRawPointer. I've tried dropping the cast to UnsafeMutableRawPointer and invoke ud.userdataPointer().deinitialize() directly but I get this error Generic parameter 'T' could not be inferred.
Other things I've tried is to convert the UnsafeMutablePointer to UnsafeMutableRawPointer but It always result in Xcode 8 complaining one thing or another. Any suggestion on how to get this to work?
As you may have already found out, Swift 3 attempts to provide better type safety when it comes to pointers. UnsafeMutablePointer can now only represent a pointer to an instance of a known type. In Swift 2, a C void * was represented by UnsafeMutablePointer<Void>, allowing void and non-void pointers to be treated in the same way, including trying to call a de-initializer of the pointed-to type, which is what the destroy() method in the problematic line of code does:
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
In Swift 3 the de-initializer on the pointee is called using the deinitialize() method of the UnsafeMutablePointer structure. It appears that the migration assistant got confused. The line
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
makes little sense because (1) UnsafeMutablePointer cannot be converted using as to UnsafeMutableRawPointer;
(2) UnsafeMutableRawPointer has not deinitialize() method. In Swift 3, UnsafeMutableRawPointer is a special type to represent void*. It is actually quite understandable why the migration tool made this mistake: it blindly replaced destroy() with deinitialize() and UnsafeMutablePointer<Void> with the corresponding Swift 3 type UnsafeMutableRawPointer, without realizing that the conversion would not work.
I don't quite understand why calling destroy() on a void pointer would work in Swift 2. Maybe this was a bug in the program or some compiler trick allowed the correct de-initializer to be called. Without knowing enough about the code, I can't be more specific than to suggest analyzing it to figure out what is the type pointed to by that pointer on which destroy() was called. For example, if we know for sure that it is always the placeholder type T used on the following line:
let o: T = ud.toCustomType()
then the offending line simply becomes
(ud.userdataPointer() as UnsafeMutablePointer<T>).deinitialize()
We need the conversion in the parentheses to allow the compiler to infer the generic parameter.
Thank you for bringing up an interesting problem. BTW, once you get over this obstacle, you are likely to run into other problems. One thing that jumps out is that UnsafeMutablePointer has no .memory in Swift 3; you'll have to use .pointee instead.
Here's an update. After playing with Swift 2.2 on Linux, I realize that calling destroy() on an UnsafeMutablePointer<A> cast as UnsafeMutablePointer<Void> won't call A's deinitializer, even if it has one. So, you have to be careful with that line...
Try creating an instance of UnsafeMutableRawPointer instead of trying to cast it:
UnsafeMutableRawPointer<T>(ud.userdataPointer()).destroy()

Swift iOS take in a input and use it for function name

func callFunctionName(parameters: String) -> returnType
{
var somevalue = parameters
var returnValue = somevalue()
return returnValue
}
Is there a way to take in a input and use it as a function name?
example: let say input is green, I want to call function green. if input is red call function red etc...
Or to have a huge if statement to check each input to call different functions
This is not possible in Swift. You will have to store any functions you want to call in your own dictionary, and then use that to look up functions by name.
A "huge statement" might be feasible for a small number of functions, and it would certainly perform faster, but the ideal approach would be to store them in a dictionary.
However, if you are dealing with objects:
if exampleObject.respondsToSelector("exampleFunction")
{
exampleObject.performSelector("exampleFunction")
}
This approach currently works with all classes, be it Objective-C or Swift.
This is easily possible in Objective-C by using:
[self performSelector:NSSelectorFromString(#"green")];
But Swift is less dynamically typed than Objective-C and has less support for reflection. The Objective-C way I described above is very prone to crashes at runtime if the input (e.g. "purple" if you didn't have a function for purple) doesn't match a function that exists.
Using a big if statement is not an unreasonable way to approach it.
As the other answers said, an if statement is probably the best way to go about this.
override func viewDidLoad() {
if someValue = green {
green() //This will run whatever you have in the green() function below
}
}
func green() {
//put code for output of green here
}
Then all you have to do is make separate functions for all your outputs such as the green() func
This is the closest I could get. (And it's partially based on the answer of Vatsal Manot)
The idea is to use closures.
First of all we define the return type of the closure: let's use Int (of course you can change this later).
typealias colorClosureReturnType = Int
Now lets define the type of a closure that receives no parameters (you can change this too) and returns colorClosureReturnType
typealias colorClosureType = () -> (colorClosureReturnType)
Fine, now lets create a dictionary where the key is a String and the value is a closure of type colorClosureType:
let dict : [String: colorClosureType] = [
"red": { return 0 /* you can write here all the logic you need */ },
"green": { return 1 /* also here */},
"blue": { return 2 /* and here */}
]
Usually I let Swift Type Inference to infer the type of the variable/constant. But this time for sake of clarity I explicitly declared the type of the dictionary.
Now we can build a simple function that receives a String and return an optional colorClosureReturnType.
func callClosure(colorName: String) -> colorClosureReturnType? {
return dict[colorName]?()
}
As you can see the function look in the dictionary a closure associated to the key received as param. If it does found it then runs the closure and returns the results.
If the dictionary does not contain the requested key then the function returns nil. That's why the return type of this function is colorClosureReturnType? and not colorClosureReturnType.
Finally some tests:
callClosure("red") // 0
callClosure("green") // 1
callClosure("blue") // 2
func callFunctionName(parameters: String) -> ()
{
_ = NSTimer.scheduledTimerWithTimeInterval(0.1, target: self, selector: Selector(parameters), userInfo: nil, repeats: false)
}
func green() {
}

How to cast an typed array to one of the types protocols?

I have this in my playground:
func foo(printables: [Printable]) {
// do something
}
enum MenuOptions: String, Printable {
case ChangePickup = "Change Pickup Time"
case RequestSupport = "Request Support"
var description: String {
get {
return self.rawValue
}
}
}
var menuOptions: [MenuOptions] = [.ChangePickup]
foo(menuOptions)
let printables = menuOptions as [Printable]
Both last lines give a compiler error. I would expect menuOptions to be implicitly casted to [Printable] but the compiler complains with a [MenuOptions] is not convertible to [Printable] error. Am I missing something?
If I do menuOptions as! AnyObject as! [Printable] the compiler doesn't complain and the code works properly, but this seems dirty. Funnily enough, doing foo(menuOptions.map { $0 }) also works! Is this just a compiler bug?
You are trying to create a generic function constrained to objects that conform to the Printable protocol. Just define it differently:
func foo<T: Printable>(printables: [T]) {
// do something
}
The function will now work in your context. It will take an array of any object type that conforms to Printable.
Here are the relevant docs on Generics, specifically Type Constraints.
As to why this works, but your definition doesn't - this is just the Swift syntax. Printable is a protocol, but T is a type (generic, but still a type). You want your function to accept any type (T) that conforms to a protocol (<T: Protocol>). This is just how the language has been designed.
Using menuOptions as! AnyObject as! [Printable] may work in this example, but it is a bad practice. Using as! is like telling the compiler "trust me, this will work." This is fine in limited cases, but sets you up for trouble in the long run. Using as! turns off the compiler's type checking, which means if there is a problem it will happen at runtime, and your app will crash.
Again, I highly suggest reading the docs on Type Constraints, which I linked above. It explains it in detail.
I found that this is working. It seems that Swift compiler is not that smart.
var menuOptions: [MenuOptions] = [MenuOptions.ChangePickup]
let printables:[Printable] = menuOptions as! [Printable]
foo(printables)
I don't know why these are not working
foo(menuOptions)
foo(menuOptions as! [Printable])

Resources