Converting Int64 to Float in Swift 2.2 - ios

Please let me know what I am doing wrong in below statements, its not compiling (where I have casted the Int64 to Float)
func connection(connection: NSURLConnection, didReceiveResponse response: NSURLResponse) {
NSLog("data came")
var expectedDownloadSize:Float = (Float) response.expectedContentLength
}

It looks like you are getting confused between Obj-C and Swift in a couple of places. As Robert says you need to place the value you want to case within parenthesis wrapped around the type, eg: Float(response.expectedContentLength). It is also worth noting that NSLog has been deprecated in Swift in favour of print(...). Additionally, you no longer need to explicitly declare the variable type as Swift will pick this up automatically.

Casting in Swift is done slightly differently:
var expectedDownloadSize = Float(response.expectedContentLength)
Rather than casting to a value, you actually use Float's initialiser, which takes an Int64 parameter.

in swift you can do like this:
var expectedDownloadSize:Float = Float (response.expectedContentLength)
try

Related

Syntax to Call Swift Extension Function from Objective-C

I have a project with both objective-c and swift. Everything is hooked up properly so I can generally call extensions on classes without issue. In this case, however, I have to pass an argument to the extension and am getting hung up on the syntax.
Here is the Swift 3 Extension
extension Double {
/// Rounds the double to decimal places value
func rounded(toPlaces places:Int) -> Double {
let divisor = pow(10.0, Double(places))
return (self * divisor).rounded() / divisor
}
}
From Swift you can call it with
let x = Double(0.123456789).rounded(toPlaces: 4)
The following is not working in Objective-C: I have played around with it but can't get the correct syntax:
double test = 0.123456789;
double roundedtot = test.roundedToPlaces:2;
The specific error is
'Member reference base type 'double' is not a structure or union'
which I gather is a c error but since you can call the function in swift it seems there ought to be a way to call it in Objc-C
You need to add #objc to the extension definition in order to call it from Objective-C.
Also, it's important to note that only extensions for classes (not for structs or enums) are accessible from Objective-C.

How to convert 'void *' return from a C function to 'UnsafeMutableRawPointer' in Swift 3?

I'm trying to convert a lua bridge from Swift 2 to Swift 3. I am not the original author so there are aspects of the library I don't know very well and the original author seems not interested to continue working on the project. I have most of the conversion done but there remain one place I'm stuck and could not figure out. I've tried searching on SO and on the Internet but could not find anything that could help me solve the problem.
If anyone is interested in looking at the full source code, here is my fork of the project on github: https://github.com/weyhan/lua4swift (My changes is in the Swift3 branch)
Allow me setup the context to the error I'm stuck on. There is a Userdata class, specifically in the method userdataPointer<T>() -> UnsafeMutablePointer<T> the c function lua_touserdata returns the block address of userdata as a void * pointer type.
Original code written in Swift 2:
public class Userdata: StoredValue {
public func userdataPointer<T>() -> UnsafeMutablePointer<T> {
push(vm)
let ptr = lua_touserdata(vm.vm, -1)
vm.pop()
return UnsafeMutablePointer<T>(ptr)
}
public func toCustomType<T: CustomTypeInstance>() -> T {
return userdataPointer().memory
}
public func toAny() -> Any {
return userdataPointer().memory
}
override public func kind() -> Kind { return .Userdata }
}
After the conversion with Xcode 8 migration tool, Xcode is complaining about the return line with error Cannot invoke initializer for type 'UnsafeMutablePointer<T>' with an argument list of type '(UnsafeMutableRawPointer?)':
return UnsafeMutablePointer<T>(ptr)
I've fixed it with:
return (ptr?.assumingMemoryBound(to: T.self))!
Following that above change, now Xcode 8 is now complaining about the calling statement in createCustomType:
public func createCustomType<T: CustomTypeInstance>(setup: (CustomType<T>) -> Void) -> CustomType<T> {
lua_createtable(vm, 0, 0)
let lib = CustomType<T>(self)
pop()
setup(lib)
registry[T.luaTypeName()] = lib
lib.becomeMetatableFor(lib)
lib["__index"] = lib
lib["__name"] = T.luaTypeName()
let gc = lib.gc
lib["__gc"] = createFunction([CustomType<T>.arg]) { args in
let ud = args.userdata
// ******* Here's the line that is causing problem in Swift 3
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
// *******
let o: T = ud.toCustomType()
gc?(o)
return .Nothing
}
if let eq = lib.eq {
lib["__eq"] = createFunction([CustomType<T>.arg, CustomType<T>.arg]) { args in
let a: T = args.customType()
let b: T = args.customType()
return .Value(eq(a, b))
}
}
return lib
}
Where I'm getting stuck is the line :
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
I believe the original author is attempting to clear the memory block where the pointer returned by userdataPointer() call is pointing to.
With the Xcode 8 auto migration tool the above line is converted as below:
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
However Xcode now is then complains that Cannot convert call result type 'UnsafeMutablePointer<_>' to expected type 'UnsafeMutableRawPointer'.
From my research, the change to the return line in userdataPointer seems correct, so I think the issue is with the cast to UnsafeMutableRawPointer. I've tried dropping the cast to UnsafeMutableRawPointer and invoke ud.userdataPointer().deinitialize() directly but I get this error Generic parameter 'T' could not be inferred.
Other things I've tried is to convert the UnsafeMutablePointer to UnsafeMutableRawPointer but It always result in Xcode 8 complaining one thing or another. Any suggestion on how to get this to work?
As you may have already found out, Swift 3 attempts to provide better type safety when it comes to pointers. UnsafeMutablePointer can now only represent a pointer to an instance of a known type. In Swift 2, a C void * was represented by UnsafeMutablePointer<Void>, allowing void and non-void pointers to be treated in the same way, including trying to call a de-initializer of the pointed-to type, which is what the destroy() method in the problematic line of code does:
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
In Swift 3 the de-initializer on the pointee is called using the deinitialize() method of the UnsafeMutablePointer structure. It appears that the migration assistant got confused. The line
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
makes little sense because (1) UnsafeMutablePointer cannot be converted using as to UnsafeMutableRawPointer;
(2) UnsafeMutableRawPointer has not deinitialize() method. In Swift 3, UnsafeMutableRawPointer is a special type to represent void*. It is actually quite understandable why the migration tool made this mistake: it blindly replaced destroy() with deinitialize() and UnsafeMutablePointer<Void> with the corresponding Swift 3 type UnsafeMutableRawPointer, without realizing that the conversion would not work.
I don't quite understand why calling destroy() on a void pointer would work in Swift 2. Maybe this was a bug in the program or some compiler trick allowed the correct de-initializer to be called. Without knowing enough about the code, I can't be more specific than to suggest analyzing it to figure out what is the type pointed to by that pointer on which destroy() was called. For example, if we know for sure that it is always the placeholder type T used on the following line:
let o: T = ud.toCustomType()
then the offending line simply becomes
(ud.userdataPointer() as UnsafeMutablePointer<T>).deinitialize()
We need the conversion in the parentheses to allow the compiler to infer the generic parameter.
Thank you for bringing up an interesting problem. BTW, once you get over this obstacle, you are likely to run into other problems. One thing that jumps out is that UnsafeMutablePointer has no .memory in Swift 3; you'll have to use .pointee instead.
Here's an update. After playing with Swift 2.2 on Linux, I realize that calling destroy() on an UnsafeMutablePointer<A> cast as UnsafeMutablePointer<Void> won't call A's deinitializer, even if it has one. So, you have to be careful with that line...
Try creating an instance of UnsafeMutableRawPointer instead of trying to cast it:
UnsafeMutableRawPointer<T>(ud.userdataPointer()).destroy()

How do i encode a PersistentID for a song using NSCoder in Swift 3

This is my first time using NSCoder and there used to be a method called encodeInteger but it seems to have vanished in Swift 3 and the docs don't seem to help.
It maybe that the confusion lies in the difference between Int and UInt64. Are they the same?
Should i be using a NSKeyedArchiver and if so how does that work to comply with NSCoding?
Here's before with the error:
And after without an error:
Why don't you use NSNumber and encode it as an object? It'd look like this:
let bigNumber: UInt64 = /* 123 */
let number = NSNumber(value: bigNumber)
// Encoding it just like a String
coder.encode(number, forKey: "BigNumberKey")
// Decoding and using the property uint64Value from NSNumber to get the UInt64 back
if let object = coder.decodeObject(forKey: "BigNumberKey") as? NSNumber {
let decodedBigNumber = object.uint64Value
}
If that's a requirement for some reason, NSCoder supports the encoding of Int64 (and you could cast it, described here).
The change from encodeInteger to just encode is part of SE-0005 (which affected a lot of different classes; UIColor.blueColor() is now UIColor.blue(), for instance).

Convert Unmanaged<AnyObject>! to Bool in Swift

I am trying to get the result of a method of an existing Objective C class, called using performSelector
let result = controlDelegate.performSelector("methodThatReturnsBOOL") as? Bool
I need to cast this result to Bool type of Swift.
The code mentioned above, gives a warning
"Cast from 'Unmanaged!' to unrelated type 'Bool' always fails"
and the result is always "false", even when the method returns YES.
Any suggestions for converting the result to Bool ?
Signature for methodThatReturnsBOOL
- (BOOL)methodThatReturnsBOOL
{
return YES;
}
It's been a long time since this has remained unanswered so I'm adding what I have learned along the way.
To convert a BOOL value returned by an Objective C method you can simply cast it using,
if let result = controlDelegate.performSelector("methodThatReturnsBOOL") {
print("true")
} else {
print("false")
}
Here you can also assign the value true/false to a Swift Bool, if required.
Note : I tried casting Unmanaged<AnyObject> directly to Bool using takeRetainedValue() as suggested by many answers on SO, but that doesn't seem to work in this scenario.
You can't do what you want nicely in Swift. My issue with the accepted solution is that it takes advantage of the idea that 0x0 just so happens to be nil. This isn't actually guaranteed by Swift and Objective-C specifications. The same applies to boolean values since 0x0 being false and 0x1 being true is just an arbitrary implementation decision. Aside from being technically incorrect, it's also just awful code to understand. Without thinking about what a nil pointer is on most platforms (32/64 bits of zeros), what was suggested makes zero sense.
After talking to an engineer at WWDC '19 for a while, he suggested that you can actually use valueFor(forKey:) with the key being the function name/selector description. This works since the Obj-C runtime will actually execute any function with the given name/key in order to evaluate the expression. This is still a bit hacky since it requires knowledge of the Objective-C runtime, however it is guaranteed to be platform and implementation independent because valueFor(forKey:) returns an Any? which can be cast into an Int or a Bool without any trouble at all. By using the built in casts instead of speculating on what 0x0 or 0x1 means, we avoid the whole issue of interpreting a nil pointer.
Example:
#objc func doThing() -> Bool{
return true
}
...
let target = someObjectWithDoThing
let selectorCallResult = target.value(forKey: "doThing")
let intResult = selectorCallResult as? Int //Optional<Int(1)>
let boolResult = selectorCallResult as? Bool //Optional<Bool(true)>
This is the solution in Swift 3, as the methods are a bit different. This method also checks if Objective-C object responds to selector, to avoid crashing.
import ObjectiveC
let selector = NSSelectorFromString("methodThatReturnsBOOL")
guard controlDelegate.responds(to: selector) else {
return
}
if let result = controlDelegate.perform(selector) {
print("true")
}
else {
print("false")
}
Similarly to my answer here this can be done with #convention(c) instead:
let selector: Selector = NSSelectorFromString("methodThatReturnsBOOL")
let methodIMP: IMP! = controlDelegate.method(for: selector)
let boolResult: Bool = unsafeBitCast(methodIMP,to:(#convention(c)(Any?,Selector)->Bool).self)(controlDelegate,selector)
This^ particular syntax is available since Swift 3.1, also possible with one extra variable in Swift 3.
More compact cast to bool:
let result = controlDelegate.perform(NSSelectorFromString("methodThatReturnsBOOL")) != nil

Swift + CoreData: Can not set a Bool on NSManagedObject subclass - Bug?

I have a little strange issue which I can't seem to figure out, I have a simple entity with a custom NSManagedObject subclass:
#objc(EntityTest) class EntityTest: NSManagedObject {
#NSManaged var crDate: NSDate
#NSManaged var name: String
#NSManaged var completed: Bool
#NSManaged var completedOn: NSDate
}
This is the problem, I can create the object fine and set the all the values and store in in an array. However late on, when I try to retrieve the same object, I can set all the values EXCEPT the "completed" field. I get a run-time error saying "EXC_BAD_ACCESS", I can read the value, just can not set it.
The debugger points to:
0x32d40ae: je 0x32d4110 ; objc_msgSend + 108
0x32d40b0: movl (%eax), %edx
Maybe some issues due to it being treated as an Objective-C class and trying to send a message to set boolean which I know is a bit funny with CoreData originally representing them as NSNumbers.
Any ideas? I created the class myself, it is not generated.
EDIT:
entity.crDate = NSDate() // succeeds
entity.completed = false // fails
entity.completed.setValue(false, forKey: "completed") //succeeds
So for setting the bool, using the setValue of NSManagedObject works but not the direct setters, though for the non-bool properties, I can set it using the setters.
UPDATE:
While checking this a bit more, it seems like the first time I set the value after getting from NSEntityDescription, it uses normal Swift accessor methods. Later on when I try to access the same object (which was stored in an array) it attempts to treat it as a Objective-C style object and sends a message for method named "setCompleted". I guess it makes sense since I use the dot notation to access it and I used the #objc directive.
I tested this by creating a "setCompleted" method, however in the method I set the value using "completed = newValue" which makes a recursive call back to "setCompleted" causing it to crash... Strange, so at this moment still can't don't have a proper fix. It seems to only happen with Bools.
Only workaround is use the "setValueForKey" method of NSManagedObject. Perhaps file this as a bug report?
If you let Xcode 6 Beta 3 create the Swift files for your entities, it will create NSNumber properties for CoreDatas Boolean type.
You can however just use Bool as a Swift type instead of NSNumber, that worked for me without using the dot syntax though.
It will set the Swift Bool with a NSNumber, that maybe leads to a bug in the dot syntax.
To make it explicit you should use the type NSNumber for attributes in the entity with the Boolean type. Then create a computed property (in iBook The Swift programming language under Language Guide -> Properties -> Computed Properties) to return you a Swift Bool. So would never really store a Bool.
Like so:
#NSManaged var snack: NSNumber
var isSnack: Bool {
get {
return Bool(snack)
}
set {
snack = NSNumber(bool: newValue)
}
}
Of course it would be cool to hide the other (NSNumber attribute), but be patient and Apple will implement private attributes in the future.
Edit:
If you check the box create skalar types it will even use the type Bool in the automatically created Swift file!
So I think it is a bug.
Following on from CH Buckingham who is entirely correct. You are attempting to store a primitive type in core data where it is expecting an NSNumber.
The correct usage would be entity.completed = NSNumber.numberWithBool(false)
This is also why you cannot retrieve this completed value as a bool directly and thus you would need to write:
var: Bool? = entity.completed.boolValue()
You can downcast your property from NSNumber to Bool type like this:
var someBoolVariable = numberValue as Bool
It works for me in this way:
self.twoFactorAuthEnabledSwitch.enabled = userProfile?.twoFactorEnabled as Bool
In XCode 8 Just set :
user.isLoggedIn = true
its works like a charm
I haven't touched swift, but in Objective-C, a BOOL is not an object, and cannot be an object. It's a primitive, and it looks like you are attempting to tell an Objective-C class to treat a BOOL like an object. But I could be making that up, as I'm not familiar with what #NSManaged does under the hood.
https://developer.apple.com/library/ios/documentation/cocoa/conceptual/ProgrammingWithObjectiveC/FoundationTypesandCollections/FoundationTypesandCollections.html
I found that it works fine if you specify the class name and module in the data model (instead of leaving the default NSManagedObject).
Once I set that, I can use Bool, Int16, Int32, etc., without any problems.

Resources