I am just trying to use the ceil or round functions in Swift but am getting a compile time error:
Ambiguous reference to member 'ceil'.
I have already imported the Foundation and UIKit modules. I have tried to compile it with and without the import statements but no luck. Does anyone one have any idea what I am doing wrong?
my code is as follow;
import UIKit
#IBDesignable class LineGraphView: GraphView {
override func setMaxYAxis() {
self.maxYAxis = ceil(yAxisValue.maxElement())
}
}
This problem occurs for something that might seem strange at first, but it's easily resolved.
Put simply, you might think calling ceil() rounds a floating-point number up to its nearest integer, but actually it doesn't return an integer at all: if you give it a Float it returns a Float, and if you give it a Double it returns a Double.
So, this code works because c ends up being a Double:
let a = 0.5
let c = ceil(a)
…whereas this code causes your exact issue because it tries to force a Double into an Int without a typecast:
let a = 0.5
let c: Int = ceil(a)
The solution is to convert the return value of ceil() to be an integer, like this:
let a = 0.5
let c = Int(ceil(a))
The same is true of the round() function, so you'd need the same solution.
Depending on the scope of where you call ceil, you may need to explicitly call Darwin's ceil function (deep in a closure, etc). Darwin is imported through Foundation, which is imported by UIKit.
let myFloat = 5.9
let myCeil = Darwin.ceil(myFloat) // 6
On linux, Glibc is used in place of Darwin for C-level API access. You would have to explicitly import Glibc and call Glibc.ceil(myFloat) instead of using Darwin.
Related
I have a project with both objective-c and swift. Everything is hooked up properly so I can generally call extensions on classes without issue. In this case, however, I have to pass an argument to the extension and am getting hung up on the syntax.
Here is the Swift 3 Extension
extension Double {
/// Rounds the double to decimal places value
func rounded(toPlaces places:Int) -> Double {
let divisor = pow(10.0, Double(places))
return (self * divisor).rounded() / divisor
}
}
From Swift you can call it with
let x = Double(0.123456789).rounded(toPlaces: 4)
The following is not working in Objective-C: I have played around with it but can't get the correct syntax:
double test = 0.123456789;
double roundedtot = test.roundedToPlaces:2;
The specific error is
'Member reference base type 'double' is not a structure or union'
which I gather is a c error but since you can call the function in swift it seems there ought to be a way to call it in Objc-C
You need to add #objc to the extension definition in order to call it from Objective-C.
Also, it's important to note that only extensions for classes (not for structs or enums) are accessible from Objective-C.
Please let me know what I am doing wrong in below statements, its not compiling (where I have casted the Int64 to Float)
func connection(connection: NSURLConnection, didReceiveResponse response: NSURLResponse) {
NSLog("data came")
var expectedDownloadSize:Float = (Float) response.expectedContentLength
}
It looks like you are getting confused between Obj-C and Swift in a couple of places. As Robert says you need to place the value you want to case within parenthesis wrapped around the type, eg: Float(response.expectedContentLength). It is also worth noting that NSLog has been deprecated in Swift in favour of print(...). Additionally, you no longer need to explicitly declare the variable type as Swift will pick this up automatically.
Casting in Swift is done slightly differently:
var expectedDownloadSize = Float(response.expectedContentLength)
Rather than casting to a value, you actually use Float's initialiser, which takes an Int64 parameter.
in swift you can do like this:
var expectedDownloadSize:Float = Float (response.expectedContentLength)
try
I'm trying to convert a lua bridge from Swift 2 to Swift 3. I am not the original author so there are aspects of the library I don't know very well and the original author seems not interested to continue working on the project. I have most of the conversion done but there remain one place I'm stuck and could not figure out. I've tried searching on SO and on the Internet but could not find anything that could help me solve the problem.
If anyone is interested in looking at the full source code, here is my fork of the project on github: https://github.com/weyhan/lua4swift (My changes is in the Swift3 branch)
Allow me setup the context to the error I'm stuck on. There is a Userdata class, specifically in the method userdataPointer<T>() -> UnsafeMutablePointer<T> the c function lua_touserdata returns the block address of userdata as a void * pointer type.
Original code written in Swift 2:
public class Userdata: StoredValue {
public func userdataPointer<T>() -> UnsafeMutablePointer<T> {
push(vm)
let ptr = lua_touserdata(vm.vm, -1)
vm.pop()
return UnsafeMutablePointer<T>(ptr)
}
public func toCustomType<T: CustomTypeInstance>() -> T {
return userdataPointer().memory
}
public func toAny() -> Any {
return userdataPointer().memory
}
override public func kind() -> Kind { return .Userdata }
}
After the conversion with Xcode 8 migration tool, Xcode is complaining about the return line with error Cannot invoke initializer for type 'UnsafeMutablePointer<T>' with an argument list of type '(UnsafeMutableRawPointer?)':
return UnsafeMutablePointer<T>(ptr)
I've fixed it with:
return (ptr?.assumingMemoryBound(to: T.self))!
Following that above change, now Xcode 8 is now complaining about the calling statement in createCustomType:
public func createCustomType<T: CustomTypeInstance>(setup: (CustomType<T>) -> Void) -> CustomType<T> {
lua_createtable(vm, 0, 0)
let lib = CustomType<T>(self)
pop()
setup(lib)
registry[T.luaTypeName()] = lib
lib.becomeMetatableFor(lib)
lib["__index"] = lib
lib["__name"] = T.luaTypeName()
let gc = lib.gc
lib["__gc"] = createFunction([CustomType<T>.arg]) { args in
let ud = args.userdata
// ******* Here's the line that is causing problem in Swift 3
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
// *******
let o: T = ud.toCustomType()
gc?(o)
return .Nothing
}
if let eq = lib.eq {
lib["__eq"] = createFunction([CustomType<T>.arg, CustomType<T>.arg]) { args in
let a: T = args.customType()
let b: T = args.customType()
return .Value(eq(a, b))
}
}
return lib
}
Where I'm getting stuck is the line :
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
I believe the original author is attempting to clear the memory block where the pointer returned by userdataPointer() call is pointing to.
With the Xcode 8 auto migration tool the above line is converted as below:
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
However Xcode now is then complains that Cannot convert call result type 'UnsafeMutablePointer<_>' to expected type 'UnsafeMutableRawPointer'.
From my research, the change to the return line in userdataPointer seems correct, so I think the issue is with the cast to UnsafeMutableRawPointer. I've tried dropping the cast to UnsafeMutableRawPointer and invoke ud.userdataPointer().deinitialize() directly but I get this error Generic parameter 'T' could not be inferred.
Other things I've tried is to convert the UnsafeMutablePointer to UnsafeMutableRawPointer but It always result in Xcode 8 complaining one thing or another. Any suggestion on how to get this to work?
As you may have already found out, Swift 3 attempts to provide better type safety when it comes to pointers. UnsafeMutablePointer can now only represent a pointer to an instance of a known type. In Swift 2, a C void * was represented by UnsafeMutablePointer<Void>, allowing void and non-void pointers to be treated in the same way, including trying to call a de-initializer of the pointed-to type, which is what the destroy() method in the problematic line of code does:
(ud.userdataPointer() as UnsafeMutablePointer<Void>).destroy()
In Swift 3 the de-initializer on the pointee is called using the deinitialize() method of the UnsafeMutablePointer structure. It appears that the migration assistant got confused. The line
(ud.userdataPointer() as UnsafeMutableRawPointer).deinitialize()
makes little sense because (1) UnsafeMutablePointer cannot be converted using as to UnsafeMutableRawPointer;
(2) UnsafeMutableRawPointer has not deinitialize() method. In Swift 3, UnsafeMutableRawPointer is a special type to represent void*. It is actually quite understandable why the migration tool made this mistake: it blindly replaced destroy() with deinitialize() and UnsafeMutablePointer<Void> with the corresponding Swift 3 type UnsafeMutableRawPointer, without realizing that the conversion would not work.
I don't quite understand why calling destroy() on a void pointer would work in Swift 2. Maybe this was a bug in the program or some compiler trick allowed the correct de-initializer to be called. Without knowing enough about the code, I can't be more specific than to suggest analyzing it to figure out what is the type pointed to by that pointer on which destroy() was called. For example, if we know for sure that it is always the placeholder type T used on the following line:
let o: T = ud.toCustomType()
then the offending line simply becomes
(ud.userdataPointer() as UnsafeMutablePointer<T>).deinitialize()
We need the conversion in the parentheses to allow the compiler to infer the generic parameter.
Thank you for bringing up an interesting problem. BTW, once you get over this obstacle, you are likely to run into other problems. One thing that jumps out is that UnsafeMutablePointer has no .memory in Swift 3; you'll have to use .pointee instead.
Here's an update. After playing with Swift 2.2 on Linux, I realize that calling destroy() on an UnsafeMutablePointer<A> cast as UnsafeMutablePointer<Void> won't call A's deinitializer, even if it has one. So, you have to be careful with that line...
Try creating an instance of UnsafeMutableRawPointer instead of trying to cast it:
UnsafeMutableRawPointer<T>(ud.userdataPointer()).destroy()
I am trying to get the result of a method of an existing Objective C class, called using performSelector
let result = controlDelegate.performSelector("methodThatReturnsBOOL") as? Bool
I need to cast this result to Bool type of Swift.
The code mentioned above, gives a warning
"Cast from 'Unmanaged!' to unrelated type 'Bool' always fails"
and the result is always "false", even when the method returns YES.
Any suggestions for converting the result to Bool ?
Signature for methodThatReturnsBOOL
- (BOOL)methodThatReturnsBOOL
{
return YES;
}
It's been a long time since this has remained unanswered so I'm adding what I have learned along the way.
To convert a BOOL value returned by an Objective C method you can simply cast it using,
if let result = controlDelegate.performSelector("methodThatReturnsBOOL") {
print("true")
} else {
print("false")
}
Here you can also assign the value true/false to a Swift Bool, if required.
Note : I tried casting Unmanaged<AnyObject> directly to Bool using takeRetainedValue() as suggested by many answers on SO, but that doesn't seem to work in this scenario.
You can't do what you want nicely in Swift. My issue with the accepted solution is that it takes advantage of the idea that 0x0 just so happens to be nil. This isn't actually guaranteed by Swift and Objective-C specifications. The same applies to boolean values since 0x0 being false and 0x1 being true is just an arbitrary implementation decision. Aside from being technically incorrect, it's also just awful code to understand. Without thinking about what a nil pointer is on most platforms (32/64 bits of zeros), what was suggested makes zero sense.
After talking to an engineer at WWDC '19 for a while, he suggested that you can actually use valueFor(forKey:) with the key being the function name/selector description. This works since the Obj-C runtime will actually execute any function with the given name/key in order to evaluate the expression. This is still a bit hacky since it requires knowledge of the Objective-C runtime, however it is guaranteed to be platform and implementation independent because valueFor(forKey:) returns an Any? which can be cast into an Int or a Bool without any trouble at all. By using the built in casts instead of speculating on what 0x0 or 0x1 means, we avoid the whole issue of interpreting a nil pointer.
Example:
#objc func doThing() -> Bool{
return true
}
...
let target = someObjectWithDoThing
let selectorCallResult = target.value(forKey: "doThing")
let intResult = selectorCallResult as? Int //Optional<Int(1)>
let boolResult = selectorCallResult as? Bool //Optional<Bool(true)>
This is the solution in Swift 3, as the methods are a bit different. This method also checks if Objective-C object responds to selector, to avoid crashing.
import ObjectiveC
let selector = NSSelectorFromString("methodThatReturnsBOOL")
guard controlDelegate.responds(to: selector) else {
return
}
if let result = controlDelegate.perform(selector) {
print("true")
}
else {
print("false")
}
Similarly to my answer here this can be done with #convention(c) instead:
let selector: Selector = NSSelectorFromString("methodThatReturnsBOOL")
let methodIMP: IMP! = controlDelegate.method(for: selector)
let boolResult: Bool = unsafeBitCast(methodIMP,to:(#convention(c)(Any?,Selector)->Bool).self)(controlDelegate,selector)
This^ particular syntax is available since Swift 3.1, also possible with one extra variable in Swift 3.
More compact cast to bool:
let result = controlDelegate.perform(NSSelectorFromString("methodThatReturnsBOOL")) != nil
I am querying HealthKit and saving it to CoreData. I fetch the data in a separate class. In TableViewController I append the data to an array:
if NSUserDefaults.standardUserDefaults().boolForKey("weightSwitch") == true {
xAxisDatesArray.append(cdFetchWeight.queryCoreDataDate())
yAxisValuesArray.append(cdFetchWeight.queryCoreDataData())
and pass it at tableView.dequeueReusableCellWithIdentifier
myCell.xAxisDates = xAxisDatesArray[indexPath.row]
myCell.yAxisValues = yAxisValuesArray[indexPath.row]
In UITableViewCell I initialise the variables (yAxisValues, xAxisDates) and pass them into a charting library which takes the x and y values and plot a chart.
class TableViewCell: UITableViewCell, TKChartDelegate {
var xAxisDates = []
var yAxisValues = []
plot....
I need to get the min and max values of yAxisValues so that I can set the appropriate y-axis range to the data.
I have tried to get the min and max with the following code:
func rangeMinAndMax(){
let minYvalue = minElement(yAxisValues)
let maxYvalue = maxElement(yAxisValues)
}
But this generates the error: Generic parameter 'R.Generator.Element' cannot be bound to non-#objc protocol type 'AnyObject'
- Question: Why and how can I fix it?
Any help would be much appreciated !
The thing to do in a situation like this is to throw away all the misleading dross and boil it down to the simplest possible case:
let arr : [AnyObject] = [1,2,3]
let min = minElement(arr) // same error: "Generic parameter blah-de-blah..."
So, you see, minElement doesn't work on an array of AnyObject, and that's what the error is telling you. And the reason is obvious: an AnyObject is not a Comparable. There is no "minimum" for a bunch of AnyObject things; the entire concept of one AnyObject being "less than" another AnyObject is undefined. You need to cast your array down to an array of something that minElement can work on, namely a Comparable of some kind.
For example, in that code, I can fix the problem like this:
let arr : [AnyObject] = [1,2,3]
let min = minElement(arr as [Int])
That is the sort of thing you need to be doing. Of course, what you cast down to depends upon what these elements actually are. It looks to me as if will probably be Double and NSDate respectively, but that's just a guess; I don't know what's in your arrays. You do (presumably). Note that an NSDate is not a Comparable so you will have a bit more work to do with that one.