I have a float (rate) that can be set by the user for this app. I would like to keep this variable persistent, so I would like to use the #AppStorage property wrapper. The problem I'm having is that #AppStorage("rate) var rate: Float = 0.5 gives No exact matches in call to initializer. After some brief googling, I learned that you cannot store floats with app storage. Is there a way to work around this?
Looking at the definition of AppStorage you can find the following allowed types:
Bool
Int
Double
String
URL
Data
enum with Int raw value
enum with String raw value
Bool?
Int?
Double?
String?
URL?
Data?
optional enum with Int raw value
optional enum with String raw value
So you should probably use Double, rather than Float (you shouldn't really use Float anyway, since it's 32-bit rather than 64-bit).
Related
I have the following C# function:
public (double? Average, int? Count) AverageQuotes(Candle.Intervals Interval, DateTime From, DateTime To)
and I get the data in F#:
let struct (average, count) = db.AverageQuotes(previous, time, time + timespan)
The problem is that Average leaves the C# as a double? and arrives in the F# as Nullable so there is a double -> float conversion happening somewhere.
How can I keep the result as a double?
In F#, float and double are aliases for System.Double. The aliases for 32-bit doubles are float32 and single. See basic types or section 18.1 of the specification (PDF)
I found a bug in my code that is caused by NSDecimalNumber.notANumber.intValue returning 9, while I would expect NaN (as floatValue or doubleValue return). Does anybody know why?
Like mentioned by Joakim Danielson and noted in the Apple Developer Documentation
... Because numeric types have different storage capabilities, attempting to initialize with a value of one type and access the value of another type may produce an erroneous result ...
And since Swift's Int struct cannot represent NaN values, you get this erroneous result.
Instead you could use Int's Failable Initialiser init(exactly:) that converts your NSDecimalNumber to an Int? that will either contain it's value or be nil if it is not representable by an Int.
let strangeNumber = NSDecimalNumber.notANumber // nan
let integerRepresentation = Int(exactly: strangeNumber) // nil
I am calling a .mm (objective-c / c++) class method from my swift viewController via linked headers. They are successfully linked. However, I am struggling to pass correct data types that match as parameters.
Here is where I call the function in swift
OpenCVWrapper.thefunc(array1, otherstuff);
...array1 is of type [[Int]]
and here is the definition in objective-c
+(NSString*) thefunc: (int[][2])array1 otherstuff(int)other{
but i get the error
Cannot convert value of type '[[Int]]' to expected argument type 'UnsafeMutablepointer<(Int32)(Int32)>!'
My question is, how can I match the data types so they both handle a basic 2D array of type Int?
UPDATE:
value / structure issues:
structure passed from swift:
structure received in OC:
First of all, you may need to know that C-array and Swift Array are different things. C-array represents a contiguous region of memory and uses the pointer to the first element when passed.
Second, if you want to use imported Objective-C method from Swift, you'd better check the generated header of the method.
(Press the "four square icon" and choose "Generated Interface" while editor is showing the .h file.)
Tested with a small sample project, your method is imported as:
open class func thefunc(_ array1: UnsafeMutablePointer<(Int32, Int32)>!, otherstuff other: Int32) -> String!
(The corresponding type to int in Swift is Int32, not Int.)
So, you may need to pass a mutable pointer to tuple (Int32, Int32), to do that you need to declare a Swift Array of Element type (Int32, Int32) and pass it as an inout argument (using &).
So, you may need to write something like this:
//Assuming all inner Array of `array1` have two elements.
var convertedArray = array1.map {(Int32($0[0]), Int32($0[1]))}
MyClass.thefunc(&convertedArray, otherstuff: someInt32Value)
But the conversion of huge array may take some amount of time, in some cases, which is critical.
You may declare your Swift side array1 as Array of (Int32, Int32) and modify other parts according to this change, and use it as:
//Somewhere in your code...
var array1: [(Int32, Int32)] = []
//...
//And call `thefunc` as:
MyClass.thefunc(&array1, otherstuff: someInt32Value)
I found I can compare two numbers in String format directly in Swift.
Initially, I was trying to cast my String format number to double or Int, and then do the comparison. When I accidentally found I can compare them directly, I have done some tests in my playground. It seems the result are all correct, even with the numbers are empty string or "(Double)" vs "(Int)".
(Exceptions): The test cases with negative numbers in the String format will fail the comparing.
Does anyone know if this is a default behavior in Swift String? I am not sure if I can utilize this in my program.
Example:
var numStrs1 = ["15", "12.2", "15", ""]
var numStrs2 = ["13", "12", "", "23.0"]
func compareNumStr(numStr1:String, numStr2:String) -> Bool {
return numStr1 > numStr2
}
for var i = 0; i < numStrs1.count; ++i{
compareNumStr(numStrs1[i], numStrs2[i])
}
What the default implementation of string comparisons rely on is the Unicode collation algorithm source: http://oleb.net/blog/2014/07/swift-strings/
The comparison you're making right now is not a canonical Int/Double comparison, but rather how it translates to unicode. Depending on how you wish to use the information, it may not be suitable for all scenarios. Like you found out with negative numbers, you should use to .toInt method to get the number you want, and compare the integers instead of comparing the strings.
I am receiving a creation date for an object in a database as milliseconds (# of milliseconds since epoch or whatever) and would like to convert it to/from a string in Swift!
I think I'd need a data type of CUnsignedLong?
I am trying something like this but it outputs the wrong number:
var trial: CUnsignedLong = 1397016000000
println(trial) //outputs 1151628800 instead!
I'm guess this is the wrong data type so what would you all advise in a situation like this?
In Java I was using long which worked.
Thanks!
func currentTimeMillis() -> Int64{
let nowDouble = NSDate().timeIntervalSince1970
return Int64(nowDouble*1000)
}
Working fine
On 32-bit platforms, CUnsignedLong is a 32-bit integer, which is not large
enough to hold the number 1397016000000. (This is different from Java, where
long is generally a 64-bit integer.)
You can use UInt64 or NSTimeInterval (a type alias for Double), which is what the
NSDate methods use.