How do you convert int to double in swift? - ios

I've looked at the answers for converting int's to floats and other similar answers but they don't do exactly what I want.
I'm trying to create a basic program that takes a number does some different calculations onto it and the results of those calculations are added together at the end.
For one of those calculations I created a segmented controller with the 3 different values below
var myValues: [Double] = [0.00, 1.00, 1.50]
var myValue = [myValuesSegmentedController.selectedSegmentIndex]
then when one of those values is picked, it's added to the final value. All the values added together are Doubles to 2 decimal places.
var totalAmount = valueA + valueB + valueC + myValue
the problem I'm having is that swift won't let me add "myValue" to those final calculations. It gives me the error:
Swift Compiler Error. Cannot invoke '+' with an argument list of type '($T7, #lvalue [int])'
What do I need to do to change that value to a Double? Or what can I do to get a similar result?

You can cast it with Double() like this
var totalAmount = valueA + valueB + valueC + Double(myValue)

The problem is you are trying to add an array instead of an Int, so You don't even need to convert anything, considering that all of your values are already Doubles and your index actually has to be an Int. So
let myValues = [0.00, 1.00, 1.50]
let myValue = [myValuesSegmentedController.selectedSegmentIndex] // your mistake is here, you are creating one array of integers with only one element(your index)
The correct would be something like these:
let myValues = [0.00, 1.00, 1.50]
let totalAmount = myValues.reduce(0, combine: +) + myValues[myValuesSegmentedController.selectedSegmentIndex]

Put this in a playground:
var myValues: [Double] = [0.00, 1.00, 1.50]
let valueA = 1
let valueB = 2
let valueC = 3
var totalAmount = Double(valueA + valueB + valueC) + myValues[2]
println(totalAmount) //output is 7.5
valueA/B/C are all inferred to be Int.
totalAmount is inferred to be a Double

To convert a float to an integer in Swift. Basic casting like this does not work because these vars are not primitives, unlike floats and ints in Objective-C:
var float:Float = 2.2
var integer:Int = float as Float

Related

Assign values of an Int to two separate variables [duplicate]

This question already has answers here:
How to split an Int to its individual digits?
(11 answers)
Closed 3 years ago.
say for example i have an Int var firstInt = 23 what i need is i want to assign the value of firstInt to two separate variables so the output would be var x = 2 and var y = 3. i tried converting the firstInt to a string like so var strFirstInt = String(firstInt) and wanted to assign the first index of the string to a different variable and the second index to another variable and convert them to Int but i couldn't pick the string by index. so any ideas how to do this?
You can use .compactMap from String like this :
let numberInt = 23
let digits = String(numberInt).compactMap{ $0.wholeNumberValue}
Response :
[2, 3]
And with this array, you put the first member to the first var and seconds to another:
var x = digits[0]
var y = digits[1]
print("The decade is \(x) and units is \(y)")
Response:
The decade is 2 and units is 3
Convert the firstInt to String and then to Array,
var firstInt = 23
let arr = Array(String(firstInt)).map({ String($0 )})
Next, get the elements as per the index from array, i.e.
var x = Int(arr[0])
var y = Int(arr[1])

Swift 3 : Decimal to Int

I tried to convert Decimal to Int with the follow code:
Int(pow(Decimal(size), 2) - 1)
But I get:
.swift:254:43: Cannot invoke initializer for type 'Int' with an argument list of type '(Decimal)'
Here I know pow is returning a Decimal but it seems that Int has no constructors and member functions to convert Decimal to Int.
How can I convert Decimal to Int in Swift 3?
This is my updated answer (thanks to Martin R and the OP for the remarks). The OP's problem was just casting the pow(x: Decimal,y: Int) -> Decimal function to an Int after subtracting 1 from the result. I have answered the question with the help of this SO post for NSDecimal and Apple's documentation on Decimal. You have to convert your result to an NSDecimalNumber, which can in turn be casted into an Int:
let size = Decimal(2)
let test = pow(size, 2) - 1
let result = NSDecimalNumber(decimal: test)
print(Int(result)) // testing the cast to Int
let decimalToInt = (yourDecimal as NSDecimalNumber).intValue
or as #MartinR suggested:
let decimalToInt = NSDecimalNumber(decimal: yourDecimal).intValue
If you have a very long decimal, then beware of rounding errors
let decimal = Decimal(floatLiteral: 100.123456)
let intValue = (decimal as NSDecimalNumber).intValue // This is 100
However
let veryLargeDecimal = Decimal(floatLiteral: 100.123456789123)
let intValue = (veryLargeDecimal as NSDecimalNumber).intValue // This is -84 !
I ensured I rounded my Decimal before I converted it to an Int, using NSDecimalRound (which you can put in an extension of Decimal).
var veryLargeDecimal = Decimal(floatLiteral: 100.123456789123)
var rounded = Decimal()
NSDecimalRound(&rounded, &veryLargeDecimal, 0, .down)
let intValue = (rounded as NSDecimalNumber).intValue // This is now 100
There is nothing wrong with either of the posted answers, but I would like to offer up an extension that reduces the verbosity for scenarios where you need to use this frequently.
extension Decimal {
var int: Int {
return NSDecimalNumber(decimal: self).intValue
}
}
To call it:
let powerDecimal = pow(2, 2) // Output is Decimal
let powerInt = powerDecimal.int // Output is now an Int
Unfortunately there is an intermittent failure using some of the methods provided.
NSDecimalNumber(decimal: <num>).intValue can produce unexpected results...
(lldb) po NSDecimalNumber(decimal: self)
10.6666666666666666666666666666666666666
(lldb) po NSDecimalNumber(decimal: self).intValue
0
I think there is more of a discussion on it here, and #Martin was pointing it out here
Instead of using the decimal value directly, I made a work around that converts the decimal to a whole number before converting the Decimal to an Int.
extension Decimal {
func rounded(_ roundingMode: NSDecimalNumber.RoundingMode = .down, scale: Int = 0) -> Self {
var result = Self()
var number = self
NSDecimalRound(&result, &number, scale, roundingMode)
return result
}
var whole: Self { rounded( self < 0 ? .up : .down) }
var fraction: Self { self - whole }
var int: Int {
NSDecimalNumber(decimal: whole).intValue
}
}
Just use the description of Decimal, String replacement the NSDecimalNumber to bridge it.
extension Decimal {
var intVal: Int? {
return Int(self.description)
}
}

Multiply Text Field Error?

I want to multiply a text field by a multiplier, but I keep getting the error below. Can anyone help? Using Swift.
Binary operator '*' cannot be applied to operands of type 'Int?' and 'Double'
var Number1 = Int(weight.text!)
let lidocainemult = (1.5)
var lidoresult = Number1 * lidocainemult
lidocaine.text = NSString(format:"%d",lidoresult)as String;
You're going to have to convert your variables into the same type first. Here Double would make the most sense, since there would be no loss of information (unlike rounding to produce an Int!).
var Number1 = Double(weight.text!)
let lidocainemult = (1.5)
var lidoresult = Number1 * lidocainemult
lidocaine.text = NSString(format:"%d",lidoresult)as String;
You must convert Number1 to a Double, the operands must be of same type.
var Number1 = Double(weight.text!)

Cannot divide two NSNumbers in swift?

I have two NSNumbers that I want to divide, but in the last line of the code I get the error: "Cannot invoke '/' with an argument list of type '(#Ivalue NSNumber,#Ivalue NSNumber)'"
var firstnumber: NSNumber = object["numberone"] as NSNumber
var secondnumber: NSNumber = object["numbertwo"] as NSNumber
var calculated: NSNumber = firstnumber / secondnumber
Can anyone help me with this??
NSNumber is an object which holds a value that can be retreived as different types by using its properties such as floatValue, integerValue and so on. NSNumber
So by doing nsnumber1 / nsnumber2 would be like doing uiview1 / uiview2
I guess your number1 and number2 are Float or Double, take Float as example:
var caculated = firstnumber.floatValue / secondnumber.floatValue as NSNumber
If you need perform division on NSNumber you need to use NSDecimalNumber.
So you need perform division as follows:
let object = ["numberone" : "33", "numbertwo" : "33"]
var firstnumber: NSDecimalNumber = NSDecimalNumber(string: object["numberone"])
var secondnumber: NSDecimalNumber = NSDecimalNumber(string: object["numbertwo"])
var calculated: NSDecimalNumber = firstnumber.decimalNumberByDividingBy(secondnumber)

How to cast numeric types?

I am using Xcode playground to downcast in swift. Typecasting would normally allow me to convert a type to derived type using As operator in swift. But it gives me error while i try to typecast var a as Double,String. Thanks in advance!!
var a = 1
var b = a as Int
var c = a as Double
var d = a as String
You cannot cast it to each other because they do not relate. You can only cast types that are related like UILabel and UIView or [AnyObject] and [String]. Casting an Int to a Double would be like trying to cast a CGPoint to a CGSize
So to change for example an Int to a Double you have to make a new Double of that Int by doing Double(Int).
This applies to all numeric types like UInt Int64 Float CGFloat etc.
Try this:
var a = 1
var b = Int(a)
var c = Double(a)
var d = String(a)
Cast as Int works, because a is Int
You should do it like this:
var c = Double(a)
var d = toString(a) //or String(a)

Resources