I have two NSNumbers that I want to divide, but in the last line of the code I get the error: "Cannot invoke '/' with an argument list of type '(#Ivalue NSNumber,#Ivalue NSNumber)'"
var firstnumber: NSNumber = object["numberone"] as NSNumber
var secondnumber: NSNumber = object["numbertwo"] as NSNumber
var calculated: NSNumber = firstnumber / secondnumber
Can anyone help me with this??
NSNumber is an object which holds a value that can be retreived as different types by using its properties such as floatValue, integerValue and so on. NSNumber
So by doing nsnumber1 / nsnumber2 would be like doing uiview1 / uiview2
I guess your number1 and number2 are Float or Double, take Float as example:
var caculated = firstnumber.floatValue / secondnumber.floatValue as NSNumber
If you need perform division on NSNumber you need to use NSDecimalNumber.
So you need perform division as follows:
let object = ["numberone" : "33", "numbertwo" : "33"]
var firstnumber: NSDecimalNumber = NSDecimalNumber(string: object["numberone"])
var secondnumber: NSDecimalNumber = NSDecimalNumber(string: object["numbertwo"])
var calculated: NSDecimalNumber = firstnumber.decimalNumberByDividingBy(secondnumber)
Related
I'm currently a making IOS app using Stripe. When I tried to implement a Stripe card object( the image below), I got a compiler error on the line
var expMonth: NSNumber = Int(expArr[0])!
var expYear: NSNumber = Int(expArr[1])!
saying
"Type of optional type [String] not wrapped. not unwrapped; did you mean to use ! or ??"
func buttonPressed(_: UIButton) {
let creditCard = STPCardParams()
creditCard.number = cardNumberTextField.text
creditCard.cvc = cvvTextField.text
if (expDateTextField.text?.isEmpty == nil){
let expArr = expDateTextField.text?.components(separatedBy: "/")
if (expArr?.count)! > 1 {
var expMonth: NSNumber = Int(expArr[0])!
var expYear: NSNumber = Int(expArr[1])!
creditCard.expMonth = expMonth.uintValue
creditCard.expYear = expYear.uintValue
How can I fix this error? Your help would be appreciated!
Your expArr is an optional I suspect, try unwrapping it first:
var expMonth: NSNumber = NSNumber(value: Int(expArr![0])!)
var expYear: NSNumber = NSNumber(value: Int(expArr![1])!)
Also I'm not even sure you can directly assign an int to NSNumber as NSNumber is a pointer type but that might have changed in Swift 3.
Converting from int to NSNumber:
NSNumber(value: intValue)
I have a custom data object with a NSNumber property. How would I sort the messages from one array into another array using the NSNumber property?
I tried this but I get an error
Binary operator '<' cannot be applied to two 'NSNumber?' operands
emptyArray = messagesArray.sort{$0.time < $1.time}
Ascending or descending doesn't matter
class Message{
var toMessage: String?
var time: NSNumber?
}
var messagesArray[Message]()
let message1 = Message()
message1.toMessage = "Bonjour"
message1.time = 123
let message2 = Message()
message1.toMessage = "Hola"
message1.time = 456
let message3 = Message()
message1.toMessage = "Hello"
message1.time = 789
messagesArray.append(message1)
messagesArray.append(message2)
messagesArray.append(message3)
//I need to get the elements from messagesArray into emptyArray sorting by NSNumber time property
var emptyArray = [Message]()
I needed to use the intValue
emptyArray = messagesArray.sort{$0.time.intValue < $1.time.intValue}
I used core data in my iOS swift project and declared a variable as Int32, in the class file it was initialised to NSNumber and while I tried to increment the variable by creating a object for that class, it shows that Binary operator += cannot be applied on NSNumber's. Is it possible to increment the NSNumber or should I choose Int16 or Int64 to access the variable.
Here's three different answers from succinct to verbose:
Given that NSNumbers are immutable, simply assign it a new value equal to what you want:
var num : NSNumber = NSNumber(integer: 1) // NSNumber of 1
num = num.integerValue + 1 // NSNumber of 2
Or you can assign it another way:
var num : NSNumber = NSNumber(integer: 1) // NSNumber of 1
num = NSNumber(integer: num.integerValue + 1) // NSNumber of 2
Or you can convert the NSNumber to an Int, increment the int, and reassign the NSNumber:
var num : NSNumber = NSNumber(integer: 1) // NSNumber of 1
var int : Int = Int(num)
int += 1
num = NSNumber(integer: int) // NSNumber of 2
var number = NSNumber(integer: 10)
number = number.integerValue + 1
Use var. Because let means constants.
var mybalance = bankbalance as NSNumber
But NSNumber is a Object and mybalance.integerValue cannot be assigned.
if let bankbalance: AnyObject? = keystore.objectForKey("coinbalance"){
let mybalance: NSNumber = bankbalance as NSNumber
var b = mybalance.integerValue + 50;
}
It is impossible to increment an NSNumber once the object is created. There is no API that allows that.
You have to recreate the NSNumber object with a new (incremented) value:
let number = NSNumber(int: 15)
let incrementedNumber = NSNumber(int: number.intValue + 1)
I am using Xcode playground to downcast in swift. Typecasting would normally allow me to convert a type to derived type using As operator in swift. But it gives me error while i try to typecast var a as Double,String. Thanks in advance!!
var a = 1
var b = a as Int
var c = a as Double
var d = a as String
You cannot cast it to each other because they do not relate. You can only cast types that are related like UILabel and UIView or [AnyObject] and [String]. Casting an Int to a Double would be like trying to cast a CGPoint to a CGSize
So to change for example an Int to a Double you have to make a new Double of that Int by doing Double(Int).
This applies to all numeric types like UInt Int64 Float CGFloat etc.
Try this:
var a = 1
var b = Int(a)
var c = Double(a)
var d = String(a)
Cast as Int works, because a is Int
You should do it like this:
var c = Double(a)
var d = toString(a) //or String(a)
I've looked at the answers for converting int's to floats and other similar answers but they don't do exactly what I want.
I'm trying to create a basic program that takes a number does some different calculations onto it and the results of those calculations are added together at the end.
For one of those calculations I created a segmented controller with the 3 different values below
var myValues: [Double] = [0.00, 1.00, 1.50]
var myValue = [myValuesSegmentedController.selectedSegmentIndex]
then when one of those values is picked, it's added to the final value. All the values added together are Doubles to 2 decimal places.
var totalAmount = valueA + valueB + valueC + myValue
the problem I'm having is that swift won't let me add "myValue" to those final calculations. It gives me the error:
Swift Compiler Error. Cannot invoke '+' with an argument list of type '($T7, #lvalue [int])'
What do I need to do to change that value to a Double? Or what can I do to get a similar result?
You can cast it with Double() like this
var totalAmount = valueA + valueB + valueC + Double(myValue)
The problem is you are trying to add an array instead of an Int, so You don't even need to convert anything, considering that all of your values are already Doubles and your index actually has to be an Int. So
let myValues = [0.00, 1.00, 1.50]
let myValue = [myValuesSegmentedController.selectedSegmentIndex] // your mistake is here, you are creating one array of integers with only one element(your index)
The correct would be something like these:
let myValues = [0.00, 1.00, 1.50]
let totalAmount = myValues.reduce(0, combine: +) + myValues[myValuesSegmentedController.selectedSegmentIndex]
Put this in a playground:
var myValues: [Double] = [0.00, 1.00, 1.50]
let valueA = 1
let valueB = 2
let valueC = 3
var totalAmount = Double(valueA + valueB + valueC) + myValues[2]
println(totalAmount) //output is 7.5
valueA/B/C are all inferred to be Int.
totalAmount is inferred to be a Double
To convert a float to an integer in Swift. Basic casting like this does not work because these vars are not primitives, unlike floats and ints in Objective-C:
var float:Float = 2.2
var integer:Int = float as Float