Rounding up/down FLOATS to INTS in alphanumeric NSString - ios

I'm stuck on this one —
I have an alphanumeric NSString. The numbers in the string contain multiple decimals and I would like to round out those numbers to integers.
The string looks like this:
VALUES: A:123.45678 B:34.55789 C:2.94567
and I'm trying to use this code:
[self log:[NSString stringWithFormat:#"VALUES: %.f\n", values.toString]];
to convert to this:
VALUES: A:123 B:35 C:3
Xcode offers this warning —
Format specifies type 'double' but the argument has type 'NSString *'
Replace '%.1f' with '%#'
I think I need a different way of "scanning" the string, identifying the numbers and doing the rounding up/down as needed, then turning that into a new string. I'm just failing miserably with every attempt.
Any help will be appreciated.

I like NSScanner for this sort of thing. Here's a Swift solution; sorry, I'm too lazy to translate it into Objective-C, which is a little more indirect:
let s = "VALUES: A:123.45678 B:34.55789 C:2.94567"
let sc = Scanner(string:s)
sc.charactersToBeSkipped = nil
var arr = [String]()
while (true) {
if let prefix = sc.scanUpToCharacters(from: .decimalDigits) {
arr.append(prefix)
} else { break }
if let num = sc.scanDouble() {
let rounded = num.rounded()
arr.append(String(Int(rounded)))
} else { break }
}
let result = arr.joined()
print(result) // "VALUES: A:123 B:35 C:3"

Related

How to get substring from user input?

i wrote code to get character when user enter in text field and do math with them
this :
#IBOutlet weak internal var textMeli: UITextField!
var myChar = textMeli.text
var numb = [myChar[0]*3 , myChar[1]*7]
but one is wrong
textMeli.text is a String.
myChar is a String.
You can't access a Character from a String using bracket notation.
Take a look at the documentation for the String structure.
You'll see that you can access the string's characters through the characters property. This will return a collection of Characters. Initalize a new array with the collection and you can then use bracket notation.
let string = "Foo"
let character = Array(string.characters)[0]
character will be of type Character.
You'll then need to convert the Character to some sort of number type (Float, Int, Double, etc.) to use multiplication.
Type is important in programming. Make sure you are keeping track so you know what function and properties you can use.
Off the soap box. It looks like your trying to take a string and convert it into a number. I would skip the steps of using characters. Have two text fields, one to accept the first number (as a String) and the other to accept the second number (as a String). Use a number formatter to convert your string to a number. A number formatter will return you an NSNumber. Checking out the documentation and you'll see that you can "convert" the NSNumber to any number type you want. Then you can use multiplication.
Something like this:
let firstNumberTextField: UITextField!
let secondNumberTextField: UITextField!
let numberFormatter = NumberFormatter()
let firstNumber = numberFormatter.number(from: firstNumberTextField.text!)
let secondNumber = numberFormatter.number(from: secondNumberTextField.text!)
let firstInt = firstNumber.integerValue //or whatever type of number you need
let secondInt = secondNumber.integerValue
let product = firstInt * secondInt
Dealing with Swift strings is kind of tricky because of the way they deal with Unicode and "grapheme clusters". You can't index into String objects using array syntax like that.
Swift also doesn't treat characters as interchangeable with 8 bit ints like C does, so you can't do math on characters like you're trying to do. You have to take a String and cast it to an Int type.
You could create an extension to the String class that WOULD let you use integer subscripts of strings:
extension String {
subscript (index: Int) -> String {
let first = self.startIndex
let startIndex = self.index(first, offsetBy: index)
let nextIndex = self.index(first, offsetBy: index + 1)
return self[startIndex ..< nextIndex]
}
}
And then:
let inputString = textMeli.text
let firstVal = Int(inputString[0])
let secondVal = Int(inputString[2])
and
let result = firstVal * 3 + secondVal * 7
Note that the subscript extension above is inefficient and would be a bad way to do any sort of "heavy lifting" string parsing. Each use of square bracket indexing has as bad as O(n) performance, meaning that traversing an entire string would give nearly O(n^2) performance, which is very bad.
The code above also lacks range checking or error handling. It will crash if you pass it a subscript out of range.
Note that its very strange to take multiple characters as input, then do math on the individual characters as if they are separate values. This seems like really bad user interface.
Why don't you step back from the details and tell us what you are trying to do at a higher level?

Hangman Program 2

I have asked a question before about this program, but it seems that not all problems are resolved. I am currently experiencing an error that states: "Cannot convert value of type 'String' to expected argument type '_Element' (aka 'Character') on the "guard let indexInWord" line:
guard let letterIndex = letters.indexOf(sender)
else { return }
let letter = letterArray[letterIndex]
guard let indexInWord = word.characters.indexOf(letter)
else {
print("no such letter in this word")
return
}
// since we have spaces between dashes, we need to calc index this way
let indexInDashedString = indexInWord * 2
var dashString = wordLabel.text
dashString[indexInDashedString] = letter
wordLabel.text = dashString
I tried converting the String 'letter' to Character but it only resulted in more errors. I was wondering how I can possibly convert String to argument type "_Element." Please help.
It is hard to treat a string like a list in swift, mostly because the String.characters is not a typical array. Running a for loop on that works, but if you are looking for a specific character given an index, it is a bit more difficult. What I like doing is adding this function to the string class.
extenstion String {
func getChars() -> [String] {
var chars:[String] = []
for char in characters {
chars.append(String(char))
}
return chars
}
}
I would use this to define a variable when you receive input, then check this instead of String.characters

`CountedSet` initialization issue

I'm comparing the characters contained within two words. In seeking to accomplish this, Set (aka NSSet) seemed like the way to go to accomplish this task. I've discovered it returns false positives on matches, so I am attempting to use CountedSet (aka NSCountedSet) instead.
I'm able to initialize a Set without issue, but I can't get the CountedSet initializer to work. Here's what I've done...
I start with a String:
// Let's say myTextField.text = "test"
let textFieldCharacters = myTextField.text?.characters
// word is a string from the ENABLE list of words
let wordCharacters = word.characters
Then I dump the characters into an Array:
var wordCharactersArray = [Character]()
for character in wordCharacters {
wordCharacterArray.append(character)
}
var textFieldCharactersArray = [Character]()
for character in textFieldCharacters {
wordCharacterArray.append(character)
}
Then I create a Set from the character arrays:
let textFieldSet = Set<Character>(textFieldCharactersArray)
let wordSet = Set<Character>(wordCharactersArray)
Finally, I test to see if the textFieldSet is a superSet of wordSet with the following:
textFieldSet.isSuperset(of: wordSet)
Going back to my example, if myTextField.text is "test", I'm returning values for word whose characters are a superset of the wordSet, but the counts of the individual elements don't match the character counts of myTextField.text
In researching my issue, I've found CountedSet (fka NSCountedSet), which I think would resolve my issue. It has two method signatures:
public convenience init(array: [AnyObject])
public convenience init(set: Set<NSObject>)
I've tried initializing the 2 sets of characters like so:
let textFieldSet = CountedSet(array: textFieldCharacterArray)
let wordSet = CountedSet(array: wordCharacterArray)
I get the following error for the sets
Cannot convert value of type '[Character]' to expected argument type
'[AnyObject]'.
So I tried initializing the set like this:
let textFieldSet = CountedSet(array: textFieldCharacterArray as! [AnyObject])
Which yields the following error:
'AnyObject' is not a subtype of 'Character'
I've also tried to initialize the CountedSet with a Set, per the method signature, but I get errors when I try to do that, too.
Any suggestions how to initialize a CountedSet would be greatly appreciated.
You are correct that if you need to compare not just the presents of elements but also their count, you should use CountedSet, which is a renaming of NSCountedSet for swift 3.0. The problem you are running into is CountedSet can only accept elements that are objects and Characters are not. As Eric D points out in their comment, the easies way to get around this is by mapping your [Character] to [String] which will bridge to [NSString].
You are not running into this problem using Set, because it is a native Swift collection type that initialize with elements of any type. This is why you can initialize a Set with [Character].
To see the difference:
let word = "helo"
let wordCharacters = Array(word.characters)
let wordSet = Set(wordCharacters)
let wordCharStrings = wordCharacters.map{String($0)}
let wordCountedSet = CountedSet(array: wordCharStrings)
let textField = "hello"
let textFieldCharacters = Array(textField.characters)
let textSet = Set(textFieldCharacters)
let textFieldCharStrings = textFieldCharacters.map{String($0)}
let textFieldCountedSet = CountedSet(array: textFieldCharStrings)
textFieldCountedSet.isSubset(of: wordCountedSet as! Set<NSObject>) // returns false, but if word had two or more l's it would return true
textSet.isSubset(of: wordSet) // returns true

How to use optional binding in Swift 2

I'm new to learning Swift so I decided I might as well learn Swift 2 instead. Everything has made sense to me so far except for the following code snippet. Hopefully someone can shed some light on this for me.
//: Playground - noun: a place where people can play
import UIKit
//Works
let possibleNumber="2"
if let actualNumber = Int(possibleNumber) {
print("\'\(possibleNumber)\' has an integer value of \(actualNumber)")
}
else {
print("could not be converted to integer")
}
//Doesn't Work and I'm not sure why
let testTextField = UITextField()
testTextField.text = "2"
let numberString = testTextField.text //I know this is redundant
if let num = Int(numberString) {
print("The number is: \(num)")
}
else {
print("Could not be converted to integer")
}
The top section of the code is straight from Apple's Swift 2 ebook and it makes sense to me how it uses optional binding to convert the string to an int. The second piece of code is basically the same except that the string comes from the text property of a UITextField. The bottom part of the code gives the following error:
Playground execution failed: /var/folders/nl/5dr8btl543j51jkqypj4252mpcnq11/T/./lldb/843/playground21.swift:18:18: error: value of optional type 'String?' not unwrapped; did you mean to use '!' or '?'?
if let num = Int(numberString) {
I fixed the problem by using this line:
if let num = Int(numberString!) {
I just want to know why the second example needs the ! and the first doesn't. I'm sure the problem has to do with the fact that I'm getting the string from a textfield. Thanks!
The difference is that in the first case possibleNumber is not an optional variable. It is definitely a string. It cannot be nil.
In the second case textField.text returns an optional string and so numberString is an optional variable. It could be nil.
Now... The conversion Int("") returns an optional int. if the string is "abc" then it cannot return a number so returns nil. This is what you are unwrapping with the if let... statement.
However, in the second case your string is also optional and the Int() will not accept an optional. So you are force unwrapping it. This is dangerous as it could crash the app if the string is nil.
What you could do instead is this...
if let numberString = textFeidl.text,
number = Int(numberString) {
// use the number
}
This will unwrap the text first and if it's available then use it to. Get the number. If that is not nil then you enter the block.
In Swift 2 you could use the guard let function here also.
Just seen that you are using Swift 2.
You can do it this way also...
func getNumber() -> Int {
guard let numberString = textField.text,
number = Int(numberString)
else {
return 0
}
return number
}

How do I convert an NSString to an integer using Swift?

I need to convert an NSString to an integer in Swift: Here's the current code I'm using; it doesn't work:
var variable = (NSString(data:data, encoding:NSUTF8StringEncoding))
exampeStruct.otherVariable = (variable).intValue
Variable is a normal varable, and exampleStruct is a struct elsewhere in the code with a subvariable otherVariable.
I expect it to set exampleStruct.otherVariable to an int value of the NSString, but I get the following error:
"Cannot convert the expression's type () to type Float"
How do I convert an NSString to int in Swift?
It looks to me like the problem might not be the Int conversion, but rather that the exampleStruct is expecting a Float.
If that's not the issue however (and granted, Xcode errors for Swift often seem to be more about the line number rather than about the actual problem) then something like this should work for you?
var ns:NSString = "1234"
if let i = (ns as String).toInt() {
exampleStruct.otherVariable = i
}
I know you already got your answer, but I just want to explain what (I think) might not be trivial
First, we have some NSData we want to convert to NSString, because no one guaranties the data is a valid UTF8 buffer, it return an optional
var variable = NSString(data:data, encoding:NSUTF8StringEncoding)
Which means variable: NSString?
Usually NSString is bridged to swift's String, but in this case, we use an NSString constructor - you can think about it more as a "Foundation"-like syntax that wasn't directly imported to swift (as there's no bridge for NSData)
we can still use the 'Foundation' way with NSString
if let unwrappedVariable = variable {
var number = unwrappedVariable.intValue
}
if number is a Float, but the string is a string representation of an integer
if let unwrappedVariable = variable {
var number: Float = Float(unwrappedVariable.intValue)
}
if both number and the string (representation of) are floats:
if let unwrappedVariable = variable {
var number:Float = unwrappedVariable.floatValue
}
Anyway, there's a small problem with using Foundation. For these types of conversions it has no concept of an optional value (for int, float). It will return 0 if it cannot parse a string as and integer or float. That's why it's better to use swift native String:
if let variable: String = NSString(data: data, encoding: NSUTF8StringEncoding) {
if let integer = variable.toInt() {
var integerNumber = integer
var floatNumber = Float(integer)
}
}
edit/update:
No need to use NSString when coding with Swift. You can use Swift native String(data:) initializer and then convert the string to Int:
if let variable = String(data: data, encoding: .utf8),
let integer = Int(variable) {
exampeStruct.otherVariable = integer
}
If other variable is a Float type:
if let variable = String(data: data, encoding: .utf8),
let integer = Float(variable) {
exampeStruct.otherVariable = integer
}

Resources