I would like to create a function that looks at a string, and if it's a decimal string, returns it as a currency-formatted string. The function below does that, however if I pass in a string that is already formatted, it will fail of course (it expects to see a string like '25' or '25.55' but not '$15.25'
Is there a way to modify my function below to add another if condition that says "if you've already been formatted as a currency string, or your string is not in the right format, return X" (maybe X will be 0, or maybe it will be self (the same string) i'm not sure yet).
func toCurrencyStringFromDecimalString() -> String
{
var numberFormatter = NSNumberFormatter()
numberFormatter.numberStyle = NSNumberFormatterStyle.CurrencyStyle
if (self.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceCharacterSet()).utf16Count == 0)
{
//If whitespace is passed in, just return 0.0 as default
return numberFormatter.stringFromNumber(NSDecimalNumber(string: "0.0"))!
}
else if (IS_NOT_A_DECIMAL_OR_ALREADY_A_CURRENCY_STRING)
{
//So obviously this would go here to see if it's not a decimal (or already contains a current placeholder etc)
}
else
{
return numberFormatter.stringFromNumber(NSDecimalNumber(string: self))!
}
}
Thank you for your help!
Sounds like you need to use NSScanner.
According to the docs, the scanDecimal function of NSScanner:
Skips past excess digits in the case of overflow, so the receiver’s
position is past the entire integer representation.
Invoke this method with NULL as value to simply scan past a decimal integer representation.
I've been mostly programming in Obj-C so my Swift is rubbish, but here's my attempt at translating the appropriate code for detecting numeric strings (as also demonstrated in this answer):
let scanner: NSScanner = NSScanner(string:self)
let isNumeric = scanner.scanDecimal(nil) && scanner.atEnd
If the string is not a decimal representation, isNumeric should return false.
Related
This question already has answers here:
Sorting array alphabetically with number
(8 answers)
Closed 4 years ago.
I'm trying to sort an array by comparing a string value from two items, the values of the property are a number but of type String. How can I convert them to Int and check which is greater. Current code looks like this.
libraryAlbumTracks = tracks.sorted {
$0.position!.compare($1.position!) == .orderedAscending
}
but values like "13" come before "2" because it's a string. I tried to cast the values to Int but because they are optional, I get the error that operand ">" cannot be applied to type Int?
Please how can I go around this in the sorted function?
Provide the numeric option when using compare. This will properly sort strings containing numbers and it also works if some of the string don't actually have numbers or the strings have a combination of numbers and non-numbers.
libraryAlbumTracks = tracks.sorted {
$0.position!.compare($1.position!, options: [ .numeric ]) == .orderedAscending
}
This avoids the need to convert the strings to Int.
Note: You should also avoid force-unwrapping position. Either don't make them optional if it's safe to force-unwrap them, or safely unwrap then or use ?? to provide an appropriate default when comparing them.
libraryAlbumTracks = tracks.sorted {
guard let leftPosition = $0.position,
let leftInt = Int(leftPosition),
let rightPosition = $1.position,
let rightInt = Int(rightPosition) else {
return false
}
return leftInt > rightInt
}
Consider following:
extension String {
func isValidEmail() -> Bool {
let characterset = CharacterSet(charactersIn: "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789")
print(characterset)
if self.rangeOfCharacter(from: characterset.inverted) != nil {
return false
} else {
return true
}
}
}
var name = "Login"
name.isValidEmail() // print true
var incorretLogin = "Loginъ"
incorretLogin.isValidEmail() // print false
Yes, function is work. But im in confussion - how its work?
If i understand correct it work like that:
it take set of characters, then check if all of tested string characters contain symbols from set, and if it is not, then it return false.
Ok, but what is inverted for? If i remove inverted, result will be wrong:
var name = "Login"
name.isValidEmail() // false
var incorretLogin = "Logъin"
incorretLogin.isValidEmail() // false
Now i understand nothing.
If function check simply if string letters are from character set, then why is it matter if set inverted or not?
Could someone explain?
I play a bit in playground:
let characterset = CharacterSet(charactersIn: "a")
print(characterset)
print(characterset.inverted)
Print same result:
<CFCharacterSet Items(U+0061)>
<CFCharacterSet Items(U+0061)>
inverted "returns an inverted copy of the receiver." (see https://developer.apple.com/documentation/foundation/characterset).
In your case inverted means all the characters except the ones you provide in the initializer (all characters except letters and digits). So the method returns false if the email string contains any character that is not a letter or a digit.
Playground example:
According to the documentation
rangeOfCharacter(from:)
Finds and returns the range in the receiver of the first character from a given character set.
The receiver is the string being checked. When no character from the set is found in the string, nil is returned.
When the set is inverted, it contains all invalid characters. Hence, rangeOfCharacter(from:) returns the location of the first invalid character. That is why your first approach works.
When you remove inverted, the call returns the location of the first valid character. Since "Logъin" has both valid and invalid characters, both calls return false. If you call your second function on a string consisting entirely of invalid characters, e.g. "Логин", you would get true.
Note that you can simplify the implementation by removing if:
let characterset = CharacterSet(charactersIn: "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789")
return self.rangeOfCharacter(from: characterset.inverted) == nil
I'm trying to pass an int to another class like 0902 and the sent result loses the first zero. This only happens when a zero is leading.
I also have a function which tries to add it in but that doesn't work either. Is there a reason in Swift for not keeping a leading zero Int?
If I send a value of 0902 to another class what gets shown afterwards is 902, totally confused.
func convertToTime(_ value: Int) -> String {
print(value) // 902
var text = String(format: "%02d", value)
text.insert(":", at: text.index(text.startIndex, offsetBy: +2))
return text
}
The problem here is how you define the type of that value "0902"?
Is it really a number?
Or is it actually a String that only contains numeric values?
Like a telephone number for instance. Is that a number? Or a string?
Would you want to add them together? Or multiply them? If not, make it a String.
Once you make it a string, the leading 0 is no longer an issue as it is just part of the string. As soon as you make it into a number then it makes no sense for the leading zero to be there as 0902 == 902.
Additional
Having had another look at this... Is it even a string? You are dealing with time here? So surely you should be using an NSDate object?
The problem is still the same though. Make sure you define your types correctly.
Determine what this "0902" actually is. Then make it the correct type for that.
Whether that be a number, a string, or a date.
The correct type will then ensure that you get the correct formats, and functions and properties of it.
Change your function argument to String:
func convertToTime(_ value: String) -> String {
print(value) // prints "0902"
var text = value
text.insert(":", at: text.index(text.startIndex, offsetBy: +2))
return text
}
convertToTime("0902") // prints "09:02"
func convert(integerToTimeString int: Int) -> String{
var string = "\(int)"
if string.count > 3{
string.insert(":", at: string.index(string.startIndex, offsetBy: String.IndexDistance(2)))
return string
}
string.insert("0", at: string.startIndex)
string.insert(":", at: string.index(string.startIndex, offsetBy: String.IndexDistance(2)))
return string
}
Try this code instead
It converts the Int to a String first and checks if its 4 characters before adding the ":"
i wrote code to get character when user enter in text field and do math with them
this :
#IBOutlet weak internal var textMeli: UITextField!
var myChar = textMeli.text
var numb = [myChar[0]*3 , myChar[1]*7]
but one is wrong
textMeli.text is a String.
myChar is a String.
You can't access a Character from a String using bracket notation.
Take a look at the documentation for the String structure.
You'll see that you can access the string's characters through the characters property. This will return a collection of Characters. Initalize a new array with the collection and you can then use bracket notation.
let string = "Foo"
let character = Array(string.characters)[0]
character will be of type Character.
You'll then need to convert the Character to some sort of number type (Float, Int, Double, etc.) to use multiplication.
Type is important in programming. Make sure you are keeping track so you know what function and properties you can use.
Off the soap box. It looks like your trying to take a string and convert it into a number. I would skip the steps of using characters. Have two text fields, one to accept the first number (as a String) and the other to accept the second number (as a String). Use a number formatter to convert your string to a number. A number formatter will return you an NSNumber. Checking out the documentation and you'll see that you can "convert" the NSNumber to any number type you want. Then you can use multiplication.
Something like this:
let firstNumberTextField: UITextField!
let secondNumberTextField: UITextField!
let numberFormatter = NumberFormatter()
let firstNumber = numberFormatter.number(from: firstNumberTextField.text!)
let secondNumber = numberFormatter.number(from: secondNumberTextField.text!)
let firstInt = firstNumber.integerValue //or whatever type of number you need
let secondInt = secondNumber.integerValue
let product = firstInt * secondInt
Dealing with Swift strings is kind of tricky because of the way they deal with Unicode and "grapheme clusters". You can't index into String objects using array syntax like that.
Swift also doesn't treat characters as interchangeable with 8 bit ints like C does, so you can't do math on characters like you're trying to do. You have to take a String and cast it to an Int type.
You could create an extension to the String class that WOULD let you use integer subscripts of strings:
extension String {
subscript (index: Int) -> String {
let first = self.startIndex
let startIndex = self.index(first, offsetBy: index)
let nextIndex = self.index(first, offsetBy: index + 1)
return self[startIndex ..< nextIndex]
}
}
And then:
let inputString = textMeli.text
let firstVal = Int(inputString[0])
let secondVal = Int(inputString[2])
and
let result = firstVal * 3 + secondVal * 7
Note that the subscript extension above is inefficient and would be a bad way to do any sort of "heavy lifting" string parsing. Each use of square bracket indexing has as bad as O(n) performance, meaning that traversing an entire string would give nearly O(n^2) performance, which is very bad.
The code above also lacks range checking or error handling. It will crash if you pass it a subscript out of range.
Note that its very strange to take multiple characters as input, then do math on the individual characters as if they are separate values. This seems like really bad user interface.
Why don't you step back from the details and tell us what you are trying to do at a higher level?
Lots of details on this throughout SO and online. Unfortunately, I cannot get any of it to work to with my string. I have this string
https://maps.googleapis.com/maps/api/directions/json?origin=%#,%#&destination=%#,%#&sensor=false&units=metric&mode=driving
And all I'm trying to do is insert the necessary values into the string by doing
let url = String(format: Constants.GoogleDirectionsUrl, road.FromCoordinates.Latitude, road.FromCoordinates.Longitude, road.ToCoordinates.Latitude, road.ToCoordinates.Longitude)
The string though always prints out as
https://maps.googleapis.com/maps/api/directions/json?origin=(null),(null)&destination=(null),(null)&sensor=false&units=metric&mode=driving
Although all the coordinates are valid. When I do string interpolation I get the correct value to show up
print("coord -- \(road.FromCoordinates.Latitude)")
coord -- 29.613929
I've tried %l, %f and %# in the string all with the same results. Anyone see what it is I'm doing incorrect here?
Update
For anyone else, here is what I ended up doing to overcome the above. I followed the answer below a bit and created a class func that I have in one of the global classes in the app. This allows me to call it from any where. Here is the function
class func createUrlDrivingDiretions (sLat: Double, sLon: Double, eLat: Double, eLon: Double) -> String {
return "https://maps.googleapis.com/maps/api/directions/json?origin=\(sLat),\(sLon)&destination=\(eLat),\(eLon)&sensor=false&units=metric&mode=driving"
}
Why don't you use the following syntax (use Swift... not legacy obj-c coding):
let var1 = "xxx"
let var2 = "yyy"
let var3 = "zzz"
let var4 = "www"
let var5 = "kkk"
let s = "https://maps.googleapis.com/maps/api/directions/json?origin=\(var1),\(var2)&destination=\(var4),\(var5)&sensor=false&units=metric&mode=driving"
Make sure var1... var5 are not optional otherwise you have to unwrap them or you'll get something like this in the output string: Optional(xxx)... instead of xxx
If you need special formatting use NumberFormatter (see this: Formatters)