I have an array of strings (they were urls that are casted to strings), in the middle of each string they all have a UUID().uuidString with a count variable and a .jpg in them.
let arr = [ htps://firebasestorage.googleapis.com/myApp.appspot.com/o/users%Mxd6EUO5l2LK-mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_3.jpg?alt=media&token=e215e6a1-f5b9-431e-83a3,
htps://firebasestorage.googleapis.com/myAapp.appspot.com/o/users%-Ll_Mxd6EUO5l2LK-mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_1.jpg?alt=media&token=f350cf36-4c4e-4faf,
htps://firebasestorage.googleapis.com/myAPp.appspot.com/o/users%mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_2.jpg?alt=media&token=123uyqtr
....]
The first element has this in the middle of it: 2FBE26726D-B8E5-47C8-9A18-504D23B99090_3.jpg
The second element has this in the middle of it: 2FBE26726D-B8E5-47C8-9A18-504D23B99090_1.jpg
The third element has this in the middle of it: 2FBE26726D-B8E5-47C8-9A18-504D23B99090_2.jpg
The fourth element and on and on ..
How can I sort these strings in this array based on either the substring of the UUID with the _x.jpg or just the _x.jpg alone?
FYI I have access to the UUID beforehand
You can sort the array this way
Convert the strings (back) to URL.
Get the lastPathComponent of each URL.
extract the substring from the last underscore character to the end.
Compare the strings with compare: and numeric option or localizedStandardCompare:
If your starting array is something like this
let array = ["htps://firebasestorage.googleapis.com/myApp.appspot.com/o/users%Mxd6EUO5l2LK-mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_3.jpg?alt=media&token=e215e6a1-f5b9-431e-83a3","htps://firebasestorage.googleapis.com/myAapp.appspot.com/o/users%-Ll_Mxd6EUO5l2LK-mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_1.jpg?alt=media&token=f350cf36-4c4e-4faf","htps://firebasestorage.googleapis.com/myAPp.appspot.com/o/users%mKa%2F4606E275-B2C5-4A69-B997-01423ABFE3B7%2FBE26726D-B8E5-47C8-9A18-504D23B99090_2.jpg?alt=media&token=123uyqtr"]
Give this one a try.
You can split based on your jpg suffix, and then based on your UUID.
let sortedArray = array.sorted { (first, second) -> Bool in
let firstIndex = Int((first.components(separatedBy: ".jpg")[0]).components(separatedBy: "FBE26726D-B8E5-47C8-9A18-504D23B99090_")[1]) ?? -1
let secondIndex = Int((second.components(separatedBy: ".jpg")[0]).components(separatedBy: "FBE26726D-B8E5-47C8-9A18-504D23B99090_")[1]) ?? -1
return firstIndex < secondIndex
}
I'm trying to pass an int to another class like 0902 and the sent result loses the first zero. This only happens when a zero is leading.
I also have a function which tries to add it in but that doesn't work either. Is there a reason in Swift for not keeping a leading zero Int?
If I send a value of 0902 to another class what gets shown afterwards is 902, totally confused.
func convertToTime(_ value: Int) -> String {
print(value) // 902
var text = String(format: "%02d", value)
text.insert(":", at: text.index(text.startIndex, offsetBy: +2))
return text
}
The problem here is how you define the type of that value "0902"?
Is it really a number?
Or is it actually a String that only contains numeric values?
Like a telephone number for instance. Is that a number? Or a string?
Would you want to add them together? Or multiply them? If not, make it a String.
Once you make it a string, the leading 0 is no longer an issue as it is just part of the string. As soon as you make it into a number then it makes no sense for the leading zero to be there as 0902 == 902.
Additional
Having had another look at this... Is it even a string? You are dealing with time here? So surely you should be using an NSDate object?
The problem is still the same though. Make sure you define your types correctly.
Determine what this "0902" actually is. Then make it the correct type for that.
Whether that be a number, a string, or a date.
The correct type will then ensure that you get the correct formats, and functions and properties of it.
Change your function argument to String:
func convertToTime(_ value: String) -> String {
print(value) // prints "0902"
var text = value
text.insert(":", at: text.index(text.startIndex, offsetBy: +2))
return text
}
convertToTime("0902") // prints "09:02"
func convert(integerToTimeString int: Int) -> String{
var string = "\(int)"
if string.count > 3{
string.insert(":", at: string.index(string.startIndex, offsetBy: String.IndexDistance(2)))
return string
}
string.insert("0", at: string.startIndex)
string.insert(":", at: string.index(string.startIndex, offsetBy: String.IndexDistance(2)))
return string
}
Try this code instead
It converts the Int to a String first and checks if its 4 characters before adding the ":"
Equivalent functions in Swift 2 of Java's charAt() and indexOf() ?
Firstly read this article about Swift strings and think about exactly what you mean by characters.
You can use the character view (or the utf16 view if that is the sort of characters that you want) of the string to see it as a collection and if you really need to get characters by index (rather than by iteration) you may want to convert it to an array but normally you just need to advance the index.
let myString = "Hello, Stack overflow"
// Note this index is not an integer but an index into a character view (String.CharacterView.Index)
let index = myString.characters.indexOf( "," )
let character = myString[myString.startIndex.advancedBy(4)] // "o"
This is O(n) (where n is the number of characters into the String) as it needs to iterate over the array as Characters may vary in length in the encoding)
Old answer below. The character array may be quicker for repeat access still as the array accesses are O(1) following the one off O(n) conversion to array (n is the array length).
let cIndex = 5
// This initialises a new array from the characters collection
let characters = [Character](myString.characters)
if cIndex < characters.count {
let character = characters[cIndex]
// Use the character here
}
Obviously some simplification is possible if the index is guaranteed to be within the length of the characters but I prefer to demonstrate with some safety on SO.
You can extend the String class with the missing charAt(index: Int) function:
extension String {
func charAt(index: Int) -> Character {
return [Character](characters)[index]
}
}
You have to convert the input string to an Array then return the character from the specific index here is the code
extension String {
func charAt(_ index : Int) -> Character {
let arr = Array(self.characters);
return (arr[index]);
}
}
This will work same as Java's charAt() method you can use it as
var str:String = "Alex"
print(str.charAt(1))
Did you read the NSString documentation? String and NSString are not identical, but as mentioned by this answer most functions are more or less the same. Anyway, the two exact functions you ask for are right there:
Java's charAt:
func characterAtIndex(_ index: Int) -> unichar
Java's indexOf:
func rangeOfString(_ searchString: String) -> NSRange
The NSRange data type contains just 2 variables: location and length that tell you where the substring is in the original string and how long it is. See the documentation.
//Java
"abc".indexOf("b") => 1
//Swift
"abc".rangeOfString("b").location #=> 1
you can used it instead of CharAt() in swift 3:
func charAt(str: String , int :Int)->Character{
str[str.startIndex]
let index = str.index(str.startIndex, offsetBy: int)
return str[index]
}
I would like to create a function that looks at a string, and if it's a decimal string, returns it as a currency-formatted string. The function below does that, however if I pass in a string that is already formatted, it will fail of course (it expects to see a string like '25' or '25.55' but not '$15.25'
Is there a way to modify my function below to add another if condition that says "if you've already been formatted as a currency string, or your string is not in the right format, return X" (maybe X will be 0, or maybe it will be self (the same string) i'm not sure yet).
func toCurrencyStringFromDecimalString() -> String
{
var numberFormatter = NSNumberFormatter()
numberFormatter.numberStyle = NSNumberFormatterStyle.CurrencyStyle
if (self.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceCharacterSet()).utf16Count == 0)
{
//If whitespace is passed in, just return 0.0 as default
return numberFormatter.stringFromNumber(NSDecimalNumber(string: "0.0"))!
}
else if (IS_NOT_A_DECIMAL_OR_ALREADY_A_CURRENCY_STRING)
{
//So obviously this would go here to see if it's not a decimal (or already contains a current placeholder etc)
}
else
{
return numberFormatter.stringFromNumber(NSDecimalNumber(string: self))!
}
}
Thank you for your help!
Sounds like you need to use NSScanner.
According to the docs, the scanDecimal function of NSScanner:
Skips past excess digits in the case of overflow, so the receiver’s
position is past the entire integer representation.
Invoke this method with NULL as value to simply scan past a decimal integer representation.
I've been mostly programming in Obj-C so my Swift is rubbish, but here's my attempt at translating the appropriate code for detecting numeric strings (as also demonstrated in this answer):
let scanner: NSScanner = NSScanner(string:self)
let isNumeric = scanner.scanDecimal(nil) && scanner.atEnd
If the string is not a decimal representation, isNumeric should return false.
How do you get the length of a String? For example, I have a variable defined like:
var test1: String = "Scott"
However, I can't seem to find a length method on the string.
As of Swift 4+
It's just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = "Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪"
println("unusualMenagerie has \(count(unusualMenagerie)) characters")
// prints "unusualMenagerie has 40 characters"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie) instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count
TLDR:
For Swift 2.0 and 3.0, use test1.characters.count. But, there are a few things you should know. So, read on.
Counting characters in Swift
Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String, so it's the most correct alternative in the sense that, if you'd print the string and count characters by hand, you'd get the same result.
However, because of the way Strings are implemented in Swift, characters don't always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count on characters.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode
scalars. This means that different characters—and different
representations of the same character—can require different amounts of
memory to store. Because of this, characters in Swift do not each take
up the same amount of memory within a string’s representation. As a
result, the number of characters in a string cannot be calculated
without iterating through the string to determine its extended
grapheme cluster boundaries. If you are working with particularly long
string values, be aware that the characters property must iterate over
the Unicode scalars in the entire string in order to determine the
characters for that string.
The count of the characters returned by the characters property is not
always the same as the length property of an NSString that contains
the same characters. The length of an NSString is based on the number
of 16-bit code units within the string’s UTF-16 representation and not
the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = "👍"
emoji.characters.count //returns 1
emoji.utf16.count //returns 2
Swift 1.2 Update: There's no longer a countElements for counting the size of collections. Just use the count function as a replacement: count("Swift")
Swift 2.0, 3.0 and 3.1:
let strLength = string.characters.count
Swift 4.2 (4.0 onwards): [Apple Documentation - Strings]
let strLength = string.count
Swift 1.1
extension String {
var length: Int { return countElements(self) } //
}
Swift 1.2
extension String {
var length: Int { return count(self) } //
}
Swift 2.0
extension String {
var length: Int { return characters.count } //
}
Swift 4.2
extension String {
var length: Int { return self.count }
}
let str = "Hello"
let count = str.length // returns 5 (Int)
Swift 4
"string".count
;)
Swift 3
extension String {
var length: Int {
return self.characters.count
}
}
usage
"string".length
If you are just trying to see if a string is empty or not (checking for length of 0), Swift offers a simple boolean test method on String
myString.isEmpty
The other side of this coin was people asking in ObjectiveC how to ask if a string was empty where the answer was to check for a length of 0:
NSString is empty
Swift 5.1, 5
let flag = "🇵🇷"
print(flag.count)
// Prints "1" -- Counts the characters and emoji as length 1
print(flag.unicodeScalars.count)
// Prints "2" -- Counts the unicode lenght ex. "A" is 65
print(flag.utf16.count)
// Prints "4"
print(flag.utf8.count)
// Prints "8"
tl;dr If you want the length of a String type in terms of the number of human-readable characters, use countElements(). If you want to know the length in terms of the number of extended grapheme clusters, use endIndex. Read on for details.
The String type is implemented as an ordered collection (i.e., sequence) of Unicode characters, and it conforms to the CollectionType protocol, which conforms to the _CollectionType protocol, which is the input type expected by countElements(). Therefore, countElements() can be called, passing a String type, and it will return the count of characters.
However, in conforming to CollectionType, which in turn conforms to _CollectionType, String also implements the startIndex and endIndex computed properties, which actually represent the position of the index before the first character cluster, and position of the index after the last character cluster, respectively. So, in the string "ABC", the position of the index before A is 0 and after C is 3. Therefore, endIndex = 3, which is also the length of the string.
So, endIndex can be used to get the length of any String type, then, right?
Well, not always...Unicode characters are actually extended grapheme clusters, which are sequences of one or more Unicode scalars combined to create a single human-readable character.
let circledStar: Character = "\u{2606}\u{20DD}" // ☆⃝
circledStar is a single character made up of U+2606 (a white star), and U+20DD (a combining enclosing circle). Let's create a String from circledStar and compare the results of countElements() and endIndex.
let circledStarString = "\(circledStar)"
countElements(circledStarString) // 1
circledStarString.endIndex // 2
In Swift 2.0 count doesn't work anymore. You can use this instead:
var testString = "Scott"
var length = testString.characters.count
Here's something shorter, and more natural than using a global function:
aString.utf16count
I don't know if it's available in beta 1, though. But it's definitely there in beta 2.
Updated for Xcode 6 beta 4, change method utf16count --> utf16Count
var test1: String = "Scott"
var length = test1.utf16Count
Or
var test1: String = "Scott"
var length = test1.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)
As of Swift 1.2 utf16Count has been removed. You should now use the global count() function and pass the UTF16 view of the string. Example below...
let string = "Some string"
count(string.utf16)
For Xcode 7.3 and Swift 2.2.
let str = "🐶"
If you want the number of visual characters:
str.characters.count
If you want the "16-bit code units within the string’s UTF-16 representation":
str.utf16.count
Most of the time, 1 is what you need.
When would you need 2? I've found a use case for 2:
let regex = try! NSRegularExpression(pattern:"🐶",
options: NSRegularExpressionOptions.UseUnixLineSeparators)
let str = "🐶🐶🐶🐶🐶🐶"
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.utf16.count), withTemplate: "dog")
print(result) // dogdogdogdogdogdog
If you use 1, the result is incorrect:
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.characters.count), withTemplate: "dog")
print(result) // dogdogdog🐶🐶🐶
You could try like this
var test1: String = "Scott"
var length = test1.bridgeToObjectiveC().length
in Swift 2.x the following is how to find the length of a string
let findLength = "This is a string of text"
findLength.characters.count
returns 24
Swift 2.0:
Get a count: yourString.text.characters.count
Fun example of how this is useful would be to show a character countdown from some number (150 for example) in a UITextView:
func textViewDidChange(textView: UITextView) {
yourStringLabel.text = String(150 - yourStringTextView.text.characters.count)
}
In swift4 I have always used string.count till today I have found that
string.endIndex.encodedOffset
is the better substitution because it is faster - for 50 000 characters string is about 6 time faster than .count. The .count depends on the string length but .endIndex.encodedOffset doesn't.
But there is one NO. It is not good for strings with emojis, it will give wrong result, so only .count is correct.
In Swift 4 :
If the string does not contain unicode characters then use the following
let str : String = "abcd"
let count = str.count // output 4
If the string contains unicode chars then use the following :
let spain = "España"
let count1 = spain.count // output 6
let count2 = spain.utf8.count // output 7
In Xcode 6.1.1
extension String {
var length : Int { return self.utf16Count }
}
I think that brainiacs will change this on every minor version.
Get string value from your textview or textfield:
let textlengthstring = (yourtextview?.text)! as String
Find the count of the characters in the string:
let numberOfChars = textlength.characters.count
Here is what I ended up doing
let replacementTextAsDecimal = Double(string)
if string.characters.count > 0 &&
replacementTextAsDecimal == nil &&
replacementTextHasDecimalSeparator == nil {
return false
}
Swift 4 update comparing with swift 3
Swift 4 removes the need for a characters array on String. This means that you can directly call count on a string without getting characters array first.
"hello".count // 5
Whereas in swift 3, you will have to get characters array and then count element in that array. Note that this following method is still available in swift 4.0 as you can still call characters to access characters array of the given string
"hello".characters.count // 5
Swift 4.0 also adopts Unicode 9 and it can now interprets grapheme clusters. For example, counting on an emoji will give you 1 while in swift 3.0, you may get counts greater than 1.
"👍🏽".count // Swift 4.0 prints 1, Swift 3.0 prints 2
"👨❤️💋👨".count // Swift 4.0 prints 1, Swift 3.0 prints 4
Swift 4
let str = "Your name"
str.count
Remember: Space is also counted in the number
You can get the length simply by writing an extension:
extension String {
// MARK: Use if it's Swift 2
func stringLength(str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 3
func stringLength(_ str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 4
func stringLength(_ str: String) -> Int {
return str.count
}
}
Best way to count String in Swift is this:
var str = "Hello World"
var length = count(str.utf16)
String and NSString are toll free bridge so you can use all methods available to NSString with swift String
let x = "test" as NSString
let y : NSString = "string 2"
let lenx = x.count
let leny = y.count
test1.characters.count
will get you the number of letters/numbers etc in your string.
ex:
test1 = "StackOverflow"
print(test1.characters.count)
(prints "13")
Apple made it different from other major language. The current way is to call:
test1.characters.count
However, to be careful, when you say length you mean the count of characters not the count of bytes, because those two can be different when you use non-ascii characters.
For example;
"你好啊hi".characters.count will give you 5 but this is not the count of the bytes.
To get the real count of bytes, you need to do "你好啊hi".lengthOfBytes(using: String.Encoding.utf8). This will give you 11.
Right now (in Swift 2.3) if you use:
myString.characters.count
the method will return a "Distance" type, if you need the method to return an Integer you should type cast like so:
var count = myString.characters.count as Int
my two cents for swift 3/4
If You need to conditionally compile
#if swift(>=4.0)
let len = text.count
#else
let len = text.characters.count
#endif