I have a String with different value every time:
String words = "My name is Rob Joe";
I want to get only last word
Joe
but every time I don't know how many words the string consists of
String words = "My name is Rob Joe";
var array = words.split(" "); // <-- [My, name, is, Rob, Joe]
print(array.last); // output 'Joe'
In Flutter(Dart), you can get the last word of a string by splitting the string into an array of substrings using the split method and then accessing the last element of the resulting array. Here's an example:
String words = "My name is Rob Joe";
List<String> wordsArray = myString.split(" ");
String lastWord = wordsArray[wordsArray.length - 1];
print(lastWord); // "Joe"
Instead of splitting, you can also use substring and lastIndexOf:
final words = "My name is Rob Joe";
final lastWord = words.substring(words.lastIndexOf(" ") + 1);
print(lastWord);
If you want to find the last word, you should first properly define what a "word" is.
It's clearly obvious here, which is why it's doubly important to write it down, because something else may be just as obvious to someone else.
(Read: Nothing is obvious. Document it all!)
But let's say that a word is a maximal contiguous sequence of ASCII letters.
Then that's what you should look for.
Splitting on space characters works for this string, but won't if you have punctuation, or trailing whitespace, or any number of other complications.
I'd probably use a RegExp:
// Matches a word. If used properly, only matches entire words.
var wordRE = RegExp(r"[a-zA-Z]+");
// Assume at least one word in `words`. Otherwise need more error handling.
var lastWord = wordRe.allMatches(words).last[0]!;
This can be a little inefficient, if the string is long.
Another approach that might be more efficient, depending on the RegExp implementation, is to search backwards:
/// Captures first sequence of [a-zA-Z]+ looking backwards from end.
var lastWordRE = RegExp(r"$(?<=([a-zA-Z]+)[^a-zA-Z]*)");
var lastWord = lastWordRE.firstMatch(words)?[1]!;
If you don't want to rely on RegExps (which are admittedly not that readable, and their performance is not always predictable), you can search for letters manually:
String? lastWord(String words) {
var cursor = words.length;
while (--cursor >= 0) {
if (_isLetter(words, cursor)) {
var start = 0;
var end = cursor + 1;
while (--cursor >= 0) {
if (!_isLetter(words, prev)) {
start = cursor + 1;
break;
}
}
return words.substring(start, end);
}
}
return null;
}
bool _isLetter(String string, int index) {
var char = string.codeUnitAt(index) | 0x20; // lower-case if letter.
return char >= 0x61 /*a*/ && char <= 0x7a /*z*/;
}
But first of all, decide what a word is.
Some very real words in common sentences might contain, e.g., ' or -, but whether they matter to you or not depends on your use-case.
More exotic cases may need you to decide whether"e.g." is one word or two? Is and/or? Is i18n?
Depends on what it'll be used for.
This question is specifically about converting an Array of type Character to a String. Converting an Array of Strings or numbers to a string is not the topic of discussion here.
In the following 2 lines, I would expect myStringFromArray to be set to "C,a,t!,🐱"
var myChars: [Character] = ["C", "a", "t", "!", "🐱"]
let myStringFromArray = myChars.joinWithSeparator(",");
However, I can't execute that code because the compiler complains about an "ambiguous reference to member joinWithSeparator".
So, two questions:
1) Apple says,
"Every instance of Swift’s Character type represents a single extended
grapheme cluster. An extended grapheme cluster is a sequence of one or
more Unicode scalars that (when combined) produce a single
human-readable character."
Which to me sounds at least homogeneous enough to think it would be reasonable to implement the joinWithSeparator method to support the Character type. So, does anyone have a good answer as to why they don't do that???
2) What's the best way to transform an Array of type Character to a String in Swift?
Note: if you don't want a separator between the characters, the solution would be:
let myStringFromArray = String(myChars)
and that would give you "Cat!🐱"
Which to me sounds at least homogeneous enough to think it would be reasonable to implement the joinWithSeparator method to support the Character type. So, does anyone have a good answer as to why they don't do that???
This may be an oversight in the design. This error occurs because there are two possible candidates for joinWithSeparator(_:). I suspect this ambiguity exists because of the way Swift can implicit interpret double quotes as either String or Character. In this context, it's ambiguous as to which to choose.
The first candidate is joinWithSeparator(_: String) -> String. It does what you're looking for.
If the separator is treated as a String, this candidate is picked, and the result would be: "C,a,t,!,🐱"
The second is joinWithSeparator<Separator : SequenceType where Separator.Generator.Element == Generator.Element.Generator.Element>(_: Separator) -> JoinSequence<Self>. It's called on a Sequence of Sequences, and given a Sequence as a seperator. The method signature is a bit of a mouthful, so lets break it down. The argument to this function is of Separator type. This Separator is constrained to be a SequenceType where the elements of the sequence (Seperator.Generator.Element) must have the same type as the elements of this sequence of sequences (Generator.Element.Generator.Element).
The point of that complex constraint is to ensure that the Sequence remains homogeneous. You can't join sequences of Int with sequences of Double, for example.
If the separator is treated as a Character, this candidate is picked, the result would be: ["C", ",", "a", ",", "t", ",", "!", ",", "🐱"]
The compiler throws an error to ensure you're aware that there's an ambiguity. Otherwise, the program might behave differently than you'd expect.
You can disambiguate this situation by this by explicitly making each Character into a String. Because String is NOT a SequenceType, the #2 candidate is no longer possible.
var myChars: [Character] = ["C", "a", "t", "!", "🐱"]
var anotherVar = myChars.map(String.init).joinWithSeparator(",")
print(anotherVar) //C,a,t,!,🐱
This answer assumes Swift 2.2.
var myChars: [Character] = ["C", "a", "t", "!", "🐱"]
var myStrings = myChars.map({String($0)})
var result = myStrings.joinWithSeparator(",")
joinWithSeparator is only available on String arrays:
extension SequenceType where Generator.Element == String {
/// Interpose the `separator` between elements of `self`, then concatenate
/// the result. For example:
///
/// ["foo", "bar", "baz"].joinWithSeparator("-|-") // "foo-|-bar-|-baz"
#warn_unused_result
public func joinWithSeparator(separator: String) -> String
}
You could create a new extension to support Characters:
extension SequenceType where Generator.Element == Character {
#warn_unused_result
public func joinWithSeparator(separator: String) -> String {
var str = ""
self.enumerate().forEach({
str.append($1)
if let arr = self as? [Character], endIndex: Int = arr.endIndex {
if $0 < endIndex - 1 {
str.append(Character(separator))
}
}
})
return str
}
}
var myChars: [Character] = ["C", "a", "t", "!", "🐱"]
let charStr = myChars.joinWithSeparator(",") // "C,a,t,!,🐱"
Related discussion on Code Review.SE.
Context: Swift3(beta)
TL;DR Goofy Solution
var myChars:[Character] = ["C", "a", "t", "!", "🐱"]
let separators = repeatElement(Character("-"), count: myChars.count)
let zipped = zip(myChars, separators).lazy.flatMap { [$0, $1] }
let joined = String(zipped.dropLast())
Exposition
OK. This drove me nuts. In part because I got caught up in the join semantics. A join method is very useful, but when you back away from it's very specific (yet common) case of string concatenation, it's doing two things at once. It's splicing other elements in with the original sequence, and then it's flattening the 2 deep array of characters (array of strings) into one single array (string).
The OPs use of single characters in an Array sent my brain elsewhere. The answers given above are the simplest way to get what was desired. Convert the single characters to single character strings and then use the join method.
If you want to consider the two pieces separately though... We start with the original input:
var input:[Character] = ["C", "a", "t", "!", "🐱"]
Before we can splice our characters with separators, we need a collection of separators. In this case, we want a pseudo collection that is the same thing repeated again and again, without having to actually make any array with that many elements:
let separators = repeatElement(Character(","), count: myChars.count)
This returns a Repeated object (which oddly enough you cannot instantiate with a regular init method).
Now we want to splice/weave the original input with the separators:
let zipped = zip(myChars, separators).lazy.flatMap { [$0, $1] }
The zip function returns a Zip2Sequence(also curiously must be instantiated via free function rather than direct object reference). By itself, when enumerated the Zip2Sequence just enumerates paired tuples of (eachSequence1, eachSequence2). The flatMap expression turns that into a single series of alternating elements from the two sequences.
For large inputs, this would create a largish intermediary sequence, just to be soon thrown away. So we insert the lazy accessor in there which lets the transform only be computed on demand as we're accessing elements from it (think iterator).
Finally, we know we can make a String from just about any sort of Character sequence. So we just pass this directly to the String creation. We add a dropLast() to avoid the last comma being added.
let joined = String(zipped.dropLast())
The valuable thing about decomposing it this way (it's definitely more lines of code, so there had better be a redeeming value), is that we gain insight into a number of tools we could use to solve problems similar, but not identical, to join. For example, say we want the trailing comma? Joined isn't the answer. Suppose we want a non constant separator? Just rework the 2nd line. Etc...
I'd like to capture a passcode that is between 6 and 8 digits long.
I'd like to match:
123-4567 and
12-34-56-78
And fail:
1234567890 and 123-456-7890
As it stands I'm using (\\b(?:\\d[-,\\h]?+){5,7}\\d\\b)
This successfully knocks back 1234567890, but gives a partial match on 123-456-7890. Is there a way for the word boundary to include hyphens within it's count?
You can use lookarounds:
(?<!-)\b\d(?:[-,\h]?\d){5,7}(?!-)\b
See the regex demo
Swift regex uses ICU flavor, so both the lookbehind and a lookahead will work. The (?<!-) lookbehind makes sure there is no - before the digit that starts a new word (or after a word boundary), and (?!-) lookahead makes sure there is no - after the 8th digit right at the word boundary.
Do not forget to use double backslashes.
As #AlanMoore suggests, the word boundaries and --lookarounds can be substituted with lookarounds (?<![\w-]) and (?![\w-]). This will make the regex a bit more efficient since there will be only one position to be checked once at the start and end:
(?<![\w-])\d(?:[-,\h]?\d){5,7}(?![\w-])
See another demo
Not an exact literal answer, but an alternative native Swift solution
enum CheckResult {
case Success(String), Failure
}
func checkPassCode(string : String) -> CheckResult
{
let filteredArray = string.characters.filter{ $0 != "-" }.map{ String($0) }
return (6...8).contains(filteredArray.count) ? .Success(filteredArray.joinWithSeparator("")) : .Failure
}
checkPassCode("123-4567") // Success(1234567)
checkPassCode("12-34-56-78") // Success(12345678)
checkPassCode("1234567890") // Failure
checkPassCode("123-456-7890") // Failure
How do you get the length of a String? For example, I have a variable defined like:
var test1: String = "Scott"
However, I can't seem to find a length method on the string.
As of Swift 4+
It's just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = "Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪"
println("unusualMenagerie has \(count(unusualMenagerie)) characters")
// prints "unusualMenagerie has 40 characters"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie) instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count
TLDR:
For Swift 2.0 and 3.0, use test1.characters.count. But, there are a few things you should know. So, read on.
Counting characters in Swift
Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String, so it's the most correct alternative in the sense that, if you'd print the string and count characters by hand, you'd get the same result.
However, because of the way Strings are implemented in Swift, characters don't always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count on characters.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode
scalars. This means that different characters—and different
representations of the same character—can require different amounts of
memory to store. Because of this, characters in Swift do not each take
up the same amount of memory within a string’s representation. As a
result, the number of characters in a string cannot be calculated
without iterating through the string to determine its extended
grapheme cluster boundaries. If you are working with particularly long
string values, be aware that the characters property must iterate over
the Unicode scalars in the entire string in order to determine the
characters for that string.
The count of the characters returned by the characters property is not
always the same as the length property of an NSString that contains
the same characters. The length of an NSString is based on the number
of 16-bit code units within the string’s UTF-16 representation and not
the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = "👍"
emoji.characters.count //returns 1
emoji.utf16.count //returns 2
Swift 1.2 Update: There's no longer a countElements for counting the size of collections. Just use the count function as a replacement: count("Swift")
Swift 2.0, 3.0 and 3.1:
let strLength = string.characters.count
Swift 4.2 (4.0 onwards): [Apple Documentation - Strings]
let strLength = string.count
Swift 1.1
extension String {
var length: Int { return countElements(self) } //
}
Swift 1.2
extension String {
var length: Int { return count(self) } //
}
Swift 2.0
extension String {
var length: Int { return characters.count } //
}
Swift 4.2
extension String {
var length: Int { return self.count }
}
let str = "Hello"
let count = str.length // returns 5 (Int)
Swift 4
"string".count
;)
Swift 3
extension String {
var length: Int {
return self.characters.count
}
}
usage
"string".length
If you are just trying to see if a string is empty or not (checking for length of 0), Swift offers a simple boolean test method on String
myString.isEmpty
The other side of this coin was people asking in ObjectiveC how to ask if a string was empty where the answer was to check for a length of 0:
NSString is empty
Swift 5.1, 5
let flag = "🇵🇷"
print(flag.count)
// Prints "1" -- Counts the characters and emoji as length 1
print(flag.unicodeScalars.count)
// Prints "2" -- Counts the unicode lenght ex. "A" is 65
print(flag.utf16.count)
// Prints "4"
print(flag.utf8.count)
// Prints "8"
tl;dr If you want the length of a String type in terms of the number of human-readable characters, use countElements(). If you want to know the length in terms of the number of extended grapheme clusters, use endIndex. Read on for details.
The String type is implemented as an ordered collection (i.e., sequence) of Unicode characters, and it conforms to the CollectionType protocol, which conforms to the _CollectionType protocol, which is the input type expected by countElements(). Therefore, countElements() can be called, passing a String type, and it will return the count of characters.
However, in conforming to CollectionType, which in turn conforms to _CollectionType, String also implements the startIndex and endIndex computed properties, which actually represent the position of the index before the first character cluster, and position of the index after the last character cluster, respectively. So, in the string "ABC", the position of the index before A is 0 and after C is 3. Therefore, endIndex = 3, which is also the length of the string.
So, endIndex can be used to get the length of any String type, then, right?
Well, not always...Unicode characters are actually extended grapheme clusters, which are sequences of one or more Unicode scalars combined to create a single human-readable character.
let circledStar: Character = "\u{2606}\u{20DD}" // ☆⃝
circledStar is a single character made up of U+2606 (a white star), and U+20DD (a combining enclosing circle). Let's create a String from circledStar and compare the results of countElements() and endIndex.
let circledStarString = "\(circledStar)"
countElements(circledStarString) // 1
circledStarString.endIndex // 2
In Swift 2.0 count doesn't work anymore. You can use this instead:
var testString = "Scott"
var length = testString.characters.count
Here's something shorter, and more natural than using a global function:
aString.utf16count
I don't know if it's available in beta 1, though. But it's definitely there in beta 2.
Updated for Xcode 6 beta 4, change method utf16count --> utf16Count
var test1: String = "Scott"
var length = test1.utf16Count
Or
var test1: String = "Scott"
var length = test1.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)
As of Swift 1.2 utf16Count has been removed. You should now use the global count() function and pass the UTF16 view of the string. Example below...
let string = "Some string"
count(string.utf16)
For Xcode 7.3 and Swift 2.2.
let str = "🐶"
If you want the number of visual characters:
str.characters.count
If you want the "16-bit code units within the string’s UTF-16 representation":
str.utf16.count
Most of the time, 1 is what you need.
When would you need 2? I've found a use case for 2:
let regex = try! NSRegularExpression(pattern:"🐶",
options: NSRegularExpressionOptions.UseUnixLineSeparators)
let str = "🐶🐶🐶🐶🐶🐶"
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.utf16.count), withTemplate: "dog")
print(result) // dogdogdogdogdogdog
If you use 1, the result is incorrect:
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.characters.count), withTemplate: "dog")
print(result) // dogdogdog🐶🐶🐶
You could try like this
var test1: String = "Scott"
var length = test1.bridgeToObjectiveC().length
in Swift 2.x the following is how to find the length of a string
let findLength = "This is a string of text"
findLength.characters.count
returns 24
Swift 2.0:
Get a count: yourString.text.characters.count
Fun example of how this is useful would be to show a character countdown from some number (150 for example) in a UITextView:
func textViewDidChange(textView: UITextView) {
yourStringLabel.text = String(150 - yourStringTextView.text.characters.count)
}
In swift4 I have always used string.count till today I have found that
string.endIndex.encodedOffset
is the better substitution because it is faster - for 50 000 characters string is about 6 time faster than .count. The .count depends on the string length but .endIndex.encodedOffset doesn't.
But there is one NO. It is not good for strings with emojis, it will give wrong result, so only .count is correct.
In Swift 4 :
If the string does not contain unicode characters then use the following
let str : String = "abcd"
let count = str.count // output 4
If the string contains unicode chars then use the following :
let spain = "España"
let count1 = spain.count // output 6
let count2 = spain.utf8.count // output 7
In Xcode 6.1.1
extension String {
var length : Int { return self.utf16Count }
}
I think that brainiacs will change this on every minor version.
Get string value from your textview or textfield:
let textlengthstring = (yourtextview?.text)! as String
Find the count of the characters in the string:
let numberOfChars = textlength.characters.count
Here is what I ended up doing
let replacementTextAsDecimal = Double(string)
if string.characters.count > 0 &&
replacementTextAsDecimal == nil &&
replacementTextHasDecimalSeparator == nil {
return false
}
Swift 4 update comparing with swift 3
Swift 4 removes the need for a characters array on String. This means that you can directly call count on a string without getting characters array first.
"hello".count // 5
Whereas in swift 3, you will have to get characters array and then count element in that array. Note that this following method is still available in swift 4.0 as you can still call characters to access characters array of the given string
"hello".characters.count // 5
Swift 4.0 also adopts Unicode 9 and it can now interprets grapheme clusters. For example, counting on an emoji will give you 1 while in swift 3.0, you may get counts greater than 1.
"👍🏽".count // Swift 4.0 prints 1, Swift 3.0 prints 2
"👨❤️💋👨".count // Swift 4.0 prints 1, Swift 3.0 prints 4
Swift 4
let str = "Your name"
str.count
Remember: Space is also counted in the number
You can get the length simply by writing an extension:
extension String {
// MARK: Use if it's Swift 2
func stringLength(str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 3
func stringLength(_ str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 4
func stringLength(_ str: String) -> Int {
return str.count
}
}
Best way to count String in Swift is this:
var str = "Hello World"
var length = count(str.utf16)
String and NSString are toll free bridge so you can use all methods available to NSString with swift String
let x = "test" as NSString
let y : NSString = "string 2"
let lenx = x.count
let leny = y.count
test1.characters.count
will get you the number of letters/numbers etc in your string.
ex:
test1 = "StackOverflow"
print(test1.characters.count)
(prints "13")
Apple made it different from other major language. The current way is to call:
test1.characters.count
However, to be careful, when you say length you mean the count of characters not the count of bytes, because those two can be different when you use non-ascii characters.
For example;
"你好啊hi".characters.count will give you 5 but this is not the count of the bytes.
To get the real count of bytes, you need to do "你好啊hi".lengthOfBytes(using: String.Encoding.utf8). This will give you 11.
Right now (in Swift 2.3) if you use:
myString.characters.count
the method will return a "Distance" type, if you need the method to return an Integer you should type cast like so:
var count = myString.characters.count as Int
my two cents for swift 3/4
If You need to conditionally compile
#if swift(>=4.0)
let len = text.count
#else
let len = text.characters.count
#endif