Get the length of a String - ios

How do you get the length of a String? For example, I have a variable defined like:
var test1: String = "Scott"
However, I can't seem to find a length method on the string.

As of Swift 4+
It's just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = "Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪"
println("unusualMenagerie has \(count(unusualMenagerie)) characters")
// prints "unusualMenagerie has 40 characters"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie) instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count

TLDR:
For Swift 2.0 and 3.0, use test1.characters.count. But, there are a few things you should know. So, read on.
Counting characters in Swift
Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String, so it's the most correct alternative in the sense that, if you'd print the string and count characters by hand, you'd get the same result.
However, because of the way Strings are implemented in Swift, characters don't always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count on characters.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode
scalars. This means that different characters—and different
representations of the same character—can require different amounts of
memory to store. Because of this, characters in Swift do not each take
up the same amount of memory within a string’s representation. As a
result, the number of characters in a string cannot be calculated
without iterating through the string to determine its extended
grapheme cluster boundaries. If you are working with particularly long
string values, be aware that the characters property must iterate over
the Unicode scalars in the entire string in order to determine the
characters for that string.
The count of the characters returned by the characters property is not
always the same as the length property of an NSString that contains
the same characters. The length of an NSString is based on the number
of 16-bit code units within the string’s UTF-16 representation and not
the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = "👍"
emoji.characters.count //returns 1
emoji.utf16.count //returns 2

Swift 1.2 Update: There's no longer a countElements for counting the size of collections. Just use the count function as a replacement: count("Swift")
Swift 2.0, 3.0 and 3.1:
let strLength = string.characters.count
Swift 4.2 (4.0 onwards): [Apple Documentation - Strings]
let strLength = string.count

Swift 1.1
extension String {
var length: Int { return countElements(self) } //
}
Swift 1.2
extension String {
var length: Int { return count(self) } //
}
Swift 2.0
extension String {
var length: Int { return characters.count } //
}
Swift 4.2
extension String {
var length: Int { return self.count }
}
let str = "Hello"
let count = str.length // returns 5 (Int)

Swift 4
"string".count
;)
Swift 3
extension String {
var length: Int {
return self.characters.count
}
}
usage
"string".length

If you are just trying to see if a string is empty or not (checking for length of 0), Swift offers a simple boolean test method on String
myString.isEmpty
The other side of this coin was people asking in ObjectiveC how to ask if a string was empty where the answer was to check for a length of 0:
NSString is empty

Swift 5.1, 5
let flag = "🇵🇷"
print(flag.count)
// Prints "1" -- Counts the characters and emoji as length 1
print(flag.unicodeScalars.count)
// Prints "2" -- Counts the unicode lenght ex. "A" is 65
print(flag.utf16.count)
// Prints "4"
print(flag.utf8.count)
// Prints "8"

tl;dr If you want the length of a String type in terms of the number of human-readable characters, use countElements(). If you want to know the length in terms of the number of extended grapheme clusters, use endIndex. Read on for details.
The String type is implemented as an ordered collection (i.e., sequence) of Unicode characters, and it conforms to the CollectionType protocol, which conforms to the _CollectionType protocol, which is the input type expected by countElements(). Therefore, countElements() can be called, passing a String type, and it will return the count of characters.
However, in conforming to CollectionType, which in turn conforms to _CollectionType, String also implements the startIndex and endIndex computed properties, which actually represent the position of the index before the first character cluster, and position of the index after the last character cluster, respectively. So, in the string "ABC", the position of the index before A is 0 and after C is 3. Therefore, endIndex = 3, which is also the length of the string.
So, endIndex can be used to get the length of any String type, then, right?
Well, not always...Unicode characters are actually extended grapheme clusters, which are sequences of one or more Unicode scalars combined to create a single human-readable character.
let circledStar: Character = "\u{2606}\u{20DD}" // ☆⃝
circledStar is a single character made up of U+2606 (a white star), and U+20DD (a combining enclosing circle). Let's create a String from circledStar and compare the results of countElements() and endIndex.
let circledStarString = "\(circledStar)"
countElements(circledStarString) // 1
circledStarString.endIndex // 2

In Swift 2.0 count doesn't work anymore. You can use this instead:
var testString = "Scott"
var length = testString.characters.count

Here's something shorter, and more natural than using a global function:
aString.utf16count
I don't know if it's available in beta 1, though. But it's definitely there in beta 2.

Updated for Xcode 6 beta 4, change method utf16count --> utf16Count
var test1: String = "Scott"
var length = test1.utf16Count
Or
var test1: String = "Scott"
var length = test1.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)

As of Swift 1.2 utf16Count has been removed. You should now use the global count() function and pass the UTF16 view of the string. Example below...
let string = "Some string"
count(string.utf16)

For Xcode 7.3 and Swift 2.2.
let str = "🐶"
If you want the number of visual characters:
str.characters.count
If you want the "16-bit code units within the string’s UTF-16 representation":
str.utf16.count
Most of the time, 1 is what you need.
When would you need 2? I've found a use case for 2:
let regex = try! NSRegularExpression(pattern:"🐶",
options: NSRegularExpressionOptions.UseUnixLineSeparators)
let str = "🐶🐶🐶🐶🐶🐶"
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.utf16.count), withTemplate: "dog")
print(result) // dogdogdogdogdogdog
If you use 1, the result is incorrect:
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.characters.count), withTemplate: "dog")
print(result) // dogdogdog🐶🐶🐶

You could try like this
var test1: String = "Scott"
var length = test1.bridgeToObjectiveC().length

in Swift 2.x the following is how to find the length of a string
let findLength = "This is a string of text"
findLength.characters.count
returns 24

Swift 2.0:
Get a count: yourString.text.characters.count
Fun example of how this is useful would be to show a character countdown from some number (150 for example) in a UITextView:
func textViewDidChange(textView: UITextView) {
yourStringLabel.text = String(150 - yourStringTextView.text.characters.count)
}

In swift4 I have always used string.count till today I have found that
string.endIndex.encodedOffset
is the better substitution because it is faster - for 50 000 characters string is about 6 time faster than .count. The .count depends on the string length but .endIndex.encodedOffset doesn't.
But there is one NO. It is not good for strings with emojis, it will give wrong result, so only .count is correct.

In Swift 4 :
If the string does not contain unicode characters then use the following
let str : String = "abcd"
let count = str.count // output 4
If the string contains unicode chars then use the following :
let spain = "España"
let count1 = spain.count // output 6
let count2 = spain.utf8.count // output 7

In Xcode 6.1.1
extension String {
var length : Int { return self.utf16Count }
}
I think that brainiacs will change this on every minor version.

Get string value from your textview or textfield:
let textlengthstring = (yourtextview?.text)! as String
Find the count of the characters in the string:
let numberOfChars = textlength.characters.count

Here is what I ended up doing
let replacementTextAsDecimal = Double(string)
if string.characters.count > 0 &&
replacementTextAsDecimal == nil &&
replacementTextHasDecimalSeparator == nil {
return false
}

Swift 4 update comparing with swift 3
Swift 4 removes the need for a characters array on String. This means that you can directly call count on a string without getting characters array first.
"hello".count // 5
Whereas in swift 3, you will have to get characters array and then count element in that array. Note that this following method is still available in swift 4.0 as you can still call characters to access characters array of the given string
"hello".characters.count // 5
Swift 4.0 also adopts Unicode 9 and it can now interprets grapheme clusters. For example, counting on an emoji will give you 1 while in swift 3.0, you may get counts greater than 1.
"👍🏽".count // Swift 4.0 prints 1, Swift 3.0 prints 2
"👨‍❤️‍💋‍👨".count // Swift 4.0 prints 1, Swift 3.0 prints 4

Swift 4
let str = "Your name"
str.count
Remember: Space is also counted in the number

You can get the length simply by writing an extension:
extension String {
// MARK: Use if it's Swift 2
func stringLength(str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 3
func stringLength(_ str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 4
func stringLength(_ str: String) -> Int {
return str.count
}
}

Best way to count String in Swift is this:
var str = "Hello World"
var length = count(str.utf16)

String and NSString are toll free bridge so you can use all methods available to NSString with swift String
let x = "test" as NSString
let y : NSString = "string 2"
let lenx = x.count
let leny = y.count

test1.characters.count
will get you the number of letters/numbers etc in your string.
ex:
test1 = "StackOverflow"
print(test1.characters.count)
(prints "13")

Apple made it different from other major language. The current way is to call:
test1.characters.count
However, to be careful, when you say length you mean the count of characters not the count of bytes, because those two can be different when you use non-ascii characters.
For example;
"你好啊hi".characters.count will give you 5 but this is not the count of the bytes.
To get the real count of bytes, you need to do "你好啊hi".lengthOfBytes(using: String.Encoding.utf8). This will give you 11.

Right now (in Swift 2.3) if you use:
myString.characters.count
the method will return a "Distance" type, if you need the method to return an Integer you should type cast like so:
var count = myString.characters.count as Int

my two cents for swift 3/4
If You need to conditionally compile
#if swift(>=4.0)
let len = text.count
#else
let len = text.characters.count
#endif

Related

warning: 'characters' is deprecated: Please use String or Substring directly

characters - an instance property of String, is deprecated from with Xcode 9.1
It was very useful to get a substring from String by using the characters property but now it has been deprecated and Xcode suggests to use substring. I've tried to check around SO questions and apple developer tutorials/guidelines for the same. But could not see any solution/alternate as suggested.
Here is warning message:
'characters' is deprecated: Please use String or Substring
I've so many string operations are performed/handled using property characters.
Anyone have any idea/info about this update?
Swift 4 introduced changes on string API.
You can just use !stringValue.isEmpty instead of stringValue.characters.count > 0
for more information you get the sample from here
for e.g
let edit = "Summary"
edit.count // 7
Swift 4 vs Swift 3 examples:
let myString = "test"
for char in myString.characters {print(char) } // Swift 3
for char in myString { print(char) } // Swift 4
let length = myString.characters.count // Swift 3
let length = myString.count // Swift 4
One of the most common cases for manipulating strings is with JSON responses. In this example I created an extension in my watch app to drop the last (n) characters of a Bitcoin JSON object.
Swift 3:
func dropLast(_ n: Int = 0) -> String {
return String(characters.dropLast(n))
Xcode 9.1 Error Message:
'characters' is deprecated: Please use String or Substring directly
Xcode is telling us to use the string variable or method directly.
Swift 4:
func dropLast(_ n: Int = 0) -> String {
return String(dropLast(n))
}
Complete Extension:
extension String {
func dropLast(_ n: Int = 0) -> String {
return String(dropLast(n))
}
var dropLast: String {
return dropLast()
}
}
Call:
print("rate:\(response.USDRate)")
let literalMarketPrice = response.USDRate.dropLast(2)
print("literal market price: \(literalMarketPrice)")
Console:
//rate:7,101.0888 //JSON float
//literal market price: 7,101.08 // JSON string literal
Additional Examples:
print("Spell has \(invisibleSpellName.count) characters.")
return String(dropLast(n))
return String(removeLast(n))
Documentation:
You'll often be using common methods such as dropLast() or removeLast() or count so here is the explicit Apple documentation for each method.
droplast()
removelast()
counting characters
Use this characters because String stopped being a collection in Swift 2.0. However this is still valid code in Swift 4 but is no longer necessary now that String is a Collection again.
For example a Swift 4 String now has a direct count property that gives the character count:
// Swift 4
let spString = "Stack"
spString.count // 5
Examples for String and SubString.
String
Swift 4 String now directly get Element that gives the first character of String: (string.characters.first)
let spString = "Stack"
let firstElement = spString.first //S
SubString
Using SubString get first character.
let spstring = "Welcome"
let indexStartOfText = spstring.index(spstring.startIndex, offsetBy: 1)
let sub = spstring.substring(to: indexStartOfText)
print(sub) //W
That warning is just a top of the iceberg, there were a loot of string changes, strings are again a collection of characters, but we got soemthing new and cool, subStrings :)
This is a great read about this:
https://useyourloaf.com/blog/updating-strings-for-swift-4/
Just remove characters
For example:
stringValue.characters.count
to
stringValue.count
You can also use this code for dictionary grouping without using { $0.characters.first! }.
let cities = ["Shanghai": 24_256_800, "Karachi": 23_500_000, "Beijing": 21_516_000, "Seoul": 9_995_000]
let groupedCities = Dictionary(grouping: cities.keys) { $0.first! }
print(groupedCities)
func validatePhoneNumber(number:String) -> Bool{
if number.count < 10. //deprecated ->(number.characters.count)
{
return false;
}else{
return true;
}
}
You use directly .count and characters is deprecated.

How to get substring from user input?

i wrote code to get character when user enter in text field and do math with them
this :
#IBOutlet weak internal var textMeli: UITextField!
var myChar = textMeli.text
var numb = [myChar[0]*3 , myChar[1]*7]
but one is wrong
textMeli.text is a String.
myChar is a String.
You can't access a Character from a String using bracket notation.
Take a look at the documentation for the String structure.
You'll see that you can access the string's characters through the characters property. This will return a collection of Characters. Initalize a new array with the collection and you can then use bracket notation.
let string = "Foo"
let character = Array(string.characters)[0]
character will be of type Character.
You'll then need to convert the Character to some sort of number type (Float, Int, Double, etc.) to use multiplication.
Type is important in programming. Make sure you are keeping track so you know what function and properties you can use.
Off the soap box. It looks like your trying to take a string and convert it into a number. I would skip the steps of using characters. Have two text fields, one to accept the first number (as a String) and the other to accept the second number (as a String). Use a number formatter to convert your string to a number. A number formatter will return you an NSNumber. Checking out the documentation and you'll see that you can "convert" the NSNumber to any number type you want. Then you can use multiplication.
Something like this:
let firstNumberTextField: UITextField!
let secondNumberTextField: UITextField!
let numberFormatter = NumberFormatter()
let firstNumber = numberFormatter.number(from: firstNumberTextField.text!)
let secondNumber = numberFormatter.number(from: secondNumberTextField.text!)
let firstInt = firstNumber.integerValue //or whatever type of number you need
let secondInt = secondNumber.integerValue
let product = firstInt * secondInt
Dealing with Swift strings is kind of tricky because of the way they deal with Unicode and "grapheme clusters". You can't index into String objects using array syntax like that.
Swift also doesn't treat characters as interchangeable with 8 bit ints like C does, so you can't do math on characters like you're trying to do. You have to take a String and cast it to an Int type.
You could create an extension to the String class that WOULD let you use integer subscripts of strings:
extension String {
subscript (index: Int) -> String {
let first = self.startIndex
let startIndex = self.index(first, offsetBy: index)
let nextIndex = self.index(first, offsetBy: index + 1)
return self[startIndex ..< nextIndex]
}
}
And then:
let inputString = textMeli.text
let firstVal = Int(inputString[0])
let secondVal = Int(inputString[2])
and
let result = firstVal * 3 + secondVal * 7
Note that the subscript extension above is inefficient and would be a bad way to do any sort of "heavy lifting" string parsing. Each use of square bracket indexing has as bad as O(n) performance, meaning that traversing an entire string would give nearly O(n^2) performance, which is very bad.
The code above also lacks range checking or error handling. It will crash if you pass it a subscript out of range.
Note that its very strange to take multiple characters as input, then do math on the individual characters as if they are separate values. This seems like really bad user interface.
Why don't you step back from the details and tell us what you are trying to do at a higher level?

Cannot find char at position and index of char of a string var in Swift 2

Equivalent functions in Swift 2 of Java's charAt() and indexOf() ?
Firstly read this article about Swift strings and think about exactly what you mean by characters.
You can use the character view (or the utf16 view if that is the sort of characters that you want) of the string to see it as a collection and if you really need to get characters by index (rather than by iteration) you may want to convert it to an array but normally you just need to advance the index.
let myString = "Hello, Stack overflow"
// Note this index is not an integer but an index into a character view (String.CharacterView.Index)
let index = myString.characters.indexOf( "," )
let character = myString[myString.startIndex.advancedBy(4)] // "o"
This is O(n) (where n is the number of characters into the String) as it needs to iterate over the array as Characters may vary in length in the encoding)
Old answer below. The character array may be quicker for repeat access still as the array accesses are O(1) following the one off O(n) conversion to array (n is the array length).
let cIndex = 5
// This initialises a new array from the characters collection
let characters = [Character](myString.characters)
if cIndex < characters.count {
let character = characters[cIndex]
// Use the character here
}
Obviously some simplification is possible if the index is guaranteed to be within the length of the characters but I prefer to demonstrate with some safety on SO.
You can extend the String class with the missing charAt(index: Int) function:
extension String {
func charAt(index: Int) -> Character {
return [Character](characters)[index]
}
}
You have to convert the input string to an Array then return the character from the specific index here is the code
extension String {
func charAt(_ index : Int) -> Character {
let arr = Array(self.characters);
return (arr[index]);
}
}
This will work same as Java's charAt() method you can use it as
var str:String = "Alex"
print(str.charAt(1))
Did you read the NSString documentation? String and NSString are not identical, but as mentioned by this answer most functions are more or less the same. Anyway, the two exact functions you ask for are right there:
Java's charAt:
func characterAtIndex(_ index: Int) -> unichar
Java's indexOf:
func rangeOfString(_ searchString: String) -> NSRange
The NSRange data type contains just 2 variables: location and length that tell you where the substring is in the original string and how long it is. See the documentation.
//Java
"abc".indexOf("b") => 1
//Swift
"abc".rangeOfString("b").location #=> 1
you can used it instead of CharAt() in swift 3:
func charAt(str: String , int :Int)->Character{
str[str.startIndex]
let index = str.index(str.startIndex, offsetBy: int)
return str[index]
}

Swift 2.0 String with substringWithRange

I am trying to get first char from String. It should be easy but I can't do in Swift 2.0 (with Xcode beta 6).
Get nth character of a string in Swift programming language
I have tried that method also. It use extension but I can't retrieve using that method. May I know how to do?
Two solutions without casting to NSString
let string = "Hello"
let firstChar1 = string.substringToIndex(string.startIndex.successor())
let firstChar2 = string.characters.first
Update for Swift 2:
Since Swift 2 returns Character rather than String a new String must be created.
let firstChar2 = String(string.characters.first!)
Update for Swift 3:
successor() has been replaced with index(after:..)
let firstChar1 = string.substring(to:string.index(after: string.startIndex))
Try this,
let str = "hogehoge"
let text = (str as NSString).substringFromIndex(1) // "ogehoge"
For what it's worth (and for people searching for and finding this topic), without casting the String to NSString, you need to do the following with Swift 2.1:
let myString = "Example String"
let mySubString = myString.substringWithRange(Range<String.Index>(start: myString.startIndex.advanceBy(0), end: myString.startIndex.advanceBy(4)))
print(mySubString) //'Exam'
This would printout "Exam". Must say that it's much more verbose than in Obj-C. And that's saying something... ;-) But it gets the job done and without casting to NSString.
Try this
let myString = "My String" as NSString
myString.substringWithRange(NSRange(location: 0, length: 3))
In Swift 2 a String is not a collection of anything. According to the documentation:
/// `String` is not itself a collection of anything. Instead, it has
/// properties that present the string's contents as meaningful
/// collections:
///
/// - `characters`: a collection of `Character` ([extended grapheme
/// cluster](http://www.unicode.org/glossary/#extended_grapheme_cluster))
/// elements, a unit of text that is meaningful to most humans.
///
/// - `unicodeScalars`: a collection of `UnicodeScalar` ([Unicode
/// scalar
/// values](http://www.unicode.org/glossary/#unicode_scalar_value))
/// the 21-bit codes that are the basic unit of Unicode. These
/// values are equivalent to UTF-32 code units.
///
/// - `utf16`: a collection of `UTF16.CodeUnit`, the 16-bit
/// elements of the string's UTF-16 encoding.
///
/// - `utf8`: a collection of `UTF8.CodeUnit`, the 8-bit
/// elements of the string's UTF-8 encoding.
Assuming you want to find the second character,
var str = "Hello, playground"
let chars = str.characters
let n = 2
let c = str.characters[str.characters.startIndex.advancedBy(n)]
var mySuperCoolString = "Hello, World!!!!!!!!1!!";
println(mySuperCoolString.substringWithRange(Range<String.Index>(start: advance(mySuperCoolString.startIndex, 0), end: advance(mySuperCoolString.startIndex, 1))));
This should print out H
Swift 2.2 and Swift 3.0
let string = "12134"
string.substringWithRange(string.startIndex..<string.startIndex.advancedBy(2))

Swift 2.0 : Count is unavailable access the count property on the collection [duplicate]

Just downloaded Xcode 7 Beta, and this error appeared on enumerate keyword.
for (index, string) in enumerate(mySwiftStringArray)
{
}
Can anyone help me overcome this ?
Also, seems like count() is no longer working for counting length of String.
let stringLength = count(myString)
On above line, compiler says :
'count' is unavailable: access the 'count' property on the collection.
Has Apple has released any programming guide for Swift 2.0 ?
Many global functions have been replaced by protocol extension methods,
a new feature of Swift 2, so enumerate() is now an extension method
for SequenceType:
extension SequenceType {
func enumerate() -> EnumerateSequence<Self>
}
and used as
let mySwiftStringArray = [ "foo", "bar" ]
for (index, string) in mySwiftStringArray.enumerate() {
print(string)
}
And String does no longer conform to SequenceType, you have to
use the characters property to get the collection of Unicode
characters. Also, count() is a protocol extension method of
CollectionType instead of a global function:
let myString = "foo"
let stringLength = myString.characters.count
print(stringLength)
Update for Swift 3: enumerate() has been renamed to enumerated():
let mySwiftStringArray = [ "foo", "bar" ]
for (index, string) in mySwiftStringArray.enumerated() {
print(string)
}
There was an update for Swift 2 on using enumerate().
Instead of enumerate(...), people should use
... .enumerate()
The reason is that many global functions have been replaced by protocol extension methods and they will get an enumerate error.
Hope this helps.
All the best.
n
I know this is a old thread but I've just been messing around with Swift 2.0 and Playgrounds and I came across the same problem I thought I'd share a solution which uses the enumerate() method for a String
// This line works in Swift 1.2
// for (idx, character) in enumerate("A random string, it has a comma.")
// Swift 2.x
let count = inputString.characters
for (idx, character) in count.enumerate() where character == "," {
// Do something with idx
}
Hope this helps
Thanks
Kai

Resources