How the function inverted of characterSet works in Swift(5.0) - ios

I have example:
let stringToCheck: String = "42"
let numbers: CharacterSet = CharacterSet.decimalDigits
let stringIsANumber: Bool = stringToCheck.rangeOfCharacter(from: numbers.inverted) == nil
and i have two question
How the function inverted works? what does it do?
what range does rangeOfCharacter return?

inverted means the opposite. For example, if you have only characters a and b and c and a character set consisting of a, its inversion is b and c. So your decimalDigits characters set, inverted, means everything that is not a decimal digit.
A range is a contiguous stretch of something, specified numerically. For example, if you have the string "abc", the range of "bc" is "the second and third characters". The range of something that isn't there at all can be expressed as nil.
So the code you have shown looks for a character that is not a digit in the original string, and if it fails to find one (so that the range is nil), it says that the string is entirely a number.

Related

How to map many items

I'm trying to make a function that maps letters of the alphabet to other letters and I'm wondering if there is a simple way to do this or do I need to make a dictionary for each individual letter
Not sure what language but you could convert the character to it's unicode value and do some arithmetic function on to it e.g. in JS
var letter = "a";
var code = letter.charCodeAt(0);
// should do some checking to make sure you stay in letter range
code = code + 4;
return String.fromCharCode(code);

Cannot call value of non-function type double

I am quite new to programing in swift and I am working on a music app for iOS that adjusts the font size of the text in a UILabel in proportion to the string's length. In my code, I am trying to count the number of characters in the string statement and have it plugged into a formula, but for some reason Xcode gives me the error: Cannot call value of non function type double I tried setting the value to a CGFloat but it still gives me the same error on the "let b = 41.2 - .8(a) line. Thank you so much and sorry if this seems like a basic question.
let title = "Let It Bleed"
AlbumName.text = title
let a = title.characters.count
if ( a <= 19){
let b = 41.2 - .8(a)
let fontsize = CGFloat(b)
AlbumName.font = AlbumName.font.fontWithSize(fontsize)
}
A screenshot of the code with the error
I assume you expect "0.8 times a" with .8(a).
Three things:
You need leading 0 to represent fractional values in Swift.
You need explicit operator * for multiplication.
You need to convert numeric types to match for mathematical operations.
All these included, your line of interest becomes like this:
let b = 41.2 - 0.8 * CGFloat(a)

Get bounding rectangle for CGGlyphs of Characters other then letters or numbers

I am looking for a way to equire character's glyph descent as indicated on the picture:
The method needs to work for any given character (or at least all common unicode characters).
Here is my actual approach (in Swift, inspired by this and this question):
let char = "a"
let ctFont = CTFontCreateWithNameAndOptions("HelveticaNeue", 12, nil, nil)
var ctGlyph = CTFontGetGlyphWithName(ctFont, char)
let boundingBox = withUnsafePointer(&ctGlyph) { pointer -> CGRect in
return CTFontGetBoundingRectsForGlyphs(ctFont, CTFontOrientation.OrientationDefault, pointer, nil, 1)
}
Descent I need is then simply equal to -boundingBox.origin.y.
This approach works nicely for letter and number and number characters (see this answer for graphical representation).
The problem is that for everything other then letters or numbers (for example: .,#')*) it I get the same bounding rectangle: {x:0.612, y:0.012, w:4.908, h:8.532}. That is obviously incorrect.
How can I get the bonding rectangle of the descent directly for all characters?

Swift countElements() return incorrect value when count flag emoji

let str1 = "πŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺ"
let str2 = "πŸ‡©πŸ‡ͺ.πŸ‡©πŸ‡ͺ.πŸ‡©πŸ‡ͺ.πŸ‡©πŸ‡ͺ.πŸ‡©πŸ‡ͺ."
println("\(countElements(str1)), \(countElements(str2))")
Result: 1, 10
But should not str1 have 5 elements?
The bug seems only occurred when I use the flag emoji.
Update for Swift 4 (Xcode 9)
As of Swift 4 (tested with Xcode 9 beta) grapheme clusters break after every second regional indicator symbol, as mandated by the Unicode 9
standard:
let str1 = "πŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺ"
print(str1.count) // 5
print(Array(str1)) // ["πŸ‡©πŸ‡ͺ", "πŸ‡©πŸ‡ͺ", "πŸ‡©πŸ‡ͺ", "πŸ‡©πŸ‡ͺ", "πŸ‡©πŸ‡ͺ"]
Also String is a collection of its characters (again), so one can
obtain the character count with str1.count.
(Old answer for Swift 3 and older:)
From "3 Grapheme Cluster Boundaries"
in the "Standard Annex #29 UNICODE TEXT SEGMENTATION":
(emphasis added):
A legacy grapheme cluster is defined as a base (such as A or γ‚«)
followed by zero or more continuing characters. One way to think of
this is as a sequence of characters that form a β€œstack”.
The base can be single characters, or be any sequence of Hangul Jamo
characters that form a Hangul Syllable, as defined by D133 in The
Unicode Standard, or be any sequence of Regional_Indicator (RI) characters. The RI characters are used in pairs to denote Emoji
national flag symbols corresponding to ISO country codes. Sequences of
more than two RI characters should be separated by other characters,
such as U+200B ZWSP.
(Thanks to #rintaro for the link).
A Swift Character represents an extended grapheme cluster, so it is (according
to this reference) correct that any sequence of regional indicator symbols
is counted as a single character.
You can separate the "flags" by a ZERO WIDTH NON-JOINER:
let str1 = "πŸ‡©πŸ‡ͺ\u{200C}πŸ‡©πŸ‡ͺ"
print(str1.characters.count) // 2
or insert a ZERO WIDTH SPACE:
let str2 = "πŸ‡©πŸ‡ͺ\u{200B}πŸ‡©πŸ‡ͺ"
print(str2.characters.count) // 3
This solves also possible ambiguities, e.g. should "πŸ‡«β€‹πŸ‡·β€‹πŸ‡Ίβ€‹πŸ‡Έ"
be "πŸ‡«β€‹πŸ‡·πŸ‡Ίβ€‹πŸ‡Έ" or "πŸ‡«πŸ‡·β€‹πŸ‡ΊπŸ‡Έ" ?
See also How to know if two emojis will be displayed as one emoji? about a possible method
to count the number of "composed characters" in a Swift string,
which would return 5 for your let str1 = "πŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺ".
Here's how I solved that problem, for Swift 3:
let str = "πŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺπŸ‡©πŸ‡ͺ" //or whatever the string of emojis is
let range = str.startIndex..<str.endIndex
var length = 0
str.enumerateSubstrings(in: range, options: NSString.EnumerationOptions.byComposedCharacterSequences) { (substring, substringRange, enclosingRange, stop) -> () in
length = length + 1
}
print("Character Count: \(length)")
This fixes all the problems with character count and emojis, and is the simplest method I have found.

How can I put string in other string by position

My string is 'Hllo'.
I want to put inside it 'e' after the 'H' by its position, this case, position number 2.
local str = 'Hllo'
str = str:gsub('()',{[2]='e'})
You can simply cut contents until position you want to place your character on, then add the character and finally concat the characters on and after position.
src = "Hllo"
result = string.sub(src, 1, string.find(src, "H")) .. "e" .. string.sub(src, string.find(src, "H")+1)
The first part of code gets position of 'H' andf cuts the start (in this case 'H' only).
Second part adds character you want to insert. Third part adds every character after 'H' in source string to result.
you can try this out
$arr = str_split('hllo',1);
$result=$arr[0].'e'.$arr[1].$arr[2].$arr[3]

Resources