How to convert from unichar to character - ios

There is better way to convert from unichar to Character?
tried :
var unichar = ...
let str = NSString(characters: &unichar, length: 1) as String
let character = Array(str.characters)[0])

You can convert unichar -> UnicodeScalar -> Character:
let c = unichar(8364)
if let uc = UnicodeScalar(c) {
let char = Character(uc)
print(char) // €
} else {
print("illegal input")
}
Input values in the range 0xD800...0xDFFF
(high and low surrogates) are not allowed because they do not
correspond to valid Unicode scalar values.
If it is guaranteed that those input values do not occur then you
can simplify the conversion to
let char = Character(UnicodeScalar(c)!)
To replace a possible invalid input value by a default character
(e.g. a question mark), use
let char = Character(UnicodeScalar(c) ?? "?")

public class func stringForIcon(_ icon: NSInteger) -> Character {
return Character(UnicodeScalar(icon)!)
}

Related

How to convert sequence of ASCII code into string in swift 4?

I have an sequence of ASCII codes in string format like (7297112112121326610511411610410097121). How to convert this into text format.
I tried below code :
func convertAscii(asciiStr: String) {
var asciiString = ""
for asciiChar in asciiStr {
if let number = UInt8(asciiChar, radix: 2) { // Cannot invoke initializer for type 'UInt8' with an argument list of type '(Character, radix: Int)'
print(number)
let character = String(describing: UnicodeScalar(number))
asciiString.append(character)
}
}
}
convertAscii(asciiStr: "7297112112121326610511411610410097121")
But getting error in if let number line.
As already mentioned decimal ASCII values are in range of 0-255 and can be more than 2 digits
Based on Sulthan's answer and assuming there are no characters < 32 (0x20) and > 199 (0xc7) in the text this approach checks the first character of the cropped string. If it's "1" the character is represented by 3 digits otherwise 2.
func convertAscii(asciiStr: String) {
var source = asciiStr
var result = ""
while source.count >= 2 {
let digitsPerCharacter = source.hasPrefix("1") ? 3 : 2
let charBytes = source.prefix(digitsPerCharacter)
source = String(source.dropFirst(digitsPerCharacter))
let number = Int(charBytes)!
let character = UnicodeScalar(number)!
result += String(character)
}
print(result) // "Happy Birthday"
}
convertAscii(asciiStr: "7297112112121326610511411610410097121")
If we consider the string to be composed of characters where every character is represented by 2 decimal letters, then something like this would work (this is just an example, not optimal).
func convertAscii(asciiStr: String) {
var source = asciiStr
var characters: [String] = []
let digitsPerCharacter = 2
while source.count >= digitsPerCharacter {
let charBytes = source.prefix(digitsPerCharacter)
source = String(source.dropFirst(digitsPerCharacter))
let number = Int(charBytes, radix: 10)!
let character = UnicodeScalar(number)!
characters.append(String(character))
}
let result: String = characters.joined()
print(result)
}
convertAscii(asciiStr: "7297112112121326610511411610410097121")
However, the format itself is ambigious because ASCII characters can take from 1 to 3 decimal digits, therefore to parse correctly, you need all characters to have the same length (e.g. 1 should be 001).
Note that I am taking always the same number of letters, then convert them to a number and then create a character the number.

Remove special characters from the string

I am trying to use an iOS app to dial a number. The problem is that the number is in the following format:
po placeAnnotation.mapItem.phoneNumber!
"β€Ž+1 (832) 831-6486"
I want to get rid of some special characters and I want the following:
832-831-6486
I used the following code but it did not remove anything:
let charactersToRemove = CharacterSet(charactersIn: "()+-")
var telephone = placeAnnotation.mapItem.phoneNumber?.trimmingCharacters(in: charactersToRemove)
Any ideas?
placeAnnotation.mapItem.phoneNumber!.components(separatedBy: CharacterSet.decimalDigits.inverted)
.joined()
Here you go!
I tested and works well.
If you want something similar to CharacterSet with some flexibility, this should work:
let phoneNumber = "1 (832) 831-6486"
let charsToRemove: Set<Character> = Set("()+-".characters)
let newNumberCharacters = String(phoneNumber.characters.filter { !charsToRemove.contains($0) })
print(newNumberCharacters) //prints 1 832 8316486
I know the question is already answered, but to format phone numbers in any way one could use a custom formatter like below
class PhoneNumberFormatter:Formatter
{
var numberFormat:String = "(###) ### ####"
override func string(for obj: Any?) -> String? {
if let number = obj as? NSNumber
{
var input = number as Int64
var output = numberFormat
while output.characters.contains("#")
{
if let range = output.range(of: "#", options: .backwards)
{
output = output.replacingCharacters(in: range, with: "\(input % 10)")
input /= 10
}
else
{
output.replacingOccurrences(of: "#", with: "")
}
}
return output
}
return nil
}
func string(from number:NSNumber) -> String?
{
return string(for: number)
}
}
let phoneNumberFormatter = PhoneNumberFormatter()
//Digits will be filled backwards in place of hashes. It is easy change the custom formatter in anyway
phoneNumberFormatter.numberFormat = "###-##-##-##-##"
phoneNumberFormatter.string(from: 18063783889)
Swift 3
func removeSpecialCharsFromString(_ str: String) -> String {
struct Constants {
static let validChars = Set("1234567890-".characters)
}
return String(str.characters.filter { Constants.validChars.contains($0) })
}
To Use
let str : String = "+1 (832) 831-6486"
let newStr : String = self.removeSpecialCharsFromString(str)
print(newStr)
Note: you can add validChars which you want in string after operation perform.
If you have the number and special character in String format the use following code to remove special character
let numberWithSpecialChar = "1800-180-0000"
let actulNumber = numberWithSpecialChar.components(separatedBy: CharcterSet.decimalDigit.inverted).joined()
Otherwise, If you have the characters and special character in String format the use following code to remove special character
let charactersWithSpecialChar = "A man, a plan, a cat, a ham, a yak, a yam, a hat, a canal-Panama!"
let actulString = charactersWithSpecialChar.components(separatedBy: CharacterSet.letters.inverted).joined(separator: " ")
NSString *str = #"(123)-456-7890";
NSLog(#"String: %#", str);
// Create character set with specified characters
NSMutableCharacterSet *characterSet =
[NSMutableCharacterSet characterSetWithCharactersInString:#"()-"];
// Build array of components using specified characters as separtors
NSArray *arrayOfComponents = [str componentsSeparatedByCharactersInSet:characterSet];
// Create string from the array components
NSString *strOutput = [arrayOfComponents componentsJoinedByString:#""];
NSLog(#"New string: %#", strOutput);

String with Unicode (variable) [duplicate]

I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character πŸ‘.
Usually one would do something like this:
println("\u{1f44d}") // πŸ‘
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // πŸ‘
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // πŸ‘
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // πŸ‘
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // πŸ‘
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // πŸ‘

How to express Strings in Swift using Unicode hexadecimal values (UTF-16)

I want to write a Unicode string using hexadecimal values in Swift. I have read the documentation for String and Character so I know that I can use special Unicode characters directly in strings like the following:
var variableString = "Catβ€ΌπŸ±" // "Cat" + Double Exclamation + cat emoji
But I would like to do it using the Unicode code points. The docs (and this question) show it for characters, but are not very clear about how to do it for strings.
(Note: Although the answer seems obvious to me now, it wasn't obvious at all just a short time ago. I am answering my own question below as a means of learning how to do this and also to help myself understand Unicode terminology and how Swift Characters and Strings work.)
Character
The Swift syntax for forming a hexadecimal code point is
\u{n}
where n is a hexadecimal number up to 8 digits long. The valid range for a Unicode scalar is U+0 to U+D7FF and U+E000 to U+10FFFF inclusive. (The U+D800 to U+DFFF range is for surrogate pairs, which are not scalars themselves, but are used in UTF-16 for encoding the higher value scalars.)
Examples:
// The following forms are equivalent. They all produce "C".
let char1: Character = "\u{43}"
let char2: Character = "\u{0043}"
let char3: Character = "\u{00000043}"
// Higher value Unicode scalars are done similarly
let char4: Character = "\u{203C}" // β€Ό (DOUBLE EXCLAMATION MARK character)
let char5: Character = "\u{1F431}" // 🐱 (cat emoji)
// Characters can be made up of multiple scalars
let char7: Character = "\u{65}\u{301}" // Γ© = "e" + accent mark
let char8: Character = "\u{65}\u{301}\u{20DD}" // é⃝ = "e" + accent mark + circle
Notes:
Leading zeros can be added or omitted
Characters are known as extended grapheme clusters. Even when they are composed of multiple scalars, they are still considered a single character. What is key is that they appear to be a single character (grapheme) to the user.
TODO: How to convert surrogate pair to Unicode scalar in Swift
String
Strings are composed of characters. See the following examples for some ways to form them using hexadecimal code points.
Examples:
var string1 = "\u{0043}\u{0061}\u{0074}\u{203C}\u{1F431}" // Catβ€ΌπŸ±
// pass an array of characters to a String initializer
let catCharacters: [Character] = ["\u{0043}", "\u{0061}", "\u{0074}", "\u{203C}", "\u{1F431}"] // ["C", "a", "t", "β€Ό", "🐱"]
let string2 = String(catCharacters) // Catβ€ΌπŸ±
Converting Hex Values at Runtime
At runtime you can convert hexadecimal or Int values into a Character or String by first converting it to a UnicodeScalar.
Examples:
// hex values
let value0: UInt8 = 0x43 // 67
let value1: UInt16 = 0x203C // 8252
let value2: UInt32 = 0x1F431 // 128049
// convert hex to UnicodeScalar
let scalar0 = UnicodeScalar(value0)
// make sure that UInt16 and UInt32 form valid Unicode values
guard
let scalar1 = UnicodeScalar(value1),
let scalar2 = UnicodeScalar(value2) else {
return
}
// convert to Character
let character0 = Character(scalar0) // C
let character1 = Character(scalar1) // β€Ό
let character2 = Character(scalar2) // 🐱
// convert to String
let string0 = String(scalar0) // C
let string1 = String(scalar1) // β€Ό
let string2 = String(scalar2) // 🐱
// convert hex array to String
let myHexArray = [0x43, 0x61, 0x74, 0x203C, 0x1F431] // an Int array
var myString = ""
for hexValue in myHexArray {
if let scalar = UnicodeScalar(hexValue) {
myString.append(Character(scalar))
}
}
print(myString) // Catβ€ΌπŸ±
Further reading
Strings and Characters docs
Glossary of Unicode Terms
Strings in Swift
Working with Unicode code points in Swift
from your Hex "0x1F52D" to actual Emoji
let c = 0x1F602
next step would possibly getting an Uint32 from your Hex
let intEmoji = UnicodeScalar(c!).value
from this you can do something like
titleLabel.text = String(UnicodeScalar(intEmoji)!)
here you have a "πŸ˜‚"
it work with range of hexadecimal too
let emojiRanges = [
0x1F600...0x1F636,
0x1F645...0x1F64F,
0x1F910...0x1F91F,
0x1F30D...0x1F52D
]
for range in emojiRanges {
for i in range {
let c = UnicodeScalar(i)!.value
data.append(c)
}
}
to get multiple UInt32 from your Hex range for exemple

strtoul() Function- Swift

I'm trying to create a swift iOS program that converts a number into dec, bin, and hex numbers. I've come across the strtoul function, but don't quite understand how to use it, would someone be able to explain it? Thanks!
The method strtoul is pretty simple to use. You will need also to use String(radix:()) to convert it to the other direction. You can create an extension to convert from hexaToDecimal or from binaryToDecimal as follow:
Usage String(radix:())
extension Int {
var toBinary: String {
return String(self, radix: 2)
}
var toHexa: String {
return String(self, radix: 16)
}
}
Usage strtoul()
extension String {
var hexaToDecimal: Int {
return Int(strtoul(self, nil, 16))
}
var hexaToBinary: String {
return hexaToDecimal.toBinary
}
var binaryToDecimal: Int {
return Int(strtoul(self, nil, 2))
}
var binaryToHexa: String {
return binaryToDecimal.toHexa
}
}
Testing
let myBinFromInt = 255.toBinary // "11111111"
let myhexaFromInt = 255.toHexa // "ff"
let myIntFromHexa = "ff".hexaToDecimal // 255
let myBinFromHexa = "ff".hexaToBinary // "11111111"
let myIntFromBin = "11111111".binaryToDecimal // 255
let myHexaFromBin = "11111111".binaryToHexa // "ff"
The strtoul() function converts the string in str to an unsigned long
value. The conversion is done according to the given base, which must be between 2 and 36 inclusive, or be the special value 0.
Really it sounds like you want to use NSString
From what it sounds like, you want to convert an unsigned integer to decimal, hex and binary.
For example, if you had an integer n:
var st = NSString(format:"%2X", n)
would convert the integer to hexadecimal and store it in the variable st.
//NSString(format:"%2X", 10) would give you 'A' as 10 is A in hex
//NSString(format:"%2X", 17) would give you 11 as 17 is 11 in hex
Binary:
var st = NSString(format:"%u", n)
Decimal (2 decimal places)
var st = NSString(format:"%.02f", n)

Resources