Swift Converting Character to String - ios

I have an issue with converting character type to String type. First of all, I have below extension of String for finding nth character within String.
extension String {
func characterAtIndex(index: Int) -> Character? {
var cur = 0
for char in self {
if cur == index {
return char
}
cur++
}
return nil
}
}
I get what I want with this class extension. However when I use that nth character for title of my custom UIButton, gives an error. My Uibutton Class is
class hareketliHarfler: UIButton {
init(frame: CGRect) {
super.init(frame: frame)
// Initialization code
}
func getLetter(letter:String!){
self.titleLabel.text = letter
}
}
The error show when i try to access "getLetter(letter:String)" function. Here is example of main view Controller codes:
var harfim = hareketliHarfler(frame: CGRectMake(100,100,100,100))
var str="This is my String"
var bufi=str.characterAtIndex(3)
harfim.getLetter(bufi as AnyObject) ****
In * section I try .getLetter(bufi), .getLetter(bufi as String) also I try to change parameter type of function. Look like: func getLetter(letter:Character!) or func getLetter(letter:AnyObject!)...etc
Didn't find a way. Need a help on that. Thank you

How about the simple
String(theCharacter)
Works in Swift 4 and Swift 5

Your problem is quite simple: your characterAtIndex function returns a Character, and self.titleLabel.text is a String. You can't convert between the two implicitly. The easiest way would be to turn the Character into a String using the String initialiser:
// ch will be Character? type.
if let ch = str.characterAtIndex(3) {
// Initialise a new String containing the single character 'ch'
harfim.getLetter(String(ch))
} else {
// str didn't have a third character.
}
Unlike other solutions, this is safe for unusual Unicode characters, and won't initialise a potentially large array or iterate the whole String just to get the third character.

Change this:
var bufi=str.characterAtIndex(3)
harfim.getLetter(bufi as AnyObject)
to this:
harfim.getLetter(String(Array(str)[3]))
So what happening here:
we create an array from our string. Array elements are symbols from original string. Such break down correctly tracks symbols that are presented with a sequences of two or more code points. E.g. emoji or flag as noted by #MartinR.
We access element at 4-th position.
Note that as we crate an array from initial string then performance wise is better to use this method only with short strings and avoid it in oft-repeated routines. But in your case it seems to be OK.

Can also use Character(text).isNumber if you want to get localised numbers.
Reference:
https://developer.apple.com/documentation/swift/character/3127015-isnumber

Related

How to extract emojis from a string?

I'm looking for a way to take a String and extract Emoji characters.
I know that Emojis are part of Unicode so I need to remove a certain subset of Unicode characters. I don't really know where to start.
The Set of Emoji characters
First of all we need a Set containing the unicode values representing the emoji.
Disclaimer
For this answer I am using the range of Emoticons (1F601-1F64F) and Dingbats (2702-27B0) to show you the solution. However keep in mind that you should add further ranges depending on your needs.
Extending Character
Now we need a way to calculate the Unicode Scalar Code Point of a Character. For this I am using the solution provided here.
extension Character {
private var unicodeScalarCodePoint: Int {
let characterString = String(self)
let scalars = characterString.unicodeScalars
return Int(scalars[scalars.startIndex].value)
}
}
Extending String
This extension does allow you to extract the emoji characters from a String.
extension String {
var emojis:[Character] {
let emojiRanges = [0x1F601...0x1F64F, 0x2702...0x27B0]
let emojiSet = Set(emojiRanges.flatten())
return self.characters.filter { emojiSet.contains($0.unicodeScalarCodePoint) }
}
}
Testing
let sentence = "πŸ˜ƒ hello world πŸ™ƒ"
sentence.emojis // ["πŸ˜ƒ", "πŸ™ƒ"]

Extending Array to append SKTextures [duplicate]

This question already has answers here:
Is it possible to make an Array extension in Swift that is restricted to one class?
(4 answers)
Closed 7 years ago.
Being fairly new to Swift I decided I would look at extending Array (or more specifically [SKTexture] Arrays of SKTexture) with a function to add a specified number of frames from the application bundle.
// FRAMES
FuzzyRabbit_0001#2x.png
FuzzyRabbit_0002#2x.png
FuzzyRabbit_0003#2x.png
FuzzyRabbit_0004#2x.png
// CALL
var rabbitTextures = [SKTexture]()
self.rabbitTextures.textureFromFrames("FuzzyRabbit", count: 4)
My first attempt is listed below, I am getting the error Cannot invoke 'append' with an argument list of type '(SKTexture!)' which from looking at the function fuzzyPush is because I am trying to append an SKTexture rather than the generic T.
Is this possible, or am I limited by the fact that I don't want the function to be generic but rather specific to Arrays of SKTexture.
extension Array {
// ONLY SKTexture
mutating func textureFromFrames(imageName: String, count: Int) {
if !(self[0] is SKTexture) { return }
for index in 1...count {
let image = String(format: "\(imageName)_%04d", index)
let texture = SKTexture(imageNamed: image)
self.append(texture) // ERROR: Cannot invoke append with an argument list of type SKTexture!
}
}
// WORKS FINE
mutating func fuzzyPush(newItem: T) {
self.append(newItem)
}
}
I was just curious if this is something I could do with an extension, its not a problem as I have this as a function that takes 3 parameters (imageName, count, arrayToAppend) so I can quite easily use that.
This extension is not possible to write today. You cannot apply an extension method to only certain types of arrays.
There are two good solutions. You can use a HAS-A pattern by creating a struct (TextureList) that contains a [SKTexture], or you can use a function.
You can replace :
self.append(texture)
with
self.append(texture as T)
I checked this on an array of strings though and it worked.
About the first check add another check to see if the array is empty otherwise the self[0] is SKTexture will fail.
This is the code I tested on an online swift compiler (SKTexture was not available obviously) :
extension Array {
mutating func textureFromFrames(imageName: String, count: Int) {
for index in 1...count {
let image = String(format: "\(imageName)_%04d", index)
self.append(image as T)
}
}
}
var arr = Array<String>()
arr.textureFromFrames("testing", count:4)
for tmp in arr {
println("\(tmp)")
}

AnyObject vs. Struct (Any)

I would like to create a method like this for my projects:
func print(obj: AnyObject) {
if let rect = obj as? CGRect {
println(NSStringFromCGRect(rect))
}
else if let size = obj as? CGSize {
println(NSStringFromCGSize(size))
}
//...
}
But I can't because CGRect and CGSize are structs and do not conform to the AnyObject protocol. So, any ideas on how this could be done?
Use Any instead of AnyObject.
Swift provides two special type aliases for working with non-specific
types:
β€’ AnyObject can represent an instance of any class type.
β€’ Any can represent an instance of any type at all, including function types.
The Swift Programming Language
#nkukushkin's answer is correct, however, if what you want is a function that behaves differently depending on whether it’s passed a CGRect or a CGStruct, you are better off with overloading:
func print(rect: CGRect) {
println(NSStringFromCGRect(rect))
}
func print(size: CGSize) {
println(NSStringFromCGSize(size))
}
In comparison, the Any will be both inefficient (converting your structs to Any and back, could have a big impact if you do this a lot in a tight loop), and non-typesafe (you can pass anything into that function, and it will only fail at runtime).
If your intention is to coerce both types into a common type and then do the same operation on it, you can create a 3rd overload that takes that type, and have the other two call it.
Just discovered a much better method of doing this. Swift has a method called dump, and it works with a lot of kinds of data.
For example:
dump(CGRectMake(0, 5, 30, 60))
Will print:
{x 0 y 5 w 30 h 60}
If you just need to print a CGRect or CGSize, you could use:
println(rect)
or
println(size)
You left a '...' at the end of your function so I assume there are more types that you need to print. To do that you need to make those types conform to the Printable protocol (unless they already do). Here's an example of how -
class Car {
var mileage = 0
}
extension Car : Printable {
var description: String {
return "A car that has travelled \(mileage) miles."
}
}
The you can use:
let myCar = Car()
println(myCar)
Also, you may want to change the format of the way a type is currently printed. For example, if you wanted println(aRect) in the same format as returned by NSStringFromCGRect you could use the extension:
extension CGRect : Printable {
public var description: String {
return "{\(origin.x), \(origin.y)}, {\(size.width), \(size.height)}"
}
}

How to implement functions count and dropLast in swift, IOS?

I am making calculator in Swift. Stuck in backspace button. If user press wrong digit then backspace button would help to delete digit off the display.
Though I wrote dropLast function and works. It return appropriate result. How to use count method, don't understand the return type of count method.
#IBOutlet weak var display: UILabel!
#IBAction func backspace() {
//how to use count method to check collection of elements
//dropLast drop the last digit and display result
let dropedDigit = dropLast(display.text!)
display.text = dropedDigit
}
How about something like this:
private func dropLast(text: String) -> String {
let endIndex = advance(text.endIndex, -1)
return text.substringToIndex(endIndex)
}
It calculates the index where you want to make the cut (endIndex of text - 1) and then returns the substring to this index. This function should drop the last character.
I am not using count method here, but for you reference Swift 1.2 introduces count(<#x: T#>) method that calculates length of sets including Strings.
I know this thread is outdated, but I just went through the process of making this work, myself, in Swift 2.2, and figured I could help answer it.
#IBAction func delButton(sender: AnyObject) {
if display.text != nil {
var tempString = Array(display.text!.characters)
tempString.removeLast(1)
display.text = ""
for num in 0..<tempString.count {
display.text = display.text! + String(tempString[num])
}
}
}
Basically, we're checking to see that the display label has stuff in it, so we don't throw an error, and if so, making a variable in the scope of the function to hold the label's characters individually in a string. After that, we remove the last character from the array, clear out the label to ensure we aren't adding what's already there to our new values, then iterating through the updated array of characters and adding it to the label.
It's important to note that we are casting the values contained in the array as String, because they've been put into the array as character values, which operate differently than the string value the label is expecting.
Like I said, I know the thread is a little out of date, but I've been going through courses in Swift, and have discovered that while there is a plethora of information out there for Objective-C, there is perilously little information out there for how to do a lot of those things in Swift. Since the language is being updated repeatedly, I've noticed a growing divide between the two languages.

Get an Int out of an UILabel Swift

I have the problem, to have a high amount of buttons which have a number as their label, so i thought i could take the label as an integer instead of creating an action for every button?!
#IBAction func NumberInput(sender: UIButton) {
var input:Int = sender.titleLabel as Int
}
If you want to do this, you can convert the string to an Int by using string.toInt() such as:
if let input = sender.titleLabel?.text?.toInt() {
// do something with input
} else {
// The label couldn't be parsed into an int
}
However, I'd suggest either using UIView.tag or subclassing UIButton and adding an Int property to it to accomplish this, in case you ever change the display of your labels.
You should make sure that the text exists
var input:Int = (sender.titleLabel.text! as NSString).integerValue
You can't convert a UILabel to an Int. I think you want this instead:
var input : Int? = sender.titleLabel.text?.toInt()
Another way to convert a label in swift:
let num = getIntFromLabel(labelView)
connect all your buttons to 1 IBAction. then create the following variable and the set/get methods based on how you will use it.
note: "something" is a UILabel. The variable I wrote below should help you do conversions easily and with cleaner syntax. "newValue" comes with all setter methods. It basically takes into account any value that could possibly used to set "num" to a new value.
var num : Int {
get {
return Int(something!)!
}
set {
something.text = Int(newValue)
}
}
For Swift 3, what you can do is to directly convert it from an String input to an integer, like this
Int(input.text!)
And then, if for any reason, if you wish to print it out or return is as a String again, you can do
String(Int(input.text!)!)
The exclamation mark shows that it is an optional.

Resources