I have a code :
var i : AnyObject!
i = 10
println(i as String)
println(i.stringValue)
it get crashed on as String line but runs in second i.stringValue.
What is the difference between as String and stringValue in the above lines?
.stringValue is a way to extract Integer into string value but as String will not work for that And if you use as String then Xcode will force you to add ! with as which is not good and it will never succeed and it would crash your app. you can't cast Int to String. It will always fail. Thats why when you do as! String it crashes the app.
So casting is not a good idea here.
And here is some more ways to extract Integer into string value:
let i : Int = 5 // 5
let firstWay = i.description // "5"
let anotherWay = "\(i)" // "5"
let thirdWay = String(i) // "5"
Here you can not use let forthway = i.stringValue because Int Doesn't have member named stringValue
But you can do same thing with anyObject as shown below:
let i : AnyObject = 5 // 5
let firstWay = i.description // "5"
let anotherWay = "\(i)" // "5"
let thirdWay = String(i) // "5"
let forthway = i.stringValue // "5" // now this will work.
Both are casting an Int to String but this will not work anymore.
In Swift 2 its not possible to do it like that.
U should use:
let i = 5
print(String(format: "%i", i))
This will specifically write the int value as a String
with as String, you can not cast the value but you define that the variable contains String but in you case it is Int.so it crashes.
while the other way i.e. of i.stringValue cast your value into String.So it doesn't gives you any crash and successfully cast into String value.
Note: As you are using AnyObject, variable have member stringvalue...but Int doesn't have...To cast Int value check out #Dharmesh Kheni answer
Related
I am trying to read the string from a Label and remove the last character form it.
This is how I am trying:
#IBAction func del(sender: UIButton) {
let str = telephone.text!;
let newstr = str.remove(at: str.index(before: str.endIndex))
telephone.text = newstr;
}
When I run, I get an error:
"String" does not have a member named "remove"
Can someone help me figure out the problem?
Just started learning swift :(
remove(at:) mutates the receiver which must therefore be a variable
string:
var str = telephone.text!
str.remove(at: str.index(before: str.endIndex))
telephone.text = str
Alternatively use substring(to:), which returns the new string
instead of modifying the receiver:
let str = telephone.text!
let newstr = str.substring(to: str.index(before: str.endIndex))
telephone.text = newstr
remove is defined as follows:
public mutating func remove(at i: String.Index) -> Character
See the mutating modifier? That means it mutates the instance on which the method is called. In your case, the instance is str, a constant. Since constants cannot be mutated, the code does not compile.
And since remove returns the removed character,
let newstr = str.remove(at: str.index(before: str.endIndex))
here newstr will not be storing the string with the last character removed.
You should rewrite the method like this:
telephone.text!.remove(at: telephone.text!.index(before: telephone.text!.endIndex))
You can use:
let idx = str.index(before: str.endIndex) // compute the index
let s = str.substring(to: idx) // get the substring
In my previous app , we have used
let value = String.stringFromStringNumberOrNil(myProperty?.value)
valueLabel.text = value
In above code 'myProperty.value' was kind of 'Any' Class
When i am trying to convert the same code to swift 3.0 app cause error
I have wrote :
var value1 : Int = myProperty.value as Int
or
var value1 : String = aylaProperty.value as! String
App Gives error :
Ambiguous reference to member 'value'
What should i do for this ?
Use Strings initializer
let myPropertyValue:Any = 6
let x = myPropertyValue as! Int
let value = String(x)
let valueLabel = UILabel()
valueLabel.text = value
I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.
Usually one would do something like this:
println("\u{1f44d}") // 👍
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // 👍
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // 👍
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // 👍
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // 👍
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 👍
newbie programmer here, can't figure out why it does it...
let a1 = "This is some text"
let x : Int = 1
var stringValue = "a\(x)"
print(stringValue)
I want it to print "This is some text", but it only ever prints a1.
Rather than storing 'This is some text' in a1 string, save it in a dictionary so you can create the key (x1,x2 etc) to access it:
let stringDictionary = ["a1": "This is some text", "a2": "This is more text"]
You can then pull it out using:
let x : Int = 1
let stringKey = "a\(x)"
let stringValue = stringDictionary[stringKey]
print(stringValue)
What you want is an eval or evaluate type function, but that doesn't exist. What you're actually getting is a description of the x variable, which is the number 1 converted into text, hence a1.
If you want to get a value for a key you should use a dictionary.
let device = devices[indexPath.row]
let deviceTag = device["deviceID"] as? String
cell.slider.tag = deviceTag
I get an error from the above: Cannot assign value of type 'String?' to type 'Int'
This doesn't work (below):
cell.slider.tag = Int(deviceTag)
or what "fix-it" provides:
cell.slider.tag = Int(deviceTag!)!
You need to set optional while casting from string. Try the below code.
let device = devices[indexPath.row]
let deviceTag = device["deviceID"] as! String
cell.slider.tag = Int(deviceTag)
Unless you're 100% sure the value of the dictionary device is an Int you shouldn't implicitly unwrap those optionals.
You can use:
if let deviceTag = deviceTag, tag = Int(deviceTag) { cell.slider.tag = tag }
or
cell.slider.tag = Int(deviceTag) ?? 0
But looking at your examples, it seems like deviceTag isn't a number at all. Perhaps you should look in your debug area (left panel) to see what the value is or simply print(deviceTag) to see what the value is.