As the title says, what is the correct way to convert UnsafeMutablePointer to String in swift?
//lets say x = UnsafeMutablePointer<Int8>
var str = x.memory.????
I tried using x.memory.description obviously it is wrong, giving me a wrong string value.
If the pointer points to a NUL-terminated C string of UTF-8 bytes, you can do this:
import Foundation
let x: UnsafeMutablePointer<Int8> = ...
// or UnsafePointer<Int8>
// or UnsafePointer<UInt8>
// or UnsafeMutablePointer<UInt8>
let str = String(cString: x)
Times have changed. In Swift 3+ you would do it like this:
If you want the utf-8 to be validated:
let str: String? = String(validatingUTF8: c_str)
If you want utf-8 errors to be converted to the unicode error symbol: �
let str: String = String(cString: c_str)
Assuming c_str is of type UnsafePointer<UInt8> or UnsafePointer<CChar> which is the same type and what most C functions return.
this:
let str: String? = String(validatingUTF8: c_str)
doesn't appear to work with UnsafeMutablePointer<UInt8>
(which is what appears to be in my data).
This is me trivially figuring out how to do something like the C/Perl system function:
let task = Process()
task.launchPath = "/bin/ls"
task.arguments = ["-lh"]
let pipe = Pipe()
task.standardOutput = pipe
task.launch()
let data = pipe.fileHandleForReading.readDataToEndOfFile()
var unsafePointer = UnsafeMutablePointer<Int8>.allocate(capacity: data.count)
data.copyBytes(to: unsafePointer, count: data.count)
let output : String = String(cString: unsafePointer)
print(output)
//let output : String? = String(validatingUTF8: unsafePointer)
//print(output!)
if I switch to validatingUTF8 (with optional) instead of cString, I get this error:
./ls.swift:19:37: error: cannot convert value of type 'UnsafeMutablePointer<UInt8>' to expected argument type 'UnsafePointer<CChar>' (aka 'UnsafePointer<Int8>')
let output : String? = String(validatingUTF8: unsafePointer)
^~~~~~~~~~~~~
Thoughts on how to validateUTF8 on the output of the pipe (so I don't get the unicode error symbol anywhere)?
(yes, I'm not doing proper checking of my optional for the print(), that's not the problem I'm currently solving ;-) ).
Related
I am getting unexpected results while doing string manipulation and conversion to integer value. Please someone help me interpret why I am getting these results. I will show my code and print results and explain what I expect instead.
var startIndex = string?.index((string?.startIndex)!, offsetBy: 10)
var endIndex = string?.index(of: ".")!
var field = String(describing: string?[startIndex!..<endIndex!])
print(field as Any)
Prints:
Optional("10")
Just what I expected.
print(field.lengthOfBytes(using: .utf8))
Prints:
14
Not what I expected. I expected a value of 2 since "10" is only 2 characters long.
print(Int(field) as Any)
Prints:
nil
Not what I expected. I expected a value of 10, since a string of "10" converted to integer would be an integer value of 10.
I just duplicated this code in Playground and it works as expected. I don't know why.
//: Playground - noun: a place where people can play
import Foundation
var string = "Network\t\t\t10.0.0.0/8\nClass\t\t\t\tA\nRequired Hosts:\n2\n\nRequired hosts\t2\nAvailable hosts\t2\nSubnet\t\t\t\t10.0.0.0/30\nRange start\t\t10.0.0.1\nRange end\t\t\t10.0.0.2\nBroadcast\t\t\t10.0.0.3\nMask\t\t\t\t255.255.255.252\n\n"
print(string)
var startIndex = string.index(string.startIndex, offsetBy: 10)
var endIndex = string.index(of: ".")!
var field = String(describing: string[startIndex..<endIndex])
print(field as Any)
print(field.lengthOfBytes(using: .utf8))
print(Int(field) as Any)
Prints:
Network 10.0.0.0/8
Class A
Required Hosts:
2
Required hosts 2
Available hosts 2
Subnet 10.0.0.0/30
Range start 10.0.0.1
Range end 10.0.0.2
Broadcast 10.0.0.3
Mask 255.255.255.252
10
2
Optional(10)
Try this hope this will help:
var field = String("10")
var intValue = Int(field)
print(intValue!)
Result: 10
var field = String(describing: string?[startIndex!..
field now has the value Optional("10").
print(field.lengthOfBytes(using: .utf8))
prints 14 because that's the length of the string Optional("10"). No surprise here.
print(Int(field) as Any)
is nil because the String Optional("10") is not convertible to Int.
If you remove the optionality from string, everything works just fine:
if let str = string {
var startIndex = str.index((string?.startIndex)!, offsetBy: 10)
var endIndex = str.index(of: ".")!
var field = String(describing: str[startIndex..<endIndex])
print(field)
print(field.lengthOfBytes(using: .utf8))
print(Int(field))
}
Although the code is a bit cumbersome it should work if the first digit is the 11th character in the string
However in this case Scanner is much more reliable because the offset can be dynamic and the result is Int.
The scanner scans up to the first decimal digit and then it scans an Int and assign it to the result variable.
let string = "Network\t\t\t10.0.0.0/8\nClass\t\t\t\tA\nRequired Hosts:\n2\n\nRequired hosts\t2\nAvailable hosts\t2\nSubnet\t\t\t\t10.0.0.0/30\nRange start\t\t10.0.0.1\nRange end\t\t\t10.0.0.2\nBroadcast\t\t\t10.0.0.3\nMask\t\t\t\t255.255.255.252\n\n"
var result = 0
let scanner = Scanner(string: string)
scanner.scanUpToCharacters(from: .decimalDigits, into: nil)
scanner.scanInt(&result)
print(result) // 10
I am trying to read the string from a Label and remove the last character form it.
This is how I am trying:
#IBAction func del(sender: UIButton) {
let str = telephone.text!;
let newstr = str.remove(at: str.index(before: str.endIndex))
telephone.text = newstr;
}
When I run, I get an error:
"String" does not have a member named "remove"
Can someone help me figure out the problem?
Just started learning swift :(
remove(at:) mutates the receiver which must therefore be a variable
string:
var str = telephone.text!
str.remove(at: str.index(before: str.endIndex))
telephone.text = str
Alternatively use substring(to:), which returns the new string
instead of modifying the receiver:
let str = telephone.text!
let newstr = str.substring(to: str.index(before: str.endIndex))
telephone.text = newstr
remove is defined as follows:
public mutating func remove(at i: String.Index) -> Character
See the mutating modifier? That means it mutates the instance on which the method is called. In your case, the instance is str, a constant. Since constants cannot be mutated, the code does not compile.
And since remove returns the removed character,
let newstr = str.remove(at: str.index(before: str.endIndex))
here newstr will not be storing the string with the last character removed.
You should rewrite the method like this:
telephone.text!.remove(at: telephone.text!.index(before: telephone.text!.endIndex))
You can use:
let idx = str.index(before: str.endIndex) // compute the index
let s = str.substring(to: idx) // get the substring
I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.
Usually one would do something like this:
println("\u{1f44d}") // 👍
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // 👍
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // 👍
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // 👍
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // 👍
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 👍
I have a code :
var i : AnyObject!
i = 10
println(i as String)
println(i.stringValue)
it get crashed on as String line but runs in second i.stringValue.
What is the difference between as String and stringValue in the above lines?
.stringValue is a way to extract Integer into string value but as String will not work for that And if you use as String then Xcode will force you to add ! with as which is not good and it will never succeed and it would crash your app. you can't cast Int to String. It will always fail. Thats why when you do as! String it crashes the app.
So casting is not a good idea here.
And here is some more ways to extract Integer into string value:
let i : Int = 5 // 5
let firstWay = i.description // "5"
let anotherWay = "\(i)" // "5"
let thirdWay = String(i) // "5"
Here you can not use let forthway = i.stringValue because Int Doesn't have member named stringValue
But you can do same thing with anyObject as shown below:
let i : AnyObject = 5 // 5
let firstWay = i.description // "5"
let anotherWay = "\(i)" // "5"
let thirdWay = String(i) // "5"
let forthway = i.stringValue // "5" // now this will work.
Both are casting an Int to String but this will not work anymore.
In Swift 2 its not possible to do it like that.
U should use:
let i = 5
print(String(format: "%i", i))
This will specifically write the int value as a String
with as String, you can not cast the value but you define that the variable contains String but in you case it is Int.so it crashes.
while the other way i.e. of i.stringValue cast your value into String.So it doesn't gives you any crash and successfully cast into String value.
Note: As you are using AnyObject, variable have member stringvalue...but Int doesn't have...To cast Int value check out #Dharmesh Kheni answer
I try to get the path to my application at runtime. I found some old sources from C and converted it accordingly to the functions parameter type definition:
var path = [Int8] (count:1024, repeatedValue: 0)
var bufsize : UInt32 = 1024
if _NSGetExecutablePath(&path, &bufsize) == 0 {
println("executable path is \(path)")
}
It runs, but I need an Int8 array, not a string. So I have to search for the end of the character chain and convert it back to a string. What is the correct way to use this function in SWIFT?
You need to create a Swift String from a C String
let executablePath = String(CString: path, encoding: NSASCIIStringEncoding)!
println("executable path is \(executablePath)")
But there is an easier way to get the path to the executable
let executablePath = Bundle.main.executablePath!
In Swift 4
let executablePath = Bundle.main.executablePath!
print(executablePath)