I managed to convert a NSDecimalNumber to a hex String using BInt library. Here is my code:
public func toDecimalHex(value: NSDecimalNumber) -> String{
let bint = BInt(value.stringValue)
return (bint?.asString(radix: 16))!
}
example:
let number = NSDecimalNumber(string: "1000000000000000000000000")
let hex = toDecimalHex(value: number)
//result : d3c21bcecceda1000000
Basically, how can I convert without using any library like BInt? This brings too much overhead. I just want to get rid of it.
Related
I am using location to access the current temperature in my app. I got the weather temperature but I need only double value from it.
let weatherTemp = "41.7°C"
The required thing is:
weatherTemp = 41.7
How can I do that? I used:
extension Double {
static func parse(from string: String) -> Double? {
return Double(string.components(separatedBy: CharacterSet.decimalDigits.inverted).joined())
}
}
This prints all the digits but not the decimal. I need the decimal too.
Split the string on the '°' and convert the first part to Double
if let tempString = weatherTemp.split(separator: "°").first, let temp = Double(tempString) {
print(temp)
}
I am getting some coordinates from server in string array. And I am trying to save those coordinates in SQLite Database by splitting and converting them to double value. But some coordinates are getting saved in scientific notations. For example I am getting the following coordinate from server:
"-0.0000558,51.3368066"
I am splitting the string and converting it to double resulting in the following values:
[-5.58e-05,51.3368066]
I have tried following solutions but still returning same result:
1.
Double(latLongArr[0])
extension String{
var doubleValue: Double? {
return NumberFormatter().number(from: self)?.doubleValue
}
}
extension String{
var doubleValue: Double? {
let numberFormatter = NumberFormatter()
numberFormatter.allowsFloats = true
numberFormatter.maximumFractionDigits = 10
numberFormatter.numberStyle = .decimal
return numberFormatter.number(from: "\(self)")!.doubleValue
}
}
I have used the above code but it still returns in scientific format but I need it in normal decimal format. So what is the issue?
The last option is the option I would go for and I believe it works right.
I believe your issue is only when you print to console:
As you can see, the double variable is actually converted properly but just when it is formatted to print to the console it shows it as a scientific notation string.
Your other option besides using doubleValue is to use decimalValue
I suggest putting a breakpoint and checking the actual value of your double than reviewing it from the console output which is a formatted string.
Just for reference, code used in the image above:
let number = "-0.0000558"
let numberFormatter = NumberFormatter()
numberFormatter.numberStyle = .decimal
numberFormatter.maximumFractionDigits = 10
let finalNumber = numberFormatter.number(from: number)
let decimalNumber = finalNumber!.decimalValue
let doubleNumber = finalNumber!.doubleValue
print(decimalNumber)
print(doubleNumber)
If you want to print your Doubles without scientific notation use
String(format: "%.7f", value).
Example:
let value = Double(3.141592)
print(String(format: "%.7", value)
will print 3.1415920.
I have used the below extension to represent scientific values in the decimal format.
extension String {
func getDecimalValue() -> Decimal{
return NSNumber(value: Double(self)!).decimalValue
}
}
Usage:
let numberString = "+5.58e-05"
print(numberString.getDecimalValue()) //0.0000558
I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.
Usually one would do something like this:
println("\u{1f44d}") // 👍
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // 👍
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // 👍
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // 👍
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // 👍
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 👍
I'm trying to create a swift iOS program that converts a number into dec, bin, and hex numbers. I've come across the strtoul function, but don't quite understand how to use it, would someone be able to explain it? Thanks!
The method strtoul is pretty simple to use. You will need also to use String(radix:()) to convert it to the other direction. You can create an extension to convert from hexaToDecimal or from binaryToDecimal as follow:
Usage String(radix:())
extension Int {
var toBinary: String {
return String(self, radix: 2)
}
var toHexa: String {
return String(self, radix: 16)
}
}
Usage strtoul()
extension String {
var hexaToDecimal: Int {
return Int(strtoul(self, nil, 16))
}
var hexaToBinary: String {
return hexaToDecimal.toBinary
}
var binaryToDecimal: Int {
return Int(strtoul(self, nil, 2))
}
var binaryToHexa: String {
return binaryToDecimal.toHexa
}
}
Testing
let myBinFromInt = 255.toBinary // "11111111"
let myhexaFromInt = 255.toHexa // "ff"
let myIntFromHexa = "ff".hexaToDecimal // 255
let myBinFromHexa = "ff".hexaToBinary // "11111111"
let myIntFromBin = "11111111".binaryToDecimal // 255
let myHexaFromBin = "11111111".binaryToHexa // "ff"
The strtoul() function converts the string in str to an unsigned long
value. The conversion is done according to the given base, which must be between 2 and 36 inclusive, or be the special value 0.
Really it sounds like you want to use NSString
From what it sounds like, you want to convert an unsigned integer to decimal, hex and binary.
For example, if you had an integer n:
var st = NSString(format:"%2X", n)
would convert the integer to hexadecimal and store it in the variable st.
//NSString(format:"%2X", 10) would give you 'A' as 10 is A in hex
//NSString(format:"%2X", 17) would give you 11 as 17 is 11 in hex
Binary:
var st = NSString(format:"%u", n)
Decimal (2 decimal places)
var st = NSString(format:"%.02f", n)
This question already has answers here:
Swift - How to convert String to Double
(30 answers)
Closed 7 years ago.
How do I convert a string to double in Swift? I've tried string.doubleValue or string.bridgeToObjectiveC().doubleValue and none of it works. Any ideas?
You can create a read-only computed property string extension to help you convert your strings to double:
You can use NSNumberFormatter
extension String {
struct Number {
static let formatter = NSNumberFormatter()
}
var doubleValue: Double {
return Number.formatter.numberFromString(self)?.doubleValue ?? 0
}
}
or you can cast it to NSString and extract its doubleValue property:
extension String {
var ns: NSString {
return self
}
var doubleValue: Double {
return ns.doubleValue
}
}
"2.35".doubleValue + 3.3 // 5.65
According to Stanford CS193p course Winter 2015, the correct way to get a double from a String in Swift is to use NSNumberFormatter instead of NSString:
let decimalAsString = "123.45"
let decimalAsDouble = NSNumberFormatter().numberFromString(decimalAsString)!.doubleValue
If you want to be safe (regarding the optional unwrapping and the decimal separator), you'd use:
let decimalAsString = "123.45"
var formatter = NSNumberFormatter()
formatter.locale = NSLocale(localeIdentifier: "en_US")
if let decimalAsDoubleUnwrapped = formatter.numberFromString(decimalAsString) {
decimalAsDouble = decimalAsDoubleUnwrapped.doubleValue
}
Safe unwrapping is particularly useful if you parse a XML or JSON file and you need to make sure you have a String that can be converted into a Double (the program will crash if you force-unwrap an optional that is actually nil).
/!\
EDIT: be careful, NSNumberFormatter works differently than NSString.
NSString allowed you to do things like : (dictionary[key] as NSString).doubleValue, provided that dictionary[key] used the '.' as decimal separator (like in "123.45").
But NSNumberFormatter(), by default, initializes an instance of NSNumberFormatter with your current system Locale!
Therefore, NSNumberFormatter().numberFromString(decimalAsString)!.doubleValue would work with 123.45 on a US device, but not on a French one for example!
If you are parsing a JSON file for example, and you know that values are stored using '.' as the decimal separator, you need to set the locale of your NSNumberFormatter instance accordingly:
var formatter = NSNumberFormatter()
formatter.locale = NSLocale(localeIdentifier: "en_US")
then:
let decimalAsString = "123.45"
if let decimalAsDoubleUnwrapped = NSNumberFormatter().numberFromString(decimalAsString) {
decimalAsDouble = decimalAsDoubleUnwrapped.doubleValue
}
In that case, decimalAsDouble will correctly return 123.45 as a doubleValue.
But it would return nil if decimalAsString = "123,45".
Or if the NSLocale was set as "fr_FR".
On the other hand, a NSNumberFormatter using a NSLocale with fr_FR would work perfectly with "123,45", but return nil with "123.45".
I thought that was worth reminding.
I updated my answer accordingly.
EDIT : also, NSNumberFormatter wouldn't know what do with things like "+2.45%" or "0.1146"(you would have to define the properties of your NSNumberFormatter instance very precisely). NSString natively does.
you can always just cast from String to NSString like this
let str = "5"
let dbl = (str as NSString).doubleValue
Try this :
var a:Double=NSString(string: "232.3").doubleValue
Try this:
let str = "5"
let double = Double(str.toInt()!)
another way is:
let mySwiftString = "5"
var string = NSString(string: mySwiftString)
string.doubleValue
this latter one posted here:
Swift - How to convert String to Double