I'm trying to create a swift iOS program that converts a number into dec, bin, and hex numbers. I've come across the strtoul function, but don't quite understand how to use it, would someone be able to explain it? Thanks!
The method strtoul is pretty simple to use. You will need also to use String(radix:()) to convert it to the other direction. You can create an extension to convert from hexaToDecimal or from binaryToDecimal as follow:
Usage String(radix:())
extension Int {
var toBinary: String {
return String(self, radix: 2)
}
var toHexa: String {
return String(self, radix: 16)
}
}
Usage strtoul()
extension String {
var hexaToDecimal: Int {
return Int(strtoul(self, nil, 16))
}
var hexaToBinary: String {
return hexaToDecimal.toBinary
}
var binaryToDecimal: Int {
return Int(strtoul(self, nil, 2))
}
var binaryToHexa: String {
return binaryToDecimal.toHexa
}
}
Testing
let myBinFromInt = 255.toBinary // "11111111"
let myhexaFromInt = 255.toHexa // "ff"
let myIntFromHexa = "ff".hexaToDecimal // 255
let myBinFromHexa = "ff".hexaToBinary // "11111111"
let myIntFromBin = "11111111".binaryToDecimal // 255
let myHexaFromBin = "11111111".binaryToHexa // "ff"
The strtoul() function converts the string in str to an unsigned long
value. The conversion is done according to the given base, which must be between 2 and 36 inclusive, or be the special value 0.
Really it sounds like you want to use NSString
From what it sounds like, you want to convert an unsigned integer to decimal, hex and binary.
For example, if you had an integer n:
var st = NSString(format:"%2X", n)
would convert the integer to hexadecimal and store it in the variable st.
//NSString(format:"%2X", 10) would give you 'A' as 10 is A in hex
//NSString(format:"%2X", 17) would give you 11 as 17 is 11 in hex
Binary:
var st = NSString(format:"%u", n)
Decimal (2 decimal places)
var st = NSString(format:"%.02f", n)
Related
There is better way to convert from unichar to Character?
tried :
var unichar = ...
let str = NSString(characters: &unichar, length: 1) as String
let character = Array(str.characters)[0])
You can convert unichar -> UnicodeScalar -> Character:
let c = unichar(8364)
if let uc = UnicodeScalar(c) {
let char = Character(uc)
print(char) // €
} else {
print("illegal input")
}
Input values in the range 0xD800...0xDFFF
(high and low surrogates) are not allowed because they do not
correspond to valid Unicode scalar values.
If it is guaranteed that those input values do not occur then you
can simplify the conversion to
let char = Character(UnicodeScalar(c)!)
To replace a possible invalid input value by a default character
(e.g. a question mark), use
let char = Character(UnicodeScalar(c) ?? "?")
public class func stringForIcon(_ icon: NSInteger) -> Character {
return Character(UnicodeScalar(icon)!)
}
I tried to convert Decimal to Int with the follow code:
Int(pow(Decimal(size), 2) - 1)
But I get:
.swift:254:43: Cannot invoke initializer for type 'Int' with an argument list of type '(Decimal)'
Here I know pow is returning a Decimal but it seems that Int has no constructors and member functions to convert Decimal to Int.
How can I convert Decimal to Int in Swift 3?
This is my updated answer (thanks to Martin R and the OP for the remarks). The OP's problem was just casting the pow(x: Decimal,y: Int) -> Decimal function to an Int after subtracting 1 from the result. I have answered the question with the help of this SO post for NSDecimal and Apple's documentation on Decimal. You have to convert your result to an NSDecimalNumber, which can in turn be casted into an Int:
let size = Decimal(2)
let test = pow(size, 2) - 1
let result = NSDecimalNumber(decimal: test)
print(Int(result)) // testing the cast to Int
let decimalToInt = (yourDecimal as NSDecimalNumber).intValue
or as #MartinR suggested:
let decimalToInt = NSDecimalNumber(decimal: yourDecimal).intValue
If you have a very long decimal, then beware of rounding errors
let decimal = Decimal(floatLiteral: 100.123456)
let intValue = (decimal as NSDecimalNumber).intValue // This is 100
However
let veryLargeDecimal = Decimal(floatLiteral: 100.123456789123)
let intValue = (veryLargeDecimal as NSDecimalNumber).intValue // This is -84 !
I ensured I rounded my Decimal before I converted it to an Int, using NSDecimalRound (which you can put in an extension of Decimal).
var veryLargeDecimal = Decimal(floatLiteral: 100.123456789123)
var rounded = Decimal()
NSDecimalRound(&rounded, &veryLargeDecimal, 0, .down)
let intValue = (rounded as NSDecimalNumber).intValue // This is now 100
There is nothing wrong with either of the posted answers, but I would like to offer up an extension that reduces the verbosity for scenarios where you need to use this frequently.
extension Decimal {
var int: Int {
return NSDecimalNumber(decimal: self).intValue
}
}
To call it:
let powerDecimal = pow(2, 2) // Output is Decimal
let powerInt = powerDecimal.int // Output is now an Int
Unfortunately there is an intermittent failure using some of the methods provided.
NSDecimalNumber(decimal: <num>).intValue can produce unexpected results...
(lldb) po NSDecimalNumber(decimal: self)
10.6666666666666666666666666666666666666
(lldb) po NSDecimalNumber(decimal: self).intValue
0
I think there is more of a discussion on it here, and #Martin was pointing it out here
Instead of using the decimal value directly, I made a work around that converts the decimal to a whole number before converting the Decimal to an Int.
extension Decimal {
func rounded(_ roundingMode: NSDecimalNumber.RoundingMode = .down, scale: Int = 0) -> Self {
var result = Self()
var number = self
NSDecimalRound(&result, &number, scale, roundingMode)
return result
}
var whole: Self { rounded( self < 0 ? .up : .down) }
var fraction: Self { self - whole }
var int: Int {
NSDecimalNumber(decimal: whole).intValue
}
}
Just use the description of Decimal, String replacement the NSDecimalNumber to bridge it.
extension Decimal {
var intVal: Int? {
return Int(self.description)
}
}
I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.
Usually one would do something like this:
println("\u{1f44d}") // 👍
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // 👍
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // 👍
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // 👍
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // 👍
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 👍
Basically, I have a Float, for example: 3.511054256.
How can I extract n number of digits after the decimal point?
i.e. I'd like to retrieve 0.51, 0.511, 0.5110 or etc.
I know I can easily achieve something like this:
var temp: Float = 3.511054256
var aStr = String(format: "%f", temp)
var arr: [AnyObject] = aStr.componentsSeparatedByString(".")
var tempInt: Int = Int(arr.last as! String)!
However, this gives me 511054. I'd like the option of retrieving any number of digits past the decimal point easily.
For a task I'm doing, I only need to retrieve the first 2 digits after the decimal point, but less restriction would be ideal.
Thanks.
You can specify the number of decimal digits, say N, in your format specifier as %.Nf, e.g., for 5 decimal digits, %.5f.
let temp: Float = 3.511054256
let aStr = String(format: "%.5f", temp).componentsSeparatedByString(".").last ?? "Unexpected"
print(aStr) // 51105
Alternatively, for a more dynamic usage, make use of an NSNumberFormatter:
/* Use NSNumberFormatter to extract specific number
of decimal digits from your float */
func getFractionDigitsFrom(num: Float, inout withFormatter f: NSNumberFormatter,
forNumDigits numDigits: Int) -> String {
f.maximumFractionDigits = numDigits
f.minimumFractionDigits = numDigits
let localeDecSep = f.decimalSeparator
return f.stringFromNumber(num)?.componentsSeparatedByString(localeDecSep).last ?? "Unexpected"
}
/* Example usage */
var temp: Float = 3.511054256
var formatter = NSNumberFormatter()
let aStr = getFractionDigitsFrom(temp, withFormatter: &formatter, forNumDigits: 5)
print(aStr) // 51105
Note that both solutions above will perform rounding; e.g., if var temp: Float = 3.519, then asking for 2 decimal digits will produce "52". If you really intend to treat your float temp purely as a String (with no rounding whatsoever), you could solve this using just String methods, e.g.
/* Just treat input as a string with known format rather than a number */
func getFractionDigitsFrom(num: Float, forNumDigits numDigits: Int) -> String {
guard let foo = String(temp).componentsSeparatedByString(".").last
where foo.characters.count >= numDigits else {
return "Invalid input" // or return nil, for '-> String?' return
}
return foo.substringWithRange(foo.startIndex..<foo.startIndex.advancedBy(numDigits))
}
/* Example usage */
let temp: Float = 3.5199
let aStr = getFractionDigitsFrom(temp, forNumDigits: 2) // 51
How do I convert a String to a Long in Swift?
In Java I would do Long.parseLong("str", Character.MAX_RADIX).
We now have these conversion functions built-in in Swift Standard Library:
Encode using base 2 through 36: https://developer.apple.com/documentation/swift/string/2997127-init
Decode using base 2 through 36: https://developer.apple.com/documentation/swift/int/2924481-init
As noted here, you can use the standard library function strtoul():
let base36 = "1ARZ"
let number = strtoul(base36, nil, 36)
println(number) // Output: 60623
The third parameter is the radix. See the man page for how the function handles whitespace and other details.
Here is parseLong() in Swift. Note that the function returns an Int? (optional Int) that must be unwrapped to be used.
// Function to convert a String to an Int?. It returns nil
// if the string contains characters that are not valid digits
// in the base or if the number is too big to fit in an Int.
func parseLong(string: String, base: Int) -> Int? {
let digits = Array("0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ")
var number = 0
for char in string.uppercaseString {
if let digit = find(digits, char) {
if digit < base {
// Compute the value of the number so far
// allowing for overflow
let newnumber = number &* base &+ digit
// Check for overflow and return nil if
// it did overflow
if newnumber < number {
return nil
}
number = newnumber
} else {
// Invalid digit for the base
return nil
}
} else {
// Invalid character not in digits
return nil
}
}
return number
}
if let result = parseLong("1110", 2) {
println("1110 in base 2 is \(result)") // "1110 in base 2 is 14"
}
if let result = parseLong("ZZ", 36) {
println("ZZ in base 36 is \(result)") // "ZZ in base 36 is 1295"
}
Swift ways:
"123".toInt() // Return an optional
C way:
atol("1232")
Or use the NSString's integerValue method
("123" as NSString).integerValue