How to convert an Int to Hex String in Swift - ios

In Obj-C I used to convert an unsigned integer n to a hex string with
NSString *st = [NSString stringWithFormat:#"%2X", n];
I tried for a long time to translate this into Swift language, but unsuccessfully.

You can now do:
let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14
Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.
This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.
Use uppercase X if you want A...F and lowercase x if you want a...f:
String(format: "%x %X", 64206, 64206) // "face FACE"
If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:
let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615
Original Answer
You can still use NSString to do this. The format is:
var st = NSString(format:"%2X", n)
This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:
var st = NSString(format:"%2X", n) as String
or
var st = String(NSString(format:"%2X", n))
or
var st: String = NSString(format:"%2X", n)
Then you can do:
let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"

In Swift there is a specific init method on String for exactly this:
let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f

With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.
#1. Using String's init(_:radix:uppercase:) initializer
Swift String has a init(_:radix:uppercase:) initializer with the following declaration:
init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger
Creates a string representing the given value in base 10, or some other specified base.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:
let string1 = String(2, radix: 16)
print(string1) // prints: "2"
let string2 = String(211, radix: 16)
print(string2) // prints: "d3"
let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"
#2. Using String's init(format:​_:​) initializer
Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:
init(format: String, _ arguments: CVarArg...)
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.
The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:
Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):
import Foundation
let string1 = String(format:"%X", 2)
print(string1) // prints: "2"
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"
#3. Using String's init(format:​arguments:​) initializer
Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:
init(format: String, arguments: [CVarArg])
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):
import Foundation
let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"
let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"
let string3 = String(format:"%02X", arguments: [211])
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"

Swift 5.2.4
let value = 200
let hexString = String(format: "%02X", value)

Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.
You need to use the length modifier for values greater than a 32bit Int
%x = Unsigned 32-bit integer (unsigned int)
ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.
let hexString = String(format:"%llX", decimalValue)

To use
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
In Swift3 import foundation is not required, At least not in a Project.
String should have all the functionality as NSString.

Related

Adding float numbers to Decimal numbers

I have this code:
var weightSum: Float = 3.14159
let weightPerPortionGrams: Decimal = 0.999999
weightSum = weightSum + (weightPerPortionGrams)
The numbers are examples.
I get an error:
Binary operator '+ =' can not be applied to type 'Float' and 'Decimal'.
Does anyone know how to fix it?
To convert a Decimal to a float, you can either do this:
weightSum += Float(truncating: weightPerPortionGrams as NSNumber)
or this:
weightSum += (weightPerPortionGrams as NSNumber).floatValue
Swift needs both variables to be of the same type to make use of the "+" operator, so you would need to either convert your Decimal to type Float or the other way around before summing them:
weightPerPortionGramsFloat = (weightPerPortionGrams as NSNumber).floatValue
or
weightSumDecimal = (weightSum as NSNumber).decimalValue
Greetings!
func + (left: Float, right: Decimal) -> Float {
return left + Float(right.description)!
}
var weightSum: Float = 3.14159
let weightPerPortionGrams: Decimal = 0.999999
weightSum = weightSum + weightPerPortionGrams
print(weightSum)
// prints 4.141589
Hope it helps!
This is because the variables weightSum and weightPerPortionGrams have different types. Luckily you can convert these variables in Swift.
So, to make this work, you should convert the type of weightPerPortionGrams to the one of weightSum, so both variables become the same type:
weightSum = weightSum + NSDecimalNumber(decimal: weightPerPortionGrams).floatValue
Note that Decimal might not handle the floating points quite well.
You can also do an extra cast, to make sure that we are using the methods of NSDecimalNumber and not of NSNumber:
let double = NSDecimalNumber(decimal: weightPerPortionGrams).doubleValue
weightSum = weightSum + Float(double)

how to take binary string input from user in swift

I want to take input from user in binary, What I want is something like:
10101
11110
Then I need to perform bitwise OR on this. I know how to take input and how to perform bitwise OR, only I want to know is how to convert because what I am currently using is not giving right result. What I tried is as below:
let aBits: Int16 = Int16(a)! //a is String "10101"
let bBits: Int16 = Int16(b)! //b is String "11110"
let combinedbits = aBits | bBits
Edit: I don't need decimal to binary conversion with radix, as my string already have only 0 and 1
String can have upto 500 characters like:
10011011111010110111001011001001101110111110110001001111001111101111010110110111‌​00111001100011111010
this is beyond Int limit, how to handle that in Swift?
Edit2 : As per vacawama 's answer, below code works great:
let maxAB = max(a.count, b.count)
let paddedA = String(repeating: "0", count: maxAB - a.count) + a
let paddedB = String(repeating: "0", count: maxAB - b.count) + b
let Str = String(zip(paddedA, paddedB).map({ $0 == ("0", "0") ? "0" : "1" }))
I can have array of upto 500 string and each string can have upto 500 characters. Then I have to get all possible pair and perform bitwise OR and count maximum number of 1's. Any idea to make above solution more efficient? Thank you
Since you need arbitrarily long binary numbers, do everything with strings.
This function first pads the two inputs to the same length, and then uses zip to pair the digits and map to compute the OR for each pair of characters. The resulting array of characters is converted back into a String with String().
func binaryOR(_ a: String, _ b: String) -> String {
let maxAB = max(a.count, b.count)
let paddedA = String(repeating: "0", count: maxAB - a.count) + a
let paddedB = String(repeating: "0", count: maxAB - b.count) + b
return String(zip(paddedA, paddedB).map({ $0 == ("0", "0") ? "0" : "1" }))
}
print(binaryOR("11", "1100")) // "1111"
print(binaryOR("1000", "0001")) // "1001"
I can have array of upto 500 string and each string can have upto 500
characters. Then I have to get all possible pair and perform bitwise
OR and count maximum number of 1's. Any idea to make above solution
more efficient?
You will have to do 500 * 499 / 2 (which is 124,750 comparisons). It is important to avoid unnecessary and/or repeated work.
I would recommend:
Do an initial pass to loop though your strings to find out the length of the largest one. Then pad all of your strings to this length. I would keep track of the original length of each string in a tiny stuct:
struct BinaryNumber {
var string: String // padded string
var length: Int // original length before padding
}
Modify the binaryOR function to take BinaryNumbers and return Int, the count of "1"s in the OR.
func binaryORcountOnes(_ a: BinaryNumber, _ b: BinaryNumber) -> Int {
let maxAB = max(a.length, b.length)
return zip(a.string.suffix(maxAB), b.string.suffix(maxAB)).reduce(0) { total, pair in return total + (pair == ("0", "0") ? 0 : 1) }
}
Note: The use of suffix helps the efficiency by only checking the digits that matter. If the original strings had length 2 and 3, then only the last 3 digits will be OR-ed even if they're padded to length 500.
Loop and compare all pairs of BinaryNumbers to find largest count of ones:
var numbers: [BinaryNumber] // This array was created in step 1
maxOnes = 0
for i in 0 ..< (numbers.count - 1) {
for j in (i + 1) ..< numbers.count {
let ones = binaryORcountOnes(numbers[i], numbers[j])
if ones > maxOnes {
maxOnes = ones
}
}
}
print("maxOnes = \(maxOnes)")
Additional idea for speedup
OR can't create more ones than were in the original two numbers, and the number of ones can't exceed the maximum length of either of the original two numbers. So, if you count the ones in each number when you are padding them and store that in your struct in a var ones: Int property, you can use that to see if you should even bother calling binaryORcountOnes:
maxOnes = 0
for i in 0 ..< (numbers.count - 1) {
for j in (i + 1) ..< numbers.count {
if maxOnes < min(numbers[i].ones + numbers[j].ones, numbers[i].length, numbers[j].length) {
let ones = binaryORcountOnes(numbers[i], numbers[j])
if ones > maxOnes {
maxOnes = ones
}
}
}
}
By the way, the length of the original string should really just be the minimum length that includes the highest order 1. So if the original string was "00101", then the length should be 3 because that is all you need to store "101".
let number = Int(a, radix: 2)
Radix helps using binary instead of decimical value
You can use radix for converting your string. Once converted, you can do a bitwise OR and then check the nonzeroBitCount to count the number of 1's
let a = Int("10101", radix: 2)!
let b = Int("11110", radix: 2)!
let bitwiseOR = a | b
let nonZero = bitwiseOR.nonzeroBitCount
As I already commented above "10101" is actually a String not a Binary so "10101" | "11110" will not calculate what you actually needed.
So what you need to do is convert both value in decimal then use bitwiseOR and convert the result back to in Binary String (in which format you have the data "11111" not 11111)
let a1 = Int("10101", radix: 2)!
let b1 = Int("11110", radix: 2)!
var result = 21 | 30
print(result)
Output: 31
Now convert it back to binary string
let binaryString = String(result, radix: 2)
print(binaryString)
Output: 11111
--: EDIT :--
I'm going to answer a basic example of how to calculate bitwiseOR as the question is specific for not use inbuilt function as string is very large to be converted into an Int.
Algorithm: 1|0 = 1, 1|1 = 1, 0|0 = 0, 0|1 = 1
So, What we do is to fetch all the characters from String one by one the will perform the | operation and append it to another String.
var str1 = "100101" // 37
var str2 = "10111" // 23
/// Result should be "110111" -> "55"
// #1. Make both string equal
let length1 = str1.characters.count
let length2 = str2.characters.count
if length1 != length2 {
let maxLength = max(length1, length2)
for index in 0..<maxLength {
if str1.characters.count < maxLength {
str1 = "0" + str1
}
if str2.characters.count < maxLength {
str2 = "0" + str2
}
}
}
// #2. Get the index and compare one by one in bitwise OR
// a) 1 - 0 = 1,
// b) 0 - 1 = 1,
// c) 1 - 1 = 1,
// d) 0 - 0 = 0
let length = max(str1.characters.count, str2.characters.count)
var newStr = ""
for index in 0..<length {
let charOf1 = Int(String(str1[str1.index(str1.startIndex, offsetBy: index)]))!
let charOf2 = Int(String(str2[str2.index(str2.startIndex, offsetBy: index)]))!
let orResult = charOf1 | charOf2
newStr.append("\(orResult)")
}
print(newStr)
Output: 110111 // 55
I would like to refer Understanding Bitwise Operators for more detail.
func addBinary(_ a: String, _ b: String) {
var result = ""
let arrA = Array(a)
let arrB = Array(b)
var lengthA = arrA.count - 1
var lengthB = arrB.count - 1
var sum = 0
while lengthA >= 0 || lengthB >= 0 || sum == 1 {
sum += (lengthA >= 0) ? Int(String(arrA[lengthA]))! : 0
sum += (lengthB >= 0) ? Int(String(arrB[lengthB]))! : 0
result = String((sum % 2)) + result
sum /= 2
lengthA -= 1
lengthB -= 1
}
print(result) }
addBinary("11", "1")

Dividing massive numbers in Swift

I have a UInt128 holding a massive number like 2000009100000000000000 and I want to divide it by 1/10^30
How do I do that?
Possibly by using NSDecimalNumber. For example,
let num1 = NSDecimalNumber(string: "2000009100000000000000")
let num2 = NSDecimalNumber(mantissa: 10, exponent: 30, isNegative: false)
let result = num1.dividing(by: num2)

Inconsistent swift behavior

I am new to Swift.
I have following code
class ViewController: UIViewController {
let var1: Double = 0.0
let var2: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
let someObject = TestViewController(x: 20, total: 30, taxPact: 40, subtotal: 50)
var x = 1 + 1.0 /* COMPILER IS FINE WITH ADDING INT AND DOUBLE */
print("sum is \(var1 + var2)") /* COMPILER COMPLAINS HERE BINARY OPERATOR + CANNOT BE APPLIED */
}
Why do we see such inconsistent behavior?
The error message is unrelated to string interpolation, this
let var1: Double = 0.0
let var2: Int = 0
var x = var1 + var2 // error: binary operator '+' cannot be applied to operands of type 'Double' and 'Int'
does not compile either, and the reason is that there is no +
operator which adds an Int to a Double and
Swift does not implicitly convert types. You have to convert explicitly,
e.g.
var x = var1 + Double(var2)
print("sum is \(var1 + Double(var2))")
Your other statement
var x = 1 + 1.0
compiles because both Int and Double (and some more types)
conform to the IntegerLiteralConvertible protocol,
so the literal 1 can be both a Int literal
and a Double literal. Here the compiler chooses 1 to be a
Double because that is the only choice for which a suitable
+ operator exists.

Swift CodeUnit to String

As stated in Apple Docs:
An arbitrary Unicode scalar, written as \u{n}, where n is a 1–8 digit hexadecimal number with a value equal to a valid Unicode code point
let dollarSign = "\u{24}" // $, Unicode scalar U+0024
my question is if I have the hexadecimal digit how could I turn it into string. So if I have the following :
let dollarSignHex = 24
How can I map it to let dollarSignString = ????
24 is a decimal integer constant. If you want the Unicode code point
with the hexadecimal number 24 then you have to start with
let dollarCode = 0x24
or
let dollarCode = 36
Then you can create a string from that integer value with
let dollarSignString = String(UnicodeScalar(dollarCode)) // $
Alternatively, start with a string containing the hexadecimal
representation of the code point, and convert that to a number
and then to a string:
let dollarSignHex = "24"
let dollarCode = UInt32(dollarSignHex, radix: 16)! // 36
let dollarSignString = String(UnicodeScalar(dollarCode)) // $

Resources