Inconsistent swift behavior - ios

I am new to Swift.
I have following code
class ViewController: UIViewController {
let var1: Double = 0.0
let var2: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
let someObject = TestViewController(x: 20, total: 30, taxPact: 40, subtotal: 50)
var x = 1 + 1.0 /* COMPILER IS FINE WITH ADDING INT AND DOUBLE */
print("sum is \(var1 + var2)") /* COMPILER COMPLAINS HERE BINARY OPERATOR + CANNOT BE APPLIED */
}
Why do we see such inconsistent behavior?

The error message is unrelated to string interpolation, this
let var1: Double = 0.0
let var2: Int = 0
var x = var1 + var2 // error: binary operator '+' cannot be applied to operands of type 'Double' and 'Int'
does not compile either, and the reason is that there is no +
operator which adds an Int to a Double and
Swift does not implicitly convert types. You have to convert explicitly,
e.g.
var x = var1 + Double(var2)
print("sum is \(var1 + Double(var2))")
Your other statement
var x = 1 + 1.0
compiles because both Int and Double (and some more types)
conform to the IntegerLiteralConvertible protocol,
so the literal 1 can be both a Int literal
and a Double literal. Here the compiler chooses 1 to be a
Double because that is the only choice for which a suitable
+ operator exists.

Related

Swift: initializer 'init(_:)' requires that 'Decimal' conform to 'BinaryInteger'

I'm trying to create a function that calculates and returns compound interest. The variables are having different data types. Whenever I run the program I get an error initializer 'init(_:)' requires that 'Decimal' conform to 'BinaryInteger'. The following is my code:
import Foundation
class Compound{
var p:Double
var t:Int
var r:Double
var n:Int
var interest:Double
var amount:Double
init(p:Double,t:Int,r:Double,n:Int){
self.p = p
self.t = t
self.r = r
self.n = n
}
func calculateAmount() -> Double {
amount = p * Double(pow(Decimal(1 + (r / Double(n))),n * t))
return amount
}
}
The Error:
error: initializer 'init(_:)' requires that 'Decimal' conform to 'BinaryInteger'
amount = p * Double(pow(Decimal(1 + (r / Double(n))),n * t))
^
After looking at a similar problem I've also tried the following technique but I'm still getting the same error
func calculateAmount() -> Double {
let gg:Int = n * t
amount = p * Double(pow(Decimal(1 + (r / Double(n))),Int(truncating: gg as NSNumber) ))
return amount
}
How to solve this?
It would be easier to use the Double func pow(_: Double, _: Double) -> Double instead of using Decimal func pow(_ x: Decimal, _ y: Int) -> Decimal considering that you want to return a Double:
#discardableResult
func calculateAmount() -> Double {
amount = p * pow(1 + (r / Double(n)), Double(n) * Double(t))
return amount
}

how to take binary string input from user in swift

I want to take input from user in binary, What I want is something like:
10101
11110
Then I need to perform bitwise OR on this. I know how to take input and how to perform bitwise OR, only I want to know is how to convert because what I am currently using is not giving right result. What I tried is as below:
let aBits: Int16 = Int16(a)! //a is String "10101"
let bBits: Int16 = Int16(b)! //b is String "11110"
let combinedbits = aBits | bBits
Edit: I don't need decimal to binary conversion with radix, as my string already have only 0 and 1
String can have upto 500 characters like:
10011011111010110111001011001001101110111110110001001111001111101111010110110111‌​00111001100011111010
this is beyond Int limit, how to handle that in Swift?
Edit2 : As per vacawama 's answer, below code works great:
let maxAB = max(a.count, b.count)
let paddedA = String(repeating: "0", count: maxAB - a.count) + a
let paddedB = String(repeating: "0", count: maxAB - b.count) + b
let Str = String(zip(paddedA, paddedB).map({ $0 == ("0", "0") ? "0" : "1" }))
I can have array of upto 500 string and each string can have upto 500 characters. Then I have to get all possible pair and perform bitwise OR and count maximum number of 1's. Any idea to make above solution more efficient? Thank you
Since you need arbitrarily long binary numbers, do everything with strings.
This function first pads the two inputs to the same length, and then uses zip to pair the digits and map to compute the OR for each pair of characters. The resulting array of characters is converted back into a String with String().
func binaryOR(_ a: String, _ b: String) -> String {
let maxAB = max(a.count, b.count)
let paddedA = String(repeating: "0", count: maxAB - a.count) + a
let paddedB = String(repeating: "0", count: maxAB - b.count) + b
return String(zip(paddedA, paddedB).map({ $0 == ("0", "0") ? "0" : "1" }))
}
print(binaryOR("11", "1100")) // "1111"
print(binaryOR("1000", "0001")) // "1001"
I can have array of upto 500 string and each string can have upto 500
characters. Then I have to get all possible pair and perform bitwise
OR and count maximum number of 1's. Any idea to make above solution
more efficient?
You will have to do 500 * 499 / 2 (which is 124,750 comparisons). It is important to avoid unnecessary and/or repeated work.
I would recommend:
Do an initial pass to loop though your strings to find out the length of the largest one. Then pad all of your strings to this length. I would keep track of the original length of each string in a tiny stuct:
struct BinaryNumber {
var string: String // padded string
var length: Int // original length before padding
}
Modify the binaryOR function to take BinaryNumbers and return Int, the count of "1"s in the OR.
func binaryORcountOnes(_ a: BinaryNumber, _ b: BinaryNumber) -> Int {
let maxAB = max(a.length, b.length)
return zip(a.string.suffix(maxAB), b.string.suffix(maxAB)).reduce(0) { total, pair in return total + (pair == ("0", "0") ? 0 : 1) }
}
Note: The use of suffix helps the efficiency by only checking the digits that matter. If the original strings had length 2 and 3, then only the last 3 digits will be OR-ed even if they're padded to length 500.
Loop and compare all pairs of BinaryNumbers to find largest count of ones:
var numbers: [BinaryNumber] // This array was created in step 1
maxOnes = 0
for i in 0 ..< (numbers.count - 1) {
for j in (i + 1) ..< numbers.count {
let ones = binaryORcountOnes(numbers[i], numbers[j])
if ones > maxOnes {
maxOnes = ones
}
}
}
print("maxOnes = \(maxOnes)")
Additional idea for speedup
OR can't create more ones than were in the original two numbers, and the number of ones can't exceed the maximum length of either of the original two numbers. So, if you count the ones in each number when you are padding them and store that in your struct in a var ones: Int property, you can use that to see if you should even bother calling binaryORcountOnes:
maxOnes = 0
for i in 0 ..< (numbers.count - 1) {
for j in (i + 1) ..< numbers.count {
if maxOnes < min(numbers[i].ones + numbers[j].ones, numbers[i].length, numbers[j].length) {
let ones = binaryORcountOnes(numbers[i], numbers[j])
if ones > maxOnes {
maxOnes = ones
}
}
}
}
By the way, the length of the original string should really just be the minimum length that includes the highest order 1. So if the original string was "00101", then the length should be 3 because that is all you need to store "101".
let number = Int(a, radix: 2)
Radix helps using binary instead of decimical value
You can use radix for converting your string. Once converted, you can do a bitwise OR and then check the nonzeroBitCount to count the number of 1's
let a = Int("10101", radix: 2)!
let b = Int("11110", radix: 2)!
let bitwiseOR = a | b
let nonZero = bitwiseOR.nonzeroBitCount
As I already commented above "10101" is actually a String not a Binary so "10101" | "11110" will not calculate what you actually needed.
So what you need to do is convert both value in decimal then use bitwiseOR and convert the result back to in Binary String (in which format you have the data "11111" not 11111)
let a1 = Int("10101", radix: 2)!
let b1 = Int("11110", radix: 2)!
var result = 21 | 30
print(result)
Output: 31
Now convert it back to binary string
let binaryString = String(result, radix: 2)
print(binaryString)
Output: 11111
--: EDIT :--
I'm going to answer a basic example of how to calculate bitwiseOR as the question is specific for not use inbuilt function as string is very large to be converted into an Int.
Algorithm: 1|0 = 1, 1|1 = 1, 0|0 = 0, 0|1 = 1
So, What we do is to fetch all the characters from String one by one the will perform the | operation and append it to another String.
var str1 = "100101" // 37
var str2 = "10111" // 23
/// Result should be "110111" -> "55"
// #1. Make both string equal
let length1 = str1.characters.count
let length2 = str2.characters.count
if length1 != length2 {
let maxLength = max(length1, length2)
for index in 0..<maxLength {
if str1.characters.count < maxLength {
str1 = "0" + str1
}
if str2.characters.count < maxLength {
str2 = "0" + str2
}
}
}
// #2. Get the index and compare one by one in bitwise OR
// a) 1 - 0 = 1,
// b) 0 - 1 = 1,
// c) 1 - 1 = 1,
// d) 0 - 0 = 0
let length = max(str1.characters.count, str2.characters.count)
var newStr = ""
for index in 0..<length {
let charOf1 = Int(String(str1[str1.index(str1.startIndex, offsetBy: index)]))!
let charOf2 = Int(String(str2[str2.index(str2.startIndex, offsetBy: index)]))!
let orResult = charOf1 | charOf2
newStr.append("\(orResult)")
}
print(newStr)
Output: 110111 // 55
I would like to refer Understanding Bitwise Operators for more detail.
func addBinary(_ a: String, _ b: String) {
var result = ""
let arrA = Array(a)
let arrB = Array(b)
var lengthA = arrA.count - 1
var lengthB = arrB.count - 1
var sum = 0
while lengthA >= 0 || lengthB >= 0 || sum == 1 {
sum += (lengthA >= 0) ? Int(String(arrA[lengthA]))! : 0
sum += (lengthB >= 0) ? Int(String(arrB[lengthB]))! : 0
result = String((sum % 2)) + result
sum /= 2
lengthA -= 1
lengthB -= 1
}
print(result) }
addBinary("11", "1")

Left side of nil coalescing operator '??' has non-optional type 'Int', so the right side is Never used Warning after Conversion from swift 1.2 to 4

i've converted a line from swift 1.2 to swift 4 but it still gives warning suggestion. if i remove that warning it was deleting one value in the addition. Can someone suggest am i doing anything wrong conversion to swift 4
Swift 1.2-----> let sumEmailRemoved = account.numMessageArchived.integerValue ?? 0 + account.numMessageDeleted.integerValue ?? 0
Swift 4------> let sumEmailRemoved = account.numMessageArchived.intValue ?? 0 + account.numMessageDeleted.intValue ?? 0
I'm trying to convert a method from swift 1.2 to swift 4 but it was nat satisfying the real requirement. can somebody please convert correctly or give your valuable suggestion to make me perfect.
Swift 1.2 code below---->
func checkShouldAllow() -> Bool {
let productPurchased = NSUserDefaults.standardUserDefaults().boolForKey(kFullVersionProductId)
if productPurchased { return true }
if let account = account {
let sumEmailRemoved = account.numMessageArchived.integerValue ?? 0 + account.numMessageDeleted.integerValue ?? 0
if sumEmailRemoved > numCap {
return false
}
}
let numContactsRemoved = NSUserDefaults.standardUserDefaults().integerForKey(kNumContactsRemoved)
println(numContactsRemoved)
return numContactsRemoved < numContactsCap
}
Need Swift 4 Code ---->
As you could deduce from the error message, one (but I guess both) of the account.numMessageArchived.intValue and account.numMessageDeleted.intValue are now non-optional, therefore the ?? is redundant. Try:
let sumEmailRemoved = account.numMessageArchived.intValue + account.numMessageDeleted.intValue
It's about the precedence and associativity of operators.
Precedence = order of operation. E.g., * is higher than +, so it's evaluated first:
1 + 2 * 3
= 1 + (2 * 3)
= 7
Associativity = left or right group first. E.g., + is left associative, so it's evaluated from the left group:
1 + 2 + 3
= ((1 + 2) + 3)
= 6
Getting back to your question. First, let's simplify the code:
var num1: Int? = 0
var num2: Int = 0 // I believe this is non-optional, or else compiler will give another error.
let result = num1 ?? 0 + num2 ?? 0
Now surprisingly, + has higher precedence than ?? operator. So it's done first:
num1 ?? (0 + num2) ?? 0
num1 ?? 0 ?? 0
Then, ?? is right associative, so it's calculated from the right:
(num1 ?? (0 ?? 0))
The left 0 of (0 ?? 0) is already non-optional. That's why it gives a warning:
Left side of nil coalescing operator '??' has non-optional type 'Int', so the right side is never used
The solution is to use parentheses to be more clear:
let result = (num1 ?? 0) + num2
Check out the table of precedence and associativity here, ordered from high to low:
https://developer.apple.com/documentation/swift/operator_declarations

Swift - Unexpected error in simple if statement

I'm new to Swift so this might turn out to be very simple, but I'll ask you anyway since I can't figure it out:
I was playing on the Playground and thought I'd write some lines of code in order to do this: generate a random number between two given values (a,b).
If, for example, a = 5 and b = 20, a random number between 5 and 20 is supposed to be generated.
But I get an unexpected error! I wrote these lines of code
var a = UInt32()
var b = UInt32()
var randomNumberInBetween = Int()
a = 5
b = 20
if (b - a) > 0 {
randomNumberInBetween = Int(arc4random_uniform(b - a) + a)
} else {
print("error", terminator: "")
}
Now:
If b > a (so that (b-a)>0 ) it works just fine.
If b = a it prints "error", so it works correctly.
BUT if a > b, so that (b-a)<0, it gives this error : "Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_l386_INVOP, subcode=0x0)."
screenshot
My question is: If (b-a)<0 shouldn't it just run the "else" part of the if statement and just print "error" ? Why does this error occur?
The UInt32 version of - has the signature: (UInt32, UInt32) -> UInt32.
7 - 9 is -2.
You can't express -2 with an unsigned type such as UInt32.
Try something like:
func random(inRange range: Range<Int>) -> Int {
return range.lowerBound + Int(arc4random_uniform(UInt32(range.upperBound - range.lowerBound)))
}
func random(inRange range: ClosedRange<Int>) -> Int {
return range.lowerBound + Int(arc4random_uniform(UInt32(range.upperBound - range.lowerBound + 1)))
}
print(random(inRange: 1..<10))

How to convert an Int to Hex String in Swift

In Obj-C I used to convert an unsigned integer n to a hex string with
NSString *st = [NSString stringWithFormat:#"%2X", n];
I tried for a long time to translate this into Swift language, but unsuccessfully.
You can now do:
let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14
Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.
This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.
Use uppercase X if you want A...F and lowercase x if you want a...f:
String(format: "%x %X", 64206, 64206) // "face FACE"
If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:
let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615
Original Answer
You can still use NSString to do this. The format is:
var st = NSString(format:"%2X", n)
This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:
var st = NSString(format:"%2X", n) as String
or
var st = String(NSString(format:"%2X", n))
or
var st: String = NSString(format:"%2X", n)
Then you can do:
let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
In Swift there is a specific init method on String for exactly this:
let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.
#1. Using String's init(_:radix:uppercase:) initializer
Swift String has a init(_:radix:uppercase:) initializer with the following declaration:
init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger
Creates a string representing the given value in base 10, or some other specified base.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:
let string1 = String(2, radix: 16)
print(string1) // prints: "2"
let string2 = String(211, radix: 16)
print(string2) // prints: "d3"
let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"
#2. Using String's init(format:​_:​) initializer
Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:
init(format: String, _ arguments: CVarArg...)
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.
The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:
Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):
import Foundation
let string1 = String(format:"%X", 2)
print(string1) // prints: "2"
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"
#3. Using String's init(format:​arguments:​) initializer
Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:
init(format: String, arguments: [CVarArg])
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):
import Foundation
let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"
let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"
let string3 = String(format:"%02X", arguments: [211])
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
Swift 5.2.4
let value = 200
let hexString = String(format: "%02X", value)
Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.
You need to use the length modifier for values greater than a 32bit Int
%x = Unsigned 32-bit integer (unsigned int)
ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.
let hexString = String(format:"%llX", decimalValue)
To use
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
In Swift3 import foundation is not required, At least not in a Project.
String should have all the functionality as NSString.

Resources