I'm making a calculator app in Swift, once my answer is obtained I want to display it in a UILabel. Only problem is I want to limit said answer to 8 characters. Here is my code:
let answerString = "\(answer)"
println(answer)
calculatorDisplay.text = answerString.substringToIndex(advance(answerString.startIndex, 8))
This does not return any compiler errors but at runtime I get:
fatal error: can not increment endIndex
Any and all help would be greatly appreciated.
There are two different advance() functions:
/// Return the result of advancing `start` by `n` positions. ...
func advance<T : ForwardIndexType>(start: T, n: T.Distance) -> T
/// Return the result of advancing start by `n` positions, or until it
/// equals `end`. ...
func advance<T : ForwardIndexType>(start: T, n: T.Distance, end: T) -> T
Using the second one you can ensure that the result is within the valid bounds of the string:
let truncatedText = answerString.substringToIndex(advance(answerString.startIndex, 8, answerString.endIndex))
Update for Swift 2/Xcode 7:
let truncatedText = answerString.substringToIndex(answerString.startIndex.advancedBy(8, limit: answerString.endIndex))
But a simpler solution is
let truncatedText = String(answerString.characters.prefix(8))
Update for Swift 3/Xcode 8 beta 6: As of Swift 3, "collections move
their index", the corresponding code is now
let to = answerString.index(answerString.startIndex,
offsetBy: 8,
limitedBy: answerString.endIndex)
let truncatedText = answerString.substring(to: to ?? answerString.endIndex)
The simpler solution
let truncatedText = String(answerString.characters.prefix(8))
still works.
Related
I am getting unexpected results while doing string manipulation and conversion to integer value. Please someone help me interpret why I am getting these results. I will show my code and print results and explain what I expect instead.
var startIndex = string?.index((string?.startIndex)!, offsetBy: 10)
var endIndex = string?.index(of: ".")!
var field = String(describing: string?[startIndex!..<endIndex!])
print(field as Any)
Prints:
Optional("10")
Just what I expected.
print(field.lengthOfBytes(using: .utf8))
Prints:
14
Not what I expected. I expected a value of 2 since "10" is only 2 characters long.
print(Int(field) as Any)
Prints:
nil
Not what I expected. I expected a value of 10, since a string of "10" converted to integer would be an integer value of 10.
I just duplicated this code in Playground and it works as expected. I don't know why.
//: Playground - noun: a place where people can play
import Foundation
var string = "Network\t\t\t10.0.0.0/8\nClass\t\t\t\tA\nRequired Hosts:\n2\n\nRequired hosts\t2\nAvailable hosts\t2\nSubnet\t\t\t\t10.0.0.0/30\nRange start\t\t10.0.0.1\nRange end\t\t\t10.0.0.2\nBroadcast\t\t\t10.0.0.3\nMask\t\t\t\t255.255.255.252\n\n"
print(string)
var startIndex = string.index(string.startIndex, offsetBy: 10)
var endIndex = string.index(of: ".")!
var field = String(describing: string[startIndex..<endIndex])
print(field as Any)
print(field.lengthOfBytes(using: .utf8))
print(Int(field) as Any)
Prints:
Network 10.0.0.0/8
Class A
Required Hosts:
2
Required hosts 2
Available hosts 2
Subnet 10.0.0.0/30
Range start 10.0.0.1
Range end 10.0.0.2
Broadcast 10.0.0.3
Mask 255.255.255.252
10
2
Optional(10)
Try this hope this will help:
var field = String("10")
var intValue = Int(field)
print(intValue!)
Result: 10
var field = String(describing: string?[startIndex!..
field now has the value Optional("10").
print(field.lengthOfBytes(using: .utf8))
prints 14 because that's the length of the string Optional("10"). No surprise here.
print(Int(field) as Any)
is nil because the String Optional("10") is not convertible to Int.
If you remove the optionality from string, everything works just fine:
if let str = string {
var startIndex = str.index((string?.startIndex)!, offsetBy: 10)
var endIndex = str.index(of: ".")!
var field = String(describing: str[startIndex..<endIndex])
print(field)
print(field.lengthOfBytes(using: .utf8))
print(Int(field))
}
Although the code is a bit cumbersome it should work if the first digit is the 11th character in the string
However in this case Scanner is much more reliable because the offset can be dynamic and the result is Int.
The scanner scans up to the first decimal digit and then it scans an Int and assign it to the result variable.
let string = "Network\t\t\t10.0.0.0/8\nClass\t\t\t\tA\nRequired Hosts:\n2\n\nRequired hosts\t2\nAvailable hosts\t2\nSubnet\t\t\t\t10.0.0.0/30\nRange start\t\t10.0.0.1\nRange end\t\t\t10.0.0.2\nBroadcast\t\t\t10.0.0.3\nMask\t\t\t\t255.255.255.252\n\n"
var result = 0
let scanner = Scanner(string: string)
scanner.scanUpToCharacters(from: .decimalDigits, into: nil)
scanner.scanInt(&result)
print(result) // 10
I implemented a function to calculate the hamming distance using Swift, it uses the xor operation x ^ y to get the different bits. Then, I convert the result from an Int to a String of 8 characters which is the 8 bit representation of my Xor. However, I am getting the error:
Compile Error: ambiguous use of 'filter'
class Solution {
func hammingDistance(_ x: Int, _ y: Int) -> Int {
let xor = x ^ y //xor: compares bits
let xorBinary = String(xor, radix: 2)
let xor8BitBinaryStr = String(repeating: Character("0"), count: 8 - xorBinary.count) + xorBinary
return xor8BitBinaryStr.filter({ $0 == "1" }).count
}
}
let c = Solution()
print(c.hammingDistance(1, 4)) //prints 2
You can filter like this to avoid confusion for the compiler,
let items = xor8BitBinaryStr.filter({ $0 == "1"})
return items.count
OR
return Array(xor8BitBinaryStr).filter({ $0 == "1" }).count
To solve this, declare the type of the xor8BitBinaryStr before you perform operations on it.
let xor8BitBinaryStr : `data type here` = String(repeating: Character("0"), count: 8 - xorBinary.count) + xorBinary
In Swift 4.0, there are two filter methods on String which only differ by their return type.
One returns String, the other [Character].
If you don't explicitly declare the type of the return you expect, it defaults to String.
Therefore, if you want to get [Character], you need to do something like this:
let chars: [Character] = xor8BitBinaryStr.filter({ $0 == "1" })
return chars.count
EDIT: This was a bug in Swift that was fixed, presumably in 4.1. It was marked Resolved on 11/17/17. See https://bugs.swift.org/browse/SR-5175?jql=text%20~%20%22filter%22
I was using this extension method to generate a random number:
func Rand(_ range: Range<UInt32>) -> Int {
return Int(range.lowerBound + arc4random_uniform(range.upperBound - range.lowerBound + 1))
}
I liked it b/c it was no nonsense, you just called it like this:
let test = Rand(1...5) //generates a random number between 1 and 5
I honestly don't know why things need to be so complicated in Swift but I digress..
So i'm receiving an error now in Swift3
No '...' candidates produce the expected contextual result type 'Range<UInt32>'
Would anyone know what this means or how I could get my awesome Rand function working again? I guess x...y no longer creates Ranges or x..y must be explicitly defined as UInt32? Any advice for me to make things a tad easier?
Thanks so much, appreciate your time!
In Swift 3 there are four Range structures:
"x" ..< "y" ⇒ Range<T>
"x" ... "y" ⇒ ClosedRange<T>
1 ..< 5 ⇒ CountableRange<T>
1 ... 5 ⇒ CountableClosedRange<T>
(The operators ..< and ... are overloaded so that if the elements are stridable (random-access iterators e.g. numbers and pointers), a Countable Range will be returned. But these operators can still return plain Ranges to satisfy the type checker.)
Since Range and ClosedRange are different structures, you cannot implicitly convert a them with each other, and thus the error.
If you want Rand to accept a ClosedRange as well as Range, you must overload it:
// accepts Rand(0 ..< 5)
func Rand(_ range: Range<UInt32>) -> Int {
return Int(range.lowerBound + arc4random_uniform(range.upperBound - range.lowerBound))
}
// accepts Rand(1 ... 5)
func Rand(_ range: ClosedRange<UInt32>) -> Int {
return Int(range.lowerBound + arc4random_uniform(range.upperBound + 1 - range.lowerBound))
}
A nice solution is presented in Generic Range Algorithms
(based on How to be DRY on ranges and closed ranges? in the swift-users mailing list).
It uses the fact that both CountableRange and CountableClosedRange
are collections, and in fact a RandomAccessCollection.
So you can define a single (generic) function which accepts both open and closed
integer ranges:
func rand<C: RandomAccessCollection>(_ coll: C) -> C.Iterator.Element {
precondition(coll.count > 0, "Cannot select random element from empty collection")
let offset = arc4random_uniform(numericCast(coll.count))
let idx = coll.index(coll.startIndex, offsetBy: numericCast(offset))
return coll[idx]
}
rand(1...5) // random number between 1 and 5
rand(2..<10) // random number between 2 and 9
but also:
rand(["a", "b", "c", "d"]) // random element from the array
Alternatively as a protocol extension method:
extension RandomAccessCollection {
func rand() -> Iterator.Element {
precondition(count > 0, "Cannot select random element from empty collection")
let offset = arc4random_uniform(numericCast(count))
let idx = index(startIndex, offsetBy: numericCast(offset))
return self[idx]
}
}
(1...5).rand()
(2..<10).rand()
["a", "b", "c", "d"].rand()
You could rewrite Rand() to use Int if that is your primary use case:
func Rand(_ range: Range<Int>) -> Int {
let distance = UInt32(range.upperBound - range.lowerBound)
return range.lowerBound + Int(arc4random_uniform(distance + 1))
}
Or as kennytm points out, use Rand(1..<6)
I'm migrating a project from Swift 2.2 to Swift 3, and I'm trying to get rid of old Cocoa data types when possible.
My problem is here: migrating NSDecimalNumber to Decimal.
I used to bridge NSDecimalNumber to Double both ways in Swift 2.2:
let double = 3.14
let decimalNumber = NSDecimalNumber(value: double)
let doubleFromDecimal = decimalNumber.doubleValue
Now, switching to Swift 3:
let double = 3.14
let decimal = Decimal(double)
let doubleFromDecimal = ???
decimal.doubleValue does not exist, nor Double(decimal), not even decimal as Double...
The only hack I come up with is:
let doubleFromDecimal = (decimal as NSDecimalNumber).doubleValue
But that would be completely stupid to try to get rid of NSDecimalNumber, and have to use it once in a while...
Well, either I missed something obvious, and I beg your pardon for wasting your time, or there's a loophole needed to be addressed, in my opinion...
Thanks in advance for your help.
Edit : Nothing more on the subject on Swift 4.
Edit : Nothing more on the subject on Swift 5.
NSDecimalNumber and Decimal are bridged
The Swift overlay to the Foundation framework provides the Decimal
structure, which bridges to the NSDecimalNumber class. The Decimal
value type offers the same functionality as the NSDecimalNumber
reference type, and the two can be used interchangeably in Swift code
that interacts with Objective-C APIs. This behavior is similar to how
Swift bridges standard string, numeric, and collection types to their
corresponding Foundation classes. Apple Docs
but as with some other bridged types certain elements are missing.
To regain the functionality you could write an extension:
extension Decimal {
var doubleValue:Double {
return NSDecimalNumber(decimal:self).doubleValue
}
}
// implementation
let d = Decimal(floatLiteral: 10.65)
d.doubleValue
Another solution that works in Swift 3 is to cast the Decimal to NSNumber and create the Double from that.
let someDouble = Double(someDecimal as NSNumber)
As of Swift 4.2 you need:
let someDouble = Double(truncating: someDecimal as NSNumber)
Solution that works in Swift 4
let double = 3.14
let decimal = Decimal(double)
let doubleFromDecimal = NSDecimalNumber(decimal: decimal).doubleValue
print(doubleFromDecimal)
Swift 5
let doubleValue = Double(truncating: decimalValue as NSNumber)
Decimal in Swift 3 is not NSDecimalNumber. It's NSDecimal, completely different type.
You should just keep using NSDecimalNumber as you did before.
You are supposed to use as operator to cast a Swift type to its bridged underlying Objective-C type. So just use as like this.
let p = Decimal(1)
let q = (p as NSDecimalNumber).doubleValue
In Swift 4, Decimal is NSDecimalNumber. Here's citation from Apple's official documentation in Xcode 10.
Important
The Swift overlay to the Foundation framework provides the Decimal
structure, which bridges to the NSDecimalNumber class. For more
information about value types, see Working with Cocoa Frameworks in
Using Swift with Cocoa and Objective-C (Swift 4.1).
There's no NSDecimal anymore.
There was confusing NSDecimal type in Swift 3, but it seems to be a bug.
No more confusion.
Note
I see the OP is not interested in Swift 4, but I added this answer because mentioning only about (outdated) Swift 3 made me confused.
In Swift open source, the implementation is actually done in Decimal.swift, but it is private. You can re-use the code from there.
extension Double {
#inlinable init(_ other: Decimal) {
if other._length == 0 {
self.init(other._isNegative == 1 ? Double.nan : 0)
return
}
var d: Double = 0.0
for idx in (0..<min(other._length, 8)).reversed() {
var m: Double
switch idx {
case 0: m = Double(other._mantissa.0)
break
case 1: m = Double(other._mantissa.1)
break
case 2: m = Double(other._mantissa.2)
break
case 3: m = Double(other._mantissa.3)
break
case 4: m = Double(other._mantissa.4)
break
case 5: m = Double(other._mantissa.5)
break
case 6: m = Double(other._mantissa.6)
break
case 7: m = Double(other._mantissa.7)
break
default: break
}
d = d * 65536 + m
}
if other._exponent < 0 {
for _ in other._exponent..<0 {
d /= 10.0
}
} else {
for _ in 0..<other._exponent {
d *= 10.0
}
}
self.init(other._isNegative != 0 ? -d : d)
}
}
For swift 5, the function is
let doubleValue = Double(truncating: decimalValue as NSNumber)
the example in the below, show the number of float.
let decimalValue: Decimal = 3.14159
let doubleValue = Double(truncating: decimalValue as NSNumber)
print(String(format: "%.3f", doubleValue)) // 3.142
print(String(format: "%.4f", doubleValue)) // 3.1416
print(String(format: "%.5f", doubleValue)) // 3.14159
print(String(format: "%.6f", doubleValue)) // 3.141590
print(String(format: "%.7f", doubleValue)) // 3.1415900
Prior to the new Swift version I was using the following code in my app.
Now it launches an exception.
for (i, in 0 ..< len){
let length = UInt32 (letters.length)
let rand = arc4random_uniform(length)
randomString.appendFormat("%C", letters.characterAtIndex(Int(rand)))
}
XCode says:
Expected pattern
Expected "," separator
Expected "in" after for-each pattern
Expected SequenceType expression for for-each loop
Changing the code with the proposed solutions doesn't change the exceptions thrown.
Any help is welcome to update the code to the current Swift version.
For for syntax you are using have been deprecated, and should be changed to
for _ in 0..<len
// rest of your code
the question already has correct answer still i have converted it so posting here may be some get help from it
let len = 5
let letters:NSString = "str"
for i in 0 ..< len {
let length = UInt32 (letters.length)
let rand = arc4random_uniform(length)
let randomString:NSMutableString = ""
randomString.appendFormat("%C", letters.characterAtIndex(Int(rand)))
}
As some of the variable are not shown in the code i have made them based on the parameters