I'd like more clarification for optionals in Swift 1.2 e.g the snippet below
var numbers : [Array<String?>?]? = []
var getContact: String?
self.getContact = numbers![indexPath.row]?.first!
println(numbers![indexPath.row]?.first!)
println(getContact)
Optional("1-646-961-1869")
nil
It might help to read SomeType? as “something that might either be a value of type SomeType, or nil.
Based on this, you could read this:
var numbers : [Array<String?>?]? = []
as:
numbers is variable that might contain an array, or nil. In this part of the code it does contain an array, because you’ve assigned [] (an empty array) to it. The array contains values that are either arrays, or nil. And each element of those arrays is either a String, or nil.
What nil means depends on the context. It might indicate failure, or “not set yet”, or something else. For example, in let maybeInt = Int("foo"), maybeInt will be an Int? (which is shorthand for Optional<Int>), and will be set to nil, and in this case the nil means that "foo" cannot be turned into a number. In some other languages, you might instead get an exception, or the number 0 or -1, or a second parameter might be used to indicate failure, but Swift uses optionals.
Optionals can be overused. For example, an optional array is often suspicious – usually an empty array is sufficient. If you choose to make an array optional, ask the question “what is different between the array being nil and the array being empty?”. If there’s no difference, it probably shouldn’t be an optional, just a regular array.
(if the library/framework you’re using likes to traffic in these things without a clear reason, it might be a sign it’s not a very good library)
When you have an optional like numbers here, ! means “I know this is an optional, but I don’t want to deal with the optionality so just assume it’s not nil and if it is, crash.” Crash hard, right there, right then.
So for example, the array property first returns the first element of the array, unless the array is empty (like it is in your example code), in which case it will return nil. So in your example, writing numbers![something] won’t crash because of the !, because you set numbers to be some value. But numbers!.first! might crash if the array at index something is empty.
Lots of ! in your code usually another bad sign, often suggesting that you (or the author of your library) shouldn’t have made something optional because it isn’t ever nil, or that there are failure cases (like that sometimes an array can be empty) that you’re ignoring.
This is a difficult question to answer for a number of reasons. First, this kind of use of optionals is generally frowned upon because it's error-prone and hard to understand. Second, there isn't much context for us to work with. Third, most people are using Swift 2.0 at this point because it's backwards-compatible.
If you're trying to understand how optionals work, you might find my tutorial on Swift optionals useful.
Related
let dict = [1:"One", 2:"Two", 3:"Three"]
let values = dict.values
print(values.dynamicType)
prints:
LazyMapCollection<Dictionary<Int, String>, String>
There are two things I don't understand here. Wouldn't it be a bit more simple if values returned an Array? Also, what is LazyMapCollection? I looked into Apple's reference but it provides literally no information (or nothing I am able to make sense of).
You can iterate over this object, because it is CollectionType:
for v in values {
print(v)
}
prints:
Two
Three
One
But for some reason Apple didn't use Array type.
A LazyMapCollection is a lazy (only evaluated if necessary) view on a collection. By "view" I mean "window", "frame", "virtual subset", this kind of concept.
To get the actual values from it, just use the Array initializer:
let values = dict.values
let result = Array(values)
You have here a serious speed optimisation. Creating an array is expensive. Instead you get an instance of some strange class that behaves in every way like an array. However, it doesn't have its data stored in a real array, instead it access the data from a dictionary.
Say you have a dictionary with 10,000 string values. You don't want iOS to copy all the 10,000 string values when you call dict.values, do you? That's what this class is there for, to prevent copying 10,000 strings. A real array would force copying.
And anyway, with your username you are asking for things like this, and Apple provides many examples. That's how they make iOS fast.
Both arrays and dictionaries are value types (structs). That means that once you request an array, the values must be copied. If Dictionary.values were returning an array, this could be a performance expensive operation - and one that is usually not needed because most of the times you want only to iterate over the values.
So, to use a special (lazy) collection type is basically a way to prevent copying when the copying is not needed. If you want a copy, you have to ask for it explicitly.
Is it normal to get an extra "Optional("")" on the strings i am fetching in Core Data?
Here is the code
bowtie.name = "My bow tie"
bowtie.lastWorn = NSDate()
And this is what i am getting in the xcode output log.
Name: Optional("My bow tie"), Worn: Optional(2015-11-08 14:23:11 +0000)
Is there a way to get rid of the Optional("") thing?
Every time you force unwrap an Optional (using !) a kitten dies.
There are several safe ways of unwrapping an Optional such as using if let or flatMap (even though it's not a real flatMap).
In several cases you can use Optional Chaining so you don't have to deal with Optionals before you actually have to. The null coalescing operator (??) is also pretty useful.
This SO answer is extremely helpful, you should definitely check it out.
If you want to fully understand the concept of Optionals take a look at the docs.
In that specific case, though, I'd recommend using something like let fetchedName = bowtie.name ?? "" (or any other fallback string that makes sense for your problem).
When you force unwrap and for some bizarre reason the value is nil the app will crash. Nobody likes crashes, right?
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Can I create a normal variable in Swift (I mean a non-optional) and assign a nil value to it or later during the app lifecycle, let it be nil?
It confuses me, since it's a little strange compared to traditionally strong programming languages, like Java and C#.
No, that's not possible by design. This excerpt from the documentation explains why:
The concept of optionals doesn’t exist in C or Objective-C. The nearest thing in Objective-C is the ability to return nil from a method that would otherwise return an object, with nil meaning “the absence of a valid object.” However, this only works for objects—it doesn’t work for structures, basic C types, or enumeration values. For these types, Objective-C methods typically return a special value (such as NSNotFound) to indicate the absence of a value. This approach assumes that the method’s caller knows there is a special value to test against and remembers to check for it. Swift’s optionals let you indicate the absence of a value for any type at all, without the need for special constants.
You are describing optionals as a bad thing, whereas is one of the features I appreciate more in the language, because it prevents most of the null pointer exception bugs.
Another advantage is that when a function can return a non-value (nil for reference types in objective C, -1 for integers, etc.), you don't have to choose a value from the spectrum of possible values that a variable of a certain type can have. Not mentioning that it's a convention that both the caller and the function/method must follow.
Last, if you are using too many question and exclamation marks in your code, then you should think about whether or not optionals are really appropriate for the problem (thanks #David for the hint), or taking advantage of optional binding more frequently in all cases where optionals are really needed.
Suggested reading: Optionals
Addendum
Hint: I've frequently seen uses of optionals in cases where a variable is declared, but cannot be initialized contextually. Non optional mutable variables are not required to be declared and initialized in the same line - deferred initialization is allowed, provided that the variable is not used before its initialization. For example:
var x: Int // Variable declared here
for var counter = 0; counter < 10; ++counter {
println(counter)
}
var array = [1, 2, 3]
// ... more lines of code NOT using the x variable
x = 5 // Variable initialized here
print(x)
Hopefully this feature will let you remove several optionals from your code...
Can I create a normal variable in SWIFT (I mean a non Optional) and assign a nil value to it or later during the app lifecycle, let it be nil.
No.
This is easily testable in the playground:
var str = "Hello, playground"
str = nil
The second line will get this error:
Type 'String' does not conform to protocol 'NilLiteralConvertible'
You might want to read more about Swift Literal Convertibles and see an example of how to use it.
You are right, you cannot set a non-optional to nil, although this seems like a burden at first, you gain a lot of safety and readability by giving away a tiny bit of flexibility. Once you get used to it, you will appreciate it more and more.
My immediate project is to develop a system of CheckSums for proving that two somewhat complex objects are (functionally)EQUAL - in the sense that they have the same values for the critical properties. (Have discovered that dates/times cannot be included, so can't use JSON on the bigger object - duh :) (For my purposes) ).
To do this calling the hashCode() method on selected strings seemed to be the way to go.
Upon implementing this, I note that in practice I am getting very different values on multiple runs of highest level objects that are functionally 'identical'.
There are a number of "nums" that I have not rounded, there are integers, bools, Strings and not much more.
I have 'always' thought that a hashCode on the same set of values would return the same number, am I missing something?
BTW the only context that I have found material on hashCode() has been with WebSockets.
Of course I can write my own String to a unique value but I want to understand if this is a problem with Dart or something else.
I can attempt to answer the question posed in the title: "Can hashCode() method calls return different values on equal (==) Objects?"
Short answer: hash codes for two objects must be the same if those two objects are equals (==).
If you override hashCode you must also override equals. Two objects that are equal, as defined by ==, must also have the same hash code.
However, hash codes do not have to be unique. That is, a perfectly valid hash code is the value 1. A good hash code, however, should be uniformly distributed.
From the docs from Object:
Hash codes are guaranteed to be the same for objects that are equal
when compared using the equality operator ==. Other than that there
are no guarantees about the hash codes. They will not be consistent
between runs and there are no distribution guarantees.
If a subclass overrides hashCode it should override the equality
operator as well to maintain consistency.
I found the immediate problem. The object stringify() method, at one level, was not getting called, but rather some stringify property that must exist in all objects (?).
With this fixed everything is working as exactly as I would expect, and multiple runs of our Statistical Studies are returning exactly the same CheckSum at the highest levels (based on some 5 levels of hierarchy).
Meanwhile the JSON.stringify has continued to fail. Even in the most basic object. I have not been able to determine what is causing to fail. Of course, the question is not how "stringify" is accomplished.
So, empirically at least, I believe it is true that "objects with equal properties" will return equal checkSums in Dart. It was decided to round nums, I don't know if this was causing a problem - perhaps good to be aware of? And, of course, remember to be beware of things like dates, times, or anything that could legitimately vary.
_swarmii
The doc linked by Seth Ladd now include info:
They need not be consistent between executions of the same program and there are no distribution guarantees.`
so technically hashCode value can be change with same object in different executions for your question:
I have 'always' thought that a hashCode on the same set of values would return the same number, am I missing something?
I'm working in a ruby app in which symbols are used in various places where one would usually use strings or enums in other languages (to specify configurations mostly).
So my question is, why should I not add a to_str method to symbol?
It seems seems sensible, as it allows implicit conversion between symbol and string. So I can do stuff like this without having to worry about calling :symbol.to_s:
File.join(:something, "something_else") # => "something/something_else"
The negative is the same as the positive, it implicitly converts symbols to strings, which can be REALLY confusing if it causes an obscure bug, but given how symbols are generally used, I'm not sure if this is a valid concern.
Any thoughts?
when an object does respond_to? :to_str, you expect him to really act like a String. This means it should implement all of String's methods, so you could potentially break some code relying on this.
to_s means that you get a string representation of your object, that's why so many objects implement it - but the string you get is far from being 'semantically' equivalent to your object ( an_hash.to_s is far from being a Hash ). :symbol.to_str's absence reflects this : a symbol is NOT and MUST NOT be confused with a string in Ruby, because they serve totally different purposes.
You wouldn't think about adding to_str to an Int, right ? Yet an Int has a lot of common with a symbol : each one of them is unique. When you have a symbol, you expect it to be unique and immutable as well.
You don't have to implicitly convert it right? Because doing something like this will automatically coerce it to a string.
"#{:something}/something_else" # "something/something_else"
The negative is what you say--at one point, anyway, some core Ruby had different behavior based on symbol/string. I don't know if that's still the case. The threat alone makes me a little twitchy, but I don't have a solid technical reason at this point. I guess the thought of making a symbol more string-like just makes me nervous.