Why do assignments involving Swift optionals type check? For example in,
var foo : Int? = 0
foo = foo!
foo and foo! do not have the same
After reading rickster's answer, I came up with a simple laymen terms answer. To me the whole whole gist of his answer is
Since an optional indicates the presence or absence of a value, you shouldn't have to do anything special to indicate the presence of a value other than provide one
An optional is an enum. Which has 2 cases a or b.
String?
|
An enum with 2 cases
|
a b
| |
Set notSet
| |
any value like "hi" nil
So you could do either of the things when you want to assign to an optional.
Say the value is either:
nil
code:
var str : String?
var anotherOptional : String?
str = nil // nil <-- this is like case b
str = "hi" // "hi" <-- this is like case a
str = anotherOptional // nil <-- this is like case c
anotherOptional = "hi again"
str = anotherOptional // "hi again" <-- this is like case c
This is part of the syntactic sugar behind optionals. Assigning a non-optional value is how you wrap it in the optional type.
Since an optional indicates the presence or absence of a value, you shouldn't have to do anything special to indicate the presence of a value other than provide one. For example, in a function:
func gimmeSomethingMaybe() -> String? {
if arc4random_uniform(10) > 7 {
return "something"
} else {
return nil
}
}
Imagine if every time you wanted to return a real value from a function that's capable of returning nil, you had to write return Optional(value)
. That'd get old pretty fast, right? Optionals are an important feature of the language — even though they're actually implemented by the standard library, the syntactic sugar / automatic wrapping is there to keep it from being tedious to use them.
Edit: just to go a bit further into this... the sugar also helps to enforce the notion that a real value should not be optional. For example:
let one = 1
one? // error (in Swift 1.2, allowed but meaningless in Swift 1.1)
"two"? // error (ditto)
You can create an optional wrapping a real value with the Optional(one)
initializer, but that has little semantic meaning on its own, so you almost never need to.
Optionals should come into play when there's "mystery" as to whether a value is present or absent — that is, when whether one part of a program receives a value (or no value) depends on state unknown to that part of the program. If you know you have a real value, there's no mystery... instead, you let the unknown come into play at the boundary between the code that knows the value and the code that doesn't know — that is, the function/method/property definition that hands that value off to somewhere.