
29 Jun
2005
29 Jun
'05
11:47 a.m.
Thomas Davie
Okay, I see that, now I'm slightly intrigued -- if, as before, we have
Just foo = undefined
Doesn't that have 2 effects: foo = undefined undefined :: Maybe a
and thus cause a type error (i.e. undefined has become too specific)?
No, 'foo' does not literally gain the value of 'undefined'. In fact, foo gets no value at all, because the computation diverges before the pattern can be matched. /Semantically/ divergence is equivalent to 'undefined', but syntactically, 'foo' and 'undefined' are separate bindings, and their types are therefore not constrained to be equal. Regards, Malcolm