
20 Jul
2009
20 Jul
'09
5:29 a.m.
GHC complains that u is out of scope in the following definition: f n = let u = [1..n] in g n where g x = u Why? I understand that it's "just the way Haskell works", but I'm wondering about the rationale of the language designers when deciding on this scoping behavior.