
Hi,
On Thu, 10 Feb 2005 16:18:19 -0800, Iavor Diatchki
because I don't like the current situation with (n+k)-patterns: Everybody says they're evil, but hardly anybody can explain why he thinks so.
I think 'evil' may be a little too strong. I think the usual argument against 'n+k' patterns is that: i) they are a very special case, and may be confusing as they make it look as if '+' was a constructor, which it is not agreed
ii) they lead to some weird syntactic complications, e.g. x + 3 = 5 defines a function called '+', while (x + 3) = 5 defines a variable 'x' to be equal to 2. and there is other weirdness like: x + 2 : xs = ... does this define '+' or ('x' and 'xs')? i think it is '+'. IMO, that's not a big problem, because if ambigouties arise, only one of the possible meanings will compile (e.g. if you use + somewhere else in the module, ghc will complain about an ambigous occurrence of `+'). All (rather strange) other cases are caught by ghc -Wall.
I found another disadvantage: iii) As a side effects of how n+k patterns work, each instance of the Num class must also be an instance of Eq, which of course doesn't make sense for all numeric types.
anyways when used as intended 'n+k' are cute. it is not clear if the complications in the language specification and implementaions are worth the trouble though. It's true that their functionality can be easily expressed without them.
I like to see them (well, n+1 patterns) as a special case of views because they allow numbers to be matched against something that is not a constructor and involve a computation on pattern matching. An unambigous replacement using views could look somewhat like
foo Zero = 1 foo (Succ n) = 2 * foo n
Thomas