
Luke Palmer wrote:
This does not answer your question, but you can solve this problem without rewrite rules by having length return a lazy natural:
data Nat = Zero | Succ Nat
And defining lazy comparison operators on it.
And if you want to go that route, then see Data.List.Extras.LazyLength from list-extras[1]. Peano integers are quite inefficient, but this library does the same transform efficiently. [1] http://hackage.haskell.org/cgi-bin/hackage-scripts/package/list-extras
Of course you cannot replace usages of Prelude.length. But I am really not in favor of rules which change semantics, even if they only make things less strict. My argument is the following. I may come to rely on such nonstrictness as in:
bad xs = (length xs > 10, length xs > 20)
bad [1..] will return (True,True). However, if I do an obviously semantics-preserving refactor:
bad xs = (l > 10, l > 20) where l = length xs
My semantics are not preserved: bad [1..] = (_|_, _|_) (if/unless the compiler is clever, in which case my semantics depend on the compiler's cleverness which is even worse)
Data.List.Extras.LazyLength does have rewrite rules to apply the lazy versions in place of Prelude.length where it can. My justification is two-fold. First is that for finite lists the semantics are identical but the memory behavior is strictly better. Second is that for non-finite lists the termination behavior is strictly better. It's true that refactoring can disable either point, and that can alter semantics in the latter case. Since the module is explicit about having these rules, I would say that users should remain aware of the fact that they're taking advantage of them or they should use the explicit lengthBound or lengthCompare functions instead. -- Live well, ~wren