
On Wed, 4 Jun 2008, Luke Palmer wrote:
On Wed, Jun 4, 2008 at 9:48 AM, Loup Vaillant
wrote: I see a problem with this particular fusion, though: It changes the space complexity of the program, from linear to constant. Therefore, with some programs, relying on this "optimization" can be a matter of correctness, not just performance. Therefore, if it is not clear when the compiler will optimize this, I'd rather not use it. (A counter example is tail calls, which are rather easily deducible, at least in a strict context)
Haskell is not required to be lazy, only non-strict. That is, Haskell as a language is free to re-evaluate expressions bound in let clauses. This can change the time and space complexity of a program.
To me, time and space complexity is not about correctness but performance. Given unbounded time and space, you will still arrive at the same result regardless of the complexity. What makes the asymptotic class more blessed than the associated constants?
Is it possible to extend the garbage collector that way, that it does not only check whether references to a piece of data exist, but that it also tries to eliminate these references by further evaluations. Consider again mean xs = sum xs / fromIntegral (length xs) If 'sum' forces construction of xs, unevaluated 'length' still points to the first element of xs. Could the garbage collector start counting for 'length' in order to free the first elements of xs?