
jerzy.karczmarczuk@info.unicaen.fr wrote:
apparently - Clean has better handling of strictness issues [saying at the same time that he/she doesn't use Clean...]
Uhm... well... and does it? From what I've heard, Clean has the same mechanism as Haskell, which is the 'seq' primitive. Clean just adds some syntactic sugar to make functions strict in some arguments. If that's the only difference, I'm quite happy with the False-guard idiom (and may be even more happy with !-patterns).
And here apparently I am one of rare people - I am not proud of it, rather quite sad, who defends laziness as an *algorithmisation tool*, which makes it easy and elegant to construct co-recursive codes. Circular programs, run-away infinite streams, hidden backtracking etc.
And don't forget the Most Unreliable Method to Compute Pi! That would be plain impossible without lazy evaluation. (Nice blend of humor and insight in that paper, by the way.)
In this context, I found Clean more helpful than Haskell, for ONE reason. Clean has a primitive datatype: unboxed, spine-lazy but head-strict lists.
If I understand correctly, you'd get the same in GHC by defining *> data IntList = Nil | Cons I# IntList though it is monomorphic, and you'd get the same semantics from *> data List a = Nil | Cons !a (List a) Now it is polymorphic and it may even get unpacked. Udo. -- If your life was a horse, you'd have to shoot it.