
Hello peterv, Friday, July 13, 2007, 10:00:48 PM, you wrote: you still should select between strict algorithm which ghc can compile to non-lazy code and lazy algorithm which, as you belive, should make some other benefits :) actually, for rather large algorithms, strictness doesn't work (some parts of your code is non-strict) and you get all this orders-of-magnitude penalty. look for example at http://www.cse.unsw.edu.au/~chak/papers/afp-arrays.ps.gz or at the ByteString paper which emphasizes the same problem (and it was the reason to implementing strict ByteStrings)
Yes but doesn't GHC have a good "strictness analyzer" (or how is this called?)? I haven't looked at the generated assembly code yet (if this is at all readable; but good C/C++ compilers *do* generate reasonably readable assembly code)
-----Original Message----- From: Bulat Ziganshin [mailto:bulat.ziganshin@gmail.com] Sent: Friday, July 13, 2007 6:43 PM To: peterv Cc: 'Lukas Mai'; haskell-cafe@haskell.org Subject: Re[2]: [Haskell-cafe] Newbie question about tuples
Hello peterv,
Friday, July 13, 2007, 5:03:00 PM, you wrote:
think the latest compilers are much better). Now when implementing something like this in Haskell, I would guess that its laziness would allow to "interleave" many of the math operations, reordering them to be as optimal as possible, removing many intermediate results (like processing streams).
don't forget that laziness by itself makes programs an orders of magnitude slower :)
-- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com