
On 11/2/06, Chad Scherrer
Lemmih wrote:
Using 'seq' is generally a bad idea. It can worsen the performance if not used carefully and GHCs strictness analyser is usually good enough.
Is GHC.Conc.pseq any better? Usually the whole point of making things more strict is to optimize performance for pieces you know will be evaluated anyway. It's frustrating if there's not a consistent way to do this that works well.
pseq is just as bad. The problem is excessive use of strictness annotations in the hope of a magical performance improvement. Strictness annotations should be used with care and only placed where they're needed.
Lately, I've been using lots of strictness annotations and bang patterns - are there non-obvious places this could slow things down?
In the case of the MersenneTwister, forcing 'y4' from 'next32' would evaluate it almost 10,000,000 times more than needed.
Would it be possible for the type system to distinguish at compile time whether something would need to be evaluated, and optimize away redundant `seq`s? Maybe this is what the strictness analyzer does already.
Evaluating the seq's isn't costly, afaik. -- Cheers, Lemmih