
Greg Buchholz
I wasn't too worried about the energy function, since it only gets called twice, whereas the "advance" get called potentially millions of times (and must be the source of the space leak).
After running a few examples myself, it seems clear that the peak of the profile is very early in the run, then space falls linearly to the end of the computation. A "construction" profile reveals that there is no single type of heap node responsible, rather a mixture of list conses and unevaluated closures. Also, the initial peak seems to grow at least linearly with the number of iterations requested. I conclude that the program is initially (and quickly) building a vast data-structure filled with closures, and subsequently reducing this "latent" computation down to the single result value. The biographical profile confirms that the vast majority of the heap is 'lag': it is created too early before being used. Given this observation, it points to the initial generator of the peak being the iterations of 'advance', and the consumer of the peak would be 'energy'. Your target therefore is to structure the computation's use of data in a more linear fashion - to produce some small portion of the data structure, then partially reduce it, before continuing to lazily produce some more of the data structure. So, here is a suggested improvement - it does more work than the original, and takes more time, but it totally eliminates the space leak. main = do .... let final = forceIterate iter (energy n) (advance 0.01) bodies putStrLn $ showFFloat (Just 9) final "" forceIterate :: Int -> (a->b) -> (a->a) -> a -> b forceIterate 0 reduce f x = reduce x forceIterate n reduce f x = let r = f x in reduce r `seq` forceIterate (n-1) reduce f r Perhaps you can be thus inspired to discover a more minimal way of forcing the necessary parts of intermediate computation than calling 'energy' itself. Regards, Malcolm