
Young Hyun wrote:
Here's what the program does. It reads a binary file containing a large number of varying-length structured records. [...]
Unless I'm badly mistaken, nothing about fixArts suggests that it would leak memory. So, it would appear parseArts is somehow to blame for the memory leak, but this is where it gets weird. If I just slightly change fixArts (see below), then I no longer get a memory leak (the process size stays at just over 3MB):
fixArts ((Right x):xs) = do hPutArts stderr x fixArts xs
Most likely you think, your parser is building small datat structures, when in reality it builds large and complicated suspended computations, that never run and just take up space. As soon as you demand their result (hPutArts does this), they run and free up the space. GHC itself cannot help you much. Maybe it *could* determine that all these results are really demanded eventually and it won't hurt to compute them earlier, but in practice this is too hard to do. You'll need to give some hints. You should change the fields in your record to strict types (!Integer instead of Integer) and you should force evaluation of these structures by placing a `seq' at the appropriate place. Unfortunately, finding this place is not easy. Probably the best way is to simulate lazy evaluation in your head and see where it goes wrong.
parseArts (x:xs) ... = (createArts x) : parseArts xs
In this code the result of createArts is not demanded and simply put into a potentially large data structure. It might help to change it to the following: parseArts (x:xs) ... = ((:) $! createArts x) parseArts xs Udo. -- mitten in einer Diskussion über den "Evil Mangler" in GHC 5.04: <shapr> the evil mangler uses *perl* ?? <ChilliX> yes... <ChilliX> it is Evil after all