RE: Two Questions: 'memory consumption' and '-pgmL'

first of all, let me thank you for writing and maintaining this excellent compiler! I am using it a lot recently and I couldn't be more happy with it. Thanks! :-)
I wouldn't be posting here, though, if hadn't had a questions ... So here I go:
(1) Using the DtdToHaskell tool, I converted the XML Docbook DTD to Haskell code. The resulting parser is amazing: It is almost 4 megabyte large, 72800 lines of code. Now I tried to compile this beast and ran fresh out of memory.
I fiddled with the RTS options to no avail. At some point GHC was consuming more than 800 megabytes of RAM, what resulted in serious thrashing (my machine has only 512 MB) and eventually the process was terminated.
Does anyone have by any ideas how I could tackle this problem? Can I reduce GHC's memory requirements somehow? Can I split the module up and compile it in parts? In different phases? Anything? (If nothing comes up, I guess buying some more RAM is the answer.)
You can try fiddling with GHC's GC settings to reduce memory consumption. The section of the User's Guide on runtime flags has some hints; I would try -c first (turn on the compacting collector). Adding more generations (eg. -G3) might help, and setting a maximum heap size (eg. -M512m) will cause GHC to try to trim down its memory use when it gets close to this boundary. Remember to surround any RTS options with +RTS ... -RTS.
(2) According to the documentation, GHC allows for setting the literate pre-processor to be used when compiling an .lhs file. This is supposed to occur with the '-pgmL' option, but no matter what I try, GHC always tells me:
| ghc-5.04.2: unrecognised flags: -pgmL | Usage: For basic information, try the `--help' option.
Am I doing something wrong or is the documentation out of synch with the implementation?
Hmm, good point. This option is missing from the implementation, thanks for pointing it out. Cheers, Simon

Simon Marlow writes:
I would try -c first (turn on the compacting collector). Adding more generations (eg. -G3) might help, and setting a maximum heap size (eg. -M512m) will cause GHC to try to trim down its memory use when it gets close to this boundary.
Unfortunately neither of that helped. It appears that ghc simply _needs_ that amount of memory. No matter what option I gave, at one point it hit the 800 MB limit and aborted. (I specified -M800 because if it used more memory than that, the machine stood basically still with thrashing.) Looks like I'll have to support the memor chip industry ... The problem is that if a 512MB machine cannot compile it, I wonder how the _users_ of my program will get along. I guess they'll tend to have smaller machines than the average software developer does. -peter
participants (2)
-
Peter Simons
-
Simon Marlow