
On 27 June 2005 19:24, Arjun Guha wrote:
I have an extremely large source file of about 11 MB. It's the all-pairs shortest paths data for a map of the Hyde Park area of Chicago (no real reason, really). I generated information in Scheme and printed the result to a Haskell source file as a list. I then edited the file to initialized an array with the data.
GHC, with a 200MB stack, took up 1 hour and 1.3 GB of memory before getting killed by the system. How would I compile something of this size? I need to have the array of all-pairs shortest paths pre-computed. Any suggestions?
GHC doesn't have a good way to compile large amounts of static data. Either: (a) put the data into a static array in a C file, compile it with a C compiler, and access it via the FFI, or (b) encode it as a string, and access it using GHC's primitives, or (c) read it from a file at runtime. (Happy and Alex use (b) for encoding parse tables, BTW). Cheers, Simon