
a recent post reminded me of a feature i'd like. for all i know it is already implemenetd in GHC so pointers are welcome.
i'd like to be able to dump data structures to disk, and later load them.
A Binary library was discussed recently on the libraries list. The thread starts here: http://www.haskell.org/pipermail/libraries/2002-November/000691.html It's currently stalled. There are several implementations of Binary: one that comes with NHC and is described in a paper (sorry, don't have a link to hand), a port of this library to GHC by Sven Panne (suffers from bitrot), a derived/simplified version used in GHC which is heavily hacked for speed, and a further derived version of this library by Hal Daume who is adapting it to support bit-by-bit serialisation. I think the outstanding issues are (a) is the API for GHC's Binary library acceptable, or do we need the extra bells and whistles that the NHC version has? (b) can we make a version of Binary that uses a bit-by-bit rather than byte-by-byte serialisation of the data that is as fast (or nearly as fast) as the current byte-by-byte implementation? Perhaps performance isn't that important to the majority of people: please comment if you have an opinion. (c) how do we derive instances of Binary? IMHO: something is better than nothing, so I'd be in favour of just plugging in the Binary library from GHC, and marking it "experimental". Cheers, Simon