
On Mon, Mar 03, 2008 at 05:20:09AM +0100, Bertram Felgenhauer wrote:
Another story from an (almost) happy Haskell user that finds himself overwhelmed by laziness/space leaks.
I'm trying to parse a large file (>600MB) with a single S-expression like structure. With the help of ByteStrings I'm down to 4min processing time in constant space. However, when I try to wrap the parse results in a data structure, the heap blows up - even though I never actually inspect the structure being built! This bugs me, so I come here looking for answers.
The polyparse library (http://www.cs.york.ac.uk/fp/polyparse/) offers some lazy parsers, maybe one of those fits your needs. Text.ParserCombinators.Poly.StateLazy is the obvious candidate.
I have tried both Poly.StateLazy and Poly.State and they work quite well - at least the space leak is eliminated. Now evaluation of the parser state blows the stack... The code is at http://hpaste.org/6310 Thanks in advance, -- Krzysztof Kościuszkiewicz Skype: dr.vee, Gadu: 111851, Jabber: kokr@jabberpl.org "Simplicity is the ultimate sophistication" -- Leonardo da Vinci