
22 Oct
2007
22 Oct
'07
9:27 a.m.
Hi, I'm struggling to get my HXT-based parser to parse a largish file (<300MB), even after breaking into reasonably-sized chunks. The culprit appears to be parsing one element comprising 25K lines of text, which apparently requires more memory than the 2Gb my computer is equipped with. I'm wondering what approach others use for non-toy XML data. Is the problem due to some error I have made, or should I just ignore the XML, and just parse it "manually" by dissecting bytestrings, or will another XML library serve better? -k -- If I haven't seen further, it is by standing in the footprints of giants