
Hi, My program is eating too much memory: copyfile source.txt dest.txt +RTS -sstderr Reading file... Reducing structure... Writting file... Done in 20.277s 1,499,778,352 bytes allocated in the heap 2,299,036,932 bytes copied during GC (scavenged) 1,522,112,856 bytes copied during GC (not scavenged) 17,846,272 bytes maximum residency (198 sample(s)) 2860 collections in generation 0 ( 10.37s) 198 collections in generation 1 ( 8.35s) 50 Mb total memory in use INIT time 0.00s ( 0.00s elapsed) MUT time 1.26s ( 1.54s elapsed) GC time 18.72s ( 18.74s elapsed) EXIT time 0.00s ( 0.00s elapsed) Total time 19.98s ( 20.28s elapsed) %GC time 93.7% (92.4% elapsed) Alloc rate 1,186,898,778 bytes per MUT second Productivity 6.3% of total user, 6.2% of total elapsed The source.txt is 800kb, and I expect files of size 100 times more, say 80MB, so using -H800M will not help here much. The profile -p says: Wed Nov 21 14:23 2007 Time and Allocation Profiling Report (Final) copyfile +RTS -p -RTS source.txt dest.txt total time = 4.48 secs (224 ticks @ 20 ms) total alloc = 1,500,359,340 bytes (excludes profiling overheads) COST CENTRE MODULE %time %alloc xparse PdfModel 78.1 95.0 followed by a lot of 0.0 numbers. Then the faulty xparse: xparse uniq borders body bin = do parseOperatorEq "xref" p <- P.many (parseXRefSection uniq) parseOperatorEq "trailer" e <- parseDict let entries = IntMap.fromList (map (\(a,b,c) -> (a,c)) (concat p)) return (Body e entries) where the P is Text.ParserCombinators.ReadP made to work with [Word8]. How do I know WHAT is making so much allocation in this function? My files are xml-like in structure. The task currently is to read, parse, write the file. I can provide more details for interested souls. How do I make my program make less allocations? -- Gracjan