
On Thu, Apr 02, 2009 at 07:55:07PM -0400, Rick R wrote:
You could profile your app for memory usage. Then you could figure out just what function is blowing up the mem usage and figure out how to optimize it.
http://book.realworldhaskell.org/read/profiling-and-optimization.html
2009/4/2
I'm relatively new to haskell so as one does, I am rewriting an existing program in haskell to help learn the language.
However, it eats up all my RAM whenever I run the program.
http://hpaste.org/fastcgi/hpaste.fcgi/view?id=3175#a3175
Obviously I'm doing something wrong, but without my magical FP pants I don't know what that might be.
I ran some profiling as suggested,
[SNIP]
total time = 8.36 secs (418 ticks @ 20 ms)
total alloc = 3,882,593,720 bytes (excludes profiling overheads)
COST CENTRE MODULE %time %alloc
line PkgDb 89.7 93.5
COST CENTRE MODULE no. entries %time %alloc %time %alloc
line PkgDb 305 109771 89.7 93.3 89.7 93.3
[SNIP]
The line function is part of the file parser
line :: Parser String
line = anyChar `manyTill` newline
files' :: Parser Files
files' = line `manyTill` newline
Perhaps I should also explain the structure of the file. It's for a
simple package manager called pkgutils, used for CRUX[1]. The file
contains information for all the packages installed and is structured
as follows
<package name>
<package version>
<file>
<file>
...
<file>
<package name>
...
From profiling it shows that the memory is simple consumed by reading
in all the lines, the graph from using -p -hd shows an almost Ologn2
growth of the heap as the collection of lines grows.
Is there a better way to do this?
[1] http://crux.nu
--
Lucas Hazel