Have you tried the lazy bytestring version?
unfortunately read file tries to get the file sizeOn Tue, Apr 16, 2013 at 11:28 AM, Clark Gaebel <cg.wowus.cg@gmail.com> wrote:
See the comment for hGetContents:"This function reads chunks at a time, doubling the chunksize on each read. The final buffer is then realloced to the appropriate size. For files > half of available memory, this may lead to memory exhaustion. Consider usingreadFile
in this case."Maybe try lazy bytestrings?- Clark
On Tuesday, April 16, 2013, Anatoly Yakovenko wrote:
-- So why does this code run out of memory?import Control.DeepSeqimport System.IOimport qualified Data.ByteString.Char8 as BSscanl' :: NFData a => (a -> b -> a) -> a -> [b] -> [a]scanl' f q ls = q : (case ls of[] -> []x:xs -> let q' = f q xin q' `deepseq` scanl' f q' xs)main = dofile <- openBinaryFile "/dev/zero" ReadModechars <- BS.hGetContents filelet rv = drop 100000000000 $ scanl' (+) 0 $ map fromEnum $ BS.unpack charsprint (head rv)-- my scanl' implementation seems to do the right thing, becausemain = print $ last $ scanl' (+) (0::Int) [0..]-- runs without blowing up. so am i creating a some thunk here? or is hGetContents storing values? any way to get the exception handler to print a trace of what caused the allocation?