
From: haskell-cafe-bounces@haskell.org [mailto:haskell-cafe-bounces@haskell.org] On Behalf Of Manlio Perillo Sent: 02 March 2009 11:01
Eugene Kirpichov ha scritto:
I'm not considering the lazy IO approach, as it doesn't involve any form of control over resources.
This is not always true. I'm using lazy IO, still having full control over the resources.
parse path = withFile path ReadMode parse' where parse' :: Handle -> IO (UArr Xxx) parse' handle = do contents <- L.hGetContents handle let v = toU $ xxx $ L.lines contents rnf v `seq` return v
All the file is consumed, before the result is returned.
This only works if the entire file can reasonably fit into memory. If you want to process something really big, then you need some sort of streaming approach, where you only look at a small part of the file (a line, or a block) at a time. And this is where the enumerator-iteratee approach looks better, because the IO is strict, but you still take a stream-like approach to processing the contents. BTW, does this discussion remind anyone (else) of Peter Simon's Block-IO proposal? http://cryp.to/blockio/fast-io.html Alistair ***************************************************************** Confidentiality Note: The information contained in this message, and any attachments, may contain confidential and/or privileged material. It is intended solely for the person(s) or entity to which it is addressed. Any review, retransmission, dissemination, or taking of any action in reliance upon this information by persons or entities other than the intended recipient(s) is prohibited. If you received this in error, please contact the sender and delete the material from any computer. *****************************************************************