How to cut a file effciently?

Hi, Let us say I have a text file of a million lines, and I want to cut it into smaller (10K lines) ones. How to do this? I have tried a few ways, none I think is lazy (I mean not reading the file all at the start). -- 竹密岂妨流水过 山高哪阻野云飞

split n [] = [] split n xs = take n xs : split n (drop n xs) main = do text <- readFile "source" mapM_ (\(n,dat) -> writeFile ("dest" ++ show n) dat) . zip [0..] . split 10000 . lines $ text Modulo brainos... but you get the idea. This is lazy (because readFile is). Luke On Tue, Apr 7, 2009 at 11:20 PM, Magicloud Magiclouds < magicloud.magiclouds@gmail.com> wrote:
Hi, Let us say I have a text file of a million lines, and I want to cut it into smaller (10K lines) ones. How to do this? I have tried a few ways, none I think is lazy (I mean not reading the file all at the start). -- 竹密岂妨流水过 山高哪阻野云飞 _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On Wed, Apr 8, 2009 at 7:20 AM, Magicloud Magiclouds
Hi, Let us say I have a text file of a million lines, and I want to cut it into smaller (10K lines) ones. How to do this? I have tried a few ways, none I think is lazy (I mean not reading the file all at the start).
I would just seek to the approximate chunk boundaries (10k, 20k, etc) and the read forward until hitting a newline. Cheers, Johan
participants (3)
-
Johan Tibell
-
Luke Palmer
-
Magicloud Magiclouds