
Dennis Raddle wrote:
I have the following code, which reads several thousand text files and creates some Map data. On Windows XP, it crapped out with the error "resource exhausted: too many open files." So I'm sure I need to make it stricter.
Yes.
I've never tried to do that before, so I'm not sure where the strictness is necessary.
When you use readFile, each file remains open until the entire file has been read. When reading is delayed due to laziness, the open file handles begin to pile up.
Maybe all I need is to switch to strict ByteStrings?
That should solve your problem in this case, yes. You might consider using Text instead of ByteString, though; you are interpreting those values as text. Text has built-in functions for parsing floating point numbers. With ByteString, you'd have to change each line to a String. I almost always prefer using Text or ByteString over String nowadays anyway. To solve the problem when using readFile from the Prelude, you would need to make sure that each file is read all the way to the end as you go along. One trick sometimes used for that is to use the evaluate function from Control.Exception to force evaluation of the length of each file: b <- readFile s evaluate $ length b ... That would cause the entire contents of the file to be read into memory immediately and the file to be closed, like the behavior of readFile for strict ByteString and strict Text. -Yitz