RE: [Haskell-cafe] lockFile: fd out of range

On 25 October 2005 17:02, Joel Reymont wrote:
Is there a set limit on the number of file descriptors that a Haskell program can open?
I'm using hs-plugins on FreeBSD to transparently compile, load and launch scripts that establish a connection to a server. I'm getting this error:
internal error: lockFile: fd out of range Please report this as a bug to glasgow-haskell-bugs@haskell.org, or http://www.sourceforge.net/projects/ghc/
lockFile uses FD_SETSIZE as its idea of the maximum value of a file descriptor. If you can get file descriptors outside this range, then the above error would ensue. Not sure if this is the case on FreeBSD... seems odd if it is. Can you try a truss/ktrace and see the values of your file descriptors? Cheers, Simon

Actually, I think I was just hitting the top range of FD_SETSIZE, about 8000 on that machine. Does FD_SETSIZE get hardcoded into ghc-built binaries? That is if I increase the available descriptors per process with uname -n, will it be taken into account? Thanks, Joel On Oct 26, 2005, at 10:04 AM, Simon Marlow wrote:
lockFile uses FD_SETSIZE as its idea of the maximum value of a file descriptor. If you can get file descriptors outside this range, then the above error would ensue. Not sure if this is the case on FreeBSD... seems odd if it is. Can you try a truss/ktrace and see the values of your file descriptors?
participants (2)
-
Joel Reymont
-
Simon Marlow