
The attached short program (compile with "ghc VServer.hs -o v -package net") is supposed to set up a server on port 15151, wait for a connection, read the first character from the connection, and print it out. Unfortunately if I test it, by running it, and starting up "telnet [machine] 15151" somewhere else, and then type some random text, EG "foo[RETURN]", it does not work. It looks as if the problem is that VServer.hs issues the command hSetBuffering handle (BlockBuffering (Just 4096)) on the connection, because when I change it to hSetBuffering handle NoBuffering the program works.
However this is not what I want to do!! Because setting NoBuffering on the handle is going to mean that when the Server *outputs* something, it will potentially be done very expensively character by character. How do I get block buffering on the Server's output, but not have input to the server held up?
Hmm. I rather think that hGetChar should always return a character immediately if there is one available, regardless of the buffering mode. Looking at the source, it appears that hGetLine behaves like this, as does lazy reading with hGetContents. I can't see any reason for waiting for the buffer to be completely full before returning anything. If you have a source tree handy, try the enclosed patch. If not, make a copy of hGetChar from the sources in libraries/base/GHC/IO.hs, apply the patch, and compile it separately (you'll need to import GHC.Handle explicitly, amongst other things). Cheers, Simon