
I still think haskell is using screwy defaults for stdout buffering.
Line buffered would be an inefficient default. Stdout is fully buffered in c99 if it can be determined that it is not attached to a terminal.
Putting efficiency before efficacy is un-Haskellian. How can the required determination be implemented? Stdout may be attached to a terminal indirectly through a pipeline. It may be "attached" through a file that another process is watching. It may be attached to some other real-time device or system. Buffering can cause unbounded delays in systems with sporadic input. And it can cause deadlock in systems that employ feedback, as most systems do. Terminals, for example, are unbuffered not merely because users are impatient, but more critically because users are feedback agents. Because the effects of buffering are not transparent, it would be wise to make it an optional optimization, not a default.