
Hi Adrian,
The "bug" is in ghc stack management. Why is it so important that the stack size is arbitrarily limited?
It's not, but it makes some things easier and faster. A better question is why is it important for the stack to grow dynamically. The answer is that its not.
It's just an intermediate data structure, no different from any other intermediate data structure you may build on the heap (well apart from it's efficiency). But I guess we would be in danger of having our programs run too fast if folk were silly enough to make use of the stack.
In C putting something on the stack is massively more efficient than putting it on the heap. In Haskell, there is nearly no difference, and I can imagine some situations where the heap is actually faster. I guess your comment about speed relates to that assumption?
So perhaps the current ghc defaults are too generous. What limit do you think should be placed on the stack size that a non buggy program can use?
The current limits are fine for virtually all cases. They abort on buggy programs, but its rare that a non-buggy program will need to change them. i.e. years of experience has ended up with good defaults. Thanks Neil