
#8189: Default to infinite stack size? -------------------------------+------------------------------------------- Reporter: nh2 | Owner: Type: bug | Status: new Priority: normal | Milestone: Component: Runtime | Version: 7.6.3 System | Keywords: Resolution: | Architecture: Unknown/Multiple Operating System: | Difficulty: Easy (less than 1 hour) Unknown/Multiple | Blocked By: Type of failure: Runtime | Related Tickets: crash | Test Case: | Blocking: | -------------------------------+------------------------------------------- Comment (by simonmar): There are a few things I'm not completely happy with here. By all means turn off the default stack limit, but the business with `allocateFail()` is misguided: `allocateFail()` will not return NULL. It only returns NULL when there is a single request for memory greater than the maximum heap size, and (a) by default there's no maximum heap size, and (b) even if there was, it would be highly unlikely to be smaller than the stack chunk size (32K). Did anyone test this? The `heapOverflow()` test in `allocate()` is purely optional, see #1791. The typical behaviour when someone writes a program that accidentally blows the stack will be for the machine to grind to a halt swapping, until the OS finally kills something (hopefully the right process). We have this behaviour for space leaks now, so I guess it's no worse to have it for stack leaks too. An actual "out of memory" error is rare, typically it happens when you try to allocate an array larger than the memory size, or something like that. Austin: could you back this out please, and let's discuss it some more. -- Ticket URL: http://ghc.haskell.org/trac/ghc/ticket/8189#comment:12 GHC http://www.haskell.org/ghc/ The Glasgow Haskell Compiler