
#8189: Default to infinite stack size? -------------------------------+------------------------------------------- Reporter: nh2 | Owner: Type: bug | Status: patch Priority: high | Milestone: 7.8.1 Component: Runtime | Version: 7.6.3 System | Keywords: Resolution: | Architecture: Unknown/Multiple Operating System: | Difficulty: Easy (less than 1 hour) Unknown/Multiple | Blocked By: Type of failure: Runtime | Related Tickets: crash | Test Case: | Blocking: | -------------------------------+------------------------------------------- Comment (by rwbarton): There are some issues of units here. The win32 `getPhysicalMemorySize` returns the physical memory size in bytes, while the posix one uses units of pages (need to multiply by `getPageSize()`), and `RtsFlags.GcFlags.maxStkSize` expects units of words (`sizeof(W_)`). Your `getPhysicalMemorySize` functions have the same unused variable as in #8289. A 32-bit system can easily have 4 GB (or even 16 GB) of physical memory, in which case the physical memory size in bytes (respectively, words) will overflow a `W_`. In this case, the stack size should be unlimited. I'm not sure if this is what the `if (maxStkSize <= 0)` test is supposed to be checking, but it won't work in general (imagine you somehow have just over 4 GB of RAM: then `getPhysicalMemorySize()` will overflow to a very small number). You should return `StgWord64` from `getPhysicalMemorySize` and use `StgWord64` when computing the default stack size, that should last a while. :) (FWIW I'm not totally sold on the idea of trying to guess an appropriate maximum stack size, but the alternatives of the current small stack limit or no stack limit at all aren't entirely satisfying either.) -- Ticket URL: http://ghc.haskell.org/trac/ghc/ticket/8189#comment:23 GHC http://www.haskell.org/ghc/ The Glasgow Haskell Compiler