
On 06/06/2011 10:23, Max Bolingbroke wrote:
On 6 June 2011 02:34, KQ
wrote: The shock here is that there was only one failure, whereas the "False ~=? True" should have failed.
I'm not sure, but at a glance it looks you might have the usual problem where compiling your test with optimisations means that GHC optimises away the test failure. This is a known problem. If you compile with -fno-state-hack (or -O0) it should work.
Can anyone expand on this? It seems odd to me that, an optimisation that causes some programs to behave incorrectly is switched on as part of the optimisation suites. Especially as there's no warning in the manual (or at least, not under http://www.haskell.org/ghc/docs/latest/html/users_guide/options-optimise.htm... which seems like the obvious place) I found some references to -fno-state-hack being required when compiling without occasionally causes bad performance, and also to prevent unsafePerformIOs being optimised away, but these seem a lot less bad. (Performance changes aren't as bad as correctness, and unsafePerformIO is known to be, well unsafe). I also found quite a few references to how beneficial this optimisation is to IO code, but I wonder if program correctness is a price worth paying. Or is this bad behaviour due to HUnit doing something unsafe? Regards, Jim