
#9533: Signed/unsigned integer difference between compiled and interpreted code -------------------------------------+------------------------------------- Reporter: | Owner: MichaelBurge | Status: new Type: bug | Milestone: Priority: high | Version: 7.8.3 Component: Compiler | Keywords: Resolution: | Architecture: x86_64 (amd64) Operating System: | Difficulty: Unknown Unknown/Multiple | Blocked By: Type of failure: Incorrect | Related Tickets: result at runtime | Test Case: | Blocking: | Differential Revisions: | -------------------------------------+------------------------------------- Comment (by rwbarton): Some more good stuff. {{{ test = case 1 :: Int of 18446744073709551617 -> "A" -- 2^64 + 1 _ -> "B" main = putStrLn $ show test }}} prints "B". The `KnownBranch` optimization fires (even without `-O`) and thinks that 1 is not 18446744073709551617. I'm inclined to conclude that Core `Literal`s should always be wrapped to the range of integers that their type can represent. I assume that `HsLit`s should hold the actual integer in the source program, though, so we can display expressions as the user wrote them. Simon, any thoughts? Also, there is no warning about the out-of-range literal patterns in any of these programs. Perhaps that ought to be a separate feature request ticket. -- Ticket URL: http://ghc.haskell.org/trac/ghc/ticket/9533#comment:3 GHC http://www.haskell.org/ghc/ The Glasgow Haskell Compiler